Recycled spamtraps

Spamtraps strike fear into the heart of senders. They’ve turned into this monster metric that can make or break a marketing program. They’ve become a measure and a goal and I think some senders put way too much emphasis on spamtraps instead of worrying about their overall data accuracy.

Recently I got a question from a client about the chances that any address they were currently mailing would turn into a recycled spamtrap. Assuming both a well behaved outbound mail server and a well behaved spamtrap maintainer the answer is never. Well behaved spamtrap maintainers will reject every email sent to one of their spamtrap feeds for 6 – 12 months. Some reject for longer. Well behaved mail servers will remove addresses that consistently bounce and never deliver.

Of course, not everyone is well behaved. There are maintainers who don’t actively reject mail, they simply pull the domain out of DNS for years and then start accepting mail. Well behaved mail servers can cope with this, they create a fake bounce when the get NXDomain for an address and eventually remove the address from future mailings. There have been cases in the past where spamtrap maintainers purchase expired domains and turn them into spamtraps immediately. No amount of good behaviour on the part of the sender will cope with this situation.

On the flip side some MTAs never correctly handle any undeliverable address when the reason is anything other than a direct SMTP response. Generally these are built on the open source MTAs by people who don’t realise there are mail failures outside of SMTP failures.

There are three general cases where recycled spamtraps will show up on a list.

  1. A list has been improperly bounce handled.
  2. An address has not been mailed for more than a year.
  3. Someone signs up an address that’s a recycled spamtrap (same as how a pristine trap will get added to a list)

ESPs have to worry about recycled spamtraps in another common case. A new customer brings over a list and decides to retry addresses that their previous ESP marked as bounced. (It happens. Regularly.)

Recycled addresses are a sign that there is a problem with the long term hygiene of a list. As with any spamtrap, they’re a sign of problems with data collection and maintenance. The traps aren’t the problem, they’re just a symptom. Fix the underlying issue with data maintenance and traps cease to be an actual issue.

Related Posts

The source of deliverability problems

Most deliverability problems don’t start where many people think they do. So very often people call looking for deliverability help and tell me all about the things they’re doing to reach the inbox. They’ll tell me about content, they’ll tell me about bounces, they’ll talk about complaints, engagement, opens and clicks. Rarely will they bring up their list source without some prompting on my part.

The reality is, though, that list source is to root of deliverability success and deliverability problems. Where did those addresses come from and what do the people who gave them think you’re going to do with them?
Outsourcing collection to a third party can cause significant issues with delivery. Letting other people collect addresses on your behalf means you lack control over the process. And if you’re paying per address, then there monetary incentive for that company to pad the list with bogus addresses.
Sometimes there are even issues with having your own employees collect addresses from customers. For instance, a retailer requires sales associates collect a minimum percentage of addresses from customers. The company even ties the associates’ evaluations to that percentage. Associates have an incentive to submit addresses from other customers. Or a retailer will offer a discount for an address and customers want the discount but not the mail, so they give a fake address.
All of these things can affect deliverability.
Address collection is the key to delivery, but too many companies just don’t put enough attention to how they’re collecting addresses and entering into the relationship with subscribers. This is OK for a while, and delivery of small lists collected like this can be great. But as lists grow in size, they come under greater scrutiny at the ISPs and what used to work doesn’t anymore.
The first step to diagnosing any delivery problem is to look at the list. All of the things ISP use to measure reputation measure how well you’re collecting addresses. Changing IPs or domains or content doesn’t change the reason mail is being filtered. It just means the filters have to figure out something new to key on.
Want great deliverability? Start with how you’re collecting addresses.
Want to fix deliverability? Start with how you’ve collected addresses, how you’ve stored them and how you’ve maintained them.
 

Read More

Dueling data

One of the things I miss about being in science is the regular discussions (sometimes heated) about data and experimental results. To be fair, I get some of that when talking about email stuff with Steve. We each have some strong view points and aren’t afraid to share them with each other and with other people. In fact, one of the things we hear most when meeting folks for the first time is, “I love it when you two disagree with each other on that mailing list!” Both of us have engineering and science backgrounds, so we can argue in that vein.
ThatsFunny
One of the challenges of seemingly contradictory data is figuring out why it seems to disagree. Of course, in science the first step is always to look at your experimental design and data collection. Did I do the experiment right? (Do it again. Always do it again.) Did I record the data correctly? Is the design right? So what did I do differently from what you did? For instance, at one of my labs we discovered that mixing a reagent in plastic tubes created a different outcome from mixing the reagent in glass vials. So many variables that you don’t even think of being variables that affect the outcome of an experiment.

Read More