Recycled spamtraps

Spamtraps strike fear into the heart of senders. They’ve turned into this monster metric that can make or break a marketing program. They’ve become a measure and a goal and I think some senders put way too much emphasis on spamtraps instead of worrying about their overall data accuracy.

Recently I got a question from a client about the chances that any address they were currently mailing would turn into a recycled spamtrap. Assuming both a well behaved outbound mail server and a well behaved spamtrap maintainer the answer is never. Well behaved spamtrap maintainers will reject every email sent to one of their spamtrap feeds for 6 – 12 months. Some reject for longer. Well behaved mail servers will remove addresses that consistently bounce and never deliver.

Of course, not everyone is well behaved. There are maintainers who don’t actively reject mail, they simply pull the domain out of DNS for years and then start accepting mail. Well behaved mail servers can cope with this, they create a fake bounce when the get NXDomain for an address and eventually remove the address from future mailings. There have been cases in the past where spamtrap maintainers purchase expired domains and turn them into spamtraps immediately. No amount of good behaviour on the part of the sender will cope with this situation.

On the flip side some MTAs never correctly handle any undeliverable address when the reason is anything other than a direct SMTP response. Generally these are built on the open source MTAs by people who don’t realise there are mail failures outside of SMTP failures.

There are three general cases where recycled spamtraps will show up on a list.

  1. A list has been improperly bounce handled.
  2. An address has not been mailed for more than a year.
  3. Someone signs up an address that’s a recycled spamtrap (same as how a pristine trap will get added to a list)

ESPs have to worry about recycled spamtraps in another common case. A new customer brings over a list and decides to retry addresses that their previous ESP marked as bounced. (It happens. Regularly.)

Recycled addresses are a sign that there is a problem with the long term hygiene of a list. As with any spamtrap, they’re a sign of problems with data collection and maintenance. The traps aren’t the problem, they’re just a symptom. Fix the underlying issue with data maintenance and traps cease to be an actual issue.

Related Posts

Organizational security and doxxing

The security risks of organizational doxxing. 
These are risks every email marketer needs to understand. As collectors of data they are a major target for hackers and other bad people. Even worse, many marketers don’t collect valid data and risk implicating the wrong people if their data is ever stolen. I have repeatedly talked about incidents where people get mail not intended for them. I’ve talked about this before, in a number of posts talking about misdirected email. Consumerist, as well, has documented many incidents of companies mailing the wrong person with PII. Many of these stories end with the company not allowing the recipient to remove the address on the account because the user can’t prove they own the account.
I generally focus on the benefits to the company to verify addresses. There are definite deliverability advantages to making sure email address belongs to the account owner. But there’s also the PR benefits of not revealing PII attached to the wrong email address. With Ashley Madison nearly every article mentioned that the email address was never confirmed. But how many other companies don’t verify email addresses and risk losing personally damaging data belonging to non customers.
Data verification is so important. So very, very important. We’ve gone beyond the point where any big sender should just believe that the addresses users give them are accurate. They need to do it for their own business reasons and they need to do it to prevent incorrect PII from being leaked and shared.

Read More

Dueling data

One of the things I miss about being in science is the regular discussions (sometimes heated) about data and experimental results. To be fair, I get some of that when talking about email stuff with Steve. We each have some strong view points and aren’t afraid to share them with each other and with other people. In fact, one of the things we hear most when meeting folks for the first time is, “I love it when you two disagree with each other on that mailing list!” Both of us have engineering and science backgrounds, so we can argue in that vein.
ThatsFunny
One of the challenges of seemingly contradictory data is figuring out why it seems to disagree. Of course, in science the first step is always to look at your experimental design and data collection. Did I do the experiment right? (Do it again. Always do it again.) Did I record the data correctly? Is the design right? So what did I do differently from what you did? For instance, at one of my labs we discovered that mixing a reagent in plastic tubes created a different outcome from mixing the reagent in glass vials. So many variables that you don’t even think of being variables that affect the outcome of an experiment.

Read More