Permission Based Emails? Are you sure?

Yesterday I wrote about the ReturnPath study showing 21% of permission based email does not make it to the inbox. There are a number of reasons I can think of for this result, but I think one of the major ones is that not all the mail they are monitoring is permission based. I have no doubt that all of the RP customers say that the mail they’re sending is permission based, I also have no doubt that not all of the mail is.
Everyone who sends mail sends permission based email. Really! Just ask them!
In 10 years of professionally working with senders I have yet to find a marketer that says anything other than all their email is permission based. Every email marketer, from those who buy email addresses to those who do fully confirmed verified opt-in with a cherry on top will claim all their email is permission based. And some of the mailers I’ve worked with in the past have been listed on ROKSO. None of these mailers will ever admit that they are not sending permission based email.
Going back to ReturnPath’s data we don’t really know what permission based email means in this context and so we don’t know if the mail is legitimately or illegitimately blocked. My guess is that some significant percentage of the 20% of email to the probe accounts that doesn’t make it to the inbox is missing because the sender does not have clear recipient permission.
When even spammers describe their email as permission based email marketing, what value does the term have?

Related Posts

Campaign stats and measurements

Do you know what your campaign stats mean? Do you know what it is that you’re measuring? I think there are a lot of emailers out there who have no idea what they are measuring and what those measurements mean.
The most common measurement used is “open rate.” There’s been quite a bit of discussion recently about open rates, how they’re calculated, and is there a better way. In my own opinion, open rate can be useful, but only in some circumstances. More often it is a distraction from real measurements
Not only has there been the recent discussions about “open rate” versus “render rate” and a lot of confusion among people about what the underlying issues are, but I’ve also been working through some campaign stats questions with other people that indicate maybe they don’t actually understand the numbers they’re using.
For instance, what do the delivery statistics reported by the various mailbox monitoring companies mean? If you have 100% inbox delivery as measured by the program, does that mean all your mail has reached the recipient’s inbox?
What about bounce rates? Everyone says “keep them low” but what does low mean? How do you measure them?
Over the next few posts, I’ll talk about the different stats and measurements in common use and what they do and don’t mean.

Read More

ReturnPath customers?

Someone posted the following question about ReturnPath in the comments:

Read More

Delivery Metrics

Last week ReturnPath published a study that shows 20% of permission based email fails to be delivered to the inbox. For this study, ReturnPath looked at the mail sent by their mailbox monitor customers and counted the number of deliveries to the inbox, the number of deliveries to the bulk folder and the number of emails that were not delivered.
At US ISPs 21% of the permission based emails sent to the ReturnPath probe network did not make it to the inbox. 3% of the emails sent went to the bulk folder and 17% did not make it to the mailbox at all.  MSN/Hotmail and Gmail were the worst ISPs to get mail to. They each failed to deliver more than 20% of the mail that was sent to them. At Canadian ISPs, even less of the mail made it to the inbox, primarily because primus.ca is such a large portion of the Canadian market and they use Postini as a filter. Postini is a quite aggressive filter and takes no feedback from senders.
ReturnPath’s take home message on the survey is that one set of metrics is not enough to effectively evaluate a marketing program. Senders need to know more about their mailings than they can discover from just the bounce rate or the revenue rate or response rate or open rate.
There are a lot of reasons an email doesn’t get to the recipient’s inbox or bulk folder. Mail can be hard blocked at the MTA, and rejected by the ISP outright. Mail can be soft blocked at the MTA and the ISP can slow down sending. Sometimes this is enough to cause the sending MTA to stop attempting to deliver the mail, thus causing mail to not show up. Both of these types of blocks are usually visible when looking at the bounce rate.
Some ISPs accept mail but then fail to deliver it to the recipient. Everything on the sender end says the ISP accepted it for delivery but the ISP just drops it on the floor. This is the type of block that a mailbox monitoring program is best able to identify.
Despite all the discussions of numbers, many marketers are still not measuring the variables in their email campaigns. Ken Magill wrote today about a study released by eROI that indicates more than a third of marketers are not doing any testing on their mailings.
Now, both of these studies are done in an attempt to sell products, however, the numbers discussed should be making smart senders think about what they are measuring in regards to their email campaign, how they are measuring those factors and what the measurements mean.

Read More