Last week ReturnPath published a study that shows 20% of permission based email fails to be delivered to the inbox. For this study, ReturnPath looked at the mail sent by their mailbox monitor customers and counted the number of deliveries to the inbox, the number of deliveries to the bulk folder and the number of emails that were not delivered.
At US ISPs 21% of the permission based emails sent to the ReturnPath probe network did not make it to the inbox. 3% of the emails sent went to the bulk folder and 17% did not make it to the mailbox at all. MSN/Hotmail and Gmail were the worst ISPs to get mail to. They each failed to deliver more than 20% of the mail that was sent to them. At Canadian ISPs, even less of the mail made it to the inbox, primarily because primus.ca is such a large portion of the Canadian market and they use Postini as a filter. Postini is a quite aggressive filter and takes no feedback from senders.
ReturnPath’s take home message on the survey is that one set of metrics is not enough to effectively evaluate a marketing program. Senders need to know more about their mailings than they can discover from just the bounce rate or the revenue rate or response rate or open rate.
There are a lot of reasons an email doesn’t get to the recipient’s inbox or bulk folder. Mail can be hard blocked at the MTA, and rejected by the ISP outright. Mail can be soft blocked at the MTA and the ISP can slow down sending. Sometimes this is enough to cause the sending MTA to stop attempting to deliver the mail, thus causing mail to not show up. Both of these types of blocks are usually visible when looking at the bounce rate.
Some ISPs accept mail but then fail to deliver it to the recipient. Everything on the sender end says the ISP accepted it for delivery but the ISP just drops it on the floor. This is the type of block that a mailbox monitoring program is best able to identify.
Despite all the discussions of numbers, many marketers are still not measuring the variables in their email campaigns. Ken Magill wrote today about a study released by eROI that indicates more than a third of marketers are not doing any testing on their mailings.
Now, both of these studies are done in an attempt to sell products, however, the numbers discussed should be making smart senders think about what they are measuring in regards to their email campaign, how they are measuring those factors and what the measurements mean.
Delivery Metrics
D
Primus Canada only has 100,000 Internet users, tiny in comparison to Telus, Rogers, Bell all of whom have millions of subscribers
http://www.primustel.ca/en/about/career/career_finance.html
ABOUT PRIMUS
As a leading Internet Service Provider (ISP) in Canada, Primus Canada has over 100,000 Internet subscribers accessing 46 national points-of-presence across Canada.
Hi, Neil,
I got the Primus / low delivery rates directly from the RP report. From the caption on the figure showing non delivery rates in Canada:
Laura, you are correct that low deliverability at Primus is a reason for lower overall deliverability to Canada, as our report states. It’s just not the primary reason. There are a number of factors at play (including the impact of US marketers sending into Canada which is a big part of our client base and is thus represented here) that are leading to lower deliverability in Canada. Sorry if it was confusing.
[…] blogged in the past about previous Return Path deliverability studies. The recommendations and comments in those previous posts still apply. […]