Meaningless metrics

I’ve been having some conversations with fellow delivery folks about metrics and delivery and bad practices. Sometimes, a sender will have what appear to be good metrics, but really aren’t getting them through any good practices. They’re managing to avoid the clear indicators of bad practices (complaints, SBL listings, blocks, etc), but only because the metrics aren’t good.
This made me laugh when a friend posted a link to a Business Insider article about how many website metrics aren’t useful indicators of the business value of a website.  Then I found the original blog post referenced in the article: Bullshit Metrics. It’s a great post, you should go read it.
I’d say the concluding paragraph has as much relevance to email marketing as to web marketing.

Despite the internet’s evolution, bullshit metrics perpetuate a constant cycle of poor understanding. Let’s strive to understand how our businesses are doing and to pick better metrics–the harsher, the better. Let’s stop fooling ourselves with numbers that don’t represent reality. And let’s push the industry forward as a whole because collectively we’ll all benefit.

The sooner we can get away from opens as a useful email metric, the better the email industry is going to be.
 

Related Posts

When an open is not a sign of interest

A lot of people, including myself, are using opens as one of the measures of engagement. This, as a general rule, is not a bad measure. However, there are people who will open email not because they’re interested in it, but because they know it is spam.
Take, for instance, the email address I acquired in 1993. Yes, I still have this address. I stopped using it to sign up for lists in 1999 and stopped using it for most of the rest of my mail around 2001. This address, though, is on any number of spam mailing lists. The spam that gets through is usually sent by hard-core spammers. The ISP that hosts that mailbox uses Communigate Pro to filter mail, so much of the casual spam is filtered.
Generally, if I open an email (and load images or click through) on that account it is only in order to track down a spammer. For instance, I’m getting a lot of spam there from affiliates offering me the opportunity to purchase printing services for a very low price. I have actually been opening the mail, and clicking through. But I’m not clicking through because I’m interested in purchasing. I’m clicking through to see if my reports to abuse@ printer are resulting in any action against the spammers. (They’re not).
The thing is, though, I know that by clicking through on ads, I’ve now been promoted by the spammer to the “clicks on emails! it’s a live address!” list. Which only means I’m going to get more spam from them. Lucky me.
Using clicks and opens as a measure of engagement isn’t necessarily bad. But when using them you have to understand the limitations of the measurement and that what you may think it’s telling you isn’t actually what it’s telling you.

Read More

Delivery Metrics

Last week ReturnPath published a study that shows 20% of permission based email fails to be delivered to the inbox. For this study, ReturnPath looked at the mail sent by their mailbox monitor customers and counted the number of deliveries to the inbox, the number of deliveries to the bulk folder and the number of emails that were not delivered.
At US ISPs 21% of the permission based emails sent to the ReturnPath probe network did not make it to the inbox. 3% of the emails sent went to the bulk folder and 17% did not make it to the mailbox at all.  MSN/Hotmail and Gmail were the worst ISPs to get mail to. They each failed to deliver more than 20% of the mail that was sent to them. At Canadian ISPs, even less of the mail made it to the inbox, primarily because primus.ca is such a large portion of the Canadian market and they use Postini as a filter. Postini is a quite aggressive filter and takes no feedback from senders.
ReturnPath’s take home message on the survey is that one set of metrics is not enough to effectively evaluate a marketing program. Senders need to know more about their mailings than they can discover from just the bounce rate or the revenue rate or response rate or open rate.
There are a lot of reasons an email doesn’t get to the recipient’s inbox or bulk folder. Mail can be hard blocked at the MTA, and rejected by the ISP outright. Mail can be soft blocked at the MTA and the ISP can slow down sending. Sometimes this is enough to cause the sending MTA to stop attempting to deliver the mail, thus causing mail to not show up. Both of these types of blocks are usually visible when looking at the bounce rate.
Some ISPs accept mail but then fail to deliver it to the recipient. Everything on the sender end says the ISP accepted it for delivery but the ISP just drops it on the floor. This is the type of block that a mailbox monitoring program is best able to identify.
Despite all the discussions of numbers, many marketers are still not measuring the variables in their email campaigns. Ken Magill wrote today about a study released by eROI that indicates more than a third of marketers are not doing any testing on their mailings.
Now, both of these studies are done in an attempt to sell products, however, the numbers discussed should be making smart senders think about what they are measuring in regards to their email campaign, how they are measuring those factors and what the measurements mean.

Read More

Metrics, metrics, metrics

I’ve been sitting on this one for about a week, after the folks over at IBM/Pivotal Veracity called me to tell me about this. But now their post is out, so I can share.
There are ISPs providing real metrics to senders: QQ and Mail.ru. Check out Laura Villevieille’s blog post for the full details.

Read More