Testing and data driven decisions

There’s a lot of my education in the sciences that focused on how to get a statistically accurate sample. There’s a lot of math involved to pick the right sample size. Then there’s an equal amount of math involved to figure out the right statistical tests to analyse the data. One of the lessons of grad school was: the university has statistics experts, use them when designing studies.

Outline of a head with a gear inside it.

Even in science not everything we test has to be statistically accurate. Sometimes we just want to get an idea if there is something here. Is there a difference between doing X and doing Y? Let’s do a couple pilot tests and see what happens. Is this a line of inquiry worth pursuing?

Much of my statistical knowledge comes from practice, not theory. Most of my advanced classes did have some stats, but I never actually took a statistics class. That leaves me in a strange position when listening to people talking about the testing they do. I know enough statistics to question whether their results are valid and meaningful. But I don’t know enough theory to actually dig down into the numbers and explain why.

In marketing, we do a lot of testing. We use the results of this testing to drive decisions. We call this data driven marketing. I know a lot of marketing departments and agencies do have statisticians and data scientists on hand.

I am sure, though, that some tests are poorly designed and incorrectly analysed. This bad data leads to poor decision making that leads to inconsistent or unexpected results. The biggest problem is, people who fail to go back and question if the data used to make the decision means what they think it does.

Email, and particularly filters, have a lot of non-repeatable elements. Gmail filters, for instance, adapt constantly. Without carefully constructed, controlled and repeated tests we’re never going to be able to tease out the specifics. The even bigger challenge is that the process of testing will, in and of itself, change the results. Run the same series of tests over and over again and the filters may adapt and act differently for test 11 than test 2.

Another piece that leads to poor decision making is thinking our preferences are representative of our audience. Even unconsciously, many of us design marketing programs that fit the way we like to be marketed to. In order to make good decisions, we need to question our own biases and think about what our audience wants.

Finally, there is a lot of value in looking at how people behave. One thing I’ve heard a lot from marketers over the years is that what people say they want is different from how they actually act.

Overall, to make good marketing decisions we can’t just collect random bits of data and use it to justify what we wanted to do anyway. The data always reflects the question we asked, but not always the question we wanted the answer to. Blindly using data, without thinking about our own biases, leads to poor outcomes.

Related Posts

Working around email security

One of the common things I see as a delivery consultant is that companies do their best to set effective policies about email, but make it difficult to comply with those policies. It happens all the time. It’s one of the reasons that the tweets Steve shared about Sec. Clinton’s email server rang so true to me.
Security.
One of the commenters on that post disagrees, and uses banks and health care as an example.
Erik says:

Read More

Data Cleansing part 2

In an effort to get a blog post out yesterday before yet another doctor’s appointment I did not do nearly enough research on the company I mentioned selling list cleansing data. As Al correctly pointed out in the comments they are currently listed on the SBL. And when I actually did the research I should have done it was clear this company has a long term history of sending unsolicited email.
Poor research and a quickly written blog post led to me endorsing a company that I absolutely shouldn’t have. And I do apologize for that.
With all that being said, Justin had a great question in the comments of yesterday’s post about data cleansing.

Read More

Data Driven Email (and other) Marketing

The frequency of emails from the Obama campaign ended up being a talking point for pundits and late night talk show hosts. Jon Stewart of The Daily show even asked President Obama about email directly during his October 18th interview. (Video, email question at the 5:56 mark)

Read More