Deliverability session at Connections 2016
If you’re at Connections 2016 stop by our session at 3:00 in the Sidney Marcus Auditorium. Bring your pressing deliverability questions.
If you’re at Connections 2016 stop by our session at 3:00 in the Sidney Marcus Auditorium. Bring your pressing deliverability questions.
Ken Magill will be interviewing me on the Truths and Myths of Email Deliverability, November 12 at the 2015 All About eMail Virtual Conference & Expo. Ken has a bunch of questions he wants to ask me, but he’s also expecting to take a lot of questions from the audience as well.
Speaking of myths, there has been discussion lately about recycled spamtraps. Apparently, there are people who believe (believed?) that every ISP uses recycled spamtraps. When Hotmail and Gmail said recently they didn’t use recycled traps people got very upset that they believed something that was not true.
It’s a mess. There is so much about email that is like a version of telephone. One person says “hotmail uses recycled spamtraps” someone else repeats “big ISPs use recycled spamtraps” then then third person says “all ISPs use recycled spamtraps.” People try and correct this type of misinformation all the time but sometimes it’s hard to clarify.
So show up to our session and let Ken lob questions at me, lob some of your own and we can see what myths we can clear up.
One of the things I miss about being in science is the regular discussions (sometimes heated) about data and experimental results. To be fair, I get some of that when talking about email stuff with Steve. We each have some strong view points and aren’t afraid to share them with each other and with other people. In fact, one of the things we hear most when meeting folks for the first time is, “I love it when you two disagree with each other on that mailing list!” Both of us have engineering and science backgrounds, so we can argue in that vein.
One of the challenges of seemingly contradictory data is figuring out why it seems to disagree. Of course, in science the first step is always to look at your experimental design and data collection. Did I do the experiment right? (Do it again. Always do it again.) Did I record the data correctly? Is the design right? So what did I do differently from what you did? For instance, at one of my labs we discovered that mixing a reagent in plastic tubes created a different outcome from mixing the reagent in glass vials. So many variables that you don’t even think of being variables that affect the outcome of an experiment.
My speaking schedule is coming together for Q1 and Q2 this year.![]()
Email Evolution Conference. March 30 – April 1. New Orleans, LA. I’ll be participating on the “All You Ever Wanted to Know about Deliverability (But Were Afraid to Ask)” panel Friday Morning. The other panelists are Chris Arrendale, Alyssa Nahatis and Matthew Vernhout. This panel should be quite a bit of fun, as we all know each other and have collaborated in the past. I’m looking forward to it. Come prepared with questions!
Salesforce Connections. May 10 – 12. Atlanta, GA. Another panel on deliverability, this time with Mickey Chandler from SFMC and Melinda Plemel from ReturnPath. We’ll each bet giving our 3 best tips to improve deliverability and then be taking questions from the audience. We have all been around a long time, in fact Mickey used to work for me at MAPS back in 2000. We’re all ready to answer those questions you’ve always had but never known who to ask.
Email Innovations Summit. May 17 – 19. Las Vegas, NV. Not a panel! I’ll be speaking about the technical things happening around email that will affect sending, marketing and deliverability. If you ever wanted to know how to talk to the technical folks this is the session to come to. I’ll be explaining some of the terminology and teaching attendees what they need to care about and what they just need to know exists. Register with my code (SPKATK) and save 15%.