It’s lovely to see more organizations consistently testing to boost performance! But it’s sad to see marketers doing tests which are returning inconclusive or just plain useless results. Here are three of the most common testing mistakes my team and I run across working with clients, along with tips for how you and your team can avoid making them in the future!
This year I asked the members of Only Influencers to brag about their accomplishments for 2017. The response was outstanding and we will be making this an annual event. So, here is what the members of email marketing community accomplished this year:
I remember when CSS was barely usable in HTML email and plain-text versions were our attempt to be ‘mobile-friendly’. Table-based, inline-styled emails were all the rage way back in 2007. Seems like ages have passed since, although Outlook will always remind us of the good-old-days. Now, emails are mobile responsive, contain kinetic elements, and even contextual content by using real-time information gathered from the recipient - dilly, dilly!
Change is hard! With email, change can even be more overwhelming because what you are doing now is likely something with which you are comfortable. However, isn’t it time to leave that comfort zone? We tend to lose sight over the impact that small changes to our email program can have. We test subject lines and day of week, the length of the email, text vs. image, and the frequency in which a subscriber should be mailed, etc. The list of what you can test goes on and on. Testing is great because it provides us good insight into how we should be engaging with our subscribers. However, one of the things I don’t see happening often is a change in the creative template. Both with my clients and what I see as a consumer (and yes, I get tons of emails) I see the same template time and time again.
A few months back, I wrote about using data to determine the timing, content, and impact of your email marketing campaigns (See: 6 Steps to Putting Data to Work in Email). In the last few months, the element I find myself discussing most with colleagues and clients is the latter: What data can we use to attribute the impact of our hard working email marketing campaigns?
When it comes to challenges in attribution, I’ve heard it all:
It never fails that a few months before a brand marketer is due to renew their marketing automation contract, they reach out to tell me that they are secretly looking to switch to a new platform and want my opinion. After we have spent a little time talking about their current state, what they want out of their future state and the business goals they are looking to accomplish, we usually determine that their current platform meets their needs. However, one thing always stands out, they typically need what I refer to as a “Marketing Automation Tune-Up,” which, simply put, is just recalibration and adjustment to what they are currently doing.
Person-first personalization is a hot topic at the moment. Forrester Research, The Relevancy Group, and other thought leaders are writing extensively on the subject. Why? Because consumers have raised their expectations. When consumers have personalized experiences with Spotify, Netflix or any other similar brand, they subsequently expect all other brands to deliver the same 'surprise and delight’ experience.
It’s so easy to get caught up in the details, the big and little decisions that need to be made every day and then tracked and managed and… well, you know. At the same time, as marketers we all want to focus on strategy, on the vision of what we are trying to achieve and who our audience really is and all the great thought leadership that goes along with that. It’s hard to find the balance when you’re stuck in the trenches. That’s why a guide can be helpful.
Most of all, differences of opinion are opportunities for learning - Terry Tempest Williams
As some of you may already know, Alchemy Worx the agency I founded in 2001 was recently acquired by SellUp an email marketing agency with offices in NYC and Manila, founded by Allan Levy – whom I have known for many years and respect a great deal. One of the primary reasons for doing so was to free up the time I was spending running the agency full time to spin off the software division of Alchemy Worx into a separate company. The new company, Touchstone Intelligent Marketing exists to help plug 2 significant gaps in the marketing clouds - testing and subscriber level reporting. The 1st of these Touchstone Tests a testing tool that allows you to try out any number of subject lines quickly and without burning out your customer database, is the source of the data I am about to share.
In “The Blueprint for Better Performance Testing” I walked you through how I look at an existing email campaign to come up with hypotheses for testing.
But what do you do when you don’t have an existing campaign to look at – what do you do when you’re developing a performance testing plan for a brand-new product or service?
By Ryan Brelje, Content Marketing Manager at Iterable
More than 10,000 ambitious companies fiercely compete inside the expanding subscription retail space—with the industry more than validated by the successes of pioneers like Birchbox and Dollar Shave Club, brick and mortar mega stores like Sephora and Walmart have entered the arena and vie for rival market share. In a high-stakes game where customer churn is only one click away, how are these companies keeping their customers engaged while they await their deliveries?
Because email marketers are under resourced, busy people – and often new to the profession or have nobody to show them the ropes – they look to "best practices" as silver bullets that will fix their problems or keep them on the right side the law
Coupled with our history of being associated with spam, it's easy to see why marketers are so focused on following best practices. They use it as a solution to a common problem. The solution becomes a trend, and before you know it, it's promoted to a best practice.
However, I see too many marketers rushing to implement best practices without questioning whether something is truly a best practice, a trend or a bad habit that has evolved into a rule.
The concepts of Retention and Predictive Marketing have been around for quite some time; however, at its inception, only the largest stores could afford to invest in this type of data warehouse and management. Over the past few years, solutions have evolved to help retailers gain access to their data and enable retention and predictive marketing, but adoption has been mostly exhibited by innovators and early adopters. Over the past three years, we’ve conducted the Retention and Predictive Marketing survey to better understand trends in the marketplace, adoption of retention and predictive marketing, and barriers and successes retailers, who’ve invested in this technology, are experiencing. This year’s survey was completed by hundreds of retailers, spanning industries and revenue levels. Here’s a breakdown of the results from the 2017 Retention & Predictive Marketing Report.
List size is a metric that is ridiculous on its own. As every Email Marketer knows, it’s not about the size of the list, but the quality.
The battle over when to suppress users is age old, commonly fought between those who are focused on the size of the list, and those on the email side who understand that inactives hurt deliverability.
Your customer data is a goldmine of information just waiting to be discovered. You know that emails which reflect a customer's data are more relevant and likely to be acted on, but too many marketers stop at basics like name, gender or location.
“The most important work in the vocabulary of advertising is TEST. Never stop testing and your advertising will never stop improving.”
Founder, Ogilvy and Mather
Considered the Father of Advertising
Sending an email marketing message is easy. But boosting bottom line performance from send-to-send -- that’s a little more difficult. The key to success here is testing.
If you’re not doing any performance testing, now’s the time to start. If you are already doing regular testing, commit to upping your game. Either way, here’s a quick walk through the process my team and I follow to help guide you.