September 04, 2003
Blog Entry

Surprising Email Test Results: Text vs HTML Survey Versions

SUMMARY: No summary available.
"Everyone talks in general about text vs HTML, but hardly anyone has specific test data they can share on it," Joanne Blatte, the editor of our upcoming Email Metrics Guide 2nd Edition, complained to me last week.

I was about to email about 50,000 of our readers a survey to gather data for our media kit, so I figured, why not toss a test cell into it? Heck, why not toss in two?

Here's how the test worked and what I learned:

We randomly generated three cells (I prefer this to slicing the list into parts so there's less chance of skewing results.)

Group A were sent a text-only note with a link to the survey form online.

Group B were sent an identical note with link, except a graphic of our logo was at the very top which meant the message was HTML.

Group C were sent a virtually identical note with our logo at the top, but they also received the actual survey form in the body of their email. If the form worked in their email system (Lotus Notes, some versions of Eudora and others can't use forms), they could submit their answers immediately instead of clicking to reach a survey.

If the form didn't work, they also had a link to click to get to the survey online.

Every test cell received the same "from" and "subject" line, however the size of messages were obviously different. They surveys were also all identical. (And to my shame, rather badly written -- my fault, never copywrite a survey when you are tired late at night.)

Our list is professionals at work - fewer than 10% are
Hotmail, Yahoo or AOL addresses. I culled newbies because
I didn't feel it was the right stage in our relationship to
ask loads of demographic questions, so every name sent to
had been on our house file for at least 30 days and
received at least four newsletters from us (most many more
than that.)

My expectations were: A low 20% or so open rate due to the fact that it was the week before Labor Day (and also a bank holiday in the UK). I also expected the HTML "lite" to win overall for no good reason beyond gut.

Actual results were:

Group A text-only:
Opens - can't tell with text
% click on link - 8.5%
% of sent completed survey - 7.5%

Group B HTML lite:
Opens - 35%
% click on link - 8%
% of sent completed survey - 7%

Group C HTML form:
Opens - 35.4%
% clicked link to use form online - 1%
% of sent completed survey - 7.2%

Color me completely stunned. Never in a million years did I dream the tests would be so similar.

If the list was not regular readers, I suspect there would
have been more profound differences. Perhaps if people have a strong enough relationship with your brand, your
email format doesn't affect response as much as it would a
newbie?

Anyway, if you've ever conducted a text vs HTML test,
please do let me know what you learned at
aholland@marketingsherpa.com. Thanks

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions