Email Analytics: Testing Subject Lines

Do you have your email marketing analytics handy? Today we’re going to look at subject lines.

A recent study by a retention marketing firm, Retention Science, compared email subject lines to derive why some were opened and others were trashed. By studying 260 million delivered emails from 540 campaigns, they discovered that less is more.

Subject lines with 6 to 10 words perform best, generating a 21 percent open rate, well above industry standard. Those with subject lines containing 5 or fewer words ranked second with a 16 percent open rate, and those with 11–15 words returned a minimal 14 percent open rate. Despite this, the majority of emails sent (52 percent) had subject lines in the 11-15 word range.

Now this gets tricky. Their research also showed that 35 percent of emails were opened through mobile devices, where 5 or 6 words might appear on a subject line. So if you stick to 6 to 10 words, make sure your first 5 or 6 are compelling. (Very similar to the outdoor billboard rule-of-thumb: 8 words or fewer.)

Another interesting fact they discovered is that including movie titles and song lyrics in subject lines will increase open rates. “Results showed those with subject lines referencing movies or songs were opened 26 percent of the time, while emails with more traditional subject lines were opened 16 percent of the time.” Since we’re in the book business, we should twist this theory a bit and consider testing book titles or memorable lines from books.

With your analytics in front of you, look at several weeks of email campaigns. Count the number of words in the subject lines and divide them into three categories: 0 to 5 words; 6 to 10 words; and 11 to 15 words. Now look at the open rates in each category. Do your analytics agree with the results from the study?

Now, look to see if your email marketing program offers A/B testing. Also known as split testing, this is a way of determining which of two campaign options is the most effective in terms of encouraging opens or clicks. In an A/B test you set up two variations of the one campaign and send them to a small percentage of your total recipients. Half of the test group is sent Version A and the other half gets Version B. The result, measured by the most opens or clicks, determines the winning campaign and that version is sent to the remaining subscribers.

If you have A/B testing available, test two different headlines the next time you send out a campaign. You can test different lengths or different phrasing. Try using a specific book title vs. a more generic subject. Test your own theories. The program will send test emails to a small sample of your recipients, and whichever message received the highest open rate will be delivered to the rest of your list. Then go back to your current analytics and check for improvement.

Beth Golay

Beth is a reader, writer, marketer and Books & Whatnot founder. Even though she knows better, she’s a sucker for a good book cover and will positively swoon if a book is set in appropriate type.

@BethGolay