Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path

The email deliverability specialists from Return Path just released their new “Sender Score™ Benchmark-Report 2012”. It’s fully packed with new insights and tips, centering around global email deliverability issues and the senderscore.org reputation database.

The PDF report is an interesting read. However, being as data addicted as I am, I just had to play around with the numbers myself. Let’s look beyond the tables.

“Sender Score” – what’s that?

As you might know, the Sender Score mirrors the reputation for a given mailer IP. Go for it; give the online database a try and query it. The score output ranges from 0 (very bad) to 100 (perfect). The scoring itself is based on several indices: spam complaint rates, email volume changes, unknown user rates, spam trap hits, and more.

The good thing with the Sender Score is that it can predict your future email deliverability, i.e. how many emails will likely pass through to the users Gmail, Hotmail, or Yahoo! Mail inboxes. If the Sender Score decreases, it’s time to take measures to improve the indices. But what is a good, what is a bad score? And how can I improve it? The benchmark report holds all the answers.

Sender Scores (and indices) by countries



Can you guess all the colored countries? 😉 I thought a map might be an interesting complementary visualization to the tables in the PDF, so I made those two…

  • What do you see here? The darker the green of the country’s fill color, the higher is its Sender Score (see the wide legend scale at the bottom). For instance, Canadian und U.S. mailer IPs have got the best Sender Scores. They provide senders therefore with the best deliverability. On the other hand, Brazil is worst. Gray means “N/A, no data available”. Note that light green doesn’t necessarily mean “good” or “ok” (e.g. UK or Australia). Return Path characterizes a range from 0 to 60, which archieves an inbox placement rate of only 21% on average, as bad.
  • I also highlighted Europe in the middle of the upper map. I must say it pretty much surprised me, too, that European countries, with the exception of UK, have such low average Sender Scores. It does not at all resonate with Europe’s strict opt-in law, nor does it resonate with the list of the 10 worst spam countries. However, the reason may lie in the indices, i.e. complaint rates (black), unknown user rates (red), and spam trap hits (cyan). I plotted those in the stacked bars over the corresponding country. This way, you can e.g. immediately see that Germany has, in comparison to its European neighbors, the biggest problems with spam complaints, which is indicated by the huge black segment. It also has the least problems with spam traps.

Deliverability indices by industries

  • Social networks are the bad dudes when it comes to sending emails. This is consistent with the benchmarks by industry data, which I modeled in 3D some days ago. (Social networks and Online Communities is colored yellow in the rotating cube). According to Return Path, one problem with social networks is the friend finder feature. (That is by the way one thing Facebook got sued for here in Germany some months ago.) By uploading and mailing address books, the networks send a huge amount of emails to dead addresses (“unknown users”). It is highly recommended to keep this rate under 2%. Seems as if the networks miss this mark. In addition, social networks hit the most spam trap addresses.

Good and bad Sender Scores by email providers

The “Sender Score” becomes less abstract, when directly translated into inbox placement rates (e.g. “a Sender Score of 80 means 85% inbox placement rate”). Luckily, Return Path provides tables for Gmail, Hotmail, and Yahoo! Mail. When plotting inbox placement rates against Sender Scores, the results might look like this:

  • It seems, Yahoo! Mail (right figure) provides fewer difficulties in getting through to the recipient’s inbox than Gmail and Hotmail. This may be indicated by the high curve in the plot – even with a very low Sender Score (left side), still nearly 50% of all emails make it through to the inbox. Maybe it’s partly because Yahoo! works together more closely with email certification authorities, like the German eco? Or, well ok, it’s more likely that the spam filtering just aren’t that harsh or rather bad in comparison.
  • With Gmail and Hotmail I tried some curve fitting. Especially the shape of the Gmail plot seemed very well suited for an exponential fitting. And as you see, the dotted continous approximation visually fits the discrete data points quite well. So you can expect an inbox placement rate of about 50% when having a Sender Score of 80. The curve tells us another important thing: it doesn’t matter much, if you improve your Sender Score from 20 to 40 or 60. It’s all equally bad. But it makes a considerable difference, whether someone got 70 or 80. The difference becomes even more significant when talking about 80 versus 90.
  • One interesting thing about the Hotmail plot in the middle: A Sender Score between 50 and 60 seems to provide senders with a smaller inbox placement rate than the worse Sender Score in the range from 0 to 50. The quadratic nonsense-fit shows a minimum around a Sender Score of 45. How can this be? Well, I guess it just shows that the Sender Score is no perfect 1:1 representation of inbox placement rates among all email providers. It can’t be alone because it’s not possible to implement all possible indices. The Sender Score is just an indicator. And as such it should be used.

Let me know what you think of this.

Enjoyed this one? Subscribe for my hand-picked list of the best email marketing tips. Get inspiring ideas from international email experts, every Friday: (archive♞)
Yes, I accept the Privacy Policy
Delivery on Fridays, 5 pm CET. You can always unsubscribe.
It's valuable, I promise. Subscribers rate it >8 out of 10 (!) on average.

11 Responses to Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path

  1. Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path http://t.co/LMiXX4qv

  2. René, your analysis and commentary is, as always, great. Thank you for sharing this. Contact me if you ever have any interest working on something together.

  3. RT @LukeAnker: Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path http://t.co/P6VE6uIE

  4. Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path http://t.co/pgMm7OYu

  5. Great analysis here from @lukeanker on our latest Sender Score Benchmark Report. Check it out! http://t.co/7RSJeklu

  6. Pingback: Can seed lists measure email deliverability? A practical guide by the numbers. | E-Mail Marketing Tipps

  7. Pingback: Where & why emails (don’t) get delivered – Sender Score benchmark data from Return Path | Email deliverability | Scoop.it

  8. Pingback: Inbox overload? Increasing email volumes? Here’s the truth. | E-Mail Marketing Tipps

  9. Hi Rene, thanks for this. This is exactly what I was looking for. It’s amazing how easily a Sender Score is affected. I sent out 12,000 emails. 2000 were opened. Of the 2000 I got 20 spam complaints and 10 bounces and my score went from 95 to 85! How to improve the score again?

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.