Driving Deliverability with Data Quality (A Tale of Two Clients)

maria-teneva-vf4O1OwtPnk-unsplash-600

Over the past year of consulting, I've had two client cases that have caused me significant inner turmoil. Besides fixing flawed technical setups, most cases could be fixed with the sentiment, "send wanted content." And this most often was THE solution; often boiling things down to sending the content as it stood to those that engaged because they indicated they wanted it via some action with the message.

We never really went beyond that as the inbox was found again. Anything that delved into creating wanted content for those that disengaged or to optimize and improve what was going out was recommended, but left to different departments specializing in research, copywriting, data analytics, etc.

But these two client cases had their issues routed deeper in their program. Frankly, we couldn't really address what wanted content was because we struggled to even get the content viewed.

Keep in mind, I love being able to solve problems. I love it even more when I can solve problems quickly, relatively speaking. In these two scenarios, one was never solved, and one felt like it would never be solved or at least put on a path to the solution.

If it wasn't content, then what was the issue? Poor data quality. The core of an email program.

This likely isn't a huge surprise to you dear reader.. In Only Influencers’ recent Ask Me Anything (AMA) webinar (view it on-demand) survey, poor data quality (aka ‘”bad” data in my email list’) was the top issue senders were experiencing.

Webinar survey biggest challenge

Good data quality is a combination of the following factors:

  • Email exists
  • Email deliversKeep in mind this is dynamic; what delivers today, may not tomorrow.
  • Email does not utilize a disposable domain: Disposable domains accept mail for a short period of time, say 24 hours, and often utilize a shared inbox approach where others can also take advantage of this short-term mailbox.
  • Email gave confirmed explicit consent
  • Email gave consent willingly: Alison Gootee has a great post on a business’ need (or lack thereof) for consent over on Spamhaus’ blog, in short, don’t force a lead to enter an email. This could also capture those whose emails are entered into email lists unknowingly via bot attacks, typos, etc.
  • Email has activity: Activity such as email engagement, past or recent purchase history, website activity, account activity, etc.

When one of these factors aren't met, it doesn't mean the program is destined to fail. The fact of the matter is a lot of data doesn't check off each of these. And some see no issues, but it's likely because there are other measures in place to ensure the data not only gets to a healthy place, but also does so with best practices. An example is an opt-in form collecting new sign ups sends out a confirmed opt-in email to not only confirm the consent, but that the user intended to submit it.

However, as you reduce these criteria down more and more, problems are inevitable.

Case Study: Client A

The client I was unable to rehabilitate didn't confidently have any of these items. The addresses they were collecting were based on street teams collecting addresses who were incentivized by the number of signups. This led to non-existent emails, emails with consistent delivery issues, emails that never gave permission (either via typos or intentionally subscribing other addresses), and, as these were entered "on the fly," no activity data other than a "sign up date."

I ran an audit for my client and right from the get-go, I identified that the data was a problem. The content they planned to send was likely wanted as they were sending welcome emails to provide information about the service. Unfortunately, the suggestions to improve the data were met with, “we can’t do that.”

We tried a number of things, all while knowing the issue was still there and we’d likely see no improvement. The hardest part of this was the time and effort it took to get to the point where my client could justify to their leadership that the issue was the data.

As we dug in and did more soul searching, we were able to update some processes so the incentives for the street team were based on quality and not quantity. Unfortunately, the good volume wasn't enough to overpower all the bad and the email program wasn't able to find ground. Although this didn't turn the tide for my client, the success we did see was the data the management team needed to see the collection method for leads needed to be reimagined.

Case study: Client B

The second client struggled with letting go of their database from eons ago. They were a long-cycle product so they found data that supported aged records were valuable. However, email would argue otherwise.

As we dug in we not only found they were mailing everyone in their database, many of which were mailed over 100+ times without interacting, but they also had sources of emails from different stores that may not truly have been consented. The new to file leads were also automatically subscribed even if they were only reaching out about a product.

There was a lot to tackle, so to start we focused on sending to those with a form of positive campaign engagement (opens or clicks) in the last 6 months as the core change.

While that was running, we also worked on updating the signup process so we could limit the incoming customers to those that wanted to receive marketing content (aka we moved from automatic subscription to opt-out, although I’m still recommending opt-in.) We flagged those that did not show activity or have a high propensity to convert based on other data points. We identified and flagged risky lists from sources that couldn’t be verified. And there is still so much more left to do. With the restrictions in place, though, we did, after about 2 months, see a movement.

2 months.

It felt like an eternity, for me and for the client. There were times they were losing hope. But I was reassuring them as we went along the campaign metrics were improving. Are they a bullet proof measure, no, but a positive trend is indications that things were improving. And then finally Google Postmaster Tools showed an improvement in domain and IP reputation. Had there been a stricter approach narrowing down the email program to 3 months, we likely would have reached that point faster, but that wasn’t acceptable to the client.

In Conclusion

These are just two examples of cases that kept me up at night. If your program is seeing issues, looking at what could cause complaints is a key step (audience selection versus content, for example), but a lot of times it boils down to the data. Do you have their consent? Are the email addresses coming into your database legitimate? Are you setting the correct expectations around what they are signing up for? Are you collecting the email address for the right reason, or can a relationship exist without one? Are you sending to them because they are showing interest (or because you hope they will)?

If you can satisfy these questions with a ‘Yes,’ you have a great foundation. Then you can focus on the next step that will improve what messaging you are sending and how you welcome them into your program and set expectations. You can work on building out loyalty programs and value add messaging. The strongest programs build onto deliverability instead of making sure it just passes.

maria teneva vf4O1OwtPnk unsplash 600Photo by Maria Teneva on Unsplash

Related Posts