Home / Blogs

ESP Compromises and Their Lack of Security

Terry Zink

Over at Word to the Wise, Laura Atkins has a post up where she talks about the real problem with ESPs and their lack of internal security procedures which resulted in the breach of many thousands of email addresses (especially Epsilon). However, Atkins isn't only criticizing ESP's lack of security but also the industry's response wherein they have suggested countermeasures that are irrelevant to the problem. Better spam filtering, signing mail with DKIM and securing consumers' machine are all besides the point — the problem is that ESPs do not have security policies in place that would have prevented this problem.

As I read through the comments on the blog, what Atkins is getting at is that Epsilon should have implemented policies where even if they were breached by a third party, it shouldn't have mattered:

ESPs must address real security issues. Not security issues with sending mail, but restricting the ability of hackers to get into their systems. This includes employee training as well as hardening of systems. These are valuable databases that can be compromised by getting someone inside support to click on a phish link.

Not everyone inside an ESP needs access to address lists. Not everyone inside an ESP customer needs full access to address lists. ESPs must implement controls on who can touch, modify, or download address lists. These controls must address technical attacks, spear phishing attacks and social engineering attacks.

To further clarify on this, here's a brief 411 on the above paragraphs:

  • Not everyone inside an ESP needs to access address lists.

    The idea behind this is that email addresses are stored in databases. When someone opts in, their address is written to a database somewhere on a central server (or servers for redundancy). But who can access these lists? Can any old ham-and-egger with limited technical know-how logon to the server, run a select command and dump the outputs to a file?

    In our environment, not just anyone can do this. Only certain people have access to key servers. And even in those cases, many of us only have read access. We cannot dump the outputs to a file and then transfer the contents of that file offsite. Atkins' point is that if everyone in the company by default has access to all the servers, then an attacker's job is easier; they only need to successfully phish anyone in an organization of a couple of hundred employees vs needing to successfully phish someone those few people in the organization with access to the right servers, if access is restricted by default and opened up only those who need it. This narrows the window of opportunity in which a hacker must succeed, kind of like firing two shots into the exhaust system on the Death Star.

  • Key personnel require training on social engineering attacks.

    Social engineering has been around forever. Heck, I use it myself in many of my magic performances. As I said in my presentation at Virus Bulletin in Vancouver in October 2010, the best strategy for combating social engineering attacks is through education. People who don't know about these types of scams are more prone to fall for them if certain types of emotions are invoked. By teaching people about the tricks that hackers use, people can become more resistant to falling for them.

    Thus, ESPs (any company) need proper security training in place to illustrate the risk of these types of attacks. If employees are aware of the risks it further reduces the odds of an attacker's success. Someone is more likely to recognize it as a scam and not take the action of opening the attachment or clicking the link to a malicious site.

    This is a part of the human element of narrowing the attack surface.

  • Key data must be encrypted.

    The rationale behind this is that even if an attacker steals the data, because it is encrypted it is in a format that is not useful to the attacker. By the time he breaks it will not be useful because valuable data frequently has a time limit. Storing stuff in plain text is risky because if you lose it, you have given the thief everything they need to use it to their advantage.

The problem is that even if these security procedures were implemented, it still doesn't harden an organization against a spear phishing attack. Software companies need to take part of the blame. Why do I say this? Because the RSA attack that just occurred could have implemented all of this and still been hacked.

In the RSA attack, hackers got the random seed and the algorithm that they use to create the random tokens. They got this by sending phishing mails to a few employees — employees who would have had the proper access — that were caught by spam filters with a subject line that said something along the lines of "Documentation for 2011" along with an attachment. Employees then went into their junk folders, believed it was a false positive and opened the document. Mayhem ensued.

These people know about attacks. The reason they went into the junk folder is because they have experienced times when their spam filters blocked legitimate mail. They have therefore been subconsciously trained to not trust their spam filter 100% because they have obviously missed legitimate messages in the past. Whose fault is it that users have been trained to go into their spam filters and retrieve legitimate mail?

Even if they had their antimalware signatures and software patches up-to-date, this malware executed zero-day vulnerabilities and compromised the machine. The attackers knew what they were looking for and got what they needed. They did some pre-operational surveillance to target the people they needed to target, get in and get out, and then cover their tracks (this is very similar to what happened to Google in January 2010). The point is this: whose fault is it that the software they were using contained a zero-day vulnerability?

If an attacker specifically targets someone and goes to the time and effort to customize a zero-day, and knows their way around the inside (or has the technical know how to navigate their way around and create a map quickly), then creating policies to resist these types of attacks is going to put constraints on people. The reality is that some people need access to data to do their jobs. Code is written in plain text somewhere that allows people to reverse engineer and steal critical data. People will sometimes fall for phishes.

If I get you to name any card at any number, I have an advantage over you. Everything looks perfectly fair, but it's not. I know what I'm doing and my odds of success are very high whereas you are merely looking to be entertained. Every one of us have other things to do during the day and get distracted, protecting against phishing is not high on our list of priorities — we are looking to do our day-to-day jobs. Attackers who want to get the data have the advantage of surprise and subterfuge. This advantage is the difference between success and failure. As Sun Tzu said, "He will win who, prepared himself, waits to take the enemy unprepared."

I don't know what the answer is but I know it's not simple. Industry needs to come up with a comprehensive guide to securing your IT environment. People need training. Data probably should be encrypted. But it's not going to solve everything.

In the meantime, think of a number. Next time we meet up in person, I'll tell you what it is.

By Terry Zink, Program Manager
Follow CircleID on

If you are pressed for time ...

... this is for you. More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

Vinton Cerf, Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Share your comments

Not certain if the point CAUCE made was clear Neil Schwartzman  –  Apr 08, 2011 10:04 AM PDT

What we said and are saying is that now is the time for the ESP community to embrace _all_ of the best practices they have dithered with, to a degree, for a long time, like email authentication. That allows for the successful parsing by receiving systems, of good mail vs. the inevitable spear-phishing that will happen.

Security on their outbound email obviously would not have prevented Epsilon. What would have prevented Epsilon is they, and every other ESP and mailer treating their email lists like their most valuable commodity, which it is. That means hiring security professionals to harden their networks, and doing things like database encryption, that is industry standard in other sectors.

I did understand the CAUCE point Terry Zink  –  Apr 08, 2011 10:39 AM PDT

I did get the point, but the point of my article was that even if you do all sorts of security procedures, you are not completely hardened.

> What would have prevented Epsilon is they, and every other ESP and mailer
> treating their email lists like their most valuable commodity, which it is. That
> means hiring security professionals to harden their networks, and doing things like
> database encryption, that is industry standard in other sectors.

Not necessarily.

According to Microsoft's internal data classification procedures (specified in a Word doc and publicly accessible at this link: http://is.gd/8A8bXY), they classify data in to three different categories - High Business Impact (HBI), Medium Business Impact (MBI) and Low Business Impact (LBI).

HBI consists of authentication credentials and highly sensitive PII like medical or financial records.  MBI consists of PII that is not highly sensitive like political opinions, name, address, and email addresses.

MBI must be encrypted when while it is in transit, but HBI must be encrypted when it is in transit *and* while stored, and accessed using two factor authentication.

Thus, for Epsilon to encrypt email addresses while stored would not necessarily be industry standard.  I understand your point that perhaps they *should* be treated as HBI, but classifying data as HBI has all sorts of other implications.

Encryption at rest, privilege / duty separation etc .. Suresh Ramasubramanian  –  Apr 10, 2011 9:17 AM PDT

These are absolutely basic and necessary.  And this is something that shouldn't be overlooked by any ESP.

Epsilon isnt the first ESP to be compromised and it won't be the last.

To post comments, please login or create an account.



DNS Security

Sponsored byAfilias

New TLDs

Sponsored byAfilias

Domain Names

Sponsored byVerisign


Sponsored byVerisign

IP Addressing

Sponsored byAvenue4 LLC