By Jon Espenschied
November 16, 2006
The Akron Children's Hospital is having a tough autumn. A couple of
weeks ago, the hospital began sending out notification to patients and
their parents regarding a messy security breach. First discovered in
September 6, 2006, ACH staff found that the security of two systems had
been breached, one containing private records of around 230,000
patients, and the other containing details of some 12,000 donors. Early
reports indicate that the hospital received erroneous assurance from
their first responders that the incident was relatively minor, but later
contacted the FBI when the scope became apparent.
Its worth noting that the ACH has been fairly open about the situation
in recent weeks, but there are a series of assertions on their public
page that bug me. ACH says that "Immediately upon discovery of the
unauthorized entries, we retained computer security consultants to
determine the extent of the breaches. They have found no evidence that
any specific data was downloaded, tampered with, or compromised;
however, the opportunity to view the data existed."
A news site in the region quoted Bob Howard, the hospital's director of
planning, as saying "We don't know that anybody was actually affected.
All we know is, it's possible. The information was visible for the two
hackers who were able to get into the system. We don't even know if they
I am, to put it mildly, unconvinced. There goes another quarter million
people's personal information into the Internet ether, but we're not
supposed to worry? What does it mean, in the age of the Internet, to
say that an intruder or attacker could "access" or "view" information,
but that it was not "taken" from the database? These are old-school
distinctions that ought to have been wiped aside by even the dimmest
awareness of MP3 sharing and downloading, among many other examples.
When one teenager copies another's MP3, is the data "taken" in anything
but a licensing sense? Of course not.
A more blatant example of this outmoded thinking came some months ago,
when a researcher for the Veterans Administration was relieved of a
laptop computer containing the personal data of 26.5 million veterans.
Aside from the issue of whether the individuals responsible for putting
those millions of records on a home-bound laptop ought to be ground into
dust for blatant disregard of others' welfare, there were a number of
highly questionable public statements made.
Initial reports indicated that forensic experts were hard at work on the
laptop once it was recovered back in June, trying to determine what had
happened to the information prior to its return. VA Secretary Jim
Nicholson was quoted as saying "Law enforcement has in their possession
the laptop and hard drive," and "They are diligently conducting forensic
analysis on it to see if they can tell whether it's been duplicated or
utilized or entered in any way, and that work is not complete. However,
they did say to me that there is reason to be optimistic." Worse, Bill
Chase, FBI Special Agent in Charge, said forensic tests on the recovered
laptop showed that "the sensitive files were not accessed."
This is a bunch of hooey. Any data thief with "skilz" worth a pack of
gummy bears would put on a pair of nytrile gloves, pop the drive from
the laptop by removing as few as one screw, hook the drive to a
read-only controller similar to the ones used for forensics, dd the
drive byte-for-byte to another hard drive, and place the reassembled
system back where it could be "found" by a nun whose convent needed the
If gross evidence were left on the drive, maybe the FBI forensics
experts could show that some event or another positively did happen.
But proving a negative statement is orders of magnitude harder, and
often either infeasible or just plain impossible. FBI spokespersons gave
the strong impression that forensics researchers could determine that
that last data accessed wasn't the VA database based on screwdriver
marks on the screws and residual charges on capacitors in the laptop.
Give me a %$#*@ break. It's nonsense statements like this -- connecting
the unconnectable and asserting the unprovable -- that makes otherwise
competent forensics labs look foolish.
Proving a negative is hard to do, but prognosticating what might have
happened to data, as happened with ACH, is truly dangerous. When ACH
states that investigators "have found no evidence that any specific data
was downloaded, tampered with, or compromised; however, the opportunity
to view the data existed," the statement is logically inconsistent.
The opportunity to view data, along with the information that the
intrusions took place over a long period of time, is itself evidence
that data was accessed -- which is indistinguishable from viewed, read,
copied or stolen.
That's not to say that there aren't rational assessments that can be
made about the aftereffects of a public breach. ACH spokespeople
recently indicated that there have been no reports of misuse of the
information in the wild, just as there have been, to my knowledge, no
confirmed incidents traceable to the VA laptop loss. While financial
gurus like to say that past performance is not a reliable indicator of
the future, this kind of empirical information is useful and reasonable,
if not very comforting.
It's also worth mention that for both the VA and ACH, the organizations
are noncommercial enterprises, and thus lucky in a way. Commercial
companies such as credit cards issuing banks can be mortally wounded
merely by the public mention that private information was exposed in a
security breach. While ACH may continue to receive post-hack patients,
the decision of consumers to switch away from a post-hack bank is much
While personal details that would facilitate identity theft or
constitute an unacceptable breach of privacy may be hazy for many
people, many more intuitively understand that if their financial
information is accessed, an actual theft may not occur until days,
months or even years later. However, it's disconcerting to hear CIOs,
forensics investigators, FBI agents, and other people that ought to know
better cling to old pre-digital notions of theft, or actively try to
push the idea that you can control what happens to data after public
disclosure. It'd be nice if we lived in simpler times, but
unfortunately we don't.
Jon Espenschied has been at play in the security industry for enough
years to become enthusiastic, blas, cynical, jaded, content and
enthusiastic again. He is currently a senior security consultant in
Seattle, where his advice has been ignored by CEOs, auditors and
Subscribe to InfoSec News