By Ryan Blitstein
Feb. 05, 2007
Microsoft says the new Vista computer operating system is the most
secure Windows yet. But in its millions of lines of complex computer
code, there are bound to be at least a few holes.
Now that Vista has been released to the public, hackers all over the
world are banging away at it, hoping to exploit these flaws with devious
software, or malware, such as viruses and worms. It's one reason the
Redmond, Wash., software company called in big guns like Scott Stender
to help beat the hackers to the punch. Stender, co-founder of San
Francisco security firm iSEC Partners, was a senior member of the teams
Microsoft hired to hunt for Vista's weak spots.
Without this ``plumbing'' work done on Vista, holes in the operating
system would leave users susceptible to malware that slows down their
computers, steals their credit card numbers, or deletes the important
files that keep machines running.
Just a day after the Vista launch, Microsoft acknowledged a flaw in
Vista's new speech recognition feature, which might allow a malicious
audio file playing through computer speakers to take over a machine.
Though it's an unlikely scenario, that the vulnerability even exists
shows why Microsoft decided to give experts such as Stender early access
to Vista code. That's a strategic shift for a company long accused of
careless security practices.
The true measure of success for Vista's security defenses will come over
the next year, as vulnerabilities are discovered. Although no one knows
for sure, several Microsoft and security analysts said they expect Vista
to have a lower count of first-year vulnerabilities than its last
version, Windows XP.
If so, it will show Microsoft's major effort to boost security paid off.
The company handed out copies of an early version of Vista to 3,000
hard-core security researchers and hackers at the Black Hat security
conference last year. It also rented office space at its headquarters to
large computer-security companies, such as McAfee and Symantec, so their
engineers could work directly with the Vista teams.
``We've learned as a company that it's not just our own expertise we
have to rely on,'' said Stephen Toulouse, senior product manager with
the Microsoft Security Technology Unit.
Although Stender and his iSEC colleagues can't reveal everything about
what they found in Vista (or how they found it) because they signed a
non-disclosure agreement, iSEC and Microsoft were able to provide some
details about the work they did to discover chinks in Vista's armor.
The process of finding software flaws, according to Stender, is probably
not what you think: ``There's this perception of a Mountain Dew-fueled
night where bugs come out of nowhere and you patch them, and that's just
the end of it,'' he said. In reality, it involves months, sometimes
years, of painstaking work, requiring security experts to scrutinize
programmers' code as they predict how it could be used in real life.
They tried to think like a computer user, whether expert, average,
newbie or malicious.
``Like with anything, you break it down into consumable parts,'' Stender
said. ``You look at how a person might interact with that part of
software. How can a person misuse it? Then you test to see whether that
misuse is possible. If it is, that becomes a flaw.''
From experience, security researchers know the most common flaws that
might show up in software code. A buffer overflow -- which happens when
a program tries to store data in a section of computer memory beyond its
assigned spot -- is one flaw that hackers might take advantage of to,
say, run malicious code that takes over a machine.
Security experts may search for vulnerabilities just by reading through
raw code on a piece of paper or computer screen, or by looking at the
binary code, which has been translated by a compiler into 1s and 0s that
can be read by a machine. They might also find them by reviewing the
design of a software project before any code has been written.
Often, the hunt is automated, run by tools that simulate hundreds of
thousands of user inputs in an attempt to crash the system.
Once a flaw has been spotted, the security team investigates to find out
whether it's an isolated problem or part of a new class of issues that
need to be analyzed and fixed.
``You're never going to be able to `test in' security to a product,''
Stender said ``You keep on testing to find all the bugs, but you'll
never be able to find them all and patch them all.''
Knowing that perfection was unreachable, Microsoft's goal was to
engineer Vista with security in mind from the early stages. In 2002,
about the time the Vista project began, Chairman and Chief Software
Architect Bill Gates sent a memo outlining the Security Development
Lifecycle, which revamped software engineering at the company.
Instead of just focusing on what cool features they could add to a piece
of software, Microsoft programmers were trained to think about potential
security problems before they wrote a line of code. Each engineer was
assigned a ``buddy'' from the internal Security Technology Unit to guide
them through the process of searching for security vulnerabilities.
Nevertheless, there are threats to Vista that even experts such as iSEC
won't be able to catch, so Microsoft is recommending that consumers buy
additional security software, whether it's Microsoft's own anti-virus
and backup product, Windows Live OneCare, or more extensive software
such as McAfee's Internet Security Suite, which adds features like
content filtering to protect children from objectionable photos.
``Microsoft wants to make the operating system and the user experience
more secure, but Microsoft is not a security vendor,'' said Gartner
security analyst Neil MacDonald. ``I don't think its goal in life is to
be the next Symantec.''
Subscribe to the InfoSec News RSS Feed