AOH :: ISNQ4183.HTM

Anti-hacking laws 'can hobble net security'




Anti-hacking laws 'can hobble net security'
Anti-hacking laws 'can hobble net security'



  This message is in MIME format.  The first part should be readable text,
  while the remaining parts are likely unreadable without MIME-aware tools.

--1457021584-1145732114-1182228523=:8494
Content-Type: TEXT/PLAIN; CHARSET=UTF-8
Content-Transfer-Encoding: QUOTED-PRINTABLE
Content-ID:  

http://www.theregister.co.uk/2007/06/18/hacking_laws_discourage_research/ 

By Robert Lemos
SecurityFocus
18th June 2007

Jeremiah Grossman has long stopped looking for vulnerabilities in 
specific websites, and even if he suspects a site to have a critical 
flaw that could be compromised by an attacker, he's decided to keep 
quiet.

The silence weighs heavily on the web security researcher. While ideally 
he would like to find flaws, and help companies eliminate them, the act 
of discovering a vulnerability in any site on the internet almost always 
entails gaining unauthorised access to someone else's server - a crime 
that prosecutors have been all too willing to pursue.

"I have long since curtailed my research," said Grossman, who serves as 
the chief technology officer for website security firm WhiteHat 
Security. "Any web security researcher that has been around long enough 
will notice vulnerabilities without doing anything. When that happens, I 
don't tell anyone, rather than risk reputational damage to myself and my 
company."

Grossman's fears underscore the fact that security researchers who find 
flaws in websites are crossing a line and trespassing on systems that do 
not belong to them. However, applying the law to good Samaritans 
interested in eliminating possible online risks only undermines the 
security of the Internet, a working group of researchers, digital-rights 
advocates and federal law enforcement officials concluded this week.

"I think that if you look at the software security world, there has been 
many, many cases of someone knowing about a vulnerability before you do 
and be using it out in the wild," said Sara Peters, editor for the 
Computer Security Institute. "There is no way to say that these same 
things are not happening in the web world. Assuming that nothing is 
going wrong, because you haven't heard about it is a very myopic and 
callow way of looking at it."

Dubbed the Working Group on Web Security Research Law, the panel of 
experts has started to study whether researchers have any ability to 
play the good Samaritan and find security flaws in websites without 
risking prosecution. The group met at the Computer Security Institute's 
NetSec on Monday and released an initial report that raises more 
questions about the status of web vulnerability research than provides 
answers to concerned bug hunters.

While security researchers have been able to test computer software and 
disclose details about any flaws found, the working group concluded that 
there is no way to test a web server without prior authorisation and not 
run the risk of being prosecuted. Software security researchers are free 
to disclose flaws fully or take part in a process that allows the vendor 
to plug the holes, while web researchers that disclose vulnerabilities 
in a way that angers the website owner could easily be reported to law 
enforcement.

"The way it is right now, if you find a vulnerability and the site owner 
finds about it, you can be held culpable for anything that happens after 
that," Peters said. "Perhaps, that is a bit of hyperbole, but not much. 
There is no culpability for the website owner."

The working group's report, available from the Computer Security 
Institute (registration required), includes four case studies including 
that of Eric McCarty.

In June 2005, McCarty, a prospective student at the University of 
Southern California, found a flaw in the school's online application 
system and notified SecurityFocus of the issue.

SecurityFocus contacted the school at the request of McCarty and relayed 
the information to USC, which initially denied the seriousness of the 
issue but eventually acknowledged the vulnerability after McCarty 
produced four records that he had copied from the database. In April 
2006, federal prosecutors leveled a single charge of computer intrusion 
against McCarty, who accepted the charge last September.

As part of its policy, SecurityFocus did not publish an article on the 
issue until USC had secured its database.

While CSI's Peters believes that good Samaritans should be given some 
leeway, a few of the comments found on McCarty's computer by the FBI - 
and repeated in court documents - suggested that vengeance was a motive. 
For that reason, Peters suggests that security researchers who decide to 
look for vulnerabilities in websites use discretion in dealing with site 
owners.

"You can't let anyone run wild and hack into websites indiscriminately," 
Peters said. "If you publicly disclose a vulnerability in a website you 
are pointing a big red arrow at a single site, so there needs to be some 
discretion."

The working group also concluded that the web is becoming increasingly 
complex as more sites share information and increase interactivity, 
characteristics of what is referred to as Web 2.0. Earlier this year, 
security researchers warned that Asynchronous JavaScript and XML (AJAX), 
a technology that many sites use to add Web 2.0 features, brings 
additional risks to the table for security researchers and vulnerability 
analysts.

"AJAX is not necessarily adding more vulnerabilities to the landscape, 
it is making it more difficult for the scanner vendors to find the 
vulnerabilities," said WhiteHat Security's Grossman, who is also a 
member of the working group. "The sites still have vulnerabilities, but 
they are harder to find."

Independent researchers finding vulnerabilities in websites could put 
pressure on site owners to secure their part of the internet. However, 
the working group could not agree on whether the law should be changed 
to allow for good Samaritans.

That likely leaves liability as the best stick, said Grossman, who 
website owners should be held liable to some extent for any consumer 
data lost due to a vulnerability in their site.

"I think the motivation has to monetary," he said. "Right now, the 
website owners are the ones that have to pay for the security, but the 
consumer is the one bearing all the costs of failure."

Such an equation, he said, is unlikely to add up to better security.

This article originally appeared in Security Focus.

Copyright =C2=A9 2007, SecurityFocus


--1457021584-1145732114-1182228523=:8494
Content-Type: text/plain; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

_____________________________________________________
Attend Black Hat USA, July 28-August 2 in Las Vegas, 
the world's premier technical event for ICT security 
experts. Featuring 30 hands-on training courses and 
90 Briefings presentations with lots of new content 
and new tools. Network with 4,000 delegates from 
70 nations.   Visit product displays by 30 top
sponsors in a relaxed setting. Rates increase on 
June 1 so register today. http://www.blackhat.com 
--1457021584-1145732114-1182228523=:8494--

Site design & layout copyright © 1986-2014 CodeGods