Is it really in society's best interest to impose severe punishment on young hackers? This question came about after the recent arrest of Jeffrey Lee Parsons. He is the teenager who wrote the variant of the Blaster worm that brought down thousands of PCs. Many experts advocate prosecution and heavy penalties against such virus writers—even if they are teenagers.
But does this approach make sense? Virus experts have labeled Parson's hacking as little more than a "script kiddie." In other words, he wrote the virus using simple computer scripts. When our youth indulge in such experimentation, should they be punished with stiff prison sentences? More importantly, what does the success of such simple viruses say about our government's ability to deal with even more sophisticated and experienced hackers? Many experts wonder if the government should spend more time and resources trying to understand today's virus technology, rather than just pursue "spammers and scammers."
This point brings us back to finding the best fate for our young hackers. Perhaps the government should make use of their misplaced talents. Hackers like Parsons could help to catch other potential computer criminals, for example. They could be given a parole stipulation that would require them to prevent or mitigate the effects of a certain number of virus attacks. In this way, the convicted hackers might come to appreciate the damage caused by their mischief. It also would help the government climb the "learning curve" of today's virus technology.
As experts have pointed out, even the simplest Visual Basic (VB) script can infect and shut down an e-mail application in such a way as to cripple e-mail servers. In many of these cases, benevolent hackers have argued that by exposing these flaws, they have prevented even worse attacks from occurring.
Distinguishing between the "good" and "evil" hackers is very important. Malevolent hackers only create viruses for monetary gains or to damage others. They should be punished to the full extent of the law. Benevolent hackers, on the other hand, often provide great value. They demonstrate weaknesses that can adversely affect all users. For example, many large corporate software companies have denied the existence of serious security flaws in their products. In response, hackers created a virus with the sole purpose of exposing those flaws. Some observers have suggested that a National Hackers Society (NHS) could reward such hackers with public recognition, college scholarships, or money (paid by grateful clients).
This approach would undoubtedly have some legal challenges. If a company's security measures have been exposed as inadequate, it may find itself fending off lawsuits from users who have been victimized. Still, such a move is not without precedence. Consider the recent New Hampshire hacker-friendly bill. It requires wireless operators to secure their networks. If they don't, they lose some of their ability to prosecute the individuals who gain illegal access. With these approaches, the difficulty lies in setting up the metrics. These parameters must determine what actually represents a security bug that should have been caught by the vendor. Of course, such an exercise in itself would be useful.
Here again, the idea of a National Hackers Society (NHS) has value. This organization could serve as a beta tester of sorts (albeit after the actual product release). It could perform its services on a pay-per-bug basis. This practical approach would probably not be embraced by everyone, however. It implies that products are released with known security bugs. This assumption comes as no surprise to experienced software developers. But it might come as a shock to the general public. Then again, a National Hackers Society could serve as an educational body to both the public and legislature.
In my opinion, the best way to convince vendors of the benefits of constructive hacking is with a return-on-investment (ROI) approach. It would be far cheaper to have a benevolent hacker find security bugs than to have a predatory one wreak havoc on thousands or millions of users. Of course, the argument can still be made that the vendor should've found these problems. Again, this is an argument for an NHS-like society—one that does nothing but hack other people's applications. I know that similar organizations exist. Usually, they're under the guise of software "quality" testers or security specialists. But I don't believe that they have much experience hacking into code.
What do you think about this subject matter? Should virus hackers be given stiff prison sentences? Or is there a better way to have them repay their debt to society? Please share your thoughts or experiences with me. I can be reached at [email protected].