Transterrestrial Musings  


Amazon Honor System Click Here to Pay

Space
Alan Boyle (MSNBC)
Space Politics (Jeff Foust)
Space Transport News (Clark Lindsey)
NASA Watch
NASA Space Flight
Hobby Space
A Voyage To Arcturus (Jay Manifold)
Dispatches From The Final Frontier (Michael Belfiore)
Personal Spaceflight (Jeff Foust)
Mars Blog
The Flame Trench (Florida Today)
Space Cynic
Rocket Forge (Michael Mealing)
COTS Watch (Michael Mealing)
Curmudgeon's Corner (Mark Whittington)
Selenian Boondocks
Tales of the Heliosphere
Out Of The Cradle
Space For Commerce (Brian Dunbar)
True Anomaly
Kevin Parkin
The Speculist (Phil Bowermaster)
Spacecraft (Chris Hall)
Space Pragmatism (Dan Schrimpsher)
Eternal Golden Braid (Fred Kiesche)
Carried Away (Dan Schmelzer)
Laughing Wolf (C. Blake Powers)
Chair Force Engineer (Air Force Procurement)
Spacearium
Saturn Follies
JesusPhreaks (Scott Bell)
Journoblogs
The Ombudsgod
Cut On The Bias (Susanna Cornett)
Joanne Jacobs


Site designed by


Powered by
Movable Type
Biting Commentary about Infinity, and Beyond!

« An Opening For Creationism? | Main | NASA's Vietnam? »

Looks Like They Got Him

It looks like the guy behind the "Blaster" worm was a teenager from Minnesota.

One of the tenets of nanotechnology is that the distinction between hardware and software is going to increasingly blur in the years and decades ahead. This is an example where that's happening. Had this guy demolished a skyscraper (empty of people), he'd be considered a terrorist, but I doubt that he'd have done that, or even considered it, because that's such an obvious huge destruction of property.

But what he did instead destroyed an equivalent amount of wealth in lost data and lost productivity. We need to come up with a way to make it very clear to people, particularly young people, that there's no substantive difference between those two crimes.

Posted by Rand Simberg at August 29, 2003 10:11 AM
TrackBack URL for this entry:
http://www.transterrestrial.com/mt-diagnostics.cgi/1675

Listed below are links to weblogs that reference this post from Transterrestrial Musings.
Secure software and Microsoft
Excerpt: I’ve seen a lot of commentary on the recent virus attacks that targeted Microsoft software, some of it truly bizarre....
Weblog: Thought Mesh
Tracked: August 30, 2003 07:26 AM
Comments

Well, actually there is a big difference between the two. With the skyscraper, you can quantify pretty easily what the damage was. It's right there. With a computer worm or virus, it's highly subjective.

A further beef though is that in both skyscrapers and computers, we design them to friendly to destructive forces. If we didn't put 50,000 people in a single pair of buildings, then we wouldn't have the tempting target that was the World Trade Center. At least, the WTC was designed well enough that it could withstand the collision of two fuel laden planes for an hour and in the process saves many thousands of lives.

Microsoft has created a computer environment that is highly favorable to the spread of computer viruses and worms. An 18 year old can inflict substantial damage on the world because that's the way the system is designed. If you look at a Microsoft license (or for that matter any software licensed for the personal computer), you quickly see that they disavow all responsibility for damage that their software does to you. The kid may be responsible for the Blaster worm, but Microsoft is responsible for the environment which spawned thousands of viruses and worms.

Posted by at August 29, 2003 10:51 AM

I got nailed by this worm and it's my first experience with a virus. I guess I've been lucky up to this point. I'm mad. No, I'm very mad. A solution to this problem (concerning this 18 year old)is to put him in prison for a very long time. Then he can see how his computer skills help him when his cellmate tells him to assume the position. Mostly likely the gov. will give him a short sentence and then hire him at 80K a year.

Posted by Steve at August 29, 2003 11:35 AM

According to one article he did not write the original Blaster worm, he just modified it into one of the (several) variants. One article actually claimed that his variant had infected 7,000 computers (out of 100,000's infected all together). See http://www.cnn.com/2003/TECH/internet/08/29/worm.arrest/index.html.

Seems to me the real culprit is still out there.

The only affect this worm had on my computer was a few extra lines in my firewall log. Then again, I run Mac OS X.

Posted by AndrewS at August 29, 2003 11:57 AM

> A solution to this problem (concerning this 18 year old)is to put him in prison for a very long time.

How is that going to help?

The situation we have here (to borrow Rand's analogy) is a building contractor (Microsoft) who built a building (Windows) with a bomb already inside (all the security vulnerabilities) and a big red button next to the front door with a sign that says, "Don't press this button - it will blow up the building" (period publications of the security holes). Jeffrey Parson pressed the button, but he didn't build the bomb, and putting him in jail isn't going to stop future viruses.

BTW, the reason that Microsoft puts the bombs in the building is because it wants to convince you that the public street outside the building (the Internet) isn't safe, and that you need to abandon it in favor of the private sidewalk that Microsoft is building (.Net). Steve, along with most of the people in the U.S., has fallen for this scam hook, line, and sinker.

Posted by Erann Gat at August 29, 2003 02:38 PM

Microsoft should be beaten about its corporate head for all the security flaws in Winodws, but that doesn't absolve someone who exploits those flaws to inflict damage on others.

One could as easily argue that companies who build homes with wood frames are responsible for arson.

Posted by enloop at August 29, 2003 02:50 PM

The computer virus problem is a complex issue that defies simple solutions. As much as I would like to blame Bill Gates for every evil afflicting the computer world, I am afraid that is not possible. Given the complexity of every operating system, the uncounted millions of lines of code, and the multiplicity of applications, the holy grail of a totally secure OS is unattainable. Man is not perfect and neither are his machines, nor will they ever be.

My networks (I am an IT manager) run on Novell. Microsoft Outlook and ISS are banned. The few Microsoft application servers that I have are maintained in an "isolation ward." The only virus problem I have is the crap coming in from outside, and if a station should happen to get infected, the problem doesn't spread.

Is Novell immune? No, though I believe it is inherently a much more secure system than Windows. It merely isn't targeted. If Novell had 90% of the market, 18 year olds in Michigan would be trying to demolish NDS, and some might succeed.

There is no such thing as a safe that can't be cracked.

Worst of all, the really smart ones don't get caught. The dumb ones and the copycats do. The hapless character in Michigan was a dumb one. Even the Feds admit they have no idea who started Blaster, much less Sobig.F or Welchia.

The people who ferret out these creeps have my utmost respect. It is a tough job and they get little recognition. Alas, it is also probably a life time employment.

jrb

Posted by Jon R Brenneman at August 29, 2003 04:53 PM

The only effect Blaster had on me -- assuming it had any at all, but I haven't bothered to look -- would be the same. Then again, I run Microsoft Windows 2000.

...with regular updates in addition to the firewall.

Posted by Kevin McGehee at August 30, 2003 02:03 AM

"Given the complexity of every operating system, the uncounted millions of lines of code, and the multiplicity of applications, the holy grail of a totally secure OS is unattainable."

I cringe every time I hear this mischaracterization of the truth. Although technically a totally secure OS in not possible it is -almost- completely wrong to blame this on the millions of lines of code.

There is only one reason, one only, that it is not technically possible to -totally- secure an OS and that is because an unauthorized user just has to do what an authorized user does in order to gain access. Usually this means finding or guessing someones password, where guessing is usually done in an automated fashion. This is why passwords should not be short words found in a dictionary.

What the millions of lines of code phrase alludes to is the second way to gain access... through cracks in the software. The idea here is to get an unauthorized executable to run on the target system giving access to an unauthorized user. A common way is to take advantage of poor error checking to cause a buffer overrun into memory that the target will execute (buffer overruns happen quite often, but usually don't act as executable code.) There is no technical reason why these cracks should exist in an OS. So a totally secure OS -IS- attainable given the qualification that unauthorized users may still gain access borrowing an authorized access path. There are even techniques to make this less likely (biometrics and randomly changing passwords and so forth...)

A tired or inexperienced programming might write code that has a crack that can be exploited, but this is certainly the OS vendors fault. What about the Cracker/Hackers responsibility? Like anybody else, which includes the OS vendor, each of us is individually responsible for the consequences of our actions. However, a strong case can be made that crackers are part of the environment that makes for a secure OS. It's the unpaid testing department (why should a company use profits to test and fix an OS when the public will do it for them?) Can anything that hasn't been robustly tested be considered secure?

The other thing I find wrong is the desire to inflict maximum punishment on some fool that was not likely the instigator. There's plenty of culpability to go around. People shouldn't steal, but if I decide the safest place to store my one hundred dollar bills is on the public sidewalk... I might be a little bit responsible for the loss.

So yeah, I can find plenty of blame for those whose only creed is profit.

Posted by ken anthony at August 30, 2003 09:36 AM

Mr Anthony is absolutely correct when he states that TECHNICALLY there is no reason why system "cracks" could not be eliminated. On the other hand, given the realities of the software publishing world and human nature, I can only reiterate my stated scepticism that it will ever happen. I am willing to bet that if you walk into a library and pull a randomly selected book off a shelf, I will be able to find an editing error, probably many. TECHNICALLY, there is no reason why this should be. Nevertheless, the likelihood that the book you hand me is error free is sufficiently small that I will readily make the bet.

Software programs are complex. Worse, they are in a constant state of flux as new features are added or (rarely) removed. Programmers in most companies are under considerable pressure to get products to market. But, they do not have to be tired or inexperienced to make a mistake. They all make mistakes. The corporate cultures they labor under largely determine how many of those mistakes will be caught. My "millions of lines of code" comment referred to complexity as a factor in the process. It was not intended to be construed as the sole cause of, or a mitigating excuse for, flawed software.

I am reminded of the Mars probe that was lost because someone failed to reconcile English and metric units of measure. Obviously, subsequent testing failed to find the flaw. Should it have been found? Yes. Will yet to be imagined errors occur in future mission software? Undoubtably yes. The testing process is no more immune to error than the programming process. We can work to minimize that error, but we will never totally eliminate it. The people I fear the most are those with the hubris to assert that THEY never make mistakes. Some actually believe what they are saying. Sadly, they are all too common.

I agree completely with the comments regarding user culpability. No matter how many times users are reminded not to open unsolicited e-mail attachments, some inevitably do. If I had a nickel for every user who has his password written on a post-it stuck to his monitor, I could retire right now. You can rail and rant. You can station a security goon behind every user. You can load a workstation up with so much spyware that it barely runs. The indomitable user will STILL find a way to screw something up.

No one is devoid of the Will to Fail. We had an incident at my company some months back that involved transmission of classified federal government material via unsecure e-mail. The culprit? The company's Chief Operating/Facility Security Officer. The IT manager (me) ended up wasting most of a day, while concurrently directing a cascade of withering vitriol at anyone unfortunate enough to venture near, sanitizing the machines of the offender and recipients, the mail server, the back up tapes ... The moral? If there is a human factor involved, sooner of later there WILL be a failure. (I must humbly confess to having actually made some mistakes myself.)

As to your last point concerning throwing the book at luckless individuals who get caught, that is simply the way our legal system works. Deterrence is the equal, if not greater, objective of a prosecution. Those who get caught are made public spectacles in the not always vain hope that doing so may discourage others from similar indiscretions. Unfair? Perhaps, but the alternatives are worse.

jrb

Posted by Jon R Brenneman at August 30, 2003 03:51 PM


The real question in my mind is two-fold: why wasn't the blaster patch (available a month before the worm) part of MicroSoft's Update fixes (I downloaded the patch independently, which as far as I can tell, was the only way to get it -- after the worm got big, M$ put a notice box on Update, but not the patch!) and, secondly, why hasn't this been brought up as an issue?

The patch gets issued, a month later the worm spreads like mad (because the DCOM problem didn't require the user to do anything, unlike Sobig) and you still have to seek out the patch.

I use Update every day or two -- why didn't it fix the problem before it became one?

Oh, yeah, let's lock up some fat 18-year-old and give M$ another billion -- stupid move, Rand, you buy into Gatesland?

Posted by at August 31, 2003 07:29 PM

"...given the realities of the software publishing world and human nature, I can only reiterate my stated scepticism that it will ever happen."

I share your skepticism and certainly acknowledge the realities that give force to your statements.

I think it's the realities themselves that have to be addressed. If Microsoft puts out products with bugs and security holes, given all their resources, imagine the dilemna for a small software publisher? It costs real money to prevent bugs and holes. The metric I've read somewhere is that even the most careful person makes about one error in a thousand. The solution is to have two or more programmers code the same function then do a code review. Even if both programmers make errors, it's likely they will make different errors and a code review can find them.

Obviously this is more expense than some software companies can afford. The other thing is small companies often use less experienced programmers, again costing less, that are unable to even see the errors in their code.

But Microsoft is a different animal. The most significant thing is that along with applications they also write this thing called an OS. The OS is unique in that it can provide security even when insecure applications run on it. The only real question is do they have both the money and responsibility to do so? I think the answer is plain.

The other question is how does a user verify the security of the OS? There is only one possible way and that's for the user to have access to the source code. Microsoft has a lot of products to keep them profitable, but an OS is unique and should be treated differently from your average application. The source needs to be verifiably secure, especially since public utilities rely on this software.

I support the right of Microsoft to maintain secrecy of their source, but I think this will eventually push, at least public services, to look for other viable alternatives. I just wish the open source community would look at the things Microsoft does right and try to follow the example. I'm a strong believer in free and open competition (and just as opposed to strong arm tactics that have no place in an open marketplace.)

Posted by ken anthony at September 1, 2003 09:02 AM


Post a comment
Name:


Email Address:


URL:


Comments: