Johnny's Software Saloon

Weblog where I discuss things that really interest me. Things like Java software development, Ruby, Ruby on Rails, Macintosh software, Cocoa, Eclipse IDE, OOP, content management, XML technologies, CSS and XSLT document styling, artificial intelligence, standard document formats, and cool non-computing technologies.

My Photo
Name:
Location: Germantown, Maryland, United States

I like writing software, listening to music (mostly country and rock but a little of everything), walking around outside, reading (when I have the time), relaxing in front of my TV watching my TiVo, playing with my cat, and riding around in my hybrid gas/electric car.

Saturday, January 16, 2010

Quite a decade

As often happens when we start a new year or a new decade, The Register has published an article reflecting the past decade.

The theme of the article is how big companies became bigger, and some really big companies failed in the past decade.  It also looks over some of the technologies that went over smoothly, like USB.

It is a fascinating retrospective.  The title is a little demented since it picks on Google for being the opposite of what they are.

I just ignore that like I ignored the claptrap that made the spectacular hacking of Siliicon Valley possible.  All the illogical statements like: hackers do not find bugs/flaws in software, they wait until patches come out; bug in product that has been out for over 8 years is a zero day vulnerability, because the press release says so; IE is safe now, because one more out of a cornucopia has been fixed.

The facts in the article are interesting.  The title, I dunno.  I suppose it was an ad.  It did not fit the facts, and really we all learned decades ago to ignore "paid testimonials" from watching TV commercials.

Looking back over 2000-2009, there were a lot of dramatic changes.  More than the article hinted out, since it fixed its gaze mainly on a few companies, and not the broader canvas of computing.  In the really long run, I am not sure those companies matter.  I think the computing situations we are in matter more.
  • search is fundamentally important because the web is so dynamic and so big
  • computing has become incredibly untrustworthy
  • cost of flaws is pushed off onto the public, with the press and the government supposed to take on the roles of technical support
  • computer languages and application servers are incredibly powerful today
  • web browsers are much faster and more efficient than IE6 and Netscape 4
  • companies that talk about all the great things they are doing with computer security are the ones whose products get hacked far more than any other
  • security companies do get hacked the same ways the rest of the industry does
  • we cannot shutdown giant botnets or stop them from attacking US companies and agencies
All of these things are somewhat surprising to me.  Consider history.

In the 1970's, computers had just transitioned from DP centers to hobbyist desktops, and finally to appliances.  TV set top video games like Atari were introduced. Schools taught programming courses on minicomputers and mainframes - with terminals, paper & magnetic tape readers, and mainframes.  You could get access to a computer at MIT by request, if you could explain a worthy project.  People who brought down computers were outcasts, and banned.  It was truly the "honor system".  Hackers were not a threat to the aveage person or typical company at all.  Operating systems consisted of some device drivers, a file system for hard disks, and maybe a BASIC interpreter.  You turned on your computer, and it was on instantly or a matter of seconds.  The US was a self-contained computing bubble and we really did not attack ourselves that much, obviously. We were building something.

In the 1980's, computer displays transitioned from raw text to friendly GUIs and so desktop publishing, word processing & spreadsheets caught on.  Now, we were using these tools we we had created to write/print letters, organize personal and business information, and basically retire typewriters.  Wonderful video games for Apple's and PC's arrived that were better than arcade and TV set video games, in some cases.  Two-dimensional graphical role playing games arrived on home computers. Computer flaws that caused serious problems were pretty rare.  Macs just worked.  PCs just ran MS-DOS most places during this era.  The first PC and Mac viruses came out.  The Mac had absurd vulnerabilities like the ability to attach executable code to desktop icons.  Society's biggest dangerous hack in that era was a malfunctioning worm that shutdown government/commercial/university computers all around the United States for a day.  Academians at MIT and elsewhere snagged copies of it, analyzed it, solved the problem of how to stop it, and shared it.  So the worm did not live long.

In the 1990's, personal computers were fast, attractive and easier to use.  RISC computing rose to its peak and began to fall.  The Intel/AMD x86 processor architecture merged as dominant.  The 68k architecture expired. The Mac switched from it to PowerPC (RISC).  Three-D computer games for PCs debuted.  Windows finally caught on.  The web caught on when Mosaic was introduced in the early days.  The Wall that symbolically separated Eastern Europe from the Western world was torn down.  Eastern Europe had serious economic problems.  Russia gave back East Germany to West Germany.  Much of the rest of the former Soviet bloc declared "I quit," and left as well!  Russia had to use conventional trade agreements to make economic agreements.  East European gangs were unleashed.

What happened in this latest decade was the US got hacked hard.  Over and over, the Windows operating system got pounded.  Mistakes by users, errors by programmers, flaws in server/email/browser applications began to be exploited by curious, noisome little vandals - in.  In 2001, thse suddenly had very serious ramifications when exploited. Global worms infested millions of Windows PCs in 18 minutes.  It was a game-changer, or at least a warning shot over the bow.  A couple years later, that warning rang true.  Criminals bent on thievery, extortion, and exploitation started hitting those same Windows computers. They never let go, overall - they just grew, and grew their giant armies of compromised computers.

Computers are much faster today than they were a decade ago.  No doubt faster and easier than almost anyone dreamed they could become, back in the 1970's.   GNU C sped up the state of the art in compiler technology, making very fast software possible.  Intel and AMD made very fast computer processors.  Every application has benefited from those type of improvements.  Speed has gone well.  Safety has not.

Computers are vulnerable for a lot of reasons today.  Most of those reasons were evident in the 1970's or 1980's.  Computer language designers declared they would not get in programmers' way and stop them from making mistakes in their programs.  C, still the main one used today for system programming - along with its descendants C++ and Objective-C for applications programming, had a lot of risks.  It made it easy for a programmer to make a mistake and never notice it.  Many mistakes were made.  Though unnoticed by their creators/maintainers, they were noticed by criminal elements.

Today, commerce and security is threatened by a ball dropped three decades ago.  That ball has kept rolling.  How long before we switch balls or fix it?

0 Comments:

Post a Comment

<< Home

Related pages & news