Tuesday, May 6, 2008
This isn't a post about an embarrassing teenage angst. It is actually a story about natural resources, the industrial revolution, and programming for Windows. Ars Technica, one of the better tech sites on the internet, recently did a series on the lead up to the most recent malaise we've seen with Windows programming. As I read the piece, I kept noticing parallels between Microsoft's difficulties and the early innovators in the industrial revolution. Despite the fact that you probably know nothing about the 19th century about Britain besides the fact the nascent United States kicked its ass off the better parts of North America, it is actually pretty relevant. Rule Britannia Britain ran the 19th century. It controlled all of the sea lanes, dominated world trade/finance, and ran nations which were ten times its landmass and population by occasionally machinegunning down crazy-brave hordes of natives who were attacking them with spears. It got this way because it not only was the first to recognize and reap the massive benefits from new industrial machinery, but also because it ruthlessly closed itself off in terms of technology to retain their advantage. A lot of the first major industries that started in America were based on stealing the technology anyway - not completely unlike piracy in China at the moment. In the late 19th century this initial advantage actually started to hurt them in places. A "second" industrial revolution started when better techniques for creating stuff like steel were developed. Nations a little behind the times compared to Britain could buy better stuff fresh without the perceived cost of replacing older equipment. A newly united Germany bears mention here. Relevance to Windows development Programming in Windows is a bunch of old equipment that needs replacing according to the Ars series (they are not at all alone in this suggestion). Old 16bit windows stuff will still work in the new 64 bit systems, but the odd things you have to account for to maintain this backwards compatibility makes programming new stuff for it a tremendous pain. Relatively late comers (in terms of popularity) like Apple and the more user friendly Linux distributions like Ubuntu can and do start from scratch in order to build better tools and not incur the kind of fallout that would happen if MS did the same. Apple pulled this of by creating a virtual machine for their old apps. Desktop Linux distributions don't really have to worry about this because their user base really isn't so large as to cause mass customer loss at breaking compatibility. The apparently sexier place to do development is for the web, which is OS neutral and can increasingly do stuff that was once limited only to the desktop. That stuff that still needs to be done on a local machine - heavy audio/image/video processing - have a lot of good tools on alternatives to Windows. The only exception here seems to be in the gaming industry, which in general steadfastly refuses to develop for multiple operating systems because, from what I understand, the open tools for rendering graphics (OpenGL) are more difficult to use than Microsoft's DirectX.