As a custom-build PC enthusiast, I've always been one to keep an eye on the developments of new hardware and software. As a younger lad, enthusiastic about PC games such as Tribes, Rune, Starcraft, and Diablo 2, I was always keen to have the latest, greatest, fastest, and biggest available. Upon realizing there was a bit more to life than having the highest kill count or a perfect victory record, my uses for computers have changed - but my enthusiasm remains.
Today, as a digital musician, digital media producer and web developer my goals are much more oriented towards stability and reliability than peak numbers and having the absolute newest rig out there. Because of this, I buy generation old computer parts - significantly cheaper, broader driver base, and everyone's already figured out the kinks to get it running "just right." As a result, I can build a system that is more stable than any purchasable branded PC available.
But its still not perfect. Inevitably, Photoshop will still crash on the occasional rendering of a 1.5 gigabyte file. Something may still throw an error when shutting down the system. And there will always be that niggling sense that the system isn't as fast as it could be, or perhaps even should be, given all the touted features that my hardware supposedly has.
So whats the deal? With such amazing technology that we have today - quad core processors, solid state PCI-x drives, wireless broadband, and touch-screen cell phone PDA's that will put your grandma's desktop to shame, why can't my iPod and Windows Media Player get along? Why does it take my system 40 seconds between turning it on and being in a usable state (as compared to a TV which is instantaneous)? And before listing off excuses - of course there are excuses, what I'm talking about is where we should be, not where we are currently.
Manufacturers are bent on releasing a new product every 3 months, with higher numbers, more features, and fancier packaging. This is the (apparently) successfully business model of planned obsolescence. The result? More potential hardware configurations than any software developer could be realistically expected to be able to support.
Rather than flooding the market with 6 lines of graphics cards, and 6 trim levels of each, every 6 months, manufacturers should be focusing all their development on realeasing fewer, more functional and stable products.
Video game consoles are the perfect example of this. There is only ONE system configuration, and it works very well within its set bounds. It will have its limitations, and even flaws, but if they're broadly known then they can be avoided or even used to the developer's benefit. The result? Watch the releases of games from product launch through to the end - the first games look on-par with the previous console release, and by the end its as if they're using twice the power as the console is capable of. And on hardware that is at time 1/10th the power of a decent desktop gaming rig.
The reason is optimization. I would even use the term "mastery," given the context of discussion: language (in this instance, coded language). With the computer world, new hardware, and new languages to use the new hardware, are thrown at us from all directions on a monthly basis. Before we have a chance to fully realize the capabilities of the current technology, it becomes incompatible with the new and must be thrown out or greatly modified. Its potential is never fully realized.
I don't mean to come across as saying that growth and expansion of a language or technology is bad, that to sterilize its organic nature into stagnation is the solution. My proposition is that we stop pumping it with steroids and growth hormones and try to let it evolve naturally. Chip manufacturers: release one chip a year, make sure everyone has drivers for it well in advance of public launch, make it solid, stable, optimized, and streamlined. Make products that will still hold their own 5 or 6 years down the road. Video card manufacturers - have only one model and one size, maybe with the option of how much memory or something. Again, release products that will still be useable when the newest games are released. And developers: optimize the code! Don't be lazy and count on the hardware manufacturers to come up with something by the time you launch that will barely be able to run your game. Level the playing field - go for stability. Video games will then be able to actually utilize the capabilities of the hardware. Keep in mind that programs take years to develop - they are developed based on the technology available at the time. So how is it that by the time they're launched the most powerful systems can barely manage them?? With the technology we have at our disposal, this should not be an issue.
What I'm saying is, don't abandon something before giving it time to grow into its own. Master the technology we have before making decisions on where to go next.
Technology - why such a hurry? Faster processors mean nothing if the software can't even use it.