Jimmy's Blog Jimmy Ruska's Blog
Computer Revolution of 2009 Posted on Monday, January 26 2009
Roleplay. Learn to play songs by ear! Free Ear Training.
[Video Tutorial] How to build google chrome extensions

Here are a few things I look forward to this year.

Solid State Disks

Solid state disks are predicted to become more reliable, energy efficient and faster in 2009. You can see the advantages and disadvantages of SSD vs HDD on wikipedia. The netbook revolution of 2009 will fuel demand for research and improvement. One to two hours of battery life just isn't enough time to watch a movie on a trip or last through a couple of sequential classes in college with confidence.

Electronic Paper Display

Electronic paper is a type of energy efficient display that only recently started being used in mass production of consumer products like the amazon kindle. The kindle is much easier on the eyes and can open pdf, doc, txt and similar extensions. Reading ebooks from the screen can get tiring on the eyes, but buying a 30 - $40 paper version like an O'Reilly book just because it's slightly more annoying to read the online version seems wasteful, not that they don't deserve the revenue. Some sites sell the pdf version for much much less than the paper version of a book. The kindle and hopefully similar devices will offer a middle ground where I can get a book cheap online and read it without straining my eyes. As more competition comes out and the technology improves it will definitely be worth it.

Right now the kindle's price is at $359 which is probably more than all the books / ebooks / screencasts I've ever bought online. There's no way it costs anywhere near $350 to make a 10.3 ounce kindle. however since demand and hype is high and they control the supply they can get away with it. Given that amazon will probably make a huge profit from people purchasing books off their whispernet, they could afford to sell it below production costs. Amazon should follow iTunes lead in stripping the DRM from their downloads. $9.99 is certainly cheaper than $20-$30 for paperback but with real books you can lend them and re-sell them. Bandwidth on tiny pdf documents is minimal compared to the distribution and creation of a book. Many complaints have been made about holding the device without accidentally pressing buttons and the unreliability of whispernet.

USB 3.0

USB 2.0 went a max of 480 Mbit/s while USB 3.0 is expected to max at 5.0 Gbit/s. Check out the specifications on wikipedia.

Programming Shift and The Multi Core Revolution

Previously clock speeds almost doubled every year but now it's getting to the point where it's starting to reach the limits of the silicon's potential. Instead we've been seeing the focus go multiple-core computing with dual core and quad core now somewhat common in stores. The problem is many programs don't yet have support for multiple cores and they can't use them to their advantage.

It's not yet a big deal yet, at least not to where the majority of people start complaining about performance loss in multiple cores. The opposite is true. Most people are clueless and often get issues like a process using 100% CPU. Rather than end the process they shutdown their computer and curse machines for being dumb and stupid. The more cores, the more unoptimized processes are allowed to run rampant while the clueless user happily uses outlook express, browses through all their wonderful IE toolbars, chats on AIM and downloads more spyware'd 'cool games'.

Differences will be more obvious as even more cores are added. A 16 core or even 32 core machine with slower but much more energy efficient processors is what I'm hoping for. If they keep using super fast but energy inefficient cores a 32 core system with 3Ghz per core would be insanely fast but might start competing with your air conditioning in terms of energy consumption and the case would have to look be something like a mini-freezer. AMD and Intel have long known how to make energy efficient processors but there's not a market for slower machines. Even with several cores if a program can only use one core and the cores are 400Mhz AMD geodes then it's still going to feel like your program is running on a 400mhz machine. Even if we probably have the technology already to get such machine the world isn't really ready for it. Programmers need to get to speed.

Google chrome is a great example of a program that can take advantage of a multiple core systems. Each tab is an individual process that can be automatically designed to any specific core by the operating system. If one tab crashes it doesn't crash the whole browser. The design is more convenient on a single core than not having multiple processes and it takes full advantage of future multi core machines.

Linux Revolution

With vista being a huge disappointment to people know know what they're talking about. A lot hinges on Windows 7 being a great operating system. More and more people are discovering firefox, open office, pidgin, gimp and other very successful open source replacements to common proprietary applications. The open source revolution is hurting public perception of microsoft, allowing more people to become willing to experiment and consider alternatives. In the last few years Linux has been considerably improved and the speed at which it's improving gives me certainty it will become at the least somewhat equal with microsoft's market share within the next 5-10 years. One problem is that linux developers often use cross platform frameworks like gtk+ and wxwidgets which allow programs to be run on a variety of operating systems. Many of the Killer Apps linux users have made run on Mac and Windows just fine.

My first linux: Debian Woody

When I tried debian woody in 2003 I was honestly frustrated. The installation process gave me huge difficulties installing the distro on my laptop. I spent several days tinkering and roaming message boards for solutions. I did manage to get KDE running after the annoying "no screens found" and some stuff the CD complained about not finding but my ethernet card wasn't supported and the package manager would drive me insane. One program would need an updated version of a certain library, and installing the new library would break older programs. As a beginner it wasn't very fun and I didn't know enough C to really get into the developer culture.

I later got a dedicated server for my website and over time got more familiar with linux. I dual booted mandriva with XP and used it every now and then to study c with gcc. For some reason whenever I used it my disk would start making loud noises like it was writing something though the process list didn't show anything unusual was running. Later on it went commercials and would no longer give updates so in time I felt it was outdated and a good vestigial chunk of my partition so I removed it. I still messed with my dedicated server every now and then. Sometimes forced sometimes for fun.

I again started looking into linux when the Ubuntu craze came around and so far I've been happy to have it on its own partition along with XP. I'd mainly switched when I wanted to mess with gcc. I've been using my dedicated server and managing it from SSH for a long time but this year I'll try to make the transition from using gnome more than XP. It seems inevitable microsoft will lose this war. Maybe not immediately with windows 7 but definitely in the next few years. As long as I can code I can take an active part in the battle.




Tags: best of 2009, technology changes 2009, solid state drives digg, reddit, linux



stumble digg delicious


Blog by Jimmy Ruska
Add me: Youtube, Twitter, Facebook, MySpace

Share:

More OMFG-Good Links

See all Posts in the Funny Pictures category.
Download mp3s faster than limewire using google.
I've made 100+ free video tutorials.
See the best of the internet today on one page.