Showing posts with label concurrency. Show all posts
Showing posts with label concurrency. Show all posts

Saturday, September 12, 2009

The Way of the Future

I will start this very special entry with a very ugly truth I realized, "This is my last trimester, I'm getting out of school, finding a job, making some money and I just realized I'm already old news, I'm not the a fresh programmer but only the latest iteration from a dying paradigm."

Ok, the people who know me think I'm a talented individual so it may have shock them to read through that last paragraph. The truth is, it is all true, we are a dying breed.

I have always used a personal metric to find where IT was headed, I just read what new trend Microsoft have adopted (people can hate Microsoft but if something new is good, Microsoft will adopt it, buy it or copy it, either way they just end up using it), so when I found out about Microsoft making a big deal about concurrency in the upcoming .Net 4.0, I knew something was happening, something big.

So what's the big deal about concurrency? Well to put it simple we all know the future of processors is adding more and more cores, but what happen when your code's performance doesn't improve, when your code is 1% faster in a machine with twice as cores as before? You just have found the tip of the iceberg, and believe me it's a huge problem for us programmers.

It seems the problem is the way we develop today is just as von Neumann's theories said we would, one instruction at a time, but our hardware is already past beyond that point, it's just our code and coding skills that are outdated, we just aren't thinking parallel.

Multicore processors eliminated single cores because adding more frequency and speed to single cores was causing overheating and more power consumption, so multicore processors don't overheat as easily as single core processors does, but the more important reason is that they use less power (running those ubber datacenters spawning total airplanes hangars both in the US and in europe is posible now), point is we all agree more cores is the way Intel and AMD are gonna shape our future.

The answer to this problem is to think parallel, we need new skills, there won't be a magical compiler to handle parallelism for us (researchers are trying to achive that since the 80s), our current dominating and most used platforms (Java platform and .Net) can't solve this issue, nor can dynamic languages like Ruby (a personal favorite) or Python, we need new skills.

Why our most used tools and languages fail to solve this problem? Because they share the resources (RAM), so if A + B = C then C - A = B, we all know that's true but what happens when running on a datacenter using 600 nodes (servers) a la Facebook, one of those nodes change the value of B just when another is checking the second half of the equation? Bad things, so our answer is lock the memory, just as our DBs lock write access to a record to avoid this kind of problem (and the more and more cores of our datacenters are waiting because the resources are locked they can't do anything until the resource is free), so the problem is at the core of all our tools, some experts have said the problem is C itself and I ask you which of our modern tools have not used some of C ideas for themselves? (I'm assuming your language of choice ain't written in C). I hope you guys are following.

So what's the answer? At the moment we may not feel this pain in our daily jobs (unless you work for a huge company, Amazon and Google like, which may need to use concurrency to use all the power of multicore processors), our programs don't use all the power of our hardware but we can endure for a few more years (specially in DR where changes reach with a 8 years delay). Microsoft and Apple have made wonderful efforts to use the power of multicore procesors and to make it easy to jump to the bandwagon to the future, concurrent programming. Both companies have added APIs and libraries for use by devs in their current iteration of their operating systems, will that be enough? No, but it's a start. (Sorry Linux users, aparently at the moment Linux doesn't care about concurrency at all, which may make some devs leave that OS because of its lack of vision, my personal case)

So most of my personal friends in the IT world know I don't care about the money (which I need to eat and to "satisfy" all of my future GF's whims, yes ladies I'm single and yes Sab I know) but I care about what problems and what achivements I make programing solutions for me or the company I'm currently working for. So my personal solution is to learn concurrent languages like Erlang, F# (runs .Net) or Clojure and Scala (both runs on the JVM), I know most people don't have time but I'm learning with my little free time and I already see some benefits, I'm coming up with more sophisticated solutions because my mind has a broarder scope now.

So to end this article I conclude with the phrase that started it all, "This is my last trimester, I'm getting out of school, finding a job, making some money and I just realized I'm already old news, I'm not the a fresh programmer but only the latest iteration from a dying paradigm."

Last, I leave a very good link about this present problematic.

Is this a problem you 're facing or you don't believe this won't affect IT? Leave your comments.