I had noted the announcement of the IBM-Sony-Toshiba "Cell" processor last week, but had not paid it close attention. While (as a relative of the PowerPC architecture) it may end up in future-generation Macs, its initial use will be in the Playstation 3; console game systems really aren't on my radar these days. However, IDFuel's Dominic Muren gave me a well-deserved (virtual) smack upside the head -- the Cell processor looks to be a pretty big deal for some distinctly worldchanging-relevant reasons.
If you haven't heard of it, the Cell is a new processor design which starts out at speeds well exceeding the best from Intel, with the potential to run ten times the speed of today's fastest computers (and you thought Moore's Law was dead...). That's nice, but faster processors are hardly a surprise. One aspect that makes it interesting is it's "multi-core" architecture: each processor in turn comprises multiple sub-processors, each able to handle tasks independently (even running different operating systems). What makes the Cell very interesting, however, is that the task-shuffling between and among cores isn't limited to a single Cell processor: the chips can send tasks across networks, creating ad-hoc distributed computing groups. And since the Cell runs at relatively low power -- the one used in the Playstation 3 will likely consume 30 Watts, the equivalent of a mobile Pentium -- it can be used in a much wider array of hardware than traditional PC processors. IBM, Sony and Toshiba are already talking about the Cell as being useful for mobile devices (phones, PDAs) and home electronics. In a Cell-enabled environment, not only can various devices talk to each other, they can share computing resources.
IDFuel imagines what such a world would look like:
But now, if you are in a bus terminal with 15 people talking on phones, and none of them are playing games or taxing their processors, there could be over 100 cores available to your phone to do some crazy calculation unheard of on cellphone. Or, your entire house of appliances has the potential to use the computing power of the rest of your house. Do you want a toaster with the capability to perform 5th order differential equations of heat flux, but want to keep the price reasonable? Just allow it to tap a little of your PC's idle power. Or, how about a car that can calculate the best route from Chicago to New York taking into account a predictive model for traffic patterns over the next 12 hours based on current weather, traffic, and road condition reports? No sweat: You're on a road full of cell-enabled cars just waiting to loan you some P-time.
One hopes that nobody near you is taking that same trip and wants to run the same calculations...
The Cell processor looks to be fast, low-power, inherently distributed and relatively inexpensive (while no price has been discussed, it's unlikely the Playstation 3 will be more than a few hundred dollars, meaning the chip will be a fraction of that). This is exactly the kind of computing technology I love to see.
Don't believe the Cell hype. There's nothing "worldchanging" about this processor. There are already processors on the market that consumes less power and perform better than this thing is supposed to.
The 10xfaster is only true for some very specialized tasks (audio/video/encryption), that can use the SPE's (the 8 green blocks on the picture). For most applications, those will just be sitting idle and consume area/power.
A Cell processor consuming 30W will be vastly outperformed by a Pentium M at 25W, and the latter will cost less and will have more applications written for it.
For videogames (PS3), set-top boxes and as a building block in a supercomputer Cell is a very nice design, but for general purpose computing it's is not very exciting.
It's really amazing how IBM gets away with their claims. Usually when giant corporation X say that their new product Y will beat the crap out of the competition people are suspicious (and rightfully so), but not in this case. Why? I would really lika an answer to that question.
Frederik - it's hardly to fair to say with one side of your mouth that the Cell is really good at some tasks while simultaneously insisting the Pentium M will outperform it. It's impossible for both of those statements to be true.
The Pentium M will probably out-perform the Cell in interger heavy applications - AI, spreadsheets, word processors.
The Cell has a PPC core similar to a Mac G4 (but running at 4 Ghz) AND has 8 dedicated vector microcomptuers.
The Pentium line was designed from the ground up to be a superior stand-alone processor. Dell will not be switching to the Cell any time soon (and neither will Apple, for the same reason). The Cell was designed from the group up to be a superior Network-based, Floating-point intensive processor. The Cell is expected to output DP-FP 25-30 Gflops/ processor. Just as a comparison, the Earth Simulator supercomputer in Japan (which cost $300 million (or was it $350?)) churns out 2-5 DP-FP Gflops/ processor. The Cell is also better at sharing resources than the ES cores.
Worldchanging? I don't know. At the very least we'll have to wait for RW benchmarks. Just keep in mind one thing: you know those "very specialized" tasks the Cell is good at? It's not just video games. It's also protein synthesis, fusion modeling, weather modeling - you know, the "big iron" of computer tasks. If a Cell-based supercomputer 50x more powerful than the Earth Simulator helps us make some medical or physical breakthroughs, that'll change something.
I probably should have said that the Pentium M will outperform this in non-parallel workloads, which make up the bulk of all software in use today.
And i didn't just say videogames, i said that it will be very useful as a building block in a supercomputer so i agree with you about protein folding and weather sims etc.
The PPC core on this thing is very different from anything released to this day. Apart from the instruction set, there are very few similarities with the G4, or any other CPU on the market for that matter. It will be interesting to see how it will perform.
Although video games might seem far afield from the worldchanging milieu, I think there are several really important things to note about the cell architecture chips coming out.
1) Game boxes continue to represent the cutting edge in terms of processing power. As the debate above already indicates, that processing power tends to be particular in nature (number-crunching), but it is frequently astonishing how much faster video games are than more generalized processors - an order or more of magnitude. This alone makes them worth watching from a technology perspective.
2) More and more, game boxes are not just game boxes - they offer generalized computing services, provide a means of community through communications, and in some cases (Xbox) run a "conventional" operating system. When you consider that they only cost $149, they start to look intriguing in their own right.
3) Cell architecture is pretty radical, not for what it includes, but for what it leaves out, ie the elimination of many layers of abstraction that are normal in modern programming. It's not unlike a return to writing assembly code, for those who remember it. While this will require a major readjustment for programmers, it offers a lot of promise for significant performance increases.
4) Cell architecture moves distributed computing from a software abstraction (running a linux cluster, say) into hardware. Moreover, this distributed computing is highly generalized - rather than splitting a task between 2 processors, or across a known cluster size, you can bring to bear virtually limitless processing on demand. Again, the potential performance gains, in at least some types of computing, are enormous.
5) Power consumption is not only significantly lower, but becomes much more fine-grained, and can be managed even more efficiently than mobile processors allow.
6) As anyone who's used a computer for 20 years can attest to, despite their ever-increasing speed, they still take just as long to start up, and to launch applications. Software continues to increase in complexity as we increase our demands on it. Providing "real" or "human" experiences on a computer, including virtual environments, and/or complex behaviors that simulate "thinking" will place ever greater demands on computers. CA offers a means to "leapfrog" Moore's law, by negating the importance of indidual processor speed, and bringing to bear a network of computing power.
As a side note, Richard Feynam gave a talk in the 60's in which he demonstrated that at least in theory, the lower limit of scale in computer design is actually the atom. Quantum computing, though still nascent, is computing at such a scale. And quantum computing very much requires a major shift in thinking about software and how to write it. CA may very well provide an interesting bridge on the way toward computers built on a molecular scale; regardless, we have a long way to go before we really hit the wall on Moore's law.
Without question CA could die on the vine as another seemingly good idea. But I absolutely think it holds the promise of creating whole new ways of computing, and of using computers we haven't even thought of yet. They aren't just (and some would say aren't even) faster processors, but a fundamentally different processing architecture, and thus by definition could be the killer app of "personal computing."
Here are two excellent articles on Cell Architecture for those interested in reading more:
"The 10xfaster is only true for some very specialized tasks (audio/video/encryption), that can use the SPE's (the 8 green blocks on the picture). For most applications, those will just be sitting idle and consume area/power."
Reminds me of the much hyped hyperthreading... ;)
Due to privacy concerns, and battery life (until we get that solved) this sort of thing actually getting out there like the IDfolk envisage, is unlikely, which is a shame... but this cell thing is quite cool none-the-less.. ;)
One of my lecturers did their thesis on pervasive technologies and things...
Not had chance to read through it yet, but it links in with this somewhat,