92w ago - Following up on our previous article, today Kotaku (linked above) reports that according to inside sources Sony may be ditching the Cell Processor for the PlayStation 4 console.
Below are the details, to quote: "The PlayStation 4 will not use Sony's Cell processor nor any possible successor to the vaunted chipset that was introduced to the world through the PlayStation 3, gaming industry sources tell Kotaku.
What we're hearing from sources follow a Forbes rumor last week that chip-maker AMD would make the graphics chip for a PS4, a shift from the PS3's use of a graphics chip from AMD rival Nvidia.
The abandonment of the Cell architecture would thrill the many game developers who have struggled with the complex chipset, but it could also be viewed as the admission of a mistake.
Cell was the pet project of PlayStation creator Ken Kutaragi, who dreamed that the chip- a "Power Processing Element" married to eight "Synergistic Processing Elements"- would make the PS3 the most impressive gaming console ever. He spoke of a home equipped with multiple devices that were powered by Cell, all of them linking to each other to increase the computational power driving any of the devices.
Cell was not the revolution Sony hoped and hyped that it would be. It also never managed to make the PS3 appear to be significantly more powerful than the year-older Xbox 360. That could have been the Cell's fault or simply the result of development decisions that compelled game creators to make their games run on both the PS3 and the generally-more-popular XBox 360.
But with no Cell or Cell successor in the PS4, what would Sony do? Here's where the reporting turns to speculation. One theory I've heard is that AMD will provide both the CPU and GPU for the PS4, meaning that AMD, not Sony, would engineer the main processing and graphics chips for the machine. Should AMD be doing that, they could go with the AMD Fusion architecture, which puts CPU and GPU on the same chip.
AMD has already been putting chips like this out (one was considered for the MacBook Air), which would enable Sony to turn to developers and say: you could be working with the PS4 architecture right now; just work on an AMD Llano chip or something.
Would developers like that? They'd have to prefer it to Cell and- what do you know- here's one of gaming history's best programmers, id's John Carmack, saying in an interview with PC Perspective last year that AMD Fusion-style chip architecture is "almost a forgone conclusion" for the future of computing.
A Sony rep declined to comment on this story, citing the company's policy not to comment on rumors and speculation."
Stay tuned for more PS3 Hacks and PS3 CFW news, follow us on Twitter and be sure to drop by the PS3 Hacks and PS3 Custom Firmware Forums for the latest PlayStation 3 scene updates and homebrew releases!
the ps3 has been more stable during it's life than the 360 has been. and the ps3 is smaller than most theater receivers, so i fail to see the large size as a valid argument.
as far as graphics, ati/amd is not a bad choice if they intend to keep advancing the parallel processing architecture. and given the advances in cpu architecture since the ps3 was developed, the main cpu should be able to calculate the physics fast enough to negate the need to have physics calculations on the graphics processor. theoretically the graphics processor will be able to work faster without the added overhead from calculating physics.
if they work it right, they could make every 240hz television operate like the dedicated playstation television that allows two players to each see their own full screen at the same time without seeing the other player's screen. hell, they could probably allow that same capability for four simultaneous players if the television has a fast enough refresh rate.
I really don't want another console the size of a house, that guzzles electric, is incredibly loud and that takes another 2 years after release before a stable one shows up. Just make a console that works this time Sony.
Let's just hope it has a proper graphics card this time and that they don't pull an MS and use an entry level GPU like a 6670... which is 2010 technology and a fraction as powerful as the 6970... which is in turn stomped by the 7970. Let's also hope they decide to include some RAM this time, especially considering DDR3 costs less than dirt right now.