253w ago - TheInquirer (linked above) claims to have been informed by a Sony Rep at CES that Intel bought the win, and will design the PlayStation 4 GPU. Until confirmation directly from Sony is available we'll label this as a rumor, but to quote:
The nice Sony engineering lady at CES told us that Intel essentially bought the win, a theoretically good architecture, no imminent threats of going bust, and not being hated by Sony all contributed too.
With a couple of deliverables satisfied, the PS4 GPU belongs to Intel. No word if this is going to be the entire architecture, CPU as well, or not. That, from what we are told, is not final yet.
Moving on to the the XBox3 GPU, also due in 2012, we hear strong rumors (but have not confirmed yet) that it is an ATI design. Given the close ties between ATI and MS over DirectX, the bad blood between Nvidia and MS over the grand Nvidia DX10 neutering, plus memories of the XBox1, this is not a surprise either. The bed was made for short-term profits years ago, time to lie in it.
That brings us to the Wii2, also due around the same time. Given that the ArtX / ATI guys have won about every Nintendo GPU since they went 3D, this one appears to be a no-brainer. If you take into account that it will likely be evolutionary, it is almost assuredly an ATI win as well, but we haven't confirmed it either way.
Stay tuned for more PS3 Hacks and PS3 CFW news, follow us on Twitter and be sure to drop by the PS3 Hacks and PS3 Custom Firmware Forums for the latest PlayStation 3 scene updates and homebrew releases!
It is true that Intel's onboard graphics is well pathetic, but it was never meant to be a 3D Powerhouse, but instead and option for those who do not want to buy an expensive graphics card. The current gpu which Intel is working on seems to be very powerful (on paper atleast), you can read about it on wikipedia: http://en.wikipedia.org/wiki/Larrabee_(GPU)
As for backwards compatibility with the PS3, i think its something we can bare to loose if it means a cheaper console and better games for the PS4. Hopefully Sony will get its act together and put more research into making Blu-Ray cheaper, only then will it become a more mainstream platform.
The only graphics cards I have ever seen intel make are integrated, and these are the worst graphics cards out there. Performance wise, a GeForce 4 MX still beats most of Intel's graphics cards. Even if intel magically made something decent, simply saying that they are going to use intel is going to make Sony lose face.
I totally agree about the cell though it has not given anything that a normal processor couldn't do and it costing an arm and a leg to use, plus the difficulty in development. They need to ditch it for the PS4. While this would mean no PS3 backwards compatibility, Sony has been jerking people around this generation so much in that regard that it shouldn't matter.
The xbox1 was never supposed to have DX10, this article is just saying that Microsoft has been giving Nvidia a hard time over the ability to use DX10, with the first Xbox Nvidia ripped off Microsoft with massive costs to make a sub-par GPU, so thats why they went to ATi for the Xbox 360, which is what is happening now Nvidia is ripping off Sony for the RSX and eventually Sony will go to Intel or ATi, the latter being the best option. If Sony does go Intel the GPU will most probably be a Intel Larrabee.
For the Xbox 3 it is obvious Microsoft will go to ATi, unless they are stupid. I think the PS4 should loose the Cell Processor since its only good for number crunching and is worthless in a game console, instead move to a more mainstream Intel Processor like Core 2 or i7, or a custom made processor based on the Core 2 or i7.
The major issue with the current Gen consoles is the amount of hardware problems that they are having, I'am beginning to think that both MS & Sony will eventually end up dropping their current gen consoles and release new consoles. Hopefully it doesn't come down to this and the consoles go on strong.
As for the Wii2, theres no doubt in my mind that they will continue to add useless features and cheap hardware, and they'll make a fortune doing it.