238w ago - Will Intel's Atom chip make it into smartphones? That's an interesting question.
And it's one the company seemed intent on addressing at its investor conference earlier this week- and in particular, at a session focused on the next generation of Atom chips given by Anand Chandrasekher, head of the company's Ultra Mobility Group.
The next chip is called Moorestown, and it's due out early next year. While this chip is likely to be used mainly in products such as netbooks, Chandrasekher said it would also allow for a broader use in "mobile internet devices." That chip will be followed in 2011 by Medfield, which is the version really aimed at smartphones.
The smartphone market has been asking for a number of features that Intel chips haven't been able to provide, Chandrasekher acknowledged, such as lower total power, lower power when used, all-day battery life, performance, and broadband connectivity. Intel is taking a number of generations to get there.
The existing Atom chip (known as Menlow) takes the power the chip uses when it is active 10 times lower than the power of previous Intel chips, primarily designed for notebooks. The Moorestown version focused instead on bringing down the power the chip uses when it is idle (which is most of the time). He said Moorestown...
251w ago - Intel has dropped more details of its upcoming 32nm processor line up, codenamed Westmere and featuring on-die graphics circuitry.
Rather than a wholesale architecture change, the move to 32nm is a die-shrink of the existing 45nm Nehalem architecture. Intel will start manufacturing the new chips at the end of the year, for mass release in early 2010.
"It's Nehalem upgraded. Nehalem - the largest design change - is the tock. Westmere is the tick we're announcing today," said Steve Smith, Director of Operations for Intel's Digital Enterprise Group, referring to Intel's tick-tock processing architecture strategy.
The first 32nm desktop chips will feature two cores with integrated graphics on the die. Clarkdale for the desktop and Arrandale for the thin and light market.
The 32nm Westmere processor core has been combined with a 45nm integrated integrated graphics and integrated memory controller. "Intel has had for the last decade or so, integrated graphics in our chipsets. So we're taking that, currently shipping in our 4 series chipset, and integrating that," added Smith.
Before that, though, there will be some more 45nm chips. "We have a variety of products that we expect to be shipping in the second half of 2009," said Smith. These long-awaited 45nm quad-core...
252w ago - Killzone Community Editor Victor Zuylen shared the following via PS Blog today:
With Killzone 2 receiving lots of positive attention from the press, I figured now would be a good time to give you a taste of the bonus content which can be unlocked on Killzone.com through the game.
After you've uncovered a piece of Helghast intel in Killzone 2's single player campaign, you can connect to Killzone.com to claim a special reward.
These unlockable rewards come in various shapes and sizes - some are simply printable versions of the propaganda posters you see in-game, others contain details on the wildlife of the Maelstra Barrens or pages from a Helghast soldier's secret journal.
(Please note: To view these documents you will need Adobe Reader.)
If you're good enough, you can even unlock and download physical objects from the Killzone universe. "That's impossible," I hear you cry at your monitor, "You can't to download physical objects through the Internet - let alone objects from a fictional...
252w ago - Following up on the article from a few days ago: Sony will use Intel's Larrabee graphics chip in its upcoming PlayStation 4.
To quote: We know for a fact that Jeffery Katzenberg at DreamWorks likes Larrabee- a lot. That apparently was one of the reasons DreamWorks dropped Advanced Micro Devices.
So, chalk that up as one big win for Intel's somewhat-murky next-generation graphics chip due late this year or 2010.
Now Sony? A report this week in the U.K.-based technology Web site The Inquirer claims Sony favors Larrabee over Nvidia for its PlayStation 4. (The other major piece of silicon used in the current PlayStation is a Cell processor developed jointly by IBM, Sony, and Toshiba.)
For the record, an Intel spokesperson said the company "cannot comment on rumor or speculation." Sony in Europe reportedly didn't mince words, however, comparing the report to some of the 20th century's great fiction. Though another reported comment from Sony is more insipid and PR-like.
252w ago - TheInquirer (linked above) claims to have been informed by a Sony Rep at CES that Intel bought the win, and will design the PlayStation 4 GPU. Until confirmation directly from Sony is available we'll label this as a rumor, but to quote:
The nice Sony engineering lady at CES told us that Intel essentially bought the win, a theoretically good architecture, no imminent threats of going bust, and not being hated by Sony all contributed too.
With a couple of deliverables satisfied, the PS4 GPU belongs to Intel. No word if this is going to be the entire architecture, CPU as well, or not. That, from what we are told, is not final yet.
Moving on to the the XBox3 GPU, also due in 2012, we hear strong rumors (but have not confirmed yet) that it is an ATI design. Given the close ties between ATI and MS over DirectX, the bad blood between Nvidia and MS over the grand Nvidia DX10 neutering, plus memories of the XBox1, this is not a surprise either. The bed was made for short-term profits years ago, time to lie in it.
That brings us to the Wii2, also due around the same time. Given that the ArtX / ATI guys have won about every Nintendo GPU since they went 3D, this one appears to be a no-brainer. If you take into account that it will likely be evolutionary, it is almost assuredly an ATI win as...