- Amidst talk that the PC's days as a major gaming platform could be counted blockbuster titles such as Assassin's Creed are welcome signs that show just the opposite.
Sadly, it is very likely that this game will be remembered for a controversy that dances around a strange decision to remove support for DirectX 10.1 and handed transferred an initial performance advantage for ATI's Radeon cards over to Nvidia. Did Nvidia have its hands in this one? We looked a bit closer to find out.
In the beginning, everything looked perfect. The DX10.1 API included in Assassin's Creed enabled Anti-Aliasing in a single pass, which allowed ATI Radeon HD 3000 hardware (which supports DX10.1) to flaunt a competitive advantage over Nvidia (which support only DX10.0).
But Assassin's Creed had problems. We noticed various reports citing stability issues such as widescreen scaling, camera loops and crashes - mostly on Nvidia hardware.
Ubisoft became aware of these complaints, which ultimately led to the announcement of a patch. According to Ubisoft Montreal, this patch will remove support for DX10.1 and exactly this result caused Internet forums to catch fire.
So, what is it that convinced Ubisoft to drop the DirectX 10.1 code path? Here is the official explanation: