Search


Direct X 11 by Nvidia

Geforce GTX 480 and GTX 470 reviewed: Fermi performance benchmarks

Nvidia's Geforce GTX 480 and GTX 470 have arrived. PC Games Hardware puts the new DirectX 11 graphics cards to the test.


With a delay of several months and after tons of rumors Nvidia's Fermi architecture has finally reached the retail market. The first cards of the new DirectX 11 generation are the Geforce GTX 480 and Geforce GTX 470. They are targeted for the high-end sector where they have to deal with AMD's Ati Radeon HD 5800 series. PC Games Hardware tests which of the factions can win this face-off.

Geforce GTX 480 and GTX 470 reviewed: The architecture
We have summarized the most important technical specifications of the Geforce GTX 400 series in order to give you a quick overview. All in all a Fermi chip consists of four CPCs - Graphics (or General) Processing Clusters -, six memory controllers including ROPs and Level 2 cache, the Host Interface for communication with the PC as well as a control center which is called Giga Thread Engine. The GF100 has three billion transistors - about 40 percent more than Ati's Cypress chips with 2.15 billion. Each GPC has its own Rasterizer, a geometry unit called Polymorph and four Shader multi processors with 32 ALUs/16 TMUs each. The memory interface is 384 bit wide.

For the launch of the Fermi architecture Nvidia releases the Geforce GTX 480 and the GTX 470 and a rumor that has been working circles on the Internet lately has been proven to be true: None of the two cards will have a fully-fledged GF100 chip with 512 ALUs. The GTX 480 has 480 Shader Units and 60 Texture Units while the GTX 470 offers 448 ALUs and 56 TMUs.

In matters of clock speeds Nvidia doesn't make big steps in comparison too the predecessor, the GT200. The GTX 480 is running with 700 MHz core frequency and the Shader ALUs are running at 1400 MHz. The GDDR5 video memory has been set to 1846 MHz. Given the fact that the 0.4 nanosecond VRAM used for the card is specified for 2500 MHz it almost seems like Nvidia wants to keep the maximal power consumption down.




Picture gallery  (enlarge to view source)



--
Author: Spille, Vötter, Sauter (Mar 27, 2010)






Advertisement
Copyright © 2015 by Computec Media GmbH      About/Imprint  •  Terms/Conditions