Username:
Password:
Remember Me?
   Lost your password?
Search



Geforce G210 (40nm)

Geforce G210: Nvidia's first DirectX 10.1 card reviewed

Nvidia's 40 nanometer generation has to show what it is capable off. PC Games Hardware tests if switching to 40 nm, DirectX 10.1 and Co. was worth the effort.
Geforce G210: Nvidia’s first DirectX 10.1 card reviewed
 
Geforce G210: Nvidia’s first DirectX 10.1 card reviewed [Source: view picture gallery]
It took some time before Nvidia released the first graphics cards that support DirectX 10.1, but with the transition to TSMC's 40 nanometer architecture the Geforce array is now, almost two years after AMD's Radeon family, made compatible to the (currently) up-to-date API.

For Nvidia the upgrade from DirectX 10 to DirectX 10.1 still isn't really important. Not very surprising since only the new GPUs and some editions for OEM partners can make full use of the features offered by the API. The high-end desktop graphics cards of the Geforce GTX series are still fully DirectX 10.0 only.

Perhaps this is also the reason why Nvidia is uncommonly tightfisted in matter of product samples and even technical details. Therefore all information delivered in this article is based on what can be found on the Nvidia website, test that we ran ourselves and general conclusions.

Geforce G210: Technical Details
We got our test sample, a Geforce G210, from an auction on Ebay. Initially this card had been part of a HP PC system and has been produced by the OEM company Pegatron. The Geforce G210 is the smallest version of the new series, which is based on the GT218 chip. In the table below we compare the technical details to those of the Geforce 9400 GT as well as the Radeon HD 3470 and HD 4350.



GT218 Chip Ruler
 
GT218 Chip Ruler [Source: view picture gallery]
Of course we also took a look at the "naked” card without the cooler: The graphics chip is the smallest that has ever been measured in the PC Games Hardware Test Lab: 57.3 square millimeters - unfortunately there is no information about the number of transistors. This is 20 square millimeters less than the RV710 (55 nm) of the HD 4350. This only leaves the question if Nvidia was able to squeeze enough power into this little chip.

Gforce G210: Cooling Loudness and Power Consumption
The card itself is quite small - the board is only 168 millimeters long and the height would also fit for a low-profile card if there would be a suitable slot cover (this is of course possible for retail cards). But our Geforce G210 is a full height version since it offers Dual Link DVI, D-Sub VGA and HDMI output.

An active single-slot cooler with a 45 millimeter fan is responsible for the temperature regulation. The warm air is blown in direction of the (closed) slot cover and thus passes the capacitor array of the G210 - this probably doesn't really increase the lifetime of the components.

Below you can see a chart that lists our results for temperature, loudness and power consumption tests. For comparison we added the results of a passively cooled Radeon HD 4350 from MSI.

Results Geforce G210 (Pegatron OEM) Radeon HD 4350 (MSI)
Loudness Idle 0.5 / 25.1 (Sone/db(A)) 0 (Sone/db(A), passive)
Temperature Idle 42 degrees Celsius n.a.
Power consumption Idle 8.7 Watt 7.2 Watt
   
Loudness Race Driver Grid 1.3 / 32.1 (Sone/db(A)) 0 (Sone/db(A), passive)
Temperature Race Driver Grid 61 degrees Celsius n.a.
Power consumption Race Driver Grid 19.5 Watt 16.3 Watt
   
Loudness Furmark 2.8 / 42.5 (Sone/db(A)) 0 (Sone/db(A), passive)
Temperature Furmark 65 degrees Celsius n.a.
Power consumption Furmark 21.4 Watt 17.6 Watt


During our tests we noticed an interesting behavior of the Idle Mode: Besides the full 3D frequencies GPU-Z reveals two additional levels: 405/405 MHz for the core and the VRAM as well as 135/135 MHz for the sleep mode. But apparently the RAMDAC frequency is linked to the clock speed of the chip core: As soon as DVI Dual Link speed (two transmitters with 165 MHz each) is applied - like necessary for resolution higher than 1,920 x 1,200 pixels - the frequencies aren't decreased below 405/405 MHz and the 135 MHz level is ignored. The same phenomenon was noticed in combination with a (Dual Link DVI) 30 inch display as soon as the resolution was higher than 1,280 x 800 pixels. With a 24 inch display at 1,920 x 1,200 on the other hand the lowest frequency level was possible though. So all in all, this means:
• 30 inch LCD with Dual Link DVI connection and resolution bigger than 1,280 x 800: 405/405 MHz
• 24 inch LCD with Single Link DVI connection and resolution up to 1,920 x 1,200: 135/135 MHz
• CRT monitor with analog connection and resolution up to 1,920 x 1,200 @ 60 Hz: 135/135 MHz; at higher resolutions or refresh rates for 1,920 x 1,200: 405/405 MHz.

However this has only little effect on the Idle power consumption: In comparison to the value listed in the chart above the power consumption at 405/405 MHz is only 0.5 watt higher and thus lies almost within measuring tolerance.

The bug of our test sample's fan control was much more annoying though: As soon as the graphics card was stressed and the fine tuned control increased the fan speed from the 24 percent in idle mode, the speed wasn't reduced anymore until the whole system was rebooted.

Geforce G210: Benchmarks
Given the intention for the absolute entry-level market we decide to adjust our game benchmarks a little. We use older games without any modifications and run them at 1,280 pixels without Anti aliasing or Anisotropic Filtering. In order to ensure comparability to our normal benchmarks we still use maximal graphics settings and our usual test system, even if this setup might seem inappropriately fast for the G210 and the HD 4350 we used for comparison. Until those graphics cards are slowed down by the processor, we probably would have to use one of the smallest currently available dual core CPUs.




The HD 4350 beats the Geforce G210 in all tests. The bigger the requirements in Shader performance are the bigger the difference between the two competitors - even if the 20 fps limit is only exceeded in two cases.

Geforce G210: DirectX 10.1 benefit?
Of course we also wanted to know if Nvidia's cards benefit in the same way from DirectX 10.1 as AMD's Radeon models do. By now there are several games that utilize DirectX 10.1. Among them are Battleforge and Tom Clancy's HAWX. Only in Battleforge we were able to record a DirectX 10.1 benefit - a typical problem for this performance level: We already were able to prove that slower graphics cards receive a smaller benefit from DirectX 10.1 than current high-end cards.

Although HAWX activated the DirectX 10.a mode there hadn't been a recordable performance benefit. Therefore we decided to run the integrated Battleforge benchmark at low details (Shader: Medium), but activated 2x MSAA and SSAO - both features benefit from DirectX 10.1.




Here the Radeon also beats the Geforce G210. But both cards benefit from DirectX 10.1: the Radeon runs about 7 percent faster and the Geforce even about 10 percent. So Nvidia's "Mission DirectX 10.1” was successful, but given the actual framerate in the games this is only a theoretical benefit.

Geforce G210 (Pegatron): Conclusion
The graphics chip, which isn't intended for gaming, can't convince in all sectors. Although a superior gaming performance wasn't to be expected with the offered technical specifications, the HD 4350 was faster in the benchmarks. Anyway, both cards can only be used for older games or to accelerate the modern user interfaces of current operating systems. Video fans might also be interested in the HD acceleration of the cards. The Physx advantage many Geforces have in comparison to the Radeons doesn't come into effect here: Only 16 Shader units are too slow to enable GPU Physx in the drivers - the option is just missing.

The technology transition to DirectX 10.1 on the other hand convinces: Like the Radeon the Geforce benefits from Microsoft's API. However the realization of the 40 nanometer technology could have been done better -at least as far as our test sample is concerned: Active cooling, that gets annoyingly loud in some cases, is not necessary for a 22 watt chip. We also couldn't record a noticeable advantage in matter of power consumption in comparison to the 55 nanometer Radeon cards.

Thus the G210 is all in all a card that might make primarily Nvidia happy. With a chip size of less than 60 square millimeters low prices and lots of units might be possible - at least until the OEMs want to have DirectX 11 cards. If you want to upgrade your system or want a card for video playback and don't mind the gaming performance at all, you might also want to consider the Geforce G210 - but you should look for a passively cooled version.

Picture gallery  (enlarge to view source)

--
Author: Carsten Spille (Sep 03, 2009)






Advertisement

Comments (13)

Comments 10 to 13  Read all comments here!
ruyven_macaran Re: Geforce G210: Nvidia's first DirectX 10.1 card reviewed
Super Moderator
22.02.2010 13:40
I would suggest to take a look at AMD/ATIs offers, as all contemporary Radeon-Cards feature an onboard-sound-solution via HDMI. Nvidia-Cards usually need a digital feed audio-feed from the mainboard or a secondary soundcard to pass on any sound through HDMI.
As for quality or performance: I'm not very much into multimedia, but aside form little differences in hardware-accelerated-HD-decoding (that should be of little relevance on modern PCs, that usually have enough CPU-power anyway), I would not expect any differences in usability. So noise, power consumption (and perhaps driver-support/stability/known compatibility problems) should be your only concerns.
ruyven_macaran Re: Geforce G210: Nvidia's first DirectX 10.1 card reviewed
Super Moderator
04.01.2010 00:45
@dunno: While the 9400GT lags slightly in terms of shader*clock, it has twice the amount of rasterisation units and roughly 80% more memory bandwith, so its quite likely to beat the g210 in almost every application.

@unregistered: The system would easily pass the offical minimum requierments of FS X. However FS X is known to require (beside a good chunk of CPU-power -I'm not sure about multi-core-support, but if all four cores are used, you should be finde- and loads of RAM -4gig are definetly not too much-) a little bit more GPU-power then the g210 could offer and as much VRAM and VRAM-Bandwith, as it can get. (It can get hardly any of the later one on g210)
If you want to use Nvidia, I would suggest a GTS250 as the minimum and it might be a good idea to look into some benchmarks, wether more then 1 Gig of VRAM would be beneficial.
pcghx_Carsten Re: Geforce G210: Nvidia's first DirectX 10.1 card reviewed
Junior Member
07.09.2009 22:53
What is it exactly, you want to know from DXVA Checker? It's listing quite a few things in multiple frames...

Copyright © 2014 by Computec Media GmbH      About/Imprint  •  Terms/Conditions