Username:
Password:
Remember Me?
   Lost your password?
Search



Pixel acceleration on Pandora

Borderlands technics test: Graphics cards benchmarks with Unreal Engine 3 - Geforces ahead

The benchmark test with 12 graphics cards will demonstrate which model is the best for the modified Unreal Engine 2 of Borderlands.
Borderlands is set in the post-apocalyptic future on Planet Pandora. As soldier, Siren, berserker or hunter (alternatively in a foursome in seamless coop!) you will be searching for a legendary treasure. The story is simple but the gameplay is highly addictive: Borderlands is a mix of a classical First Person Shooter and an RPG like Diablo 3. The result is speedy and violent combat including leveling and finding more and more weapons - the only downer is the old-fashioned repeating mission design.


Borderlands: UE3 meets Concept-Art
As with former titles, developer Gearbox relies on Epic's Unreal Engine 3 again which was heavily modified, though. The biggest difference compared to most of the UE3 games is the exceptional concept art style: Hand-drawn textures in combination with black lines result in a Cel Shading look which is unique in this form and makes Borderlands stand out from the crowd. The rather simple optics - which plays up during fights, though - is backed up by an impressive technology package: The modified UE3 boasts with Parallax Occlusion Mapping, Soft Shadows, HDR-Rendering, Screen Space Ambient Occlusion and adaptive Depth of Field - the latter is a matter of taste but we like it. DirectX 10 can be activated via the .ini file and Multi and Super Sampling via the driver (the execution file has to be renamed as "UT3.exe” first).

Borderlands: Benchmark and general performance
As test sequence we use the game chapter "Headstone Mines” which is much more demanding than e. g. Badlands. As you can see from the following chart this scene does not present the worst case: The first 12 seconds show our benchmark; the rest of the time we fight against a dozen enemies with a magnum, electric shock shotgun and a rocket launcher. The more enemies and the more effects are visible on the screen the more does the frame rate decrease. While a Geforce GTX 285/1G in 1680 x 1050 with 4x MSAA and 16:1 AF mostly achieves 40 fps the frame rate in 1920 x 1200 with 4x MSAA and 16:1 AF drops below 30 fps and the aiming becomes spongy. However, without MSAA Borderlands runs absolutely smooth.
Borderlands: Verlaufsdiagramm mit Geforce GTX 285
 
Borderlands: Verlaufsdiagramm mit Geforce GTX 285 [Source: view picture gallery]
Borderlands: Benchmark results
Gearbox‘ Shooter-RPG displays - typical for UE3 - a considerable fps decrease after activating 4x MSAA. Without anti-aliasing the frame rate rises between 30 and more than 100 percent, in average about 50 percent. A further peculiarity: Nvidia's Geforce models perform much better, a GTX 260-216 easily beats AMD's current flagship Radeon HD 5870. The UE3 basically favors Geforces but in this case we assume that AMD has not optimized their drivers - even though we use the latest driver Catalyst 9.10 WHQL. We stay in contact with the Canadians, the test presents the current situation.

Nvidia's current Geforce 195.39 Beta offers SLI support for Borderlands.



More interesting articles about Borderlands:
Borderlands: Amazing screenshots plus details about Unreal Engine 3
Borderlands: System requirements revealed
Borderlands: New screenshots of the DirectX 10 shooter - Update: Behind the Scenes trailer



Picture gallery  (enlarge to view source)

--
Author: Marc Sauter (Oct 30, 2009)






Advertisement

Comments (59)

Comments 56 to 59  Read all comments here!
Dave Baumann Re: Borderlands technics test: Graphics cards benchmarks with Unreal Engine 3 - Geforces ahead
Junior Member
03.11.2009 17:16
Quote: (Originally Posted by chizow)
Actually after reading it again, Dave Baumann seems to be posing a less-than-honest hypothetical situation where AMD's "ISV" dev relations team secured a co-marketing deal and suggested DX10 optimizations first along with DX10.1 optimizations instead of simply focusing on DX10.1 optimizations that would've been limited to AMD parts, effectively locking Nvidia out.

Actually, the DX10 codepath was already in the title. Our ISV Dev Rell made some suggestions to improve that code as well as further improve it with DX10.1 - the point being is that we could have just made the changes for DX10.1 and left it at that, generally speaking thats not the way our Dev Rel works though.

This is not a hypothetical situation but an actual one. Every DX10 owner that played the game reaped the benefits.
chizow Re: Borderlands technics test: Graphics cards benchmarks with Unreal Engine 3 - Geforces ahead
Senior Member
03.11.2009 14:41
Quote: (Originally Posted by Unregistered)
From what I am reading it a DX10 implementation was already in place, but AMD could have left that alone and just focus on the DX10.1 path.


Actually after reading it again, Dave Baumann seems to be posing a less-than-honest hypothetical situation where AMD's "ISV" dev relations team secured a co-marketing deal and suggested DX10 optimizations first along with DX10.1 optimizations instead of simply focusing on DX10.1 optimizations that would've been limited to AMD parts, effectively locking Nvidia out.

The reason I say its less-than-honest is because DX10.1 is a strict superset of DX10 but offers few remarkable additional features. The only notable improvement it offers is a "gather" instruction which allows for multiple AA samples with a single call, which results in the slight performance gains with MSAA. DX10.1 implementation is going to include all necessary features for DX10 implementation, just as DX11 is going to fully support DX10 hardware with additional modular features.

Obviously Devs will look to support the lowest common denominator that gives them the largest potential install-base, this applies to just about every industry out there and a large reason why PC gaming has largely been tied down to DX9 (OS install-base and consoles being the most limiting factors). Obviously this begins to change with Win 7 and DX11 both raising the bar, but this change isn't significant enough to shift the paradigm completely to widespread DX11 support in games, it simply opens the door for widespread DX10 adoption with the potential for additional DX11 features.
chizow Re: Borderlands technics test: Graphics cards benchmarks with Unreal Engine 3 - Geforces ahead
Senior Member
03.11.2009 14:20
Quote: (Originally Posted by Unregistered)
You are talking about superiority while you are obviously forgetting:
1) The unlimited rebrands and renames. Is that how much Nvidia respects you?
2) The Failed Vista drivers that drove people insane
3) The failed bumps of their mobile chips. A cousin of mine is a proud owner of laptop equiped with Nvidia's notorious chips and the laptop is now a paperweight. The funny thing is that they just had their first baby, thus too many expenses and he cann't afford to buy a new one and he needs it for his work. He has solely Nvidia to thank for that!
4) The launch prices of the GTX 280 and GTX 260! LOLOLOL! Is that how much Nvidia respects you? They exactly showed what are their intentions and people are still bowing before them. I mean COME ON!
5) No DX 10.1 and the benefits it could bring to the gaming market, ie 20% better AntiAliasing performance. Now they remember to brings us AA in UT3 engine based games? ZOMG!
6) Showing some paper constructed graphics card for Fermi? lol, Who does that? Still no DX11 btw.
7) PhysX you say? For what? Some asswipe papers being kicked around in Batman and some smoke and steam that completely disapear without PhysX? Why do you think they force the steam and smoke to disappear? Could they just had left them there without physics properties? Of course they could. We have seen gazillions cubic meters of smoke and steam in hundreds of games and now Nvidia decided that we need physx to see them moving left and right. And that with 50% PERFORMANCE HIT for some crappy effects. Will you people get serious?
8) 3D vision? Oh now I have to change my monitor because Nvidia wants to AND I have to pay like 250 euros for their crappy glasses? Do you see what Nvidia is doing? PAY PAY PAY and will you are at it PAY some more!

So... WHO exactly is full of empty promises and fails to delivert? Certainly not ATI!

Please, I've been monitoring this industry since the S3 Virge and I don't Nvidia to tell me what's right and what's not!

Cheers!


1. Is that a joke? Obviously you're forgetting the countless rebrands of R200 and R300, not to mention HD2900 to HD3870, 4870 to 4890, hell even the 5870 could be considered a rebrand given its just the equivalent of 4870X2 fused to a single die.

2. The Vista issues that were largely corrected by MS in their numerous significant WDDM-specific hot fixes (at least 4). I'm sure we would've seen more ATI problems had they actually had any DX10 parts of note on the market, but of course, they didn't (see R600).

3. Yep, every company manages a turd of a chip every once in a while, I just find it funny how AMD supporters always bring up GeForce mobile problems yet fail to mention anything about that fireball of a chip ATI sold Microsoft for the Xbox 360. $1 billion RROD compared to $200 million GeForce Mobile problems...hmmmm lol. As for your cousin, if his laptop were an affected part, it would've been covered under warranty.

4. Both companies will charge what the market will bear, this is simple economics. Are you claiming ATI/AMD has never charged huge premiums on their parts when they were able to? For someone who has been "following the industry since the S3 Virge", I guess you must've slept through the X800XTX, X850XTPE, X1900XTX launches that were all $500-600+ at launch. Hell even the 4870X2 launched at $600 and the 2900XTX "Draggin' Head" was supposed to launch for $500+ before it got canned. Why? Because their relative performance lead in the market allowed for such a premium. Claiming either side is innocent of actions expected of any going concern in a capitalist economy is simply ignorant and hypocritical, but not unsurprising when dealing with AMD and its supporters!

5. Why would Nvidia support features their hardware cannot support, especially such unremarkable features as those offered by DX10.1? And no, this isn't the same as PhysX as I'm sure this is where your argument will turn next, as there's nothing preventing AMD from supporting PhysX on their own hardware other than their own agenda and reasons not to.

6. Uh, everyone does that, mock-ups are standard operating procedure for numerous industries, I guess you've never been to a car show or trade show like CES? You think all those motherboards or consumer electronic device prototypes are all working samples? As for no DX11 support, are you claiming Fermi won't support DX11? Just shows the extent to which certain people will attempt to discredit and undermine the competition I guess!

7. PhysX is clearly the biggest advancement in PC gaming since DX9 and will only continue to grow and impress with DX11/DirectCompute. Obviously AMD also sees value here or they wouldn't spend so much effort trumpeting their own ever-changing, vaporware alternatives. PhysX clearly improves every game its implemented in more than any other feature promised in DX11, but its funny you emphasize the performance hit incurred. You do realize Tesselation in DX11 results in a nearly identical performance hit when enabled right? Check out the Unigine Heaven benchmark results.....

8. Yes, with Nvidia you're expected to pay for additional features but at least you get that baseline level of support, working drivers in popular games, included free of charge. With AMD and its fans, apparently everything is supposed to come free and everyone else is supposed to fix their problems for them. I guess it should be no surprise that in the end, most of those problems simply don't get fixed. As for 3D Vision, the reviews speak for themselves, obviously other industries like TV and Movie also see value in it. 3D Vision's requirements are simply pushing the envelope and helping to advance technologies that will benefit all consumers down the road, most namely, True 120Hz LCDs.

Which brings us full circle about who is failing to deliver. Nvidia is offering support, solutions and innovative new features for their hardware, sometimes at a premium. AMD offers excuses, half-ass workarounds, and empty promises, all bundled with the power of community! I guess you get what you pay for!

See, I've been following the industry far long than you have and yes there was a time I actually preferred ATI and AMD hardware, but then I came to realize it was worthwhile to simply pay a little more for a superior product with superior support.

Copyright © 2014 by Computec Media GmbH      About/Imprint  •  Terms/Conditions