Username:
Password:
Remember Me?
   Lost your password?
Search



What hardware is necessary for Left 4 Dead

Left 4 Dead: GPU and CPU benchmark review

Valve's new shooter is now available via Steam and from some retailers. PCGH dealt with the game and made benchmarks to review the performance of processors and graphics cards in Left 4 Dead.
Left 4 Dead: GPU and CPU benchmark review
 
Left 4 Dead: GPU and CPU benchmark review [Source: view picture gallery]

Left 4 Dead with maximal details
 
Left 4 Dead with maximal details [Source: view picture gallery]
Left 4 Dead: GPU and CPU benchmarks: Introduction
Like most of Valve's games since Half-Life 2, Left 4 Dead uses the familiar Source Engine, which has been upgraded with new effects that are usually used in movies. Among those are the not unquestioned graining and a reduced set of colors. How those new features affect the graphics can be seen in our article: Left 4 Dead: Graphics settings compared.

Left 4 Dead: GPU and CPU benchmarks: Benchmark sequence
Our benchmark sequence "Final Battle” takes place in the last fight the characters have to survive. Due to masses of zombies the real-time demo mainly stresses the processor. But flames, dynamic shadows and detailed textures are also challenging for the graphics card - especially at high resolutions.

This benchmark is a worst-case scenario. If you have more than 50 fps here, then the rest of the game will run smooth.

Left 4 Dead: GPU and CPU benchmarks: Test system
• Motherboard: Intel X48 / AMD 790FX
• RAM: 2x 2,048 MiByte DDR2-800

CPUs
• Core 2 Extreme QX9650 (3.0 GHz, 12 MiByte L2 Cache)
• Core 2 Duo E8400 (3.0 GHz, 6 MiByte L2 Cache)
• Core 2 Quad Q6600 (2.4 GHz, 8 MiByte L2 Cache)
• Core 2 Duo E6850 (3.0 GHz, 4 MiByte L2 Cache)
• Core 2 Duo E6600 (2.4 GHz, 4 MiByte L2 Cache)
• Pentium Dualcore E2160 (1.8 GHz, 1 MiByte L2 Cache)

• Phenom X4 9950 (2.6 GHz, 2 MiByte L2/L3 Cache)
• Phenom X4 9650 (2.3 GHz, 2 MiByte L2/L3 Cache)
• Phenom X3 8650 (2.3 GHz, 1,5 MiByte L2 Cache, 2 MiByte L3 Cache)
• Athlon 64 X2 6400+ (3.2 GHz, 1 MiByte, 1 MiByte L2 Cache)
• Athlon 64 X2 5000+ (2.6 GHz, 512 KiByte L2 Cache)
• Athlon 64 4000+ (2.6 GHz, 512 KiByte L2 Cache)
• Athlon 64 3200+ (2.0 GHz, 512 KiByte L2 Cache)

Graphics cards
(Clock speed: GPU/VRAM)
• Radeon HD 4870 (750/1,800 MHz, 512 MiByte)
• Radeon HD 4850 (625/993 MHz, 512 MiByte)
• Radeon HD 3870 X2 (2x 825/900 MHz, 2x 512 MiByte)
• Radeon HD 3870 (775/1,125 MHz, 512 MiByte)
• Radeon HD 3850 (669/830 MHz, 256 MiByte)
• Radeon X1900 XTX (650/775 MHz, 512 MiByte)

(Clock speed: GPU/Shader/VRAM)
• Geforce GTX 280 (602/1,296/1,107 MHz, 1,024 MiByte)
• Geforce GTX-260-216 (576/1,242/999 MHz, 896 MiByte)
• Geforce 9800 GTX+ (738/1,863/1,107 MHz, 512 MiByte)
• Geforce 9800/8800 GT (602/1,512/900 MHz, 512 MiByte)
• Geforce 8800 GTX (575/1,350/900 MHz, 512 MiByte)
• Geforce 7900 GTX (650/800 MHz, 512 MiByte)

Driver
• Geforce 180.43 Beta (default A.I. with VSync off)
• Catalyst 8.11 WQHL (default A.I. with VSync off)



--
Author: Marc Sauter (Nov 19, 2008)






Advertisement

Comments (13)

Comments 10 to 13  Read all comments here!
L4Dfan L4D is very scalable
Junior Member
20.02.2009 22:20
L4D works very well on old hardware. I have a single core Athlon64 4000+ 1Gb with an Nvidia 7600GT 256MB graphics card.

For online play I set everything to medium (no shadows, no bump mapping, no rain). This gives me a smooth 44fps average at 1600x1200 (measured by doing a timedemo of some intense action). Still looks nice - perfect for multiplayer.

For a more atmospheric single player experience I set the shader up to high to get shadows etc. This gives me 30fps average at 1024 x 768. Still very playable - although it can get choppy during horde attacks.

I recently upgraded the graphics card to a Radeon HD 4670 512MB - Only £50 GPB. Wow! It makes a really big difference. Now I can run *everything* at max, 16 x aniso + 8 x AA, 1600x1200 and still get 30fps - *and* the frame rate seems to hold up better in horde attacks now.

I think the HD 4670 is an ideal upgrade for an old single core processor. It won't dramatically increase your fps cos the processor is a bottleneck - but it allows you to have high quality settings at the same fps - and it's very cheap.

However, my main point is you can certainly enjoy the game on an old single core machine. Thanks Valve!!
mathesar Re: Left 4 Dead: GPU and CPU benchmark review
Junior Member
24.11.2008 08:10
The review states: With a Geforce 7 the shooter is almost unplayable.

This isnt exactly true... I have an AMD X2 4800+ / 2GB / Geforce 7800 GTX 512mb / WinXP and im able to run L4D at 1280x800 with all settings on High and 4X FSAA / 2X AF with a decent framerate, It's not silky 60fps smooth all the time but it still runs smoother than Xbox 360's version with these settings and ive been playing it online every night since its release.

The biggest performance hit seems to occur whenever I have my flashlight on and there's a ton of zombies onscreen, soon as I turn off the flashlight the framerate jumps way up.
trancemode Re: Left 4 Dead: GPU and CPU benchmark review
Junior Member
20.11.2008 20:38
Thats odd. some how i get more than the review gives.
I am using
CPU - Intel Go Quad Q6600
GPU - EVGA NVidia GTX 280
Rams - Corsair 4GB DDR2 800 mhz
MOBO - ABIT FATALITY 650i chipset
HDD - Western Digital VelociRaptor 10000rpm
i have cpu water cooling + gpu water cooling
playing on a viewsonic 2240 "22" 2ms gaming monitor
os - window xp sp3 dx9
nvidia firmware - 180.48 new driver

on resolution 1680

i have it on recommended best setting in the left 4 dead game.

i checked my fps
i got from 130-150 fps stable
and if its mass and all the effect it don't drop no more than 80 fps
but heat is a lot lol

maybe it just depends on everyone specs

Copyright © 2014 by Computec Media GmbH      About/Imprint  •  Terms/Conditions