Username:
Password:
Remember Me?
   Lost your password?
Search



New tech info about Metro 2033

Exclusive tech interview on Metro 2033

Metro 2033 is scheduled for release in mid-March. PC Games Hardware got new information about the game and the technical aspects of its 4A Engine from the Chief Technical Officer Oles Shishkovstov.
Unfortunately we had received incorrect information about who answered our questions. We now have been told that it has not been the Producer Dean Sharpe, but the Chief Technical Officer Oles Shishkovstov.

Metro 2033 is going to be released for PC and Xbox 360 on March 16 (North America) respectively March 19 (Europe). PC Games Hardware had the chance to interview the Chief Technical Officer of the forthcoming action game Metro 2033. Oles Shishkovstov revealed several interesting pieces of information. He told us for example why the developers at 4A games decided to develop a whole new engine and which advantages the PC version of Metro 2033 will have. Other features that were mentioned are support for multi-core CPU and DirectX 11. You might also want to take a look at the system requirements of Metro 2033.

PCGH: You announced that for Metro 2033 you programmed a brand new engine. Do you develop the engine code from scratch or can some remains of code from the latest iteration of the X-Ray Engine be found in your new base technology?
Oles Shishkovstov: Definitely from scratch. Knowing X-Ray architecture (which happened to be designed by me as well in 2001-2002), it became apparent that it just doesn't scale into the future. The weakest points of X-Ray were its inherent inability to be multi-threaded, the error-prone networking model, and simply awful resource and memory management which prohibited any kind of streaming or simply keeping the working set small enough for (back then) "next-gen” consoles.

So we've started from a clean sheet of paper + a bunch of middleware solutions to hook things up relatively quickly.
Metro 2033: DirectX 11 and multi-core support for the PC version (12)
 
Metro 2033: DirectX 11 and multi-core support for the PC version (12) [Source: view picture gallery]
PCGH: Why did you decide against licensing a commercial product like Unreal Engine 3 or even the X-Ray Engine that quit a few members of you team are familiar working with? Why was it necessary to develop your own in-house technology and why is it tailored to the needs of the title?
Oles Shishkovstov: Actually at some point we've thought about licensing UE3 (X-Ray was clearly out of the question). After evaluating all pros and cons we've found that it was not as good as advertised, it was too bloated and too "old-school”. And the worst of all - it demanded a lot more people and their time to get simple things done. And we became really confident we'll make a better engine/development environment in every aspect.
PCGH: You announced that you game will be developed for PC, Xbox 360 and PS3 and that the PC version will come with DX11 and GPU Physx support. So we assume that the Game won't be developed as a pure cross platform title. Is that right? Do you develop the game for each platform separately?
Oles Shishkovstov: Why? I don't understand your logic. The same code executes on all of the mentioned platforms. They use the same content. It's the engine architecture which bends towards the platform it is running on. Of course there are minor tweaks and optimizations all over the place, but we develop only one game.

Actually designers work on their PCs and don't bother with other platforms, they don't even have a single switch that is platform specific - it all just works and works extremely well. We are quite proud of that.
PCGH: Besides DX11 and GPU-Physic are there any other visual as well as technical differences between the version developed for PC and the consoles? Which technical features can't be realized with the console version?
Oles Shishkovstov: When you have more performance on the table, you can either do nothing as you say (and as most direct console ports do) or you add the features. Because our platforms got equal attention, we took the second route. Naturally most of the features are graphics related but not all. The internal PhysX tick-rate was doubled on PC resulting in more precise collision detection and joint behavior. We "render” almost twice the number of sounds (all with wave-tracing) compared to consoles. That's just a few examples, so that you can see that not only graphics gets a boost. While on the graphics side (that's not a complete list):
• most of the textures are 2048^2 (consoles use 1024^2)
• the shadow-map resolution is up to 9.43 Mpix
• the shadow filtering is much, much better
• the parallax mapping is enabled on all surfaces, some with occlusion-mapping
• we've utilized a lot of "true” volumetric stuff, which is very important in dusty environments
• from DX10 upwards we use correct "local motion blur” (sometimes called "object blur”)
• the light-material response is nearly "physically-correct” on higher quality presets
• the ambient occlusion is greatly improved (especially on higher quality presets)
• sub-surface scattering makes a lot of difference on human faces, hands, etc.
• the geometric detail is somewhat better - because of different LOD selection (not even counting DX11 tessellation)
Metro 2033: DirectX 11 and multi-core support for the PC version (4)
 
Metro 2033: DirectX 11 and multi-core support for the PC version (4) [Source: view picture gallery]
PCGH: If you had to advertise the visuals of Metro 2033 what would you mention? Can you list some very modern and advanced rendering techniques that you renderer utilizes (HDR, SSAO, Parallax Occlusion Mapping, Tone Mapping and other post effects...)? Don't be afraid to use technical terms.
Oles Shishkovstov: First and foremost: 3D Stereo. The game was developed in 3D and for 3D. That's the only game developed that way as far as I know. Playing the game under DX11 in 3D Stereo is just mind-blowing, it's much, much more amazing than watching Avatar in IMAX.

Technically interesting things include deferred reflections, analytical anti-aliasing (AAA), screen-space sub-surface scattering for skin shading, etc, etc.

For example, even with multisample-antialiasing disabled player still gets a properly antialiased picture on higher-quality presets. That's only near-vertical or near-horizontal edges which look better with traditional MSAA.
PCGH: When Developing do you try to leverage very modern PC technologies like Dual GPU Rendering/SLI- or Crossfire Systems? In other words: As far as overall performance is concerned, can players increase it remarkably by buying a second graphics card or a card with two GPUs?
Oles Shishkovstov: Yes we constantly test performance on SLI setups. You should expect linear scaling with the number of GPUs in the system. Of course there are limitations imposed by drivers which are outside of our control.
Metro 2033: DirectX 11 and multi-core support for the PC version (10)
 
Metro 2033: DirectX 11 and multi-core support for the PC version (10) [Source: view picture gallery]
PCGH:
By now multi-core CPUs have become very popular and the number of players with such machines is rapidly increasing. Did you integrate multi-core support into the engine from the beginning?
Oles Shishkovstov: Yes, the engine was architected to be multi-threaded from the start. That's the only sensible route to go. You just cannot add multi-threading later in the development cycle or it will be horribly sub-optimal.
PCGH: How many core are supported and what is the expected performance gain from 2, 4 or even 8 cores?
Oles Shishkovstov: We support at least two cores, and up to whatever count you have. You should expect linear scaling with the number of cores, when you aren't GPU limited.
PCGH: What different systems run in separate threads? How are the threads scheduled?
Oles Shishkovstov: Every engine which puts specific systems to specific threads is just badly architected, usually because of historical roots. That architecture doesn't scale. The only way to be future proof is so called task-model, where threads are just workers and there are thousands of tasks organized in kind of run-time generated dependency tree. We do exactly that.
Metro 2033: DirectX 11 and multi-core support for the PC version (11)
 
Metro 2033: DirectX 11 and multi-core support for the PC version (11) [Source: view picture gallery]
PCGH:
Does your engine profit from SMT/Hyperthreading or do you recommend turning it off for maximum performance?
Oles Shishkovstov: Definitely, every properly written engine code will benefit from SMT/Hyperthreading. For example on Xbox 360 we get almost 50% speedup from it. So, if your CPU does have this feature - don't turn it off!
PCGH: It could be read that your game offers an advanced physics simulation as well as a support for Nvidia's PhysX (GPU calculated physics) can you tell us more details here?

Does regular by CPU calculated physics affect visuals only or is it used for gameplay terms like enemies getting hit by shattered bits of blown-away walls and the like?
Oles Shishkovstov: Yes, the physics is tightly integrated into game-play. And your example applies as well.
PCGH: Besides PhysX support why did you decide to use Nvidia's physics middleware instead of other physics libraries like Havok or ODE? What makes Nvidia's SDK so suitable for your title?
Oles Shishkovstov: We've chosen the SDK back when it was Novodex SDK (that's even before they became AGEIA). It was high performance and feature reach solution. Some of the reasons why we did this - they had a complete and customizable content pipeline back then, and it was important when you are writing a new engine by a relatively small team.
PCGH: What are the visual differences between physics calculated by CPU and GPU (via PhysX, OpenCL or even DX Compute)? Are there any features that players without an Nvidia card will miss? What technical features cannot be realized with the CPU as "physics calculator”?
Oles Shishkovstov: There are no visible differences as they both operate on ordinary IEEE floating point. The GPU only allows more compute heavy stuff to be simulated because they are an order of magnitude faster in data-parallel algorithms. As for Metro2033 - the game always calculates rigid-body physics on CPU, but cloth physics, soft-body physics, fluid physics and particle physics on whatever the users have (multiple CPU cores or GPU). Users will be able to enable more compute-intensive stuff via in-game option regardless of what hardware they have.
PCGH: You promised that Metro 2033 will support the DX11 API that comes with Windows 7 and will for example utilize hardware tessellation:
Oles Shishkovstov: Yes.
PCGH: What was or were the deciding technical advantages of the DX11 API/Shader Model 5?
Oles Shishkovstov: Although the API is still awkward from pure C++ design perspective, the functionality is here. I really enjoy three things: compute shaders, tessellation shaders and draw/create contexts separation. The most significant of those three things - compute shaders. They enable a whole new class of algorithms and performance optimizations.
Metro 2033: DirectX 11 and multi-core support for the PC version (13)
 
Metro 2033: DirectX 11 and multi-core support for the PC version (13) [Source: view picture gallery]
PCGH: In what way does it allow you to optimize or simplify the rendering process in Metro 2033?
Oles Shishkovstov: The user can disable additional DX11 features, and in that way he/she can find that performance under DX11 is better than for example under DX10.

That's a result of many small tweaks which are now available under DX11.
PCGH: Do you use DX11-Multithreading to lighten the load on the CPU?
Oles Shishkovstov: If you mean command-buffer chunks, then no. Being an cross-platform we've implemented similar stuff on the engine-side with comparable (if not better) performance.
PCGH: In what way will the DX 11 visualization differ from the graphics that are rendered with DX 10(.1)/DX9 hardware or will DX11 just speed up the rendering process in Metro 2033?
Oles Shishkovstov: If you disable DX11 specific effects, you should get exactly the same picture as in DX10 code-path. The only difference will be performance gain.
PCGH: What are the graphical features that can only be rendered with shader model 5 hardware?
Oles Shishkovstov: We have almost half of the scene tuned for and rendered with tessellation and displacement mapping. That makes a big difference in image quality although comes at some performance cost. Also, don't forget to look at diffusion DOF - that's important to get a "cinematic look and feel”.
PCGH: Assuming that Metro 2033 will support DX9 too, when do you think game development will be at a juncture where it's more viable to put all the effort into one rendering-path using only DirectX 11 (with downlevel-paths) and drop support for XP?
Oles Shishkovstov: Yes, Metro supports DX9 too, and we've put a lot of effort to provide features which are usually considered as being DX10+ exclusive. For example soft particles are available in DX9 as well, but other implementations usually need DX10-level hardware to do it.

Actually it's not that difficult to support older operating systems. We'll drop support when there will be less than 1% of our potential user-base.
PCGH: Can you comment on the rumor, that Metro 2033 is secretly dealt as a launch title for Nvidia's new generation of DX11 cards (Codename: Fermi)? Is there anything true? Have you already had the chance to test the DX11 qualities of the GF100 with the DX11 version of your game?
Oles Shishkovstov: We will not comment on rumors :) Yes we were able to constantly test Metro 2033 on Fermi.

Picture gallery  (enlarge to view source)

--
Author: Kristoffer Keipp (Mar 08, 2010)






Advertisement

Comments (18)

Comments 15 to 18  Read all comments here!
Yapa Re: Exclusive tech interview on Metro 2033
Senior Member
10.03.2010 17:34
Nvidia has the TWIMBTP program which helps developers during game development to better optimise the game for PC and Nvidia hardware, Nvidia invests in this to sell more games and keep PC gaming alive.

Yes they DO pay developers to optimise their games for Nvidia and use Physx etc... but what is wrong with that? In the end we get a better PC port... aftea all most games are now console ports, and without Nvidia helping and paying developers to make the game better (such as AA, physx etc) we would have rather boring and average console ports.

If ATI wanted to do this they could, but they dont want to invest... at least not yet because they were the under dogs. Now ATI is on top for a while, and they have lifted their game with much better driver release times.

I've also seen the ATI/AMD logo on a few games recently, two well known games would be STALKER:CoP and Supreme Commander 2, both feature the ATI/AMD logo when starting the game....

Yapa
ruyven_macaran Re: Exclusive tech interview on Metro 2033
Super Moderator
10.03.2010 13:01
There are acutally several TWIMBTP-titles that performed better on ATI then on contemporary Nvidia-cards.
(Obviously there are opposite examples as well - after all it does make a difference, when a developer gets full support, (future)hardware, extensive bugtesting,... from one company and virtually nothing from the other. Not all developers spend the time and money to do compensate for AMDs lack of developer support)
chizow Re: Exclusive tech interview on Metro 2033
Senior Member
09.03.2010 23:10
Quote: (Originally Posted by connos)
Its not nvidias AA solution, its a common solution that is used in all Unreal 3 engine games.


No, it is not used in all Unreal 3 engine games, as Unreal 3 does NOT support AA by default in DX9. While both AMD and Nvidia have implemented driver workarounds that allow forced AA via driver in DX9, Nvidia specifically wrote the in-game AA implementation for Batman. Nothing is preventing AMD users from using the common solution of forced driver AA and nothing is compelling Nvidia to implement this specific feature for AMD hardware gratis.

At some point AMD fans and consumers need to hold AMD accountable for things they say and promises made in the press. Otherwise you'll continue to get the same sub-par level of support and game support for your hardware.

www.bit-tech.net/bits/int...
[quote=bit.tech]
bit-tech: Given Nvidia licensed its own MSAA technology for Unreal Engine 3, why don't you just do the same thing? Put your code in as well and when the game detects your vendor ID it uses this code instead.

AMD Richard Huddy: We're currently working with Eidos and we want that to be in there in a future update. That's not a commitment to it, but we are working with Eidos to make it happen because I believe it's in every consumer’s interest.

Copyright © 2014 by Computec Media GmbH      About/Imprint  •  Terms/Conditions