03

май

I did a great deal of Google searching to find some information on this subject and I came up empty handed. So in this topic I would like to reach out to the Oculus Rift community and discuss, with evidence preferably, the affect the Oculus Rift has on CPU performance for gaming.
In games like Arma 3 and GTA 4, CPU performance is a bottleneck long before the GPU is, in most high end systems today. I believe this is due to the massive amount of data the CPU has to compile and send to the GPU to render. This is evidenced by increasing draw distance in these games, and thus putting a significantly higher load on the CPU and watching GPU usage fall off dramatically.
My concern is, what happens when we suddenly have to render a second perspective? How does this scale against CPU performance? If a single frame on a 2D monitor is causing a CPU to bottleneck around say, 70 fps, what happens when we now introduce the Oculus Rift and must render an additional screen with slightly off perspective? Does it cut the frame-rate clear in half down to 35? Does the game engine somehow know to 'copy and paste' all the accumulated geometrical data and send it off to the GPU with a very slight hit?
If anyone out there with a Rift would be willing to do some testing, I think it would be a great contribution to the community and the world really, in determining what kind of performance users can expect to have on their computer when they pick up an Oculus Rift.
Testing for this is rather difficult as it requires a relatively controlled environment. This means no multiplayer games, and using the same scenario over and over again to get results. It also helps to narrow things down to specific games, games that are notorious for having CPU bottlenecks in particular as we are not concerned with the GPU usage in these tests. I will provide a simple setup using the two above mentioned games to show you how to test for CPU bottlenecks.
Required Software:
Third Party Stereoscopic 3D Driver to inject into game for Oculus Rift support
MSI Afterburner or EVGA PrecisionX
Arma 3 and/or GTA 4
Any other game that introduces a CPU bottleneck
Once you have all these items, the next step is to setup the GPU monitoring program (MSI/EVGA OSDs) to display GPU usage when in a game. This is done through RivaTuner Statistics Server which comes with both programs. Go into the Monitoring options for whichever program you chose, and make sure that GPU Usage and FPS are being monitored and that it is being displayed in the OSD. Test this by entering a 32-bit game and looking for the information in the top left of the screen.
Now that you have the diagnostics ready, it's time to setup the game scenario and create a CPU bottleneck. In either game, this is very simple to do. Here are the settings for each game to create a definitive CPU bottleneck no matter what GPU or CPU you currently own.
Arma 3:
Object Quality = Ultra
Terrain Quality = Ultra
Shadow Quality = Low
View Distance = 12,000m
Object Distance = 5,000m
SSAO = Disabled
Anti-aliasing = Disabled
GTA 4:
Shadow Quality = Low
View Distance = 100
Detail Distance = 100
Traffic Density = 100
With the above settings, you should be completely CPU bottlenecked no matter what setup you are running. To tell whether you are (and you definitely will be) what you should look for is in the OSD in the top left corner, GPU usage should be anything but 99%. Most likely it will be pathetically low, somewhere around 15-50% depending on how powerful your graphics card is. The faster it is, the lower the usage will be. But you should definitely be CPU bottlenecked and seeing your GPU working with a very light load.
Now that you have your CPU bottleneck, the next step is to create a scenario that you can return to with the Oculus Rift equipped and compare 2D performance to stereoscopic 3D. Log your 2D frame-rate numbers and GPU usage, quit the game, and reload it with the Oculus Rift enabled. Again, note the frame-rate and GPU usage.
Once you have some data, please post here with the results. The goal is to find out how much performance is lost when already in a CPU bottlenecked situation. This I believe will be the much larger issue when getting the 60 fps per eye target. You can nearly always scale GPU's in SLI/Crossfire to meet demand. You cannot increase CPU performance outside of overclocking and/or getting a faster chip, and we are already seeing CPU bottlenecking at the top end of the hardware spectrum in 2D mode..
If you already have some information to share on this topic, please provide it here as well. I am eager to see what the findings are. Thanks.

Sep 17, 2013  ARMA 3 Benchmarked: GPU & CPU Performance Page 6: So, How Demanding Is It? Point being, this isn't a major issue with Arma 3, and we believe this is. I know it's not exactly a new processor, but really I haven't read anything bad about it. Now that I'm playing ARMA III, I cant get more than maybe 20 minutes in (Most of the graphics are on high/very high, but things like draw distance are turned down a bit because I know those are processor intensive) before my computer just shuts down.

Bohemia Interactive has been around for more than a decade, earning awards including 'Best PC Game Developer of the Year' for its 2001 PC exclusive Operation Flashpoint: Cold War Crisis, but you're probably most familiar with the developer through its Arma franchise, which has just received its third major entry.

Like most of its previous releases, Bohemia Interactive is offering Arma 3 exclusively for PC, and the studio has no doubt been working tirelessly to ensure a smooth launch. As part of its development process, the company held lengthy alpha and beta phases that ran for more than six months starting back in March.

Is Arma 3 Cpu Intensive

Built with Real Virtuality 4, Arma 3 expands on its predecessors' realistic military experience with features including an enhanced mission editor, DirectX 10 and 11 support, improved physics across the board, underwater environments, volumetric clouds, better lighting and a 20km view distance with photo-realistic terrain.

Arma 3 offers the largest official terrain of its franchise, with ground area covering approximately 270 km² across the Aegean islands of Altis and 20 km² on the Greek island Stratis. Between its expansive world and graphical advancements, it's no surprise that the developer's recommended specifications are set relatively high.

Intensive

Although you can purportedly get by with a dual-core Intel or AMD processor, 2GB of memory and a 512MB GeForce 8800 GT or Radeon HD 3830, the developer recommends playing with at least a Phenom II X4 980 or Core i5-2300, 4GB of RAM and a 1GB GTX 560 or HD 7750 if you intend to play with DirectX 11 effects.

Now is that a good or a poor thing you ask? Sniper ghost warrior 2 siberian strike download.

Testing Methodology

Naturally, that's precisely how we intend to test the game as we explore the performance of more than two dozen DX11 graphics card configurations from AMD and Nvidia. We'll use the latest drivers and each GPU setup will be driven by an overclocked Core i7-4770K (4.0GHz) to remove the potential of our CPU bottlenecking the GPU(s).

We will use Fraps to measure frame rates during 90 seconds of gameplay footage from Arma's Infantry Showcase. Before starting the test, we plan to turn the difficulty down to the easiest level ('Recruit') and then further handicap the AI by setting its skill level to 0. This should let us pass the AI without dying for consistent results.

Our test begins in a hilly wooded area with four fellow soldiers who accompanied us down the hill and out from the cover of trees, through a valley and toward a small town before taking enemy fire, which we'll ignore in the interest of continuing the test.

We'll run Arma 3 in DX11 mode at three common desktop display resolutions: 1680x1050, 1920x1200 and 2560x1600. For the ultra-quality test, we'll set the visual quality to ultra without changing anything except disabling v-sync. Our very high-quality test was conducted using the very high preset, again with vsync disabled.

As usual, we'll be looking for an ideal of 60fps or faster.

  • HIS Radeon HD 7970 GHz (3072MB) Crossfire
  • HIS Radeon HD 7970 GHz (3072MB)
  • HIS Radeon HD 7970 (3072MB)
  • HIS Radeon HD 7950 Boost (3072MB)
  • HIS Radeon HD 7950 (3072MB)
  • HIS Radeon HD 7870 (2048MB)
  • HIS Radeon HD 7850 (2048MB)
  • HIS Radeon HD 7770 (1024MB)
  • HIS Radeon HD 6970 (2048MB)
  • HIS Radeon HD 6870 (1024MB)
  • Gigabyte GeForce GTX Titan (6144MB)
  • Gainward GeForce GTX 780 (3072MB)
  • Gainward GeForce GTX 770 (2048MB) SLI
  • Gainward GeForce GTX 770 (2048MB)
  • Gainward GeForce GTX 760 (2048MB)
  • Gainward GeForce GTX 680 (2048MB)
  • Gigabyte GeForce GTX 670 (2048MB)
  • Gainward GeForce GTX 660 Ti (2048MB) SLI
  • Gainward GeForce GTX 660 Ti (2048MB)
  • Gigabyte GeForce GTX 660 (2048MB)
  • Gainward GeForce GTX 650 Ti Boost (2048MB)
  • Gainward GeForce GTX 650 Ti (2048MB)
  • Gigabyte GeForce GTX 580 (1536MB)
  • Gigabyte GeForce GTX 560 Ti (1024MB)
  • Gigabyte GeForce GTX 480 (1536MB)
  • Intel Core i7-4770K (3.50GHz)
  • x2 8GB Crucial DDR3-2133 (CAS 11-12-11-24)
  • Asrock Z87 Extreme9 (Intel Z87)
  • OCZ ZX Series 1250w
  • Crucial m4 512GB (SATA 6Gb/s)
  • Microsoft Windows 8 64-bit
  • Nvidia Forceware 326.80 Beta
  • AMD Catalyst 13.10 (Beta)
Article Index

Popular Posts