Windows – Intel HD Graphics 4000 is used instead of nVidia Geforce 630M for old game

dual-gpugpulaptopwindowswindows 8

I have recently bought an Toshiba Satellite P855-32X laptop.

Firs thing after I did launch this laptop is ofcourse update all drivers to the newest ones.
Configure everything I needed and it works like a charm.

I can play all the newest games without problems, but when running a game called "Dark Reign 2" [DirectX 7] it runs at 25 FPS / ~20 FPS max (no matter if settings are lowest @ 640×480 or highest @ 1366×768), while on my older laptop with an i3-370M and a AMD Radeon HD Mobility 5740 [512mb gddr3 vram] it runs at 120 fps. (however I always limit it in-game to 60)

My laptop has an RMB option on applications which allows you to specify with which GPU to run an application with:
enter image description here

But after launchin DXdiag with this option, or the second, I always get this screen:
enter image description here

I tried looking into the NVidia CPL but it doesn't have as much options.. as it should?:

enter image description here

I also made sure my "default GPU" is set to the NVidia GPU. Yet still the problem persists.

I tried disabling my Intel Integrated HD Graphics:

enter image description here

But when I tried to run the game it gave me an error the there is no Harware Acceleration support (for a game from 1999 hehe)

So I looked it up in DXDiag:

enter image description here

And what suprises me here is that there is absolutely no GPU/manufaturer name and the default windows drivers are used..

All my drivers are up to date, all VC++ redistributables, .net frameworks, windows updates, dependencies and DirectX End User Runtime (DX9) are all up to date and installed.

I really don't know what the problem is and I really hate it that I can't play an old game at more than 25 fps yet all the new games (2005+) on a more playable framerate.. yet, on an older laptop the game runs like a charm.

What is going on and how can I fix this? I really don't understand this.

I suspect my Geforce is "connected" to the motherboard with the Intel Chipset -in between- ? Is there any way to circumvent this if this is true?

Edit:
I forgot to mention I did run the game with "use integrated GPU" and the results were the same. (game running on 25 fps max)

Best Answer

I authored a question on this subject a few years ago, so I might as well chime in with what I know.

Your laptop uses a technology called Nvidia Optimus to render video output from two GPUs (the integrated Intel graphics processor, [IGP], and the more powerful Nvidia graphics card [DGPU]). This is accomplished by connecting the laptop's screen to the framebuffer of the IGP only, and allowing the DGPU to write pages of memory directly into that framebuffer. In this way, both cards can render output to the same screen, even simultaneously. When an application calls for DGPU rendering, the DGPU writes output to the portion of the screen that the application occupies. In the case of a full screen application such as a game, the DGPU will write to the entire framebuffer of the IGP. A much more detailed description of this process is available in the Nvidia Optimus whitepaper.

When running a graphics-heavy application such as a game on an optimus-enabled machine and experiencing poor performance, it is logical to start by ensuring that the application is making use of the DGPU rather than the IGP. You can do this via the context menu entry you showed, or, somewhat more reliably, through the NVidia control panel. Simply select "Manage 3D settings" from the pane on the left, select your application, then set the "Preferred graphics processor" to the Nvidia chipset.

You can ensure that the application is running on the Nvidia GPU by using the Optimus Test Viewer. This tool will indicate whether or not the DGPU is enabled, and can list which processes are making use of it.

A final workaround for optimus-related issues exists in the hardware outputs of the video card. The Nvidia control panel, as in your screenshot, can display which physical outputs are connected to which monitors. From your screenshot, it appears that the Nvidia GPU has one physical output - You can try plugging an external monitor into this output and confirming that it appears connected correctly in the Nvidia control panel. If so, your montior is now hooked directly to the framebuffer of the DGPU, meaning that optimus is not in use, and all rendering on that monitor will take place on the DGPU.

Based on the discussion in the comments on your question, you have done the following:

  1. Forced use of the DGPU for your game through the Nvidia control panel
  2. Verified through use of the Optimus Test Viewer that the game is using the DGPU
  3. Connected a monitor to the DGPU's hardware output and run the game on that monitor

And despite all of this, the game still runs very poorly. I can only conclude from this information that the problem is not optimus related, but is some other problem - possibly a compatibility issue arisen from such an old game, or from some property of the configuration of your new laptop. You have mentioned that this game is open-source - if there is an active development community, they may be the next best bet for finding a resolution to this problem.

Related Question