Your laptop has Nvidia Optimus Technology (Dell help page):
NVIDIA Optimus technology (not
available on 3D panel)
NVIDIA® Optimus™ technology
automatically optimizes your battery
life while maintaining the graphics
performance you expect — completely,
seamlessly and transparently — whether
you’re watching a movie, surfing the
Web or playing a game.
How does it work?
This intelligent graphics technology switches between discrete
and integrated graphics processors
automatically whenever it determines
what kind of application is being
used. If you are simply surfing the
Web, the GPU switches to the
integrated version, therefore helping
to extend your battery life. It's that
easy to experience long battery life
and amazing visuals without having to
manually change settings.
Watch an HD movie, surf the Web or play games knowing you can get the long battery life you need and the performance you expect from NVIDIA Optimus technology.
Basically your system is not using the Nvidia card because it doesn't need to. If you were to start watching a HD movie or playing a game then it would switch to using the Nvidia hardware, the fact of the matter is that you're not doing anything that needs the Nvidia card to kick in so your laptop is saving power by turning off the Nvidia card and using the integrated graphics instead.
I authored a question on this subject a few years ago, so I might as well chime in with what I know.
Your laptop uses a technology called Nvidia Optimus to render video output from two GPUs (the integrated Intel graphics processor, [IGP], and the more powerful Nvidia graphics card [DGPU]). This is accomplished by connecting the laptop's screen to the framebuffer of the IGP only, and allowing the DGPU to write pages of memory directly into that framebuffer. In this way, both cards can render output to the same screen, even simultaneously. When an application calls for DGPU rendering, the DGPU writes output to the portion of the screen that the application occupies. In the case of a full screen application such as a game, the DGPU will write to the entire framebuffer of the IGP. A much more detailed description of this process is available in the Nvidia Optimus whitepaper.
When running a graphics-heavy application such as a game on an optimus-enabled machine and experiencing poor performance, it is logical to start by ensuring that the application is making use of the DGPU rather than the IGP. You can do this via the context menu entry you showed, or, somewhat more reliably, through the NVidia control panel. Simply select "Manage 3D settings" from the pane on the left, select your application, then set the "Preferred graphics processor" to the Nvidia chipset.
You can ensure that the application is running on the Nvidia GPU by using the Optimus Test Viewer. This tool will indicate whether or not the DGPU is enabled, and can list which processes are making use of it.
A final workaround for optimus-related issues exists in the hardware outputs of the video card. The Nvidia control panel, as in your screenshot, can display which physical outputs are connected to which monitors. From your screenshot, it appears that the Nvidia GPU has one physical output - You can try plugging an external monitor into this output and confirming that it appears connected correctly in the Nvidia control panel. If so, your montior is now hooked directly to the framebuffer of the DGPU, meaning that optimus is not in use, and all rendering on that monitor will take place on the DGPU.
Based on the discussion in the comments on your question, you have done the following:
- Forced use of the DGPU for your game through the Nvidia control panel
- Verified through use of the Optimus Test Viewer that the game is using the DGPU
- Connected a monitor to the DGPU's hardware output and run the game on that monitor
And despite all of this, the game still runs very poorly. I can only conclude from this information that the problem is not optimus related, but is some other problem - possibly a compatibility issue arisen from such an old game, or from some property of the configuration of your new laptop. You have mentioned that this game is open-source - if there is an active development community, they may be the next best bet for finding a resolution to this problem.
Best Answer
Many recent laptop models now come with "switchable graphics"...two graphics cards designed to switch between them depending on the circumstances. Unfortunately, the mechanism for managing these graphics is far from standardized, and each laptop manufacturer has their own solution for management. Looking on Asus' support website, I don't know exactly which utility is used to manage the cards, but I'm guessing it's probably Power4Gear, which is their power management utility.
Alternatively, there may be a BIOS option to force nvidia graphics all the time, but that's probably not what you want.