Is there a benefit to selecting a lower refresh rate on a monitor

display

I have a Samsung Odyssey G9 super ultrawide monitor that can switch between three refresh rates: 60Hz, 120Hz, and 240Hz. I like to leave it on 240Hz even when I'm not playing games that can hit that refresh rate, just because I like how smooth the desktop is when I move windows around that huge display real estate.

Are there potential downsides to leaving it constantly running at the maximum refresh rate of 240Hz? It does seem to give off more heat in my face, but more importantly could there be any degradation in display quality, etc?

Best Answer

As with any electronics, working faster means that more energy is expended in making the change happen quicker. This is a simple fact of electronics.

The image may not be changing much from frame to frame at the actual panel (but it can) and the data rate from the computer will be far higher for a higher refresh rate. That means more processing by the screen electronics, more panel refreshes, more cycling through pixel columns and rows at a faster rate.

So by any reasonable consideration a higher refresh rate should use more energy. How much more depends on the panel, the electronics, whether there is intelligent frame skipping or power saving going on.

I have a monitor that actually tells me the energy efficiency and it does show that higher refresh rates cut down energy efficiency, but HDR (which drastically turns up the back-light for contrast) will severely impact every efficiency.

Related Question