Is base clock speed still relevant for TurboBoost processor performance

cpuperformanceturbo-boost

With the advent of Turbo Boost, is it still important to consider base clock speed when comparing processors for performance, or can I look strictly at the Max Turbo Frequency?

For example, using Intel's processor comparison to compare the i5-3570 to the i5-3570S, I can see the 3570 is 0.3 GHz faster in base clock speed, but the chips have an identical Max Turbo Frequency (as an aside, I find it strange for the "S" to have a lower base clock speed, as the "S" suffix indicates a performance-optimized lifestyle).

Assuming I provide adequate cooling to allow the chip to easily reach the max turbo frequency, is it reasonable to expect the chips to operate at similar frequencies under normal circumstances? Or can the base clock speed still be a limiting factor?

Best Answer

"Turbo Boost" only kicks in when the CPU is under very high utilization and, depending on power profile, the CPU decides that it can do better with higher clock speed. On laptops you might not see turbo boost having as much of an effect, especially on battery power, because the software or firmware might not want to use turbo boost and its accompanying energy consumption because it'd burn through your battery too fast.

On desktop computers, assuming you aren't worried about your electric bill, you can set your power profile to "performance", which should allow turbo boost to kick in whenever it would be useful, and will run the CPU at its maximum base clock speed most of the time.

Here's something to consider.

Assuming the following: - Both processors have an instruction pipeline equally deep. - The speculative execution engine on both processors is the same (generally only true on CPUs of the same generation). - The processors have the same number of hardware threads (cores and HT). - The processors have the same Thermal Design Power (TDP).

Then, we should expect that, when the CPU does not determine that turbo boost is required, e.g. under a modest load, the processor with the higher clock speed will get more work done, faster, with the same amount of energy.

This is not always true, and I'm oversimplifying a bit, because other factors can cause my assumptions to miss the whole picture, but this is the general idea.

To take it to an extreme, if you had an old 486 processor that had the same TDP as a Core i7 but only operated at 30 MHz, you better believe the i7 @ 2.6 GHz will be worlds faster, assuming that (somehow) both CPUs were otherwise equal in architecture / pipeline / caching / etc.

Since most typical desktop applications (browser, word processing, email) will not kick in turbo mode, you might expect very slight improvements in some processing time with a faster base clock speed, but 0.3 GHz is not really anything to write home about. If it were 1 GHz I'd say maybe you could notice. But remember, if the CPU is pegged for any substantial amount of time (at 100% utilization), turbo boost will probably kick in, and once that happens, both CPUs are operating at the same clock rate, so any difference in performance is negligible (assuming, like I said, that the other factors are equal across the CPUs).

The i5-3570 and i5-3570S are both from the same microarchitecture generation, and both are designed targeting the same market and similar price point. But here's the critical difference.

a vs b

The i5-3570 has a Max TDP of 77 Watts, whereas the i5-3570S has a Max TDP of 65 Watts!

That 12 Watts means that the 3570 will consume more power, which is also probably why its base clock is higher. So, mystery solved: it isn't a better microarchitecture or anything like that which makes the 3570 faster; it's that it eats more power. Of course we would expect something consuming more energy to be faster, assuming the same microarchitecture.

Related Question