I have a laptop with Intel i5 M430 2.27GHz.
The CPU has TWO REAL cores but it also has some kind of virtualization so Windows sees it as 4-core.
In a REAL dual core CPU, a single threaded program will run in a single core at 2.27GHz. Right?
My question is, in my 4-core CPU, the same program runs at a speed of 1.13 GHz? (2.27 / 2)
I mean, the frequency of each real core is split in two in order to simulate a 4-core CPU?
I need to know for to run a CPU-hungry program at maximum speed. If I run two instances of that program I will finish my data processing two times faster, because I have two real cores. But if I start 4 instances I will finish the processing 4 times faster or is this '2 extra virtual cores' thingy another eye-candy feature from Intel?
I used CPU Overload to start 2 and 3 very CPU intensive threads. In Resource Monitor the "CPU-Total" graph shows only 50 and respectively 75% utilization.