Multicore CPUs like Intel's Core 2 Extreme and AMD's dual-core Athlon 64 have brought about better performance, better power management, and a way for the industry to free itself from a slavish devotion to sheer clock speed.
But multicore CPU architectures are creating a nightmare for programmers, particularly those who want to take full advantage of the new chips' power. The upshot? Much of your brand-new CPU's potential, like an uneducated brain, is going to waste.
http://www.wired.com/techbiz/it/news/2007/08/multicore
Just like many popular games, both new and old, dont make use of multiple cores. So when I bought my 4400 AMD chip a couple years ago (2200mhz x2) I didnt even think that certain things would run worse than they did on a single core 3000mhz cpu, because its only making use of a single 2200mhz. I know if certain MMOs (SWG, Vanguard) made use of those secondary cores, they may possibly run a whole lot smoother.
The benefit is, even without multi threded applications, that I can, for instance, encode a DVD on one core whilst playing a game on the other. Tasks running in the background are also not so noticeable. I am looking forward to seeing what 4 cores is like, once they become more mainstream.