CPUs stopped being measured in GHz two decades ago when AMD started making lower frequency processors with the Athlon XP line for the consumer market (Thanks to Jeremy Johnson for pointing that out). There is a so-called “MHz wall” which Intel hit back around the year 2000 where they couldn’t get the frequency any higher for consumer use, and AMD stopped pursuing higher Mhz in favor of building better and more efficient architecture to get better computing performance.
The result was AMD becoming the darling of the CPU wars at this time (not unlike how the Ryzens have become the darling of today), and the Athlon XP beating out the higher frequency Pentium 4 chips. This continued into the Athlon 64 range against the Pentium 4 until Intel took the crown back with the Intel Core range more than half a decade later.
The Mhz Wall is real, but Overclock Enthusiasts could get the frequency above 4Ghz and even 8Ghz though hardcore solutions like using liquid nitrogen to cool the processor, and this wasn’t sustainable, practical, or even just usable outside of benchmarking — where systems were built just for the sake of hitting a benchmark, without care for making it actually usable. Here is what an overclock benchmarking setup looks like:
Therefore Intel and AMD went another route and started devising other methods to improve performance besides increasing clock frequency speed.
So many other features figure into the equation. Core count and hyperthreading, PCIe lanes, cache size, etc.
GHz really doesn’t tell a whole lot these days.
The Ryzen 9 3850X for instance is one of the best processors out there today but “only” has a frequency of 3500Mhz, which is less than your i7 7700.
What is your opinion about this? Can you share or give me some advice?