Over the years, as technology changes and I've stopped paying attention, I've become pretty ignorant about a lot of hardware and how it works.
my first pentium comptuer had a 100MHz process that I eventually got upgraded to 133MHz when the mother board had a problem under warranty and Gateway 2000 sent me a new board. That was like 14 years ago. Over time, I've always used that processor speed as a gauge of how good a computer's specs were. Eventually I had a 1GHz processor then a 2GHz... but now Im noticing that "high powered" machines, like my macbook pro is 2.5GHz but you can buy a 3.4GHz processor in a new desktop machine for like $500. So is a 1.6Ghz macbook air for $1,500 only half as fast as $600 Dell at 3.4? I'm guessing probably not, so what makes the difference?