Well you did say "a GTX580 is capable of playing any game at 1080p resolution and never go below 60FPS," and that's clearly not true for all games
Well that statement was predicated on the CPU clock speed argument, which you understand I'm sure.
An i7 920 at 3.33 ghz is pretty damn good - it's unreasonable to expect people to clock up to 4ghz+ or buy a 980x or what have you, but you are probably right. I'd have to see gpu utilization tests like you said though.
About multi gpu scaling - I would guess that the 580s and higher end ATI cards probably scale much better at higher resolutions (which is what people should be playing at with cards like these) but I didn't bother to check any benchmarks, but in some games they don't scale well at all for some reason.
An i7 920@3.33 Ghz really isn't that impressive and you won't see the CPU bottleneck go away until 3.8Ghz and upwards IMO. This is just information that I have gleaned from a lot of benchmarking on my part and testing different hardware and overclock combination's. I run my workstation 24/7 at 4Ghz (i7 860) and when I do benchmarking and such I bump it to the max of 4.5Hhz or 4.6Ghz, which is bordering on unstable.....I noticed that I didn't see all that much of an improvement with 2x 5850's at stock clocks on my i7 860 (2.66Ghz I think), and when I started bumping the frequencies up I did benchmarks incrementally and noticed a huge increase once I got around 4Ghz+.
I agree that an overclock above 3.5Ghz for any CPU is outside of the norm, but we are talking about high end PC gaming, which doesn't fit into the "normal" category to begin with, so high overclocks become fairly common in this sector. Again this is all my opinion and what I have observed in my many, many, many years of internet geekdom.
Oh on a side note Crysis 2 is pretty darn GPU intensive and I now find myself looking for another 5850.....