Another issue is that this highlights the 2K, 4K and 8K race. In my view we never needed to go over 2K. I absolutely can not tell the difference between 2K and 4K and that is with a top end screen. My two older 2K screens have just as good a picture as my 4K one. I think telling the difference between 2k, 4K and 8K is like trying to tell the difference between 0.1%, 0.05% and 0.025% THD. This has just brooked a boat load of trouble.
This AV technology is in actuality reaching, or at maturity and these "improvements" are now essentially futile and counter productive.
What needs to happen now is a drive to reliability and better ways of doing things. That means active speakers, with DSP corrected time delays, so most will actually hear the dialog, and not have to turn up center channels and ruin the balance. These sort of advances will make a real difference and improve the experience. Whether a screen or AVR is 2k, 4K or 8K is of zero consequence.
You are correct only if people sit at average american household viewing distances from the TV screen.
You are also correct that the Xbox Series X and PS5 will not really be able to output video at 4k120 with high graphics quality settings enabled. They'll be lucky to do 4k60 consistently, so barfing at 4k120 really doesn't matter all that much.
The one thing you forget is that Audioholics readers and high end PC gamers are NOT normal people.
For people that have high end gaming PC's with an nVidia RTX 3080 or 3090 or the upcoming AMD Radeon 6800XT and 6900XT, 4k120 gaming is well within reach. As long as one sits considerably within the critical resolving distance for 1080p for the TV, games in 4K will be noticeably sharper and more detailed.
Heck even with my 50" at 8ft, I can tell if a video on youtube is streaming at 4k30 vs 1080p30. Does it really improve my viewing of enjoyment of youtube videos to watch them in 4k, of course not since the compression is so high and the source material is not super detailed (I'm watching talking heads most of the time with cameras that aren't always focused 100% correctly on their faces.)
But with my existing video card, gaming at 4k60 vs 1080p120, the extra resolution is quite noticeable since the backgrounds are rendered in so much more detail. However, the loss in smoothness when panning or with objects moving by quickly between 120Hz and 60Hz is noticeable too. So 4k120 would be ideal for my gaming wants, so if a $2000 AV Receiver that claims HDMI 2.1 support doesn't work at 4k120, I'd be really, really disappointed and would want my money back.
Note that I sit within 5ft of my 50" HDMI 2.1 OLED TV when gaming.
I agree with you that 8K is really, really bonkers since a person with normal vision (not golden eyes, they can resolve the difference from any viewing distance ;-)) would have to sit so close to the TV to resolve the difference between 4K and 8K it's really impractical.