I disagree with you on a few things.
"I have yet to hear of a video card manufacturer working closely with a game developer on a card not yet released"
It happens all the time with Nvidia and AMD, they want to make sure those games perform at their peek because it helps them sell graphics cards. That is why you see often times pre-release and just released games being offered as a give away with those cards.
"I can't speak for your case, but whenever I house sit for my dad when he goes on long vacations, I put my PC in the living room, hooked up to the home theatre. High end PCs + home theatre gear is a match that I am surprised very few people think to do."
There is no dobut it works, and for people who have the minimal technical competency to do that, it works great. The bottom line is that consoles are designed for the largest audience and I am sorry, you would alienate a large market share in doing so. Consoles, release patches, updates, fixes pretty much automatically, PCs are not quite on the same level due to their open vs closed architecture. If a game is bugged, every console has the same hardware, it's much easier to fix. The Console OS, is 100% dedicated to the console. I could go on, but you get the idea. This is why even the steam boxes have been unsuccessful to this point.
"I doubt streaming services will take over. There would be too much input lag for your controls. Even playing an RPG you would have about 100ms lag at least (unless you were fairly close to the server), so a 1/10 of a second delay in response to your inputs, if not more. This would make any shooter, or online competitive game(even League of Legends or DOTA) unplayable."
I agree with you on this point, it is not a near evolution experience, but from some who used to play dark age of Camelot and Everquest over dial-up it has the potential to work as internet speeds increase GaaS (Gaming as a service) gets better and cloud computing continues to develop. If your a PC gamer and require the absolute best in input lag response, etc, you have the money to invest in the best PC system, most of America does not.
Streaming Gaming has come a long way:
http://www.nvidia.com/object/cloud-gaming.html
I am a PC Gamer, I have invested a lot of money in my PC and gear, but I still find myself going back to console gaming.
On a side note to your earlier post comment
"In PC gaming it has been clear for a while now that you want clock speed for gaming performance rather than additional cores. Granted, console games will be written for optimization on the console, but I have yet to see console ports to PC take advantage of high levels of cores. The fact that these latest generation consoles have processors under 3 GHz let alone not closer to 4Ghz (which has has been the standard for several years now) is just as scandalous as the trend in home theatre receivers cutting power delivery to make room for proprietary software features."
Judging CPU's primarily on Ghz is partially ignorant. This is why CPU's are increasing the number of cores at a much more dramatic rate than Ghz. (multi-threading and larger secondary caches etc).
Direct X12 and
AMD Mantle were specifically designed to use multiple cores from the latest generation Intel/AMD CPUs. Games which support them correctly typically will experience a nice little jump in frame rate. That being said, they are not up to consoles in that level. Which again gives console an advantage, as game makers only have to design for one level of CPU/GPU (at least until now) which made them maximize the resources of each.