Xbox One X: Consoles Hit the Downside of the Innovation Curve

gene

gene

Audioholics Master Chief
Administrator
The biggest story out of last week's E3 was the debut of Xbox One X. Yes, Microsoft has launched the first salvo in the next-generation game console war. Xbox One X is sure to excite dedicated Xbox fans that feel they've been missing out on true 4K gaming. This time Microsoft has made Ultra-HD resolutions and HDR the prime motivator for your next console upgrade.

But with no platform exclusive titles, no truly innovative changes, Xbox One X may look more like another incremental upgrade than a truly next-gen console.



Read: Xbox One X: Consoles Hit the Downside of the Innovation Curve
 
Cos

Cos

Audioholic Samurai
I see you point, I also see a device;
- That is a fully functional 4k Blu-Ray player, supporting Dolby Atmos and HDR-10
- Has both HDMI 2.1 and supports freesync, to improve sync with display and reduce tearing
- Will more than likely have the best of the 3rd Party Titles, which I spend more time on than exclusives (Nintenod excluded, damn you Zelda and Mario Kart)
- Innovation takes time, and hopefully they take their time with Augmented/Virtual Reality
- The ability to supersample textures from 4k down to 1080P to even improve the quality of visuals
- True cross play on games incoming starting with Minecraft across all devices (hopefully Sony gets on board with this)


Both Sony and MS suck at innovation Kinect / PS4 eye, their VR device, which has sold a million, one to myself and I still can't find a game worth playing on it.

I agree with some of the posts of the article, but overall I think there are a lot of positives. Try and build a PC with the same level of power for that price point and I think you will be truly challenged.

They definitely need to catch up with exclusives, but to be honest, Sea of Thieves look like a of fun, as well as a few others announced at E3.
 
Last edited:
Wayde Robson

Wayde Robson

Audioholics Anchorman
Yeah, Cos. I see your point too. But I'm skeptical that the supersampling of textures is going to amount to a hill of beans, literally, I don't think you'll see any more individual beans in the hill.

Totally with you on not caring much about exclusives. WHere I was going with that was mainly that we're just not seeing new games that ONLY the One X can play. Compare that with the old days when a next-gen console gave you completely new experiences.

Welcome to the era of x86 consoles. Consumers being blindsided by face melting innovations are past. And you know... maybe that's a good thing. Maybe console gaming in general (and PC gaming to a lesser degree) has matured to a point where upgrades will be more iterative and predictable and less "next-gen"... if that makes sense.

I fully expect the big "innovations" to happen in the PC world first then trickle over to consoles.
 
slipperybidness

slipperybidness

Audioholic Warlord
I have never seen any reason to think that this would be a true next-gen console. In fact, what I have seen is that these incremental upgrades may be the future, at least as far as MS is concerned.

Really nothing new, this is old-hat for any PC gamer.
 
S

shadyJ

Speaker of the House
Staff member
I have never seen any reason to think that this would be a true next-gen console. In fact, what I have seen is that these incremental upgrades may be the future, at least as far as MS is concerned.

Really nothing new, this is old-hat for any PC gamer.
 
panteragstk

panteragstk

Audioholic Warlord
I agree with those saying this is an upgrade, not a next gen console. Sony and MS just launched the PS4 and XBone too early and decided to "upgrade" them a bit later. That's really all it is.

 
gene

gene

Audioholics Master Chief
Administrator
Guys;

Admittedly I don't follow gaming consoles as closely as I should. I have a first gen XBOX360 that is dying. If I got an XBOX One, will ALL of my XBOX 360 games still work? What about a handful of my XBOX games?

I believe Sony Playstation is backwards compatible but not too sure about XBOX beyond the 360.
 
slipperybidness

slipperybidness

Audioholic Warlord
Guys;

Admittedly I don't follow gaming consoles as closely as I should. I have a first gen XBOX360 that is dying. If I got an XBOX One, will ALL of my XBOX 360 games still work? What about a handful of my XBOX games?

I believe Sony Playstation is backwards compatible but not too sure about XBOX beyond the 360.
Nope. Xbox one is only compatible with selected XBOX360 games. In general, some of the best and most well-regarded 360 games, but not every single game is backwards compatible. Original XBOX compatibility was only recently announced, so I don't know enough about that yet (but I think it will be selected titles too).

Here you go:
http://www.xbox.com/en-US/xbox-one/backward-compatibility

Now, to the real gem in your statement: You really have a 1st gen Xbox360? The oldest white model, I think it didn't even have HDMI, right??? And--you never got the Red Ring of Death :eek::eek::eek:???

If you REALLY have an original Xbox360 that hasn't RRoD (yet), then stop on your way home for a lotto ticket!

Seriously, I had 2 RRoDs. And every single one of my friends had at least 1 RRoD if not more.
 
T

Tao1

Audioholic
I don't know that these are interesting times for consoles. This should be the End Times for consoles. There is absolutely no reason for their existence with different form factor PCs these days. Consoles are running on the same hardware as PCs, but can't be upgraded, and force developers to code games for additional operating systems. With a home theatre PC you have control of the cost and specs of the system, and it will do the same job as a console, plus a whole lot more.

Backwards compatibility is nothing new. You could (and still can) do this on PC since the days of DOS. Not having reverse compatibility on consoles has been one of many anti consumer practices by console makers. I really don't understand how they ever got away with it, but these days it is becoming more and more clear that they have no benefit over PC, so I don't know why they are trying to alienate their customer base. They are like the Apple of gaming.

In PC gaming it has been clear for a while now that you want clock speed for gaming performance rather than additional cores. Granted, console games will be written for optimization on the console, but I have yet to see console ports to PC take advantage of high levels of cores. The fact that these latest generation consoles have processors under 3 GHz let alone not closer to 4Ghz (which has has been the standard for several years now) is just as scandalous as the trend in home theatre receivers cutting power delivery to make room for proprietary software features.

Here is an older video outlining the shortcomings on consoles. It is a few years out of date, but the points such as hardware cost, and games cost are still relevant these days:

A quick note on 4k gaming:

Since games are drawing the images on the fly, games suffer from aliasing. Since the introduction of Nvidia's DSR technology, the savy tech head has found that running a game at 4k, with no filters, gives a better image quality and less aliasing than running 1080p with full anti aliasing and other filters. This is achieved with about the same hit to performance, depending on the game.

You may not see individual examples of aliasing at a distance, but you will be able to see aliasing crawling (shimmering). Rendering the game at 4k instead (even on a 1080p display) all but removes the shimmering from aliasing crawling. An example of this phenomenon can be seen here:


In short, 1080p gaming is pretty much obsolete without the need to get a 4k display, or even a more powerful graphics card in most cases. If a game doesn't run well at 4k with filters off, then 1440p with some filters will still be a better image than 1080p with full filters; all for about the same performance cost.
 
Last edited:
Cos

Cos

Audioholic Samurai
I have never seen any reason to think that this would be a true next-gen console. In fact, what I have seen is that these incremental upgrades may be the future, at least as far as MS is concerned.

Really nothing new, this is old-hat for any PC gamer.
I don't think the consoles, for the most part have been anything other than incremental upgrades, the same can be said for PC Gaming. The exception to this is Nintendo, they're the only company to truly innovate and have success with it (Wii) and to a lesser extend the new switch.

As much as PC Games say there should only be PC Games, it just doesn't work. Parents want to put their kids in front of the TV. They don't want to wait a minute for the system to boot up, find the right game file, click on it, hope that the PC Games doesn't have a ton of bugs, and actually works.

I PC Game, I have a high end PC with 4k Display, yet I would still rather sit in front of my 4K TV and boot up the same game on my Xbox One S or PS4.

Problem with PC games is that they are always developing for the next great video card or CPU, rather than master the limits of existing hardware. It was amazing to watch how much they could sqeeze out of the Ps3 and Xbox 360 having that long to develop for it. In the PC world, in order to play the newest game will end up requiring a video card upgrade every 2-3 years.

The future is game streaming services where on prem hardware is not the issue, then say goodbye to PC and console gaming
 
gene

gene

Audioholics Master Chief
Administrator
Nope. Xbox one is only compatible with selected XBOX360 games. In general, some of the best and most well-regarded 360 games, but not every single game is backwards compatible. Original XBOX compatibility was only recently announced, so I don't know enough about that yet (but I think it will be selected titles too).

Here you go:
http://www.xbox.com/en-US/xbox-one/backward-compatibility

Now, to the real gem in your statement: You really have a 1st gen Xbox360? The oldest white model, I think it didn't even have HDMI, right??? And--you never got the Red Ring of Death :eek::eek::eek:???

If you REALLY have an original Xbox360 that hasn't RRoD (yet), then stop on your way home for a lotto ticket!

Seriously, I had 2 RRoDs. And every single one of my friends had at least 1 RRoD if not more.
Yes my XBOX 360 is white and No HDMI. I had the RROD twice, and got Microsoft to fix it both times. The 2nd time there was catastrophic failure but I used the RROD as an excuse for them to repair previous work. Now I have to smack it while it turns on or the DVD drive doesn't read the disc. It's on it's last leg but I play the following games pretty regularly and don't want to lose them:
  • Mortal Kombat (4 different ones including Justice League and Injustice)
  • Halo 3
  • Star Trek Legacy - yea the only Star Trek game on a modern console
  • Disney Infinity (for my youngest daughter)
  • NHL Hockey
But if these games aren't compatible, I may just pick up a used XBOX 360 w HDMI.
 
T

Tao1

Audioholic
I don't think the consoles, for the most part have been anything other than incremental upgrades, the same can be said for PC Gaming. The exception to this is Nintendo, they're the only company to truly innovate and have success with it (Wii) and to a lesser extend the new switch.

As much as PC Games say there should only be PC Games, it just doesn't work. Parents want to put their kids in front of the TV. They don't want to wait a minute for the system to boot up, find the right game file, click on it, hope that the PC Games doesn't have a ton of bugs, and actually works.

I PC Game, I have a high end PC with 4k Display, yet I would still rather sit in front of my 4K TV and boot up the same game on my Xbox One S or PS4.

Problem with PC games is that they are always developing for the next great video card or CPU, rather than master the limits of existing hardware. It was amazing to watch how much they could sqeeze out of the Ps3 and Xbox 360 having that long to develop for it. In the PC world, in order to play the newest game will end up requiring a video card upgrade every 2-3 years.

The future is game streaming services where on prem hardware is not the issue, then say goodbye to PC and console gaming
If you let a computer sleep, it resumes in about 5 seconds. Then have the game icons on the desktop, and bam, kids are playing games on a home theatre PC in 30 seconds or so.

I can't speak for your case, but whenever I house sit for my dad when he goes on long vacations, I put my PC in the living room, hooked up to the home theatre. High end PCs + home theatre gear is a match that I am surprised very few people think to do.

Splitting hairs maybe, but video games are developed for the current hardware, then released about a year later when they are finished. I have yet to hear of a video card manufacturer working closely with a game developer on a card not yet released. Developers wouldn't want to do that as they risk making a game that people with the other brand of card can't play. They are already torn using Nvidia Hair Works, or Wave Works, and other proprietary technologies which gamers with video cards from AMD can't make use of. Sadly, it is mostly planned obsolescence, and I blame Nvidia for that, as they haven't had much competition to keep them giving us their best the last few years.

I doubt streaming services will take over. There would be too much input lag for your controls. Even playing an RPG you would have about 100ms lag at least (unless you were fairly close to the server), so a 1/10 of a second delay in response to your inputs, if not more. This would make any shooter, or online competitive game(even League of Legends or DOTA) unplayable.

Also graphics quality would be highly degraded, especially in scenes with fast motion due to compression to be able to stream video to many users of the service at a reasonable bandwidth usage.

At best I could see the poorman gamer using this service. The average gamer would probably be frustrated with the control lag, and there will always be the enthusiasts out there who would not put up with either draw back.
 
slipperybidness

slipperybidness

Audioholic Warlord
Yes my XBOX 360 is white and No HDMI. I had the RROD twice, and got Microsoft to fix it both times. The 2nd time there was catastrophic failure but I used the RROD as an excuse for them to repair previous work. Now I have to smack it while it turns on or the DVD drive doesn't read the disc. It's on it's last leg but I play the following games pretty regularly and don't want to lose them:
  • Mortal Kombat (4 different ones including Justice League and Injustice)
  • Halo 3
  • Star Trek Legacy - yea the only Star Trek game on a modern console
  • Disney Infinity (for my youngest daughter)
  • NHL Hockey
But if these games aren't compatible, I may just pick up a used XBOX 360 w HDMI.
Yeah, and swapping out a disc drive in a 360 is no trivial task (like it should be!).

It's been a long time since I dealt with the 360, so just going by memory here. I think that there is some ID tag or something associated with the disc drive unique to each unit. So, when you swap it you still have to do a re-flash. Basically, the procedure to make the swap is the same as if you were hacking the console. Really, at that point, you might as well hack it, but that also gets it banned from XBOX Live. Also, I know that the 360 used at least 3 different drives depending on the model. I can't remember if those were interchangeable or not.

Back in the heyday of the 360, many people would have 1 legit system to play XBOX Live, and one hacked system to do all the other stuff.
 
Wayde Robson

Wayde Robson

Audioholics Anchorman
This should be the End Times for consoles. There is absolutely no reason for their existence with different form factor PCs these days.
Consoles are PCs now. There is literally no difference now that consoles have gone ahead and adopted x86 architecture there is literally no difference.

Except that you can play on consoles with a static level performance per-iteration. I used to PC game, I used to build a new PC every few years specifically to play a certain game. But now I only console due to lack of time and willingness to put in the effort. I honestly don't get to play as much as I used to.

Sure, I sometimes fantasize about grabbing a high-powered gaming laptop and bridging it to my HDTV and playing a modded out version of Skyrim or whatever else, that would be fun. But... honestly. It'll probably never happen. I have too much work to do and I am not competitively trying to play anything.
 
Cos

Cos

Audioholic Samurai
If you let a computer sleep, it resumes in about 5 seconds. Then have the game icons on the desktop, and bam, kids are playing games on a home theatre PC in 30 seconds or so.

I can't speak for your case, but whenever I house sit for my dad when he goes on long vacations, I put my PC in the living room, hooked up to the home theatre. High end PCs + home theatre gear is a match that I am surprised very few people think to do.

Splitting hairs maybe, but video games are developed for the current hardware, then released about a year later when they are finished. I have yet to hear of a video card manufacturer working closely with a game developer on a card not yet released. Developers wouldn't want to do that as they risk making a game that people with the other brand of card can't play. They are already torn using Nvidia Hair Works, or Wave Works, and other proprietary technologies which gamers with video cards from AMD can't make use of. Sadly, it is mostly planned obsolescence, and I blame Nvidia for that, as they haven't had much competition to keep them giving us their best the last few years.

I doubt streaming services will take over. There would be too much input lag for your controls. Even playing an RPG you would have about 100ms lag at least (unless you were fairly close to the server), so a 1/10 of a second delay in response to your inputs, if not more. This would make any shooter, or online competitive game(even League of Legends or DOTA) unplayable.

Also graphics quality would be highly degraded, especially in scenes with fast motion due to compression to be able to stream video to many users of the service at a reasonable bandwidth usage.

At best I could see the poorman gamer using this service. The average gamer would probably be frustrated with the control lag, and there will always be the enthusiasts out there who would not put up with either draw back.
I disagree with you on a few things.

"I have yet to hear of a video card manufacturer working closely with a game developer on a card not yet released"

It happens all the time with Nvidia and AMD, they want to make sure those games perform at their peek because it helps them sell graphics cards. That is why you see often times pre-release and just released games being offered as a give away with those cards.

"I can't speak for your case, but whenever I house sit for my dad when he goes on long vacations, I put my PC in the living room, hooked up to the home theatre. High end PCs + home theatre gear is a match that I am surprised very few people think to do."

There is no dobut it works, and for people who have the minimal technical competency to do that, it works great. The bottom line is that consoles are designed for the largest audience and I am sorry, you would alienate a large market share in doing so. Consoles, release patches, updates, fixes pretty much automatically, PCs are not quite on the same level due to their open vs closed architecture. If a game is bugged, every console has the same hardware, it's much easier to fix. The Console OS, is 100% dedicated to the console. I could go on, but you get the idea. This is why even the steam boxes have been unsuccessful to this point.

"I doubt streaming services will take over. There would be too much input lag for your controls. Even playing an RPG you would have about 100ms lag at least (unless you were fairly close to the server), so a 1/10 of a second delay in response to your inputs, if not more. This would make any shooter, or online competitive game(even League of Legends or DOTA) unplayable."

I agree with you on this point, it is not a near evolution experience, but from some who used to play dark age of Camelot and Everquest over dial-up it has the potential to work as internet speeds increase GaaS (Gaming as a service) gets better and cloud computing continues to develop. If your a PC gamer and require the absolute best in input lag response, etc, you have the money to invest in the best PC system, most of America does not.
Streaming Gaming has come a long way:
http://www.nvidia.com/object/cloud-gaming.html

I am a PC Gamer, I have invested a lot of money in my PC and gear, but I still find myself going back to console gaming.

On a side note to your earlier post comment

"In PC gaming it has been clear for a while now that you want clock speed for gaming performance rather than additional cores. Granted, console games will be written for optimization on the console, but I have yet to see console ports to PC take advantage of high levels of cores. The fact that these latest generation consoles have processors under 3 GHz let alone not closer to 4Ghz (which has has been the standard for several years now) is just as scandalous as the trend in home theatre receivers cutting power delivery to make room for proprietary software features."

Judging CPU's primarily on Ghz is partially ignorant. This is why CPU's are increasing the number of cores at a much more dramatic rate than Ghz. (multi-threading and larger secondary caches etc). Direct X12 and AMD Mantle were specifically designed to use multiple cores from the latest generation Intel/AMD CPUs. Games which support them correctly typically will experience a nice little jump in frame rate. That being said, they are not up to consoles in that level. Which again gives console an advantage, as game makers only have to design for one level of CPU/GPU (at least until now) which made them maximize the resources of each.
 
Last edited:
Cos

Cos

Audioholic Samurai
Yes my XBOX 360 is white and No HDMI. I had the RROD twice, and got Microsoft to fix it both times. The 2nd time there was catastrophic failure but I used the RROD as an excuse for them to repair previous work. Now I have to smack it while it turns on or the DVD drive doesn't read the disc. It's on it's last leg but I play the following games pretty regularly and don't want to lose them:
  • Mortal Kombat (4 different ones including Justice League and Injustice)
  • Halo 3
  • Star Trek Legacy - yea the only Star Trek game on a modern console
  • Disney Infinity (for my youngest daughter)
  • NHL Hockey
But if these games aren't compatible, I may just pick up a used XBOX 360 w HDMI.
You would be SOL for most/all of those games:

http://www.escapistmagazine.com/articles/view/video-games/walkthroughs/17781-Xbox-One-All-The-Backwards-Compatible-Games

  • Xbox One sells a Halo pack that includes all previous versions
  • Others, you are kinda SOL unless you opt to get the Xbox One versions
 
T

Tao1

Audioholic
Consoles are PCs now. There is literally no difference now that consoles have gone ahead and adopted x86 architecture there is literally no difference.

Except that you can play on consoles with a static level performance per-iteration. I used to PC game, I used to build a new PC every few years specifically to play a certain game. But now I only console due to lack of time and willingness to put in the effort. I honestly don't get to play as much as I used to.

Sure, I sometimes fantasize about grabbing a high-powered gaming laptop and bridging it to my HDTV and playing a modded out version of Skyrim or whatever else, that would be fun. But... honestly. It'll probably never happen. I have too much work to do and I am not competitively trying to play anything.
The difference is in their proprietary nature.
Spoiler as my listing of the difference turned into a bit of a rant
-They can't be upgraded. This ends up being wasteful as you can't reuse components by selectively upgrading one component at a time.

-their build quality is questionable generally as they sacrifice the cooling needed for the box. Microsoft went so far as made this a life limiting feature with the red ring of death. The initial models had a 50% failure rate and they were sued for it. Anyone is manufacturing would probably also be skeptical of such a device having a feature to measure the expected life left in a consumer item. With such a failure rate, it seems they tried to hard to min max getting customers to buy second, or even third Xboxes.

-Their operating systems are limited. You cannot run other programs, or have the same freedoms as on PC (talking Linux as well as Windows; not just Windows).

-The console makers create a walled garden in similar anti-consumer practices as Apple does. They wield these limitations as a weapon to gouge customers by acts like historically charged users for the privilege to play online, when this was free on PC or Mac. Some, like Nintendo, use proprietary connectors to charge a premium on controllers, and accessories. They have also resisted giving players use of a mouse and keyboard to play games. The customer should be able to choose whichever tool they find best for the experience.

-going further on the walled garden approach, console companies edict complete control over game updates. This is mainly to ensure quality control, so that console owners don't receive a bad experience, but this has also been wielded as a weapon against developers. One game community I help out with, to the point of being under an NDA agreement, has shown be a glimpse of what it is like to deal with one of the console makers. It is a cross platform game, but isn't a huge moneymaker for that console. The console maker reserves the right to do their own quality control on any patch before it is released, but don't offer the developer much professional courtesy in how they do it. They simply do it when they get around to it, which either holds back patches for all platforms (which can be a major problem if the fix is for a major issue), or the patch deals with content, and that console community is left out, as the patch is put forward on PC due to timing necessity, alienating and infuriating the console players.

-The console companies also keep an iron grip on pricing. A big release will be about $60 on launch day. However retailers of PC games have full market leeway to charge whatever they want. I have never paid full price for a game, even on launch day for a few years now. $45 is common on day 1 for a $60 game. There are also many more sales events during the year, making PC cheaper for games by a large margin period.

-Going further on price, PC components are mass produced to the point that they aren't any more expensive than the hardware used in consoles. You can make a PC with similar performance for a similar price...or customize it to what you want or need.


Just like Apple, they take the technology and limit it so the consumer can only use it in a dictated way. My point is that there is exactly 0 reasons to have a console over a home theatre PC with games; there are only draw backs as I listed above. Thus there is no place in the market for any of these platforms over a PC in the living room.

To give some context of where I am coming from: Years ago I did partake in the 'haughtier than thow' PC Master Race attitude to have a bit of fun at my friends here and there. However that came to a screeching halt when I saw the things that the console manufacturers were doing to their customers. Charging for the privilege to game online, as well as being left out of content because they simply didn't give a damn to perform QC on developer additions in a timely manner. Games being more expensive while having quality dumbed down for performance, or some content stripped outright. Console boxes dying prematurely, yet being just out of warranty, etc.

I was absolutely shocked and outraged at the treatment console customers received. And yes this is coming from someone who has had to deal with Microsoft and their shenanigans. I have had nothing but sympathy for console gamers since; especially when I still hear horror stories of the crap they have to deal with. Since then I have seen the evidence play out, and there is absolutely no benefit being offered by consoles over PC.

Sorry to get soapboxy, but my pet peeve in life is when people (console makers in this case) take advantage of others and exploit them.
 
Last edited:
T

Tao1

Audioholic
I disagree with you on a few things.

"I have yet to hear of a video card manufacturer working closely with a game developer on a card not yet released"

It happens all the time with Nvidia and AMD, they want to make sure those games perform at their peek because it helps them sell graphics cards. That is why you see often times pre-release and just released games being offered as a give away with those cards.

"I can't speak for your case, but whenever I house sit for my dad when he goes on long vacations, I put my PC in the living room, hooked up to the home theatre. High end PCs + home theatre gear is a match that I am surprised very few people think to do."

There is no dobut it works, and for people who have the minimal technical competency to do that, it works great. The bottom line is that consoles are designed for the largest audience and I am sorry, you would alienate a large market share in doing so. Consoles, release patches, updates, fixes pretty much automatically, PCs are not quite on the same level due to their open vs closed architecture. If a game is bugged, every console has the same hardware, it's much easier to fix. The Console OS, is 100% dedicated to the console. I could go on, but you get the idea. This is why even the steam boxes have been unsuccessful to this point.

"I doubt streaming services will take over. There would be too much input lag for your controls. Even playing an RPG you would have about 100ms lag at least (unless you were fairly close to the server), so a 1/10 of a second delay in response to your inputs, if not more. This would make any shooter, or online competitive game(even League of Legends or DOTA) unplayable."

I agree with you on this point, it is not a near evolution experience, but from some who used to play dark age of Camelot and Everquest over dial-up it has the potential to work as internet speeds increase GaaS (Gaming as a service) gets better and cloud computing continues to develop. If your a PC gamer and require the absolute best in input lag response, etc, you have the money to invest in the best PC system, most of America does not.
Streaming Gaming has come a long way:
http://www.nvidia.com/object/cloud-gaming.html

I am a PC Gamer, I have invested a lot of money in my PC and gear, but I still find myself going back to console gaming.

On a side note to your earlier post comment

"In PC gaming it has been clear for a while now that you want clock speed for gaming performance rather than additional cores. Granted, console games will be written for optimization on the console, but I have yet to see console ports to PC take advantage of high levels of cores. The fact that these latest generation consoles have processors under 3 GHz let alone not closer to 4Ghz (which has has been the standard for several years now) is just as scandalous as the trend in home theatre receivers cutting power delivery to make room for proprietary software features."

Judging CPU's primarily on Ghz is partially ignorant. This is why CPU's are increasing the number of cores at a much more dramatic rate than Ghz. (multi-threading and larger secondary caches etc). Direct X12 and AMD Mantle were specifically designed to use multiple cores from the latest generation Intel/AMD CPUs. Games which support them correctly typically will experience a nice little jump in frame rate. That being said, they are not up to consoles in that level. Which again gives console an advantage, as game makers only have to design for one level of CPU/GPU (at least until now) which made them maximize the resources of each.

I will have to get back to you on devs working with unreleased graphics cards. I am fairly sure this isn't the case, especially with many wanting to release on console as well, but I am going to ask around to be sure I get it right either way.

I think Windows based machines are at a level where the average user can update hassle free. Windows updates on its own, and probably more average users have Windows machines over consoles. Steam, GOG Galaxy, and other DRM services also take care of game updates in a similar way.

A bit anecdotal, but I have only had one quirk with a game in years, and that was with the new Doom reboot. Everything else has worked smoothly.

Also I have seen the opposite effect: Games made for PC, but expanded cross platform. The game I help out with has more quirks come up on the console end, than on PC, Linux, or Mac. That is even after the console maker's own QC process.

Yeah it is true that measuring clock speed alone on processors is ignorant. I just tried to save going into a full essay on my post. Initial estimates on the performance on the New Xbox One X processor is that it is probably equivalent to an i3.

Microsoft making the new Xbox will be a good thing on that end as I am sure they will force developers to make games on Dx12 or not at all. However a processors capabilities is more or less on instructions per clock X clock cycles per second X number of cores. That new processor is fairly handicapped at 40% the clock speed of common processors these days.
 
Last edited:
Cos

Cos

Audioholic Samurai
I will have to get back to you on devs working with unreleased graphics cards. I am fairly sure this isn't the case, especially with many wanting to release on console as well, but I am going to ask around to be sure I get it right either way.

I think Windows based machines are at a level where the average user can update hassle free. Windows updates on its own, and probably more average users have Windows machines over consoles. Steam, GOG Galaxy, and other DRM services also take care of game updates in a similar way.

A bit anecdotal, but I have only had one quirk with a game in years, and that was with the new Doom reboot. Everything else has worked smoothly.

Also I have seen the opposite effect: Games made for PC, but expanded cross platform. The game I help out with has more quirks come up on the console end, than on PC, Linux, or Mac. That is even after the console maker's own QC process.

Yeah it is true that measuring clock speed alone on processors is ignorant. I just tried to save going into a full essay on my post. Initial estimates on the performance on the New Xbox One X processor is that it is probably equivalent to an i3.

Microsoft making the new Xbox will be a good thing on that end as I am sure they will force developers to make games on Dx12 or not at all. However a processors capabilities is more or less on instructions per clock X clock cycles per second X number of cores. That new processor is fairly handicapped at 40% the clock speed of common processors these days.

I think it was Batman or anther game that were supporting features in an Nvidia card that wasn't even released yet. I can't see it being much different than a development console kit being delivered to a software company so they can figure out how to best utilize resources.

I also agree, that windows updates have gotten a lot better, but I will disagree that it is not as straight forward as consoles. There are a lot more moving parts, variety of parts, that need to be supported than a standardized console, that is my point. Consoles are sold to everyone, not just the average PC user, which is why they need to be more friendly in that regards.

The CPU on a console can be weaker than a PC for a large number of reasons. It probably only supports about 40% the overhead of a standard PC CPU. It has pretty much one purpose, to support console gaming. These are not off the shelf CPUs, in the case of the Xbox/PS4 and more specifically the Xbox One X, they are highly customized to support a console environment. I was trying to find the article on how the Xbox One X was designed, but there were some hardware specific changes things done to tweek gaming performance, I.E. memory bandwidth GDDR5 etc. There is no doubt that the CPU is significantly less powerful on the Console vs PC, but a PC would also needs a significantly more powerful CPU to do the same things that can be done in a console. I have no idea where that delta is.
 
Cos

Cos

Audioholic Samurai
The difference is in their proprietary nature.
Spoiler as my listing of the difference turned into a bit of a rant
-They can't be upgraded. This ends up being wasteful as you can't reuse components by selectively upgrading one component at a time.

-their build quality is questionable generally as they sacrifice the cooling needed for the box. Microsoft went so far as made this a life limiting feature with the red ring of death. The initial models had a 50% failure rate and they were sued for it. Anyone is manufacturing would probably also be skeptical of such a device having a feature to measure the expected life left in a consumer item. With such a failure rate, it seems they tried to hard to min max getting customers to buy second, or even third Xboxes.

-Their operating systems are limited. You cannot run other programs, or have the same freedoms as on PC (talking Linux as well as Windows; not just Windows).

-The console makers create a walled garden in similar anti-consumer practices as Apple does. They wield these limitations as a weapon to gouge customers by acts like historically charged users for the privilege to play online, when this was free on PC or Mac. Some, like Nintendo, use proprietary connectors to charge a premium on controllers, and accessories. They have also resisted giving players use of a mouse and keyboard to play games. The customer should be able to choose whichever tool they find best for the experience.

-going further on the walled garden approach, console companies edict complete control over game updates. This is mainly to ensure quality control, so that console owners don't receive a bad experience, but this has also been wielded as a weapon against developers. One game community I help out with, to the point of being under an NDA agreement, has shown be a glimpse of what it is like to deal with one of the console makers. It is a cross platform game, but isn't a huge moneymaker for that console. The console maker reserves the right to do their own quality control on any patch before it is released, but don't offer the developer much professional courtesy in how they do it. They simply do it when they get around to it, which either holds back patches for all platforms (which can be a major problem if the fix is for a major issue), or the patch deals with content, and that console community is left out, as the patch is put forward on PC due to timing necessity, alienating and infuriating the console players.

-The console companies also keep an iron grip on pricing. A big release will be about $60 on launch day. However retailers of PC games have full market leeway to charge whatever they want. I have never paid full price for a game, even on launch day for a few years now. $45 is common on day 1 for a $60 game. There are also many more sales events during the year, making PC cheaper for games by a large margin period.

-Going further on price, PC components are mass produced to the point that they aren't any more expensive than the hardware used in consoles. You can make a PC with similar performance for a similar price...or customize it to what you want or need.


Just like Apple, they take the technology and limit it so the consumer can only use it in a dictated way. My point is that there is exactly 0 reasons to have a console over a home theatre PC with games; there are only draw backs as I listed above. Thus there is no place in the market for any of these platforms over a PC in the living room.

To give some context of where I am coming from: Years ago I did partake in the 'haughtier than thow' PC Master Race attitude to have a bit of fun at my friends here and there. However that came to a screeching halt when I saw the things that the console manufacturers were doing to their customers. Charging for the privilege to game online, as well as being left out of content because they simply didn't give a damn to perform QC on developer additions in a timely manner. Games being more expensive while having quality dumbed down for performance, or some content stripped outright. Console boxes dying prematurely, yet being just out of warranty, etc.

I was absolutely shocked and outraged at the treatment console customers received. And yes this is coming from someone who has had to deal with Microsoft and their shenanigans. I have had nothing but sympathy for console gamers since; especially when I still hear horror stories of the crap they have to deal with. Since then I have seen the evidence play out, and there is absolutely no benefit being offered by consoles over PC.

Sorry to get soapboxy, but my pet peeve in life is when people (console makers in this case) take advantage of others and exploit them.
"They can't be upgraded. This ends up being wasteful as you can't reuse components by selectively upgrading one component at a time"

Agree, but the only real significant gains you usually get from upgrading are the video card. A higher end video card can be anywhere from $250-$700+, just about the price of buying a new console

Intel and AMD have been pretty good about upgrading the Socket types on a consistent basis, which requires you to buy a new motherboard

100% Agree with you on build quality, MS screwed the pooch with the xbox 360, but future incarnations have been pretty bug free for me, and a lot of deisgn changes to cooling have been made in the Xbox One X that allow it to have an even smaller form factor

It's not a PC so the OS is a closed system, you can not run other programs on it, because that is not what is designed to do. On the plus side, there is no "bloatware" and the OS becomes highly optimized to support its specific task of gaming and entertainment.

100% agree its a controlled environment, NDA etc, I don't think it's a good thing, but it does have it's advantages too as they use to provide the best experience for their audience (as you pointed out)

Controlled pricing sucks, I agree, but console manufacturers do NOT make money on consoles in the first few years of its lifecycle, it has either been sold at cost or a loss to gain market share. They make money on games, so they need to control price or charge a ton more the hardware. I was not a big fan of paying for internet gaming, but that has changed significantly. I usually pay 39 or less for xbox live with discounts I find online and I get about 20+ games for free, each year, games I will play. To me it is a great deal.

To build a PC for $249.00 that matches an xbox one S or Ps4 is a little challenging IMO. To build a PC that matches the Xbox One X is next to impossible. You could get close for about $200 more. Again you can always upgrade the PC, unlike a console
 
Last edited:

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top