AMD vs. Intel and Nvidia - The Next-Gen GPU War is ON!

panteragstk

panteragstk

Audioholic Ninja
Nvidia has posted such strong performance numbers for their Ampere platform that AMD has its work cut out for it trying to match or even getting close to that level of performance. I don't think Nvidia is sweating AMD's moves at the moment. I am rooting for AMD because strong competition makes for better product choices for us consumers.
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
 
S

shadyJ

Speaker of the House
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
I have used Radeon graphics cards for a long time and never had a problem with them. However, for my most recent build, I bought a 5600XT based card. I just could not get the thing to work correctly. It only started to get wonky after I installed the drivers. I became tired of wrestling with the issue and just replaced with a Geforce GTX 2060 and haven't had any problems. The shame of it was that the 5600XT is $100 less but capable of the same performance of the GTX 2060 -when it works. My build didn't have any weird parts combination either, it was a very vanilla build. AMD could have so much more graphics cards marketshare if they just had stable drivers. Stability should be their first priority. Who cares if you get 200FPS if the machine isn't reliable?
 
Irvrobinson

Irvrobinson

Audioholic Spartan
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.
Agree. AMD has recently won a small amount of share back, but the market for discrete client GPUs is not where the action is. It's in datacenter servers and supercomputers. The real threat to AMD probably isn't Nvidia, it's Intel's latest integrated Xe graphics, which will likely take a big bite out of the low end of the discrete client GPU market. The discrete Xe part, whenever it comes, if it comes, is also more likely to hurt AMD than Nvidia.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.
Geek. :) PCIe 4.0 does have an advantage for client computers, but it is doubtful customers will see it per se. For a given amount of throughput PCIe 4.0 consumes less power than 3.0, and halving the number of conductors for a given throughput requirement makes board design easier, though I suspect 4.0 may need a more expensive board material.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
Ignore fab process "node numbers". Intel has had so many variations of their 14nm process even Intel engineers have admitted in the press they lose track of which one their product is using. Intel's new 10nm is impressive technology, as are their new "11th Generation" CPUs, but they are still behind the AMD/TSMC team overall IMO. Being late to market with PCIe 4.0 in the client market was mildly uncompetitive, in the server market it was pretty inexcusable with the latest SSDs and 100Gb+ networks. Here's a relatively non-technical blog on the new Intel client CPUs:


Here's a great analysis on the effects of fab process on different portions of ICs based on the circuit type:

 
panteragstk

panteragstk

Audioholic Ninja
I have used Radeon graphics cards for a long time and never had a problem with them. However, for my most recent build, I bought a 5600XT based card. I just could not get the thing to work correctly. It only started to get wonky after I installed the drivers. I became tired of wrestling with the issue and just replaced with a Geforce GTX 2060 and haven't had any problems. The shame of it was that the 5600XT is $100 less but capable of the same performance of the GTX 2060 -when it works. My build didn't have any weird parts combination either, it was a very vanilla build. AMD could have so much more graphics cards marketshare if they just had stable drivers. Stability should be their first priority. Who cares if you get 200FPS if the machine isn't reliable?
That's pretty much the issues I had with them over the years, but my AMD/ATI cards were all for HTPC builds. At the time AMD/ATI had a decent picture quality advantage over nvidia (things have leveled out in that area now) so I used them in every HTPC I had without issue for years. Once I started having random issues with drivers (de-interlacing would break) and a fix would come out, but break something else. it got annoying so I just with to intel integrated graphics and didn't look back.

My last HTPC had a 1030 in it and it worked great. I don't need an HTPC any longer so I have a spare GPU now.

I've heard rumors of AMD selling the GPU division, but I don't think they're true. AMD is (and has been) the go to for game consoles. Don't think they'd give that business up.
 
panteragstk

panteragstk

Audioholic Ninja
Agree. AMD has recently won a small amount of share back, but the market for discrete client GPUs is not where the action is. It's in datacenter servers and supercomputers. The real threat to AMD probably isn't Nvidia, it's Intel's latest integrated Xe graphics, which will likely take a big bite out of the low end of the discrete client GPU market. The discrete Xe part, whenever it comes, if it comes, is also more likely to hurt AMD than Nvidia.



Geek. :) PCIe 4.0 does have an advantage for client computers, but it is doubtful customers will see it per se. For a given amount of throughput PCIe 4.0 consumes less power than 3.0, and halving the number of conductors for a given throughput requirement makes board design easier, though I suspect 4.0 may need a more expensive board material.



Ignore fab process "node numbers". Intel has had so many variations of their 14nm process even Intel engineers have admitted in the press they lose track of which one their product is using. Intel's new 10nm is impressive technology, as are their new "11th Generation" CPUs, but they are still behind the AMD/TSMC team overall IMO. Being late to market with PCIe 4.0 in the client market was mildly uncompetitive, in the server market it was pretty inexcusable with the latest SSDs and 100Gb+ networks. Here's a relatively non-technical blog on the new Intel client CPUs:


Here's a great analysis on the effects of fab process on different portions of ICs based on the circuit type:

The only reason I decided to jump to PCIE 4 is my eyes are on a new 3080 RTX GPU and from what I read the 2080 TI bottlenecked when only using PCIE 3.0 x8 (which is how I my setup was) so I "might" have had an issue getting full performance. No longer an issue and I can use the new crazy fast SSDs that have no real world benefit, but it's still cool tech.

I agree, the node numbers don't mean much, but it was more around how many issues they had shrinking their process. At least I've read of some issues. I don't keep up THAT much with the nitty gritty aspect of how all the chips are made. The 10th series is still a great chip, but AMD isn't really playing catch up anymore so we will see an interesting battle between the two in the consumer space.

I think intel has had the market cornered in the data center world for a long time. I know AMD has their EPYC line, but I don't hear much about them. Not that I'm in that space.

You are correct that the low-end GPU market will suffer for AMD and nvidia, but AMD has been winning the APU battle for a long time. Intel has been saying they're upping the GPU game for a long long time and we still don't have a discrete GPU from them. If we get one, I doubt it'll be all that impressive. Could be wrong though.
 
Irvrobinson

Irvrobinson

Audioholic Spartan
The only reason I decided to jump to PCIE 4 is my eyes are on a new 3080 RTX GPU and from what I read the 2080 TI bottlenecked when only using PCIE 3.0 x8 (which is how I my setup was) so I "might" have had an issue getting full performance. No longer an issue and I can use the new crazy fast SSDs that have no real world benefit, but it's still cool tech.
Hmmm... I thought a card like that would be use a x16 connector, not a x8. Shows you how little I know about client GPUs.
 
panteragstk

panteragstk

Audioholic Ninja
Hmmm... I thought a card like that would be use a x16 connector, not a x8. Shows you how little I know about client GPUs.
It does, but the board limits bandwidth when using two cards. So my x16 slot was using x8 bandwidth. Granted, there was an issue with either the motherboard or CPU so I was actually only getting x4 bandwidth out of x16.

Stuff was broke, now it's not. New system should last me quite a while.
 
Irvrobinson

Irvrobinson

Audioholic Spartan
It does, but the board limits bandwidth when using two cards. So my x16 slot was using x8 bandwidth. Granted, there was an issue with either the motherboard or CPU so I was actually only getting x4 bandwidth out of x16.

Stuff was broke, now it's not. New system should last me quite a while.
Ah ha! That is so stupid I never would have guessed it. Glad you found a solution.
 
Cos

Cos

Audioholic Field Marshall
As someone who picked up NVDA between 165 and 189 a share, I can't complain. I think that the ARM acquisition makes sense because it fits into their AI strategy. Nvidia DGX utilizes the processing power of their hardware for Deep learning training and accelerated analytics. I think that provides for more future, highly profitable growth, than video cards.
 
panteragstk

panteragstk

Audioholic Ninja
As someone who picked up NVDA between 165 and 189 a share, I can't complain. I think that the ARM acquisition makes sense because it fits into their AI strategy. Nvidia DGX utilizes the processing power of their hardware for Deep learning training and accelerated analytics. I think that provides for more future, highly profitable growth, than video cards.
Yep. They better not screw up ARMs business model though. That's where their value comes from. I get that nvidia also makes ARM variants, but they don't want to screw themselves out of all that licensing money they'll get from pretty much everyone.

Even at $500 a share their stock still looks good. I should have bought some YEARS ago, but that's with any company that starts doing well.
 
Cos

Cos

Audioholic Field Marshall
Yep. They better not screw up ARMs business model though. That's where their value comes from. I get that nvidia also makes ARM variants, but they don't want to screw themselves out of all that licensing money they'll get from pretty much everyone.

Even at $500 a share their stock still looks good. I should have bought some YEARS ago, but that's with any company that starts doing well.
I have been lucky, got in on Apple when they starting making Ipods, bought Skyworks, because they made chips for apple phones, Got in early enough on shopify. Notice I am not talking about some of my bad picks....don't want to think about those ;)
 

newsletter
  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top