AMD vs. Intel and Nvidia - The Next-Gen GPU War is ON!

panteragstk

panteragstk

Audioholic Spartan
Nvidia has posted such strong performance numbers for their Ampere platform that AMD has its work cut out for it trying to match or even getting close to that level of performance. I don't think Nvidia is sweating AMD's moves at the moment. I am rooting for AMD because strong competition makes for better product choices for us consumers.
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
 
S

shadyJ

Speaker of the House
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
I have used Radeon graphics cards for a long time and never had a problem with them. However, for my most recent build, I bought a 5600XT based card. I just could not get the thing to work correctly. It only started to get wonky after I installed the drivers. I became tired of wrestling with the issue and just replaced with a Geforce GTX 2060 and haven't had any problems. The shame of it was that the 5600XT is $100 less but capable of the same performance of the GTX 2060 -when it works. My build didn't have any weird parts combination either, it was a very vanilla build. AMD could have so much more graphics cards marketshare if they just had stable drivers. Stability should be their first priority. Who cares if you get 200FPS if the machine isn't reliable?
 
Irvrobinson

Irvrobinson

Audioholic Spartan
Same boat, but AMD hasn't really been able to compete in the GPU high end for quite some time. They still make good GPUs, but I haven't owned one that wasn't for an HTPC in many years. Main reason being their drivers are somewhat terrible to deal with, or were.
Agree. AMD has recently won a small amount of share back, but the market for discrete client GPUs is not where the action is. It's in datacenter servers and supercomputers. The real threat to AMD probably isn't Nvidia, it's Intel's latest integrated Xe graphics, which will likely take a big bite out of the low end of the discrete client GPU market. The discrete Xe part, whenever it comes, if it comes, is also more likely to hurt AMD than Nvidia.

I did recently grab a new AMD Ryzen 3900x for my main PC simply because I was on an older Intel platform that had started showing it's age. I typically don't upgrade my CPU but once a decade so I opted for PCIe 4.0 support even though I don't "need" it.
Geek. :) PCIe 4.0 does have an advantage for client computers, but it is doubtful customers will see it per se. For a given amount of throughput PCIe 4.0 consumes less power than 3.0, and halving the number of conductors for a given throughput requirement makes board design easier, though I suspect 4.0 may need a more expensive board material.

I do think Intel is still doing great in the CPU market, but they've been on 14nm for a long time. That and the amount of socket changes they've made are making their long time customers mad. The guys that have to have the fastest gaming CPU will still buy them, but AMD has given them some stiff competition for the first time in a long time. It'd be nice to see AMD as the gaming king again after so many years.
Ignore fab process "node numbers". Intel has had so many variations of their 14nm process even Intel engineers have admitted in the press they lose track of which one their product is using. Intel's new 10nm is impressive technology, as are their new "11th Generation" CPUs, but they are still behind the AMD/TSMC team overall IMO. Being late to market with PCIe 4.0 in the client market was mildly uncompetitive, in the server market it was pretty inexcusable with the latest SSDs and 100Gb+ networks. Here's a relatively non-technical blog on the new Intel client CPUs:


Here's a great analysis on the effects of fab process on different portions of ICs based on the circuit type:

 
panteragstk

panteragstk

Audioholic Spartan
I have used Radeon graphics cards for a long time and never had a problem with them. However, for my most recent build, I bought a 5600XT based card. I just could not get the thing to work correctly. It only started to get wonky after I installed the drivers. I became tired of wrestling with the issue and just replaced with a Geforce GTX 2060 and haven't had any problems. The shame of it was that the 5600XT is $100 less but capable of the same performance of the GTX 2060 -when it works. My build didn't have any weird parts combination either, it was a very vanilla build. AMD could have so much more graphics cards marketshare if they just had stable drivers. Stability should be their first priority. Who cares if you get 200FPS if the machine isn't reliable?
That's pretty much the issues I had with them over the years, but my AMD/ATI cards were all for HTPC builds. At the time AMD/ATI had a decent picture quality advantage over nvidia (things have leveled out in that area now) so I used them in every HTPC I had without issue for years. Once I started having random issues with drivers (de-interlacing would break) and a fix would come out, but break something else. it got annoying so I just with to intel integrated graphics and didn't look back.

My last HTPC had a 1030 in it and it worked great. I don't need an HTPC any longer so I have a spare GPU now.

I've heard rumors of AMD selling the GPU division, but I don't think they're true. AMD is (and has been) the go to for game consoles. Don't think they'd give that business up.
 
panteragstk

panteragstk

Audioholic Spartan
Agree. AMD has recently won a small amount of share back, but the market for discrete client GPUs is not where the action is. It's in datacenter servers and supercomputers. The real threat to AMD probably isn't Nvidia, it's Intel's latest integrated Xe graphics, which will likely take a big bite out of the low end of the discrete client GPU market. The discrete Xe part, whenever it comes, if it comes, is also more likely to hurt AMD than Nvidia.



Geek. :) PCIe 4.0 does have an advantage for client computers, but it is doubtful customers will see it per se. For a given amount of throughput PCIe 4.0 consumes less power than 3.0, and halving the number of conductors for a given throughput requirement makes board design easier, though I suspect 4.0 may need a more expensive board material.



Ignore fab process "node numbers". Intel has had so many variations of their 14nm process even Intel engineers have admitted in the press they lose track of which one their product is using. Intel's new 10nm is impressive technology, as are their new "11th Generation" CPUs, but they are still behind the AMD/TSMC team overall IMO. Being late to market with PCIe 4.0 in the client market was mildly uncompetitive, in the server market it was pretty inexcusable with the latest SSDs and 100Gb+ networks. Here's a relatively non-technical blog on the new Intel client CPUs:


Here's a great analysis on the effects of fab process on different portions of ICs based on the circuit type:

The only reason I decided to jump to PCIE 4 is my eyes are on a new 3080 RTX GPU and from what I read the 2080 TI bottlenecked when only using PCIE 3.0 x8 (which is how I my setup was) so I "might" have had an issue getting full performance. No longer an issue and I can use the new crazy fast SSDs that have no real world benefit, but it's still cool tech.

I agree, the node numbers don't mean much, but it was more around how many issues they had shrinking their process. At least I've read of some issues. I don't keep up THAT much with the nitty gritty aspect of how all the chips are made. The 10th series is still a great chip, but AMD isn't really playing catch up anymore so we will see an interesting battle between the two in the consumer space.

I think intel has had the market cornered in the data center world for a long time. I know AMD has their EPYC line, but I don't hear much about them. Not that I'm in that space.

You are correct that the low-end GPU market will suffer for AMD and nvidia, but AMD has been winning the APU battle for a long time. Intel has been saying they're upping the GPU game for a long long time and we still don't have a discrete GPU from them. If we get one, I doubt it'll be all that impressive. Could be wrong though.
 
Irvrobinson

Irvrobinson

Audioholic Spartan
The only reason I decided to jump to PCIE 4 is my eyes are on a new 3080 RTX GPU and from what I read the 2080 TI bottlenecked when only using PCIE 3.0 x8 (which is how I my setup was) so I "might" have had an issue getting full performance. No longer an issue and I can use the new crazy fast SSDs that have no real world benefit, but it's still cool tech.
Hmmm... I thought a card like that would be use a x16 connector, not a x8. Shows you how little I know about client GPUs.
 
panteragstk

panteragstk

Audioholic Spartan
Hmmm... I thought a card like that would be use a x16 connector, not a x8. Shows you how little I know about client GPUs.
It does, but the board limits bandwidth when using two cards. So my x16 slot was using x8 bandwidth. Granted, there was an issue with either the motherboard or CPU so I was actually only getting x4 bandwidth out of x16.

Stuff was broke, now it's not. New system should last me quite a while.
 
Irvrobinson

Irvrobinson

Audioholic Spartan
It does, but the board limits bandwidth when using two cards. So my x16 slot was using x8 bandwidth. Granted, there was an issue with either the motherboard or CPU so I was actually only getting x4 bandwidth out of x16.

Stuff was broke, now it's not. New system should last me quite a while.
Ah ha! That is so stupid I never would have guessed it. Glad you found a solution.
 
Cos

Cos

Audioholic Field Marshall
As someone who picked up NVDA between 165 and 189 a share, I can't complain. I think that the ARM acquisition makes sense because it fits into their AI strategy. Nvidia DGX utilizes the processing power of their hardware for Deep learning training and accelerated analytics. I think that provides for more future, highly profitable growth, than video cards.
 
panteragstk

panteragstk

Audioholic Spartan
As someone who picked up NVDA between 165 and 189 a share, I can't complain. I think that the ARM acquisition makes sense because it fits into their AI strategy. Nvidia DGX utilizes the processing power of their hardware for Deep learning training and accelerated analytics. I think that provides for more future, highly profitable growth, than video cards.
Yep. They better not screw up ARMs business model though. That's where their value comes from. I get that nvidia also makes ARM variants, but they don't want to screw themselves out of all that licensing money they'll get from pretty much everyone.

Even at $500 a share their stock still looks good. I should have bought some YEARS ago, but that's with any company that starts doing well.
 
Cos

Cos

Audioholic Field Marshall
Yep. They better not screw up ARMs business model though. That's where their value comes from. I get that nvidia also makes ARM variants, but they don't want to screw themselves out of all that licensing money they'll get from pretty much everyone.

Even at $500 a share their stock still looks good. I should have bought some YEARS ago, but that's with any company that starts doing well.
I have been lucky, got in on Apple when they starting making Ipods, bought Skyworks, because they made chips for apple phones, Got in early enough on shopify. Notice I am not talking about some of my bad picks....don't want to think about those ;)
 
BMXTRIX

BMXTRIX

Audioholic Warlord
As much as I love the nVidia cards, they are still out of my budget and desires. I do think that nVidia is taking aim at PS5 and the new X-Box. They have got to be thinking of AMD, but everything about their pricing seems aimed at the next generation consoles and making PC gaming affordable to the masses. With the recent PS5 announcement, it is clear that consoles still are going to be relevant for another decade.

My kids have 1080 and 2070S cards, but my brand new PC only has a 1060 in it. I'm okay with that as it blows away my previous AMD Radeon whatever I had.

One of the tech channels, maybe Linus Tech Tips... no wait, Gamers Nexus did a PCIe G4 vs. G3 shootout for consumers. They basically proved there is a difference between the two, but that it is completely irrelevant. So, you know, progress for the sake of progress.
 
panteragstk

panteragstk

Audioholic Spartan
As much as I love the nVidia cards, they are still out of my budget and desires. I do think that nVidia is taking aim at PS5 and the new X-Box. They have got to be thinking of AMD, but everything about their pricing seems aimed at the next generation consoles and making PC gaming affordable to the masses. With the recent PS5 announcement, it is clear that consoles still are going to be relevant for another decade.

My kids have 1080 and 2070S cards, but my brand new PC only has a 1060 in it. I'm okay with that as it blows away my previous AMD Radeon whatever I had.

One of the tech channels, maybe Linus Tech Tips... no wait, Gamers Nexus did a PCIe G4 vs. G3 shootout for consumers. They basically proved there is a difference between the two, but that it is completely irrelevant. So, you know, progress for the sake of progress.
For sure. There isn't a difference...yet. When graphics cards get bandwidth hungry enough we may see a difference, but not till then. I just figured, I'm upgrading, so why not go with the latest stuff.

We'll see how long it takes for the 30xx series cards to get back in stock. I don't want to have to upgrade for a few years.
 
Sheep

Sheep

Audioholic Warlord
For sure. There isn't a difference...yet. When graphics cards get bandwidth hungry enough we may see a difference, but not till then. I just figured, I'm upgrading, so why not go with the latest stuff.

We'll see how long it takes for the 30xx series cards to get back in stock. I don't want to have to upgrade for a few years.
You'll see more benefits in storage speeds right now, as the PCIe pipe line works with NVME storage as well. From what I've read the power requirements for PCIe 4.0 are also reduced, meaning the video cards can run cooler and quieter.

I'm in a tough spot right now as I recently rebuild my PC (~ 1.5 years ago) with an i7 8700K and 32gb of ram, but my video card is only a R7 370 - an emergency upgrade for the previous card that had the cooler fail. I don't game anymore but i use a 4k monitor for photo editing which puts a decent load on my system and GPU acceleration is avaiable via lightroom (but it only works decently with top tier cards). I hope the RDNA2 cards a good, and follow the bang for buck trend AMD is known for. Seems getting anything 3000 series nVidea is not possible with stock levels currently.
 
panteragstk

panteragstk

Audioholic Spartan
You'll see more benefits in storage speeds right now, as the PCIe pipe line works with NVME storage as well. From what I've read the power requirements for PCIe 4.0 are also reduced, meaning the video cards can run cooler and quieter.

I'm in a tough spot right now as I recently rebuild my PC (~ 1.5 years ago) with an i7 8700K and 32gb of ram, but my video card is only a R7 370 - an emergency upgrade for the previous card that had the cooler fail. I don't game anymore but i use a 4k monitor for photo editing which puts a decent load on my system and GPU acceleration is avaiable via lightroom (but it only works decently with top tier cards). I hope the RDNA2 cards a good, and follow the bang for buck trend AMD is known for. Seems getting anything 3000 series nVidea is not possible with stock levels currently.
I have an NVME now, but it's pcie 3. It did get a boost, but nothing like the new ones that have been announced/tested. That will the the plan eventually. Especially since my board can run three NVME slots in RAID 0 to get crazy performance numbers. Not that I need that, but it'd be more to have access to all the drives as one rather than have three. That would be annoying to deal with.
 
Sheep

Sheep

Audioholic Warlord
I have an NVME now, but it's pcie 3. It did get a boost, but nothing like the new ones that have been announced/tested. That will the the plan eventually. Especially since my board can run three NVME slots in RAID 0 to get crazy performance numbers. Not that I need that, but it'd be more to have access to all the drives as one rather than have three. That would be annoying to deal with.
Be careful with full NVME filled birds, something’s populating all spots can cut into your main PCI-e slot (16x to 8x).
 
panteragstk

panteragstk

Audioholic Spartan
Be careful with full NVME filled birds, something’s populating all spots can cut into your main PCI-e slot (16x to 8x).
That's why I bought a new system. That's what was happening on my old board so I bought a new platform that doesn't have that problem. CPU has 4x lanes from one NVME and the chipset has 8x more for NVME.

Basically, 24 lanes on the CPU, another 16 in the chipset. I won't be running out any time soon.

https://www.guru3d.com/articles-pages/amd-ryzen-9-3950x-review,4.html#:~:text=X570 PCH actually includes sixteen,, x1, and SATA modes.
 
P

pcosmic

Audioholic
I bought a whole buncha amd stock as soon as i heard Lisa Su was taking over AMD. I have known that high iq nerd from a long time ago...bwaaahaha

Now, i watch gleefully as she sinks the intel/nvidia low iq creatures and makes money for me! Long live Lisa! Long live AMD!
 

newsletter
  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top