News Crypto Miners Fool Nvidia's Anti-Mining Limiter With $6 HDMI Dummy Plug

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I was just saying they are increasing their output of products, that AMD 9.2% share is bigger than it was lat year, so we will see more products in 2021 compared to 2020. It just takes 3-4 moths for new orders to get to stores.
A lot of those 7nm wafers are undoubtedly going toward console APUs though. The PS5 and Series X use chips roughly 4 to 4.5 times the size of the Zen 3 chiplets used in their CPUs, and Microsoft and Sony combined might be selling around 20 million consoles during their first year on the market. The contracts for these parts were likely put in place years in advance, so AMD needs to fulfill them, despite the parts not likely making them much profit per wafer.

Their graphics cards also utilize large chips, but while they are likely more profitable on a per-wafer basis, they are not likely to be nearly as profitable as the chiplets that go into their Zen3 CPUs, and AMD likely doesn't have the same kinds of guarantees with their board partners to make the chips available in large quantities. So, with their CPUs being in short supply, and probably being their most profitable use for their limited manufacturing capacity, I suspect most of the capacity that they are not contractually obligated to put toward console chips is being put toward their CPUs. The graphics card chips may only get a relatively small amount of their available capacity to keep the board partners from getting too upset. These cards may be as rare as the Vega 56 and 64 cards that AMD launched during the last Crypto fad, where they were pretty much nonexistant until after the mining rush subsided.

2. Turing sales were vastly below NVIDIA's expectations. They were disappointed in sales, but Turing had the worst MSRP/performance ratio in a long long time. This forced NVIDIA to roll out 1660ti's and below but even that didn't sell as well they hoped. This was all during mining lows. This is more subjective proof that true gamers who don't mind will bulk at the price.
While I would agree that mining is by far the number one reason why graphics card prices are high, I wouldn't say Nvidia was "forced" to roll out 16-series cards. Those were undoubtedly planned from the start. The 1660 Ti was released just 5 months after the initial Turing offerings, and they obviously didn't design and roll out a new graphics chip and card designs in just a few months.

Intel has had its own in-house IGPs for something like 20 years and Xe is its new IGP thing, so I wouldn't worry about Xe going away any time soon even if Intel decides to scrap the discrete consumer variants.
Xe might not go away, but Intel doesn't always support things particularly well in the long-term, and they generally haven't provided all that much driver support for their integrated graphics post-release. I suspect driver support may improve when they are initially trying to get a foothold in the dedicated graphics space, but who knows how long that will last, and how support will be down the line. They might end up being a good line of cards, but I think the pricing is going to need to be very competitive to change people's preconceived notions of Intel graphics not being particularly good.
 
  • Like
Reactions: VforV
Xe might not go away, but Intel doesn't always support things particularly well in the long-term, and they generally haven't provided all that much driver support for their integrated graphics post-release.
The latest drivers for Intel's HD2500-4000 graphics were released in October 2020. I'd say that's a decently long while, though updates for old-ish IGPs are only for security reasons. That said, I doubt AMD and Nvidia put much effort into optimizing their all-in-one drivers for GPUs older than two generations either.
 
Intel has had its own in-house IGPs for something like 20 years and Xe is its new IGP thing, so I wouldn't worry about Xe going away any time soon even if Intel decides to scrap the discrete consumer variants.

Your statement is based on the fact intel has made iterative advancements on igp and that xe and game ready drivers intel has been doing for 20 years also. They are very much apples and oranges in comparison. This architecture and business model has very little in common with igp.
 
The latest drivers for Intel's HD2500-4000 graphics were released in October 2020. I'd say that's a decently long while, though updates for old-ish IGPs are only for security reasons. That said, I doubt AMD and Nvidia put much effort into optimizing their all-in-one drivers for GPUs older than two generations either.
Gcn 1 is still supported. Mainly because gcn changes were minimal and still in use for apus