Nvidia Unlikely To Unveil 2018 Graphics Cards At GDC, GTC

Status
Not open for further replies.
I doubt we are going to see these cards before October. Nvidia is already making so much money over the generic stuff.

Also, new cards are supposed to be HBM2/New memory interface... with the actual situation, the prices are going to be insane if you add the Nvidia tax (they love to price their products with a good margin of profit).

A vega card is already selling for 2-3 time its MSRP, I cannot expect things to change anytime soon.
 
What they need to do is make a sub category of cards for the asshats who are into mining. That way when it comes time for the majority of the world to upgrade their PC we are not having to pay 1000 dollars for a 600 card. I am sure I am not the first person to have thought of this. But either way I said it lol.
 


Don't think it's official yet -- Techspot's article only has the 20-series numbers in its title, but the body of the article indicates they could be the 11-series or the 20-series (https://www.techspot.com/news/73456-nvidia-new-geforce-cards-rumored-arrive-next-month.html).

Although, while nVidia & AMD both have a history of "restarting" their number sequences, historically they tend to lower the numbers for the newer sequence -- i.e. the top-line card in the generation preceding the GTS 150 was the 9800 GTX/GTX+. And while it kind of made sense when they skipped over 800-series GPUs (used in laptop models but not the desktop ones), I don't see any purpose in moving from 1000-series to 2000-series...unless they're hoping the unwashed masses will assume they're twice as powerful ("2000 is 2 times as big as 1000!!")...
 

You're correct in that what you're suggesting has been said 100 times before. But AMD has already said that a large part of the supply issue (and associate price gouging) has to do with VRAM shortages, meaning that creating a separate line of mining GPUs wouldn't help anything.
 
More time that computer builders will have to endure the pain of high prices and lack of supply.

With supply channels empty for the 10 series, you would like to believe that Nvidia start building the new cards sooner rather then later. (for them to make money and to keep their customers happy)
 

That won't make anywhere near as much of a difference as you think it will: most alt-coins are memory-intensive and memory is currently the main bottleneck to increasing GPU production. If you want alt-coin miners to buy your crypto-oriented CPU/GPU, you need to make it more cost-effective than GPU-mining. If you do that though, pressure on the RAM supply will get even worse. If you restrain alt-coin miner memory supply to increase GPU production, alt-coin miners will get back to GPU-mining. If you sink more RAM into alt-coin miners, than you still have no RAM to make more GPUs.

Rinse and repeat for every type of component common to GPUs and alt-coin miners. The only thing that can 'fix' this is popping the crypto bubble.
 


Or we teach miners how to use the big FPGAs that Intel is marketing for AI use, as those have more than enough memory to mine "FPGA resistant" crypto. Sure a dev board is over $2k, but so are some GPUs.
 


As a miner I guess I deserved that ...

They do make mining specific cards, cards without any graphics outputs

Like the Gigabyte P106-100 6G

https://www.gigabyte.com/Graphics-Card/GV-NP106D5-6G-rev-11-12-13#kf

The issue is they don't make enough of them due to the manufacturer playing it safe and not wanting to be stuck with 1000s of cards if the crypto market were to suddenly collapse and never come back.

On top of that the warranty is only 3 months, due to the manufacturer knowing what the intended use will be.
 
Well, it still does - it's just a matter of sitting and waiting and getting lucky enough to snag, say, a video card in the 3-1/2 minute window when one shows up at MSRP.

Never thought I'd be glad to pay the $50-over-MSRP-premium for a Founders Edition card, or that such a price would be the best deal that I could possibly get.
 
kind of a true/false statement about NVidia not being pushed to create newer/faster products. Yea, AMD has a long way to catch up; they have for years. But games are still pushing for newer/faster architecture. There are AAA games that are a year old that even SLI 1080ti's can't keep up with on max settings in 4k, let alone a single one. That was their big push in talking about their 10 series lineup, being able to play in 4k. Yea, they can, but not anywhere near max settings in a LOT of games. And that's not even including games that are about to be released soon, such as Far Cry 5. I'm willing to bet that a pair of 1080tis won't be able to run that on Ultra in 4k.
 
There are 3 million plus x-mining cards that will eventually come back on the market and many are top tier, when they do nVidia and AMD will not profit nor will they sell as many of their newest release to gamers. Some used sales will go to other miners looking to upgrade the rest to gamers. If gamers can't afford the latest tech they can at least upgrade to a higher tier used card right? If crypto continues those upgrade replacement sales will be mostly to miners. These GPU makers should have whipped up miner class cards early on if they do it now and they are cheaper/more efficient then a huge amount of cards will dump and new card sales for gaming will tank. There are millions of gamers that would be very happy to buy a RX 4/5 70/80 or GTX 10 70/80 for a good price.
 
Having just bought a nice 4k 120hz tv, I'm in the market this October for a new graphics card whether it be called 1180 ti or 2080 ti doesn't matter, as long as they release a new card this year all be happy. The big question for me is will Nvidia officially support the new hdmi 2.1 specification and push all the businesses making Nvidia graphics cards to make sure these new cards meet the hdmi 2.1 specification. As I will have to consider AMD for a new graphics card other wise as they have all ready announced support for hdmi 2.1. Would really like to stay away from AMD for a new graphics card though so hopefully Nvidia wil support hdmi 2.1 with the announcement of the new graphics cards in the near future, as I seriously need a replacement for my gtx 1060.
 
That's another reason Nvidia doesn't need to push out faster hardware, most people do not play at 4K. 4K just hasn't caught on to the point that they have to worry about satisfying demand for it. It would take 4K monitor prices dropping drastically before the majority of gamers would switch over to it, thus increasing the demand for higher end cards.
 
Didn't they not that long ago boast about their unified architecture and how it kept them going strong? And now... we got what looks like will be two or three different architectures going on all at the same time, if I understand things correctly.
 

At least three problems with this approach:

    1) Dedicated mining cards would need to be cheaper to be attractive to miners, otherwise why not just buy the full GPU?

    2) As mentioned above, there are component shortages (particularly RAM), so making dedicated mining cards doesn't really solve the supply issues anyway

    3) Card resale value still matters to miners. If the bottom falls out of the mining market you want to be able to offload your cards. In that case a full gaming GPU retains at least some value, even in a second hand market flooded with ex-mining GPUs, while a dedicated mining card with no video outputs would be little more than a paperweight.
 

If Nvidia's supply channel is empty, that means they're selling cards as fast as they can make them, so they're obviously making money...
And as has been said, limited VRAM supply is a large contributor to graphics card supply, meaning that even if Nvidia were to release a new lineup of cards graphics card availability as a whole may not even improve at all.
 

Eh, I think the real issue with those "mining" cards is that they are the exact same as a regular graphics card (GTX 1060 in the example you linked) except without display outputs. Meaning they perform no better, while having little to no resale value (and reduced warranty like you said). Unless you could get them for significantly cheaper than the equivalent regular graphics card, they never really made sense to begin with regardless of availability.
 


...it's also pretty much the end of the line if you need to upgrade as well. Crikey a 1080 Ti today would cost almost as much as the system it would go in that I built several years ago.

Still don't like prebuilds as the often come loaded with crapware (including W10 Home Edition which I put in that category) and they always seem to scrimp in something important like the case (poor airflow or too small) or a PSU barely adequate to support what the system has installed.

Were I to build a "new" system it would be primarily for CPU rendering using older generation components with a low power GPU to run the displays, as many CPU cores as possible (duo 8 - 10 core Xeons) and something like 128 GB of DDR3 4 channel ECC memory.

...oh and running on W7 Pro.

Wouldn't be as fast as GPU rendering but then I don't have to worry about exceeding VRAM limits as I do large resolution format gallery quality images which likely would eat even a P6000 for brekkie.
 

...yeah I've been on their "Notify Me" list for a while for a 1070 and it's been nothing but "crickets" in my inbox.

 
Status
Not open for further replies.