Report: PC Graphics Shipments Down in Fourth Quarter

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]onephatkiller[/nom]More like the games are designed on a PC and then ported to consoles. PC gaming will always be the cutting edge. That's why consoles have off the shelf PC components in them.[/citation]

Consoles aren't going to play ports. That would make absolutely no sense. Consoles have more limited hardware and need their games to be developed on them with optimization to make up for it. Where have you been these last several years? Almost all big PC games are ported from consoles and many of them that aren't are weak stuff like Diablo 3 and WoW or F2P/P2W games.

EDIT: Also, just what shelf has the PS4's hardware? I guarantee that you can't buy it just anywhere. It may have a basis in PC hardware, but it is custom-designed. That's been done for pretty much every popular console that I can think of for the last two decades; they've got custom hardware. PowerPC CPUs are far more common than x86 CPUs for consoles.
 
[citation][nom]dimar[/nom]I'm using GTX 570 and don't see any point to upgrade. Will probably be getting one of nVidia 700 series, or AMD 8000 series.[/citation]
Graphically that's a weak GPU. A 650Ti would offer about the same performance, use half the electricity(110 watts vs. 219) and is inexpensive. The eventual 28nm refresh/name change won't offer you much more. Maxwell and Volcanic Islands will offer a lot, but are far from ...release.
 
[citation][nom]wdmfiber[/nom]Graphically that's a weak GPU. A 650Ti would offer about the same performance, use half the electricity(110 watts vs. 219) and is inexpensive. The eventual 28nm refresh/name change won't offer you much more. Maxwell and Volcanic Islands will offer a lot, but are far from ...release.[/citation]

A GTX 570 is not as slow as a GTX 650 Ti, especially in memory-bandwidth heavy situations such as most of the very newest games. The GTX 570 is only a little bit weaker than the GTX 660.

If we went specifically by GPU performance, then the GTX 650 Ti and the GTX 570 could be pretty similar, but the GTX 570's far faster memory interface shows its worth in gaming. The GTX 570 is much closer to the GTX 660 than it is to the GTX 650 Ti when it comes to modern gaming.
 
"...during the fourth quarter of 2012, only 28.8 million discrete GPUs were shipped. .."
"...seeing a 13.6-percent drop in quarter-to-quarter discrete GPU sales, followed by Intel which saw only a 2.9-percent drop..."

Is intel make discrete graphic card ???
 
[citation][nom]rdc85[/nom]"...during the fourth quarter of 2012, only 28.8 million discrete GPUs were shipped. ..""...seeing a 13.6-percent drop in quarter-to-quarter discrete GPU sales, followed by Intel which saw only a 2.9-percent drop..."Is intel make discrete graphic card ???[/citation]

Intel doesn't make discrete consumer cards and they haven't for a long time. They do have Xeon Phi that may be countable as a graphics card.
 
Not really surprised that AMD did better than Nvidia this quarter.

I bought my first AMD GPU a 7970 (over a GTX 670) just recently because I really found they offered better VFM with a complimentary copy of Far Cry & Bioshock Infinite included compared to Nvidias completely lackluster, insipid and uninspired "gear up" offering of pay2win games.
 
Can understand how this looks strange for a person with an iq of less than 50 or are totally unaware of the market, the rest understands why - Everyone is waiting for the next generation of cards from both camps rather than spend that money on a almost a generation old card...
 
There are so many video card options out that will run any game on ultra with maxed AA above the 30 fps mark smoothly on a 1080p screen (which a majority of people use).

Until the gaming software catches the hardware, there is no real "need" to buy anything.
 
Well normal Mr Smith is just fine with Intel or AMD APU for reading email, and using Facebook... SO 95% of customers don't need disgrete GPU. The rest 5% use disgrete or moves to mobile gaming systems (smartphones or tabs...) Allso there has not been really new DX version for a guite a long time, so even tabletop gamers are not needing new cards at this moment. There has to be some really good reason to upgrade. Maybe Crysis 3 or something like that will push the demand up for some time, but if you have allready 5xxx or 4xx series card or newer you are just fine at this moment.
 
[citation][nom]hannibal[/nom]Well normal Mr Smith is just fine with Intel or AMD APU for reading email, and using Facebook... SO 95% of customers don't need disgrete GPU. The rest 5% use disgrete or moves to mobile gaming systems (smartphones or tabs...) Allso there has not been really new DX version for a guite a long time, so even tabletop gamers are not needing new card at this moment. There has to be some really good reason to upgrade. Maybe Crysis 3 or something like that will push the demand up for some time, but if you have allready 5xxx or 4xx series card or newer you are just fine at this moment.[/citation]

DX11 came out in 2009 and didn't get very popular until 2011/2012. Then it was updated to DX11.1 in early 2012. Those are pretty recent.

Regardless, DX isn't what's important for gaming. You can make an extremely intensive DX9.0C game if you want while you can make a very light DX11 game if you want to. There are many games where cards like the Radeon 4000 cards would be a huge bottle-neck for anyone wanting a decent 1080p gaming experience and the Radeon 5000 series is still not great for 1080p. You're only just fine if you don't mind a poorer gaming experience.
 
the report states that AMD's APUs increased 0.8-percent from 3Q12 in the desktop sector, but dropped a dramatic 19.1-percent in notebooks.

Ouch.

Maybe this is the *re-boot* of an APU switch-over from Trinity to Richland on mobile?

I understand that AMD has not announced any Richland mobile ... but they may be holding it back to steal some Haswell thunder.

 
I think graphic cards are outpacing gaming technology. I can play pretty any game on HIGH and Ultra with my GTX480 so why upgrade!!!
 
[citation][nom]ryu750[/nom]I think graphic cards are outpacing gaming technology. I can play pretty any game on HIGH and Ultra with my GTX480 so why upgrade!!![/citation]
This is only going to get worse for the Nvidia and ATI/AMD as the game designers focus on console games with currently last-gen graphics cards on consoles to be released in the next 1-2 years. We see more and more console ports to PC. This trend will decrease the demand for high-end graphics hardware even further.
 


Pure FUD on my part, but it sure is interesting that 8 of those Jaguar x86-64 cores and GCN are going in the new Playstations.

I think that bodes well for future cross-over console/PC games (and likely, AMD)


 
How much would 30,000+ used gpus affect things if/when the Bitcoin ASIC miners hit the mass market ?

I paid $70 for my HD 5850, 3 months ago. Soon I'll be able to buy another for 60 or less. xD

pEACe





 
Utter BS...PC sales only down 4.9% according to Intel's earnings report. For sales being so DIRE, Intel only made 2.5% less profit...LOL. NV is up 8%. NV also stole a LOT of market share from AMD (~12% in desktop/20% in notebooks).

A GPU doesn't ship with EVERY PC. Not a discrete one. 50% are onboard from Intel, who is the largest PC gpu producer (of crap but still). The 4.9% came from AMD, NV was not affected as they stole enough to make it up or they'd be losing 1.18B like AMD instead of making $725 (well subtract the intel payment from that#). Record sales, margin and cash at NV this quarter. How does that happen if your situation is so DIRE? Peddie needs to put the crack pipe down.
 


And what are you using compute for at home? Folding@home or bincoin mining? Waste of gpu space for 95% of us. Hence NV cut it off of regular kepler giving us more GAME power for less watts/heat/noise. For those that want it, they put it back in titan. 167,500 computers are signed up for folding@home (to solve cancer etc or create some new med I won't get a dime from if solved), FAR less for bitcoins. Even if I'm the one who cures cancer, I'll get nothing from folding@home but a big electric bill they won't pay and an expensive prescription if I actually GET cancer...LOL. Realize how many gpu's were sold when looking at that number I just said and understand compute doesn't matter yet at home. Rightly, NV cut out computer unless you pay for it. AMD including this is costing them (well, helping the 1.18B loss as opposed to NV profits). Free games and product pricing are killing them too.

Only Civ5 can prove it's useful (and barely tapping that source so far, really just for a benchmark of the maps) in games so far. Maybe next year this will be important, but it isn't today (at home). YET.

I only care about the games, as that's what I buy a HOME card for. I'm not sure why people quote compute as a reason to buy AMD at home. So few REALLY use it and for the apps that do use it (adobe etc, solidworks, proe, cadcam stuff etc etc) cuda/opengl is well supported. Not sure why toms uses opencl filters for adobe when opengl is there too. I believe you can go either way in all adobe apps that support this tech (someone correct me if not). OpenGL has been supported for gpu since cs4 AFAIK.
"MGE is new to Photoshop CS6 and uses both the OpenGL and OpenCL frameworks"
etc...
http://helpx.adobe.com/photoshop/kb/photoshop-cs6-gpu-faq.html

http://helpx.adobe.com/photoshop/kb/gpu-opengl-support-photoshop-cs4.html
CS6 a few Q's down has a GPU sniffer for crying out loud...Yet Toms used OpenCL filters instead of openGL. You should use whichever is faster for X or Y cpu correct? Would you purposely choose OpenCL if OpenGL ran faster on [insert name here] card? You'd use whichever your drivers are best at running. You should run both and report whichever is faster in whatever you're testing, noting whichever it is in the pic etc. Because that's what you'd do at home etc.
"To use OpenGL acceleration, your system must support OpenGL v2.0 and Shader Model 3.0 or later
To use OpenCL acceleration, your system must support OpenCL v1.1 or later"
http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf
Cuda enabled apps catalog. Check it out, for people actually making money on their gamer/workstation, there's a lot of pro apps using it.
http://www.pointsinfocus.com/learning/digital-darkroom/enable-cuda-in-premier-pro-cs6-without-a-quadro/
Works for pretty much all recent cards. A LOT of pro apps support this tech in some way shape or form (simple text file change here).

AMD doesn't have phsyx either. Which in a title like borderlands 2 makes a huge difference in realism and perf while doing it. NV has been laying cuda ground work for 6 years (feb 2007 it made it's debut v1.0, now 5.x). It's finally paying off now in the pro world (check that pdf, that's a LOT of apps). There are 56 cuda centers across 20 countries now. They are getting this stuff taught in colleges now too (cuda classes).
http://www.nvidia.com/object/gpu-accelerated-applications.html
There's the list that includeds regular type apps (cad stuff, 3dsmax, solidworks, catia, cs6 etc). Unfortunately AMD is way behind in the ecosystem game. This is not surprising when you see they haven't made a dime in the last 10yrs overall (rather lost 3-4B over that time). You can't inspire people to program for you for free, it takes money for teachers, classes, curriculum, centers etc...

http://investing.money.msn.com/investments/financial-statements?symbol=US%3aAMD
Look at the losses. That sucks. I fear very expensive gpus in our near future if AMD doesn't turn this around. NV will be our only option. That's the 10yr summary. It isn't pretty. It's worse than I thought. Quick math tells me It's a ~$5B loss not 3-4B. I stand corrected 🙁
 



What balance sheets are you looking at? AMD lost ~500mil, NV made ~210mil this quarter. Last time I checked, business was a success if you MAKE money. An abysmal failure if you haven't made money for 10yrs overall (in that time AMD lost 5B).

For the year, AMD lost 1.18B, NV made 725Mil.
Rather than retype the whole thing:
http://www.anandtech.com/show/6746/tegra-4-shipment-date-still-q2-2013
Read my comment. There is only ONE last I checked. Simply scroll down :)

Wisely, Anand hasn't responded to me in 5 days :)

They won't respond to my Titan article posts on their site either (Well color me shocked if they do, ryan didn't fare to well last time he did that, jarred didn't either). Those lovely games you mention are part of why AMD is going broke. I put off my 660TI purchase at black friday hoping they'd release something I could buy in March/Arpil, but now everything is delayed to Nov most likely (watch for an NV announcement of delay in response to amd shortly). Now I'm just disappointed in both of them...LOL. But I sincerely hope that when they do release their new stuff AMD isn't offering me 1/2 dozen free games killing the very reason I'd wait to buy their card! I want them to MAKE MONEY off of it. The pay to win games don't cost near as much as AAA titles. The difference is making money, or losing it. AMD has to start acting more like NV or go broke.

If someone really wants my [very relevant] wall I can post it here so nobody has to delve through the 440 posts on that article at anandtech (actually the main two are on the page 5 if viewing all). It highlights performance in 14 games that they forgot in their "new testing suite".

You know crap games like, starcraft 2, diablo3, skyrim, borderlands2, guild wars 2, crysis 2, crysis 3, assassins creed 3, batman arkham city, f1 2012, dirt 3, max payne 3, WOW Mists of Pandaria, etc...You know just some junk games nobody plays. But he included Crysis warhead from 2008, and dirt3 showdown which nobody plays...ROFL. I posted a link for him to the Warhead servers which show now players :) Combined they barely sold 100k units (showdown doesn't even show up on vgchartz as it didn't sell 100K!). But anandtech thinks you need to know how these two awesome games with no sales, run for the next few years in their new benchmark suite of games...ROFLMAO.

There dumb site saw all my data links as spam links...ROFL, so I was forced to just give page references with only one link to each site (techreport, techpowerup, pcper, etc etc even mentioned toms in there I think - hardware site links are spam?). I can see Blazorthon laughing. I wonder what would happen if I tried to post it here without breaking it up and changing most of the links? At first I thought it was the 2650words...LOL. A good 1/5 of the links in the post were to their own site's benchmarks...ROFL. Quit laughing blaz... :) Nah go ahead, I about died laughing when it said something like "this post is apparently spam, we do not accept that here" :) Link limit over there must be around 9-10 in a post.
 
Status
Not open for further replies.