Imagination Eyes Low-Cost Android Consoles With High-End PowerVR GT7900 GPU

Status
Not open for further replies.

tastyKake

Reputable
Feb 24, 2015
5
0
4,510
This is cool, but I feel like these companies miss the point of "performance". That's impressive that the design can handle 1million tris on a mobile chip, I mean I remember making models for games that were less than 1k polys about just 10 years ago, but I feel as if the demos they show prove that they can draw the scenes, which is impressive, but at equally unimpressive frame rates. It's up for debate but I think the main focus should be a smooth 60fps becnhmark instead of how many polys it can handle or the lights in a scene etc. They understand that for video with 4k@60fps, so please extend that to real 3D performance.
 

TallestJon96

Distinguished
Dec 27, 2014
256
0
18,810
Mobile parts are making excellent progress, but I have to wonder how much the power will actually be used. Tablets can use more than phones, and an android console is interesting, but I don't think that will get much momentum.

With this rapid progress we could have an interesting situation where by the end of the PS4 generation mobile systems, especially tablets will have more power and higher resolution for games than the consoles do.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
"The PowerVR GT7900 aims to bring "PC-class" gaming with powerful hardware and support for the OpenGL ES 3.1 graphics API"

That made me laugh a little.
 

alextheblue

Distinguished
"The PowerVR GT7900 aims to bring "PC-class" gaming with powerful hardware and support for the OpenGL ES 3.1 graphics API"

That made me laugh a little.

Yeah. However keep in mind that the article doesn't mention that these GPUs are available with optional support for DX 11.2 and OGL 4.3. They don't include this in the base configurations because Android uses OGL ES + AEP which is supported by default. However, it is available if you wanted it for your device.

Anyway, yeah I would kind of like to see them make a return to discrete PC graphics... but it's difficult to justify the attempt. You're going to have a hard time competing with the value of integrated graphics on the low end, and you'll have an uphill battle against the price and performance of existing discrete graphics solutions. So while I think it would be interesting, from a financial point of view it doesn't make sense to spread their limited resources out further at this time.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
This will be facing the 14nm Nvidia X1 replacement chip later this year for xmas, so already going to be behind. I'd expect Nv to double their 512 FP32 and 1024 FP16 numbers of X1 when they go to samsung in Q2/Q3. Samsung says mass production of qcom/nvidia in Q2, so they'll be in xmas stuff just like this chip. Everyone is now facing NV's gaming drivers/discrete tech and developers gaming experience with that tech far before others on PC's. No comment on phones here, that part remains to be seen, but I see larger formats as Nvidia's to lose from here on out. Gaming is taking over mobile, and I don't see how a company like IMG.L (who was forced to mobile ages ago by AMD/NV) can dominate gaming anything (same for Qcom etc).

I hope AMD gets in this soon, or at least by the time games are being aimed squarely at X1 level gpus and above on mobile. They need to get out a great 14/16nm SOC for arm and reap the benefits with NV on gaming finally moving up to console/pc like stuff on mobile. It's far easier to port to the exact same chip on a PC where you made your game, to mobile that uses a chip with every feature of the big brother, just scaled down. IE, devs have been using maxwell for ages (in gpu terms anyway...LOL), so they don't have anything new to learn to port a pc game to X1 (or something AMD would put out based on their current tech). I definitely don't see myself buying an xbox1/ps4 knowing 14nm chips hit at xmas. That is already enough power to blow away an xbox360/ps3 game and would be good enough to skip consoles this time for most. We should start getting some really good ports or new content that looks/plays great on mobile soon (say, xmas and after).

I have a PC for real gaming ;) But would rather buy mobile at 14nm for another side to play that always has very cheap games compared to consoles $60. I hope devs port PC/console games like mad and that cash is used to pave the way for NEW IP content on mobile. At 14nm even junkers will be K1 level gpu wise and that should be a nice audience for devs to shoot at with a fairly powerful common gpu level. Most games out there now on mobile were not aimed at anything above Tegra4 levels and many already are pretty great. It's only going to get better now that they will all be OpenGL3.2 ES etc...These things are certainly gaining abilities fast (h265 etc).
 

bit_user

Polypheme
Ambassador
I dunno about that. The fastest tablets on the market in 2013 weren't faster than a PS3.

It's just hard to overcome power constraints. The fact that they're not even targeting this GPU at tablets, and yet it has less than half the raw compute performance of a PS4 (and certainly less than half the memory bandwidth) makes me think that even at 10 nm, they won't be reaching PS4 performance levels in a tablet power envelope.

What you're talking about is matching the performance in just a couple percent of the power consumption. That's a tall order, much inside of a decade. But who knows, maybe better battery technology will enlarge tablets' power envelopes enough to make it happen.
 

bit_user

Polypheme
Ambassador

I don't know which point you're laughing about. Based on the specs, it sounds comparable to an entry-level discrete GPU card. So, technically, you could call it PC-level gaming. But OpenGL ES probably means it'll be a fair porting effort, for any PC or console game.
 

bit_user

Polypheme
Ambassador
Really? What other chips have doubled between 20 nm and 14? When have GPUs ever doubled in a single generation (if you compare cards with equivalent TDP)?

I doubt this. While X1 might support DX12 & OpenGL 4.5, Android doesn't. That means a pretty substantial port. Also, the parameters of any mobile chip are going to differ from its PC and console cousins, which will require significant tuning (as will the smaller screen size). And NV doesn't have such a big chunk of the mobile market that developers can specifically optimize for it - most will be aiming for a far lower common denominator, meaning much of the X1's power will probably go to waste.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


TLDR for some, but I'm sure stock holders in some of the companies would want to know this stuff. :)

You're talking desktops, I'm talking SOCS. Desktops have many things that used to be doubling etc that are now just a few mhz more etc of the same memory, pretty much same channels etc. Early on in discrete you had doubling of bandwidth via new faster ram or more channels, along with new gpus (just like we have in mobile's early revs up to now). Now they can't simply add channels cheaply etc on discrete without blowing up cost. The low hanging fruit isn't there today, but it is on mobile still.

http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/2
28nm K1 is doubled by 20nm X1. This has been going on for pretty much every rev from many socs when shrunk (especially with help of surrounding better stuff). In this case 3x fp16, close for fp32, 2x perf/watt etc. The next page here shows manhattan offscreen 1080p 63fps vs. K1's 31fps. A double right? Not sure why you wouldn't think you could double the gpu at 14nm+finfet on top. I expect a 14nm X1 replacement to come with double the cores and probably faster mem, larger cache, faster clocks on gpu etc (for Qcom, apple etc too). When you put that all in a device you'll get 2x fps most likely in many things in gaming.

http://www.extremetech.com/extreme/186336-tegra-k1-gpu-posts-some-beastly-benchmarks-but-is-it-good-enough-that-devs-will-actually-use-it
K1 vs T4. Double in 3dmark.
http://www.extremetech.com/extreme/186336-tegra-k1-gpu-posts-some-beastly-benchmarks-but-is-it-good-enough-that-devs-will-actually-use-it
Unreal 4 already runs on K1, so any game running unreal 4 can easily be ported to K1/X1 etc. The same can probably said for Qcom etc, as Qcom has demonstrated unreal 4 on a ref tablet a while back last year.

http://www.anandtech.com/show/8296/the-nvidia-shield-tablet-review/5
3dmark 1.2 unlimited again 31K for K1, vs. 16K for T4 in shield handheld and that is overall, where the gpu test is 36K for K1 and 17K for T4 (over double). Same page basemark dunes more than double, Hangar benchmark is just over a triple offscreen! T-REX HD is 68fps for K1 to 13.7fps for T4...LOL. IF you look at T4 vs. T3 you see much the same. You could likely do the same on Qcom chips. Apple (imagination) has been doing the same from rev to rev.

http://www.anandtech.com/show/7190/nvidia-shield-review-tegra-4-crossroads-pc-mobile-gaming/5
Shield vs. Nexus 7 (T3 version, Nexus 7 2013 in there too, but look at T3 nexus 7). 19fps vs. 3.5fps in 3dmark ice storm and the graphics score is 7x higher for T4 vs. T3 here. Even the physics side was over double. Ice storm extreme a good 6.5x faster also. Basemark X offscreen shows the same at 4x etc for T4 vs. T3. Way over 2x faster. gfx bench shows the same a page later at almost a triple again.

Why would you not expect a die shrink + finfet on X1 to add another smx? I'm guessing 512 cores vs. 256 of x1 for the xmas chip. Gpu perf has been exploding for every rev from all vendors due to the low hanging fruit. Much like we used to have years ago on desktop. Huge returns from shrinks on mobile still due to how much of a difference it makes to shrink a chip (and more stuff) when dealing with <10w tablets etc. Check each rev of the ipads you'll see the same from apple.

http://www.anandtech.com/show/6472/ipad-4-late-2012-review/4
Ipad4 (32nm a6x) vs. ipad3 (45nm a5x). Once you get to REAL benchmarks (not synthetic crap) like Egypt HD it's a double from 21fps to 40fps for ipad4. Same story, both running same res.
"Despite sub-2x gains in a lot of the synthetic tests, Egypt HD shows us what's possible in a simulated game: the new iPad is roughly twice the speed of the previous gen model when running at the panel's native resolution."

Same benchmark in 1080p offscreen shows 25fps ipad3 vs. 47fps ipad4 (almost a double here too). Do you read the reviews or what? I could run through qcom revs but you should already get the point. Note apple did it with a decrease in die size also. A5x 165mm^2 vs. A6x 123mm^2. Also note same anandtech review they improved battery while doing it. Again, this shows the point. They shrunk the die size of the soc, got doubles in all the gaming stuff, and improved Bat-life. Both ipad 3/4 have 42.5whr batteries.

I can't tell you how many double from 20nm to 14nm because we have no examples yet...LOL. But to assume it won't happen after I've shown you repeatedly it happens each gen, is kind of silly. We're not just talking gpu here though, as they are surrounded by many things that upgrade per rev also (faster mem, clockspeeds etc that all help out each new wave of soc/device revs). The only way I see this not happening for xmas chip (after x1) would be that it's a shrink of the same tech and all the revs before were quite a bit different also for the most part (more features etc as anand points out). But I still say they'll just put 2xSMX etc. I do think it will slow down some now (much like desktops) at 10nm etc but we'll see if they can do 4xSMX (etc for everyone else) or not. If 10nm versions are pascal or something then that might not be relevant and they'll get a double maybe some other way (not directly related core#'s). Either way, so far my point is proven by many soc makers. Until I see a rev NOT double you're probably wrong ;) It has ALWAYS happened so far (and far more than double in many things gpu on socs as shown by NV's revs from T3-T4-K1-X1).

http://www.tomshardware.com/reviews/tegra-k1-kepler-project-denver,3718-3.html
Porting isn't hard. Also NV noted that with all of the ports they've been involved with so far they took less than a few weeks and had SMALL teams doing it and they commented most of the team's time went to mapping the gamepad functions not the porting of code (for portal & Half-life 2). You have OpenGL ES3.1 + AEP already, not sure why you wouldn't think higher levels would come soon. When all the chips can do OpenGL 4.x you don't think there will be an AEP rev2 or something (or a later version after lollipop) than will support these features? You do know ES 3.1/3.2 (surely coming soon for mobile) is a subset of it's big brother 4.x right?
https://developer.android.com/about/versions/lollipop.html
Gameloft etc are already amping stuff up. Games don't look ugly on android these days (or ios).

https://developer.nvidia.com/tegra-android-development-pack
Why do you think NV is releasing tools/debuggers for OpenGL 4.x apps on ARM socs?
"Tegra Graphics Debugger 1.3 with added support for OpenGL ES 3.1 Android Extension Pack, OpenGL 4.3 and 4.4 applications."

Sorry, I just don't see your points. Google wants gaming on android and DX/consoles dead (hence the AEP, and likely full opengl coming soon), and all the socs do what I claimed noted above many times. Just like Wintel wanting windows/Intel everywhere, google wants android/ARM everywhere. As chips move to full opengl (NV isn't going to be alone for long) so will games on android. Then again if ES3.2/4.0 etc can do everything OpenGL 4.x can do there's no need, it's still an easy port just a different way. The whole idea here is to get desktop stuff on mobile, then assault desktops. Period. Games first, then apps, then full blown ARM PC's with discrete gpus etc. Valve (hates the same stuff, DX, Windows, MS store) and others will help the mission. X1's power (and other upcoming socs) won't go to waste as Unreal 4, Unity 5 etc games amp up on mobile.

Unity5 info, directly aimed at many platforms.
http://unity3d.com/company/public-relations/news/unity-announces-unity5
The whole point of both engines (and others coming I suspect) is easy cross-platform stuff.
"Unity 5 brings a wealth of new capabilities and features enabling all developers, from the smallest indie teams to the largest studios, to create amazing games and interactive experiences, and painlessly deploy them to almost any platform."

That's in the 1st paragraph. Clearly this is the point. Unreal4 has same story. Again, easy deployment to everywhere is the point of both engines. Porting older games (such as Half-life 2, portal etc) is easy also (as noted weeks!) since they can do this stuff in 3.1 ES and more with AEP. No doubt more can be done faster as they move to 3.2 ES or 4.0 (glNext? Maybe no 4.0, just everyone goes glNext?)

http://steamcommunity.com/app/41070/discussions/0/630800446936367743/#c630800447016107026
Serious Sam 3 running on K1 talk from devs:
"Yes. We were even more amazed than you are. That's the first version of SS3 for Android. And it would not have been possible without this Tegra miracle. :) This chip looks like something that will knock everyone's socks off. We couldn't believe how good it is. It supports full OpenGL4, and we practically directly ported the entire PC game to it, along with PC content. It was not cut down like e.g. for Xbox 360.

As an additional point, when we compiled the code without errors and ironed the glitches in initialization code, as soon as it started the graphics correctly for the first time, it went right through the intro sequence, and into the game. Without any noticeable visual errors. I think I can safely say this is the first time ever. Normally, we'd have a few weeks of unexplainable black screens, inverted colors, garbage in textures, broken shaders, etc, until it looks ok visually. With K1, this just worked."

EASY. Developer bragging about it here. This is K1...NOT X1...LOL. This is a DEV talking, not me and that was OVER a year ago. X1 will certainly make stuff even easier, and I think we'll see much more as everyone gets to this level later this year or next year. I'm thinking they're holding it for the next chip and a shield handheld update probably in a few months (GDC announcement maybe? who knows). You'd need a big release or two to show off. Note the dev (AlenL working on SS3) says this about OpenGL in response to a guy asking about if it runs that or ES:
"KRON, the engine works on both full OpenGL and the ES. Just that the latter is limited in quite a few aspects. This is the main advantage of K1 - support for full OpenGL."

It doesn't sound like the game is limited by android. He kind of sounds like it's full openGL. Again X1 just ups the game even more. The game could also be delayed by waiting for android stuff though, as he notes commenting to another user linux had an issue with the game until all 4 cores could be 100% on, and was patched. He expects the same for android soon. He notes graphic level on SS3 with K1 was running on medium. I suspect X1 can do high at least and who knows if 14nm can hit ultra (depending on res of device I'd guess etc).

From AlenL post 56 (notes also T4 backporting may not happen, too little power):
"(*) Note though that nvidia's drivers are top of the line. I expect a lot of driver problems with other vendors and that may be a very limiting factor."

We'll see how long it takes for NV to take over mobile gaming devices ;) Devs go where it's easy to port first. We are just beginning, and as I said before NV has major advantages with features now and devs, and as the dev says, DRIVERS that work! Note the post from a guy asking if OpenGL4 is on android (post 70/71) and the devs answer:
"The version we have, yes. But who knows what each vendor puts in. That's why we also support GLES3 and also GLES2 if some additional extensions are present."

http://www.androidpolice.com/2014/07/22/upcoming-shield-games-include-trine-2-war-thunder-the-talos-principle-and-more/
Talos principle uses Serious Engine 4 (SSam3 uses 3.5 IIRC).

http://www.nvidia.com/content/PDF/tegra_white_papers/Tegra_K1_whitepaper_v1.0.pdf
Check out page 19/20. Tim Sweeney's (EPIC unreal guy) comment in there too, saying easily run PC stuff on mobile from here forward.

http://gamersrespawn.co.uk/topic/20395-can-trine-2-on-android-match-xbox-360-and-ps3/
Trine2 K1 dev talk, and comparing versions of PS4, PS3, Xbox360 etc. Note levels mentioned as left off on xbox360/ps3 but included in K1 because optimizing for consoles was tougher (so they gave it up, not so on K1, it runs just like PS4/PC versions he notes). And one more comment here about the porting:
"The studio had a headstart of sorts owing to the proliferation of platforms Trine 2 already supports. The Kepler graphics architecture in Tegra K1 is derived from a full desktop part, and supports every major graphics API. Android ports typically need to be down-converted to the more restricted OpenGL ES - Trine 2 is a pure conversion from desktop code - this is the "secret sauce" in the latest generation of mobile graphics processors, the path that makes mobile a viable target for a huge library of existing titles."

So kind of sounds like if you have the engine/driver/chip support you run FULL OPENGL right? The next paragraph the dev says they used the mac/linux OpenGL rendering as the base. It also doesn't sound like "SIGNIFICANT TUNING" is required. I mean they gave up on monster levels in Xbox360/PS3 versions, but NOT K1. It's all in there. Further:

"Indeed, Trine 2 on Mac OS X and Linux used OpenGL for rendering and that renderer served as the basis for the K1 port."

Not quite sure if I'm getting his statement correct here, but clearly the future won't be scaled down to ES. They clearly say that is the point of K1 forward (from both devs, trine and serious sam guys).

http://www.anandtech.com/show/8043/some-thoughts-on-halflife-2-and-portal-on-shield
"Nvidia has stated that Half Life 2 on Android is a port of the Linux version with OpenGL support to OpenGL ES, and based upon a casual playthrough of both Half Life 2 and Portal, it’s not immediately obvious that there are any issues with the engine port itself. In fact, it runs quite well."

http://blogs.nvidia.com/blog/2014/06/25/google-io-tegra-aep-gaming/
“Through our close collaboration with NVIDIA, Epic’s Unreal Engine 4 ‘Rivalry’ project demonstrated at Google I/O shows what’s possible when PC-class gaming technologies and performance are brought to mobile devices,” said Tim Sweeney, founder of Epic. "In under three weeks, Epic created the “Rivalry” demo. How could something so visually complex be created in such a short period of time? As it turns out, Epic had previously created the scene to show off UE4 engine features. However it was made for a DX11-class PC. Working together, the Epic team and NVIDIA dev tech engineers were able to port the demo, along with new, original content, to Android and AEP. The “Rivalry” demo Google showed was actually running the same high-end desktop rendering pipeline."

The mobile gaming future is bright, any way you want to slice it. I used NV as an example here, but make no mistake others will be here too, it's just a matter of when, and then gaming on mobile explodes for real. You won't be able to claim it's wasted power when everyone can do it (NV device or not).

https://developer.nvidia.com/content/glnext-gdc-2015
glNext coming GDC this week (mar 5th). With Neil Trevett being president of Khronos (Creator of OpenGL ES work group there also, so definitely a mobile guy) and head of Mobile Ecosystems stuff at NV, I'm guessing this will be the version to truly merge mobile/pc etc. So the next version of android (or the one after it?) will probably run this. Everyone will join on this no doubt (Qcom, Apple etc all will use it). You don't want to be left out of the best gaming content.

http://www.pcworld.com/article/2880035/attention-linux-gamers-valve-khronos-to-reveal-next-gen-opengl-successor-at-gdc.html
Watch out DX12. Valve will push this for linux and probably a steamOS for android at some point (or NV will, no brainer for both sides). One more way for Gabe to stick a knife into Bill Gates/DX12/Windows/MS Store ;)

I think we're done here ;)
 

bit_user

Polypheme
Ambassador
Wow, do you have a blog? For such substantial posts, I think most people would want a bigger potential audience.
 

bit_user

Polypheme
Ambassador
BTW, a lot of your citations just support the idea that full-blown OpenGL is needed for these ports to be so easy and successful. And I'm still thinking that NV is a bit early, in that most android devices won't have it.

And yes, porting to just about anything, these days, is going to be easier than XBox 360 & PS3 (especially). Those were rather weird beasts.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


The OpenGL ES 3.1 point, written in the context of "PC-class" gaming features/performance. As in the only thing quoted... If you're at all familiar with OpenGL ES, it isn't remotely comparable to DirectX or OpenGL in feature set, so I found the statement amusing. Not sure if that helps at all, I basically just reiterated my original comment.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


I suppose I should. Many have said that on seeking alpha etc. But I'm not in it for that (yet?). For me it's about stocks for myself, but if someone else benefits great :) The posts also serve to find data OTHERS have read that I just can't get to due to time. So you put it out there to see if there's anything else you haven't read yet.

For now call me paranoid (who can blame me today with all the personal attacks etc). I'd rather be anonymous at least for a while longer...ROFL. One wrong word in the open today, and 1000 jerks come out to ruin your life or business. I have ~20yrs left to work (depending on stocks or how much I like my job at some point maybe less...LOL), so a few dollars from a dumb blog isn't worth having my life possibly devastated. :(
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


If people are that far behind, NV will take over all the high-end stuff. IE tablets at least, android consoles, probably tv's soon etc, but no idea on the phones until NV makes a deal with samsung to drop the suit in exchange for modem use, free fabbing for awhile etc or something. They'll all be racing to full openGL now or maybe it all gets replaced by glNext. Once you port the game it can be sold for years as people get the ability, and since porting is easy it's worth doing even if you have to wait on future sales. You might be correct on Trine2, but for half-life2/portal they were ported to T4, which doesn't have full openGL (shield handheld can play both). Heck T4 doesn't even fully support OpenGL ES 3.0 and these were the FAST ports. Trine2 was more work (time really), but not hard as the devs said. That's why I think they should at least port the best older games first probably, then full OpenGL ports after cash from the quick ports of some really great games starts coming in. Most games don't hit 20mil total even if they're major successes, and a lot of today's gamers have never seen great hits from 5-10yrs ago or more.

Between OpenGL ES3.1 and AEP (which all major soc vendors should support this year) you have a very close feature set to full opengl 4.0 (at least). I'm guessing PowerVR/Qcom/Arm's next gpus will all support glnext or full opengl. I'd also guess glnext is aimed at bringing them together but we'll know for sure in a few days. I don't think feature sets are going to be a problem much longer and tons of unreal 3 engine games (among other engines) are already being ported etc. It's not like you have to port BF4 today or something ;) Take the low hanging fruit that were huge hits and run on pretty much everything that supports ES 3.1 or even better AEP on top which everyone should support by xmas. Geometry and tessellation shaders are both available via ES3.1 + AEP (among other features this combo exposes) so they're already pretty much 4.0. I'd kind of be shocked if everyone isn't on 4.x full soon or just supplanted by glNext (which is aimed at dx12 so surely better than 4.x). IF you're not doing one or the other by next year at some point, you're going to be way behind. Adreno 430 already supports DX11.2/ES3.1 so this has to be pretty close to 4.x too. Imagination, Qcom, ARM should all be there with their next revs.

One more point: IF NV wins the suit, watch for others to license their gpu IP, which of course speeds the whole thing up for all I'd guess. ;) We all know making gpu drivers is TOUGH as devs noted only NV gets this right so far. Even AMD/NV have issues at times over the years on desktops and they've been doing it for 20yrs. Just like the devs said, I think everyone else will have issues with drivers as games amp up which of course would lead to more jumping on NV gpu IP/drivers. I hope AMD joins before NV gets a major foothold or they'll miss yet another party. Gpu's taking over mobile could make them a player again or at least some decent profits. NV's now at 70% discrete and Intel's been over 80% on cpu for a while. There is plenty of room for NV/AMD to take over mobile gpus and both know DRIVERS compared to the entire soc army. Both would just be replicating desktop success yearly, while the rest have to figure this crap out for years probably. All the good driver people work at AMD/NV, as Intel proves they can't even get it right after decades...IMG.L was forced out ages ago too. They're too small to keep up as gaming amps IMHO and if apple doesn't buy them I think they're dead at some point on gpu front (already driven from desktops before and now mobile becoming desktops gpu wise anyway). It's certainly going to get interesting very soon :)
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


ES 3.1 is a subset of OpenGL 4.3. OpenGL ES 3.1 adds compute shaders, stencil textures, accelerated visual effects, high quality ETC2/EAC texture compression, advanced texture rendering, standardized texture size and render-buffer formats, and more.
https://www.opengl.org/registry/doc/glspec45.compatibility.pdf
page 30 (or page 5 depending on how you look at it). Add on top AEP extensions in android (which the chip discussed here can do as well as X1 and others) and you get even more stuff such as geometry/tessellation shaders exposed, ASTC texture compression, per-sample interpolation and shading, and other advanced rendering capabilities. It's certainly not as far off now as 3.1 alone.

https://developer.qualcomm.com/blog/new-tool-helps-game-developers-ease-transition-console-pc-mobile
qcom's take a week or so ago. Discussing AEP DirectX etc.

Unreal Engine 4.7 (released a few days ago) now supports ES 3.1+AEP for android. I'm fairly certain a lot of gamers will be happy with the quality that we'll see coming from this combo soon. They already have some pretty good looking stuff now. IE Trine2 was just out in Dec 2011, and the port to android is better than xbox360's in some cases (they left off levels because they were too complex to attempt, but not the case on the K1 port), though missing fxaa post-processing anti-aliasing seen on console is on the Android build.. But that might just be because K1 was too slow. X1 should change a few things and the chip this article here is about should be well above K1 also. I'm not saying it's as good, but saying "not even remotely comparable" seems a bit of a reach.

http://www.eurogamer.net/articles/digitalfoundry-2014-can-trine-2-on-android-match-xbox-360-and-ps3
Just over a year ago, where the dev talks about trine 2 porting, perf etc on K1. This was before AEP also, so games will be better going forward. I'm thinking an X1+AEP port of something will obviously be better than a trine2 port to K1.

What huge features are missing that makes you say not remotely comparable? It's not feature for feature but we're not talking DX7 levels here or something.

http://blog.imgtec.com/powervr/powervr-series7xt-gpus-push-graphics-and-compute-performance
Just realized they support OpenGL 4.4 also. So NV isn't so alone I guess (just early, which figures as a PC gpu company). I wonder why that wasn't in the toms article discussed here. So I'm pretty sure Adreno 450 or 530 (S815 or S820) will come with Full OpenGL 4.4 also. Which means Google will add it to android shortly anyway. So I guess this whole thing is kind of moot...LOL. ARM themselves can't be far behind with mali and then you have everyone on it, so leaving it out of android would be stupid (or glNext support if not OpenGL 4.4/4.5 etc itself). So no worries, 3.1 and even the first rev of AEP won't be with us for long before totally updated. My guess is we'll have it all by xmas for Nov Nexus device, or at worst Q1 (S820 stuff out by then too).
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
http://www.pcper.com/reviews/General-Tech/GDC-15-What-Vulkan-glNext-SPIR-V-and-OpenCL-21

There you go, Vulkan is Vulkan. No more ES. It runs everywhere. So like I assumed before, they merged now.
"Outside of Windows, the Khronos Group is the dominant API curator. Expect Vulkan on Linux, Mac, mobile operating systems, embedded operating systems, and probably a few toasters somewhere.

On that topic: there will not be a “Vulkan ES”. Vulkan is Vulkan, and it will run on desktop, mobile, VR, consoles that are open enough, and even cars and robotics. From a hardware side, the API requires a minimum of OpenGL ES 3.1 support. This is fairly high-end for mobile GPUs, but it is the first mobile spec to require compute shaders, which are an essential component of Vulkan."

I would assume google will add support for this ASAP, as most announced gpus can do ES 3.1 or more (X1 etc). Probably the Nov Nexus device will be announce with this and whatever they call X1's replacement at 14nm ;) Google will want to go with the best gaming driver team in the business for this (unless AMD joins soon, that pretty much leaves us with NV for now). They say they will still continue with all the old stuff though for certain needs (openGL/ES will still be upgraded I guess etc). We'll have to see how all this plays out this year as we get more data on this and exactly how it works (used as full on replacement for ES or not etc? we'll see).
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
Is Vulkan glNext, or is that coming later at GDC? I see glNext guy speaking on the 5th, but confused if this is it or something else. At least the roster of speakers and sessions seems pretty exciting :)

https://www.khronos.org/news/events/gdc-2015
https://developer.nvidia.com/gdc-2015
The first link for everybody (all the big players have speeches), and 2nd is pure NV stuff at GDC. Interesting week from all.
 

iBoob2

Reputable
Mar 15, 2015
3
0
4,510
@somebodyspecial

Love your take on the whole dynamics of the industry here. Thanks.

I wonder what your thoughts are about the new unveiling of the Nvidia Shield Console are? Do you think Apple will try to compete this fall with a console (Apple tv 4?) with monster specs using this GT9700 chip? And let's say they do attempt to compete with Nvidia's Shield, how would they fare in your opinion?

Nvidia provided the GPU with Sony for the PS3 which is the reason we see Capcom games available for the Shield. It's familiar territory for Capcom and porting over using Nvida's familiar architecture just made sense for them, the failing company that they are. If that's true, then that would mean any developer that made a game for PS3 could also bring it over to the Shield (providing no other contracts persists). I really can't see Apple being a major player in the gaming department now or in the future unless they have some major major game devs on board.

This whole set-top box/console business may become obsolete sooner than anybody thinks once it becomes integrated directly into the televisions themselves (ala Smart TV's). That's the one main piece of technology that's a staple in living rooms across the globe. The only issue I see is the life cycle of televisions right now are what, 5-7 years long? That would have to be cut down to 1-2 years in order for everybody to turn a profit then. Which can happen if prices dropped significantly. If people are dumb enough to buy practically the same phone every year for $600+, I don't see why TV's won't be the same. The only thing stopping it is the sheer size and bulk of the product. And then TV's themselves will become obsolete once VR/Holograph tech invades our living rooms... but that's quite a few years away I think.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Thanks :)
Over time I think they will be a huge success (I mean Nvidia and the android gaming ecosystem as a whole), as their goal here is to kill WINTEL (mostly Intel, but DirectX also as NV can't make an x86 box) and make a full fledged ARM based PC, with discrete gpu, 500w psu, 16-32GB memory etc etc...All the stuff in a gaming PC but without the Intel $350 chip, and no Windows LIC fee. Nvidia would love to stab Intel in the back for the chipset death with their own cpu (denver, and all revs after it) and at some point locking Intel out of their gpu usage. Of course that is years of android games getting better away so to speak. AMD also needs to be much weaker before doing it, so Intel has no other choice but themselves for gaming cards. NV wants part of Intel's 60B in revenue.

A mere 10% of it more than doubles the revenue of the company and I think they'll do way better than that just due to sheer numbers of android units out there that are about to get FAR better gpus than last year etc. At 14/16nm they'll all be at least K1 levels and grow in power each year. All they have to do is double the die size and sell it for $100-150 at 4ghz etc, and you have a very nice competition (assuming slapping on PC style heatsink/fan etc, everything a PC has). But you have to have more games before doing this, which is why NV aimed low with the console at $200 I guess. Get cheap consumers first, causing more units, massive amount of ports and finally some new content that pushes the limits and warrants a full ARM PC. Then come the 64bit full fledged apps and it's an all out war ;)

Apple will compete, but may just use NV as Imagination is named in the suit over soc gpus, and they make less than 60mil a year and had to borrow 20mil just to get mips purchased (meaning broke basically compared to all other players). Apple's problem would probably be price, which would keep them from dominating here, and also they have no gpu of their own yet. I think many will end up using NV gpu IP for a price. We'll find out more in June/July as the case progresses. Things can happen much faster above, if NV ends up the defacto android standard in all socs. I don't think anyone can make a gpu today without losing a suit to NV/AMD or maybe Intel. AMD is likely just waiting (strapped for cash) until NV's suit is done and can be used as evidence in their own case if they have anything worth fighting over. They've lost 6Bil in a dozen years so might not have a ton of patents (where NV has 7000+, and Intel couldn't get out of them either). Apple only gets ports right now due to them hitting 10% market share in PC's (macs I mean). They don't appear to be making gaming the center of all things like Google is actually SAYING they are doing. Google gets that games will bring units, units brings devs, and devs (games that is) bring users to play the games. We all just go where the games are.

Unless TV's are upgradable I doubt they'll sell yearly like a phone/tablet. But you could have a module that could pop in and replace SOC/mem/storage I guess. But the console/handheld device is portable to anyone's house, on the road etc. NV will keep revving tablet, console, and handheld while growing units and games specialized for their hardware. They'll do the same thing they did on PC so well (AMD has a chance to get in here too, if they can before NV has a huge game advantage optimized for their hardware). Consoles can be tossed for better versions far more easily than giving up a $700+ TV. There can be multiple gens out at once also (like xbox360/xbox1) just at different resolutions or something making it easy for devs to focus on the game knowing the underlying hardware is the same just with more cores/mhz etc. Vulkan (common api) will help this out in the next few years. Ports will get easier and easier and even putting out a game on all platforms at once will too (IE, unreal can output to multiple formats, just like Unity etc). We just need another year of consolidating hardware features (everyone on ES3.1 or better Vulkan, etc) and android will really take off as devs can see all can do xbox360+ levels of detail etc and they get faster yearly. 14/16nm and beyond really ups their game.

http://www.joystiq.com/2015/01/15/gdc-survey-esports-rising-consoles-cresting/
PC/Mobile still leading devs interest by double. More added to console this year, but I think they're just as an afterthought after the PC/mobile versions are done (where the massive units are in both cases). Interesting times ahead as mobile amps up power/perf making consoles pointless (xbox1/ps4 that is). I don't believe there will be another gen of regular consoles (MS/Sony/Nin), and I'll be surprised to see the current ones make it as SOCS up power making them worthless due to sheer number of units on mobile. Also as they ramp up streaming of games (and get all the kinks out) they'll be able to blow by consoles already. A ton of GRID servers placed in specific locations around the world could mitigate latency as profits come in making them even better for poor people that can't afford to upgrade PC's constantly, don't have money for a console or the $60 games that they come with, and like cheap games on android when not streaming. I think they will sell a good number of the NV units and gain more yearly as the game catalog ramps from stream side or android side. Imagination will lose AGAIN to NV. They drove them from the PC years ago, and as an even weaker company today (vs. very strong NV) it will happen again. Maybe apple buys them at some point or nv, the better choice, with all the patents for sue happy apple to trample everyone. That wouldn't make me happy as a consumer, but with 140B and 200B expected in cash this xmas, it's a simple cash buy for apple who currently has more money than they know what to do with.
 
Status
Not open for further replies.

TRENDING THREADS