Nvidia Tegra K1 Benchmarks from Lenovo ThinkVision 28

Status
Not open for further replies.

therealduckofdeath

Honorable
May 10, 2012
783
0
11,160
@esrever Tablets and phones tend to use identical or very similar processors today. They're good enough to get a reference.And as I suspected, the Nvidia hyperbole about it being as powerful as a desktop PC or high-end console was nothing but hyperbole.... :)
 


Even when listed as the same part many tablet SoCs will run faster then phone SoCs due to thermal throttling on phones. Tegra 4's power consumption is why its only in a few tablets and not in an phones.
 

fteoOpty64

Distinguished
Apr 15, 2006
22
0
18,510
This is a nice "surprise" which I think was well done and discretely persuaded by Lenovo. It is one way to unofficially demonstrate that K1 is already in "risk production" thus has the potential to throw competitors in a jam. Here is looking at you QC!.A display like the Thinkvision28 is inovative in that it can be used for Android games while connected to be a main or secondary display for a PV or laptop. For larger tablets of 12 inches, the cpu power can certainly be useful. So let the super-tablets start competing ....
 

kratosbellic

Distinguished
May 6, 2011
134
0
18,690
what next compare r9 290x with a phone, and see how revolutionised pc platform isnext show phones that have Tegra k1 and then compare it with Nexus 5 or GS4
 

therealduckofdeath

Honorable
May 10, 2012
783
0
11,160
@esrever The Nexus 7 Tegra 3 was clocked lower than a lot of contemporary phones with that chip. It's not quite that easy. The Galaxy Note 3 and Note 10.1 2014 edition use identical processors. It's not quite like laptops and PC's on the phone/tablet market. They usually tend to use identical processors. Differences in performance are often down to type of RAM and storage. The Tegra 4 was a bad, bad design, that's why it was almost not used by anyone.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Isn't the Snapdragon 805 supposed to have a %40 graphics performance gain over the 800? If so wouldn't that put the 805 in a dead heat with the K1 running full 2.5GHz clock speed?
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Isn't the Snapdragon 805 supposed to have a %40 graphics performance gain over the 800? If so wouldn't that put the 805 in a dead heat with the K1 running full 2.5GHz clock speed?
 

fteoOpty64

Distinguished
Apr 15, 2006
22
0
18,510


That is the claim and also the 805 uses 128bit memory bus which gives it ram bandwidth advantage but the trick here is which SoC will ship first in volume ?. Timing is the key as products will be launched based on manufacturer's strategy to capture the most market share while stomping the competition using benchmarks as a measurement point. I suspect the 12 inch tablets will finally be laptop replacements in a serious way just becuase they can have strong cpu and great gpu to match its bigger battery size.
Also the claim of QC maybe a max of 40% but averaging around 25% which not bad but not great compared to K1. One point I am puzzled with is that K1's gpu claimed to be clocked at 950Mhz which seems excessively high for mobile as most clock at around 400Mhz. How to really keep power drain down to under 3w for the gpu ?. Or maybe it can spike to 8W like the Exynos 5250 showed in Nexus 10 ?. The aggressive power gating would just limit performance.
 

jasonelmore

Distinguished
Aug 10, 2008
626
7
18,995


wtf dude? Tegra 4 is not a bad design, its pretty damn powerful. Look at the Tegra 7 Tablet benchmarks, or even shield for that matter. It blows everything else out of the water, including apple's A7 which uses the latest Rouge GPU that is supposed to be "bleeding edge".

It came to the market a few months late. Typical Smarphone launches are in spring, and Tegra 4 came out in June with Shield. Design wins are not what Nvidia wanted but they still did a great job considering they are stuck on the same process node like everyone else, and to this day, its one of the only mobile soc's that can do Open GL 3.0
 

blubbey

Distinguished
Jun 2, 2010
274
0
18,790
Imo it'd be nice to see how battery life/power consumption will advance in a few years. Supposedly it's ~5W right? It'd be awesome to see a shield 5 light (for example) with similar performance to this but it can last more than a day playing games. The battery focused counterpart to the full fat, performance orientated shield. Still what amazes me is that in less than 10 years the 360 has been put in the palm of your hand and soon you can probably play quite high fidelity (PS4/X1 rival) games, that's hilariously awesome.What's quite sad is that the Wii U's GPU is less powerful than this too. That came out around 2012 and less than 2 years later mobile has caught up. In less than 5 years you'll be hard pressed to find a high-end phone with as little performance as it.
 

Robert Sydbrink

Honorable
Oct 11, 2013
2
0
10,510
I tought it would score higher on the graphics cuz it has alot more power then the Adreno 330.And how can the GPU in A7 with less calculation power get a higher score? Is every damn x-platform benchmark better optimized for iOS?Havent seen one that gives fair numbers yet.
 

therealduckofdeath

Honorable
May 10, 2012
783
0
11,160
@jasonelmore The problem is, it draws too much energy. No point in having a 1000 Meggahurtz and all the bells and whistles if the phone or tablet runs out of juice in a few hours. Let's just hope they've learnt that with the K1.
 

teh_chem

Honorable
Jun 20, 2012
902
0
11,010
Same old story from the Tegra line year after year at tech shows. Blazing benchmark results in demo devices vs. existing processors, little-to-no details on the platform, then lackluster comparisons to current processors by the time its released. Let's just see how it pans out. Tegra 1-4 were overhyped. Where are all of these Tegra 4 and 4i devices? So far, relatively few tablets are using Tegra4 despite being out for 4-6 months, and 4i apparently still hasn't deployed to devices. And now allegedly "Tegra5" still won't have on-chip LTE support. I'm curious what Nvidia's internal direction with Tegra really is.
 
Status
Not open for further replies.