1080p and 1366x768?

Status
Not open for further replies.

wheresperry

Honorable
Jul 13, 2012
27
0
10,530
Hi everyone,

I feel like this is a question that is based at least on some solid thought processes, but on the other hand I can't help but feel I'll get laughed off the internet if I ask it :na:

Would a laptop with 1366x768 resolution be able to play 1080p videos? If so, would they be true 1080p?

Thanks in advance, lol
 
Solution
Which laptop are you looking at, and how much are you paying for it?

The objective answer to your question is yes, 1920x1080 media will scale down to 1366x768, and no, it will not be true "1080p".

But if it is reasonable to do so (and if this laptop is 15.6"), then you should make a point to avoid 1366x768 resolution in a 15.6" display. 15.6" 1366x768 displays make things onscreen large, and tend to have very poor image quality due to low contrast. This, however, is not the case for 15.6" 1920x1080 displays.

bucknutty

Distinguished
Its the players job to scale the video down to fit on the screen. For example when you grab the corner of media player and make the window fit in the corner of the screen. MS media play just scaled that video to 480x270 or what ever the size of you box was.

In short the video will play and look great but it will only play at the max resolution of your monitor.
 

edit1754

Distinguished
May 14, 2012
233
0
18,760
Which laptop are you looking at, and how much are you paying for it?

The objective answer to your question is yes, 1920x1080 media will scale down to 1366x768, and no, it will not be true "1080p".

But if it is reasonable to do so (and if this laptop is 15.6"), then you should make a point to avoid 1366x768 resolution in a 15.6" display. 15.6" 1366x768 displays make things onscreen large, and tend to have very poor image quality due to low contrast. This, however, is not the case for 15.6" 1920x1080 displays.
 
Solution

wheresperry

Honorable
Jul 13, 2012
27
0
10,530

http://shop.lenovo.com/us/laptops/ideapad/y-series/y570

I know, I know. There is a good deal floating around out there for a better HP with a 1920x1080 screen... but I just don't really like it that much. I used one of my friend's laptops for a fairly long period of time (1366x768, 15.6", same as the Lenovo) and honestly there was never a point where I had a negative thought about the screen, and I'm very aware when it comes to visuals.
 

edit1754

Distinguished
May 14, 2012
233
0
18,760
Consider the ASUS N53SM-AS51 instead.
- Since version of the GT 555M included with the Y570 is actually a misbranded higher-clocked GT 540M / GT 630M, the GT 630M GPU in this ASUS is actually somewhat similar. The GT 630M and the Y570's GT 555M both have 96 Shaders, and both receive lower benchmark scores than the 'true' GT 555M with 144 lower-clocked shaders does. The Y570's GPU tends to benchmark closer to a GT 630M than to a 'true' GT 555M. The Lenovo's GPU does have faster memory though.
- Buying from Newegg does not incur sales tax charges, but buying from Lenovo.com does. Therefore, the price difference is actually less than it appears to be.
- The ASUS N53SM-AS51 includes a Core i5 processor, not a Core i7. However, it should be one of the least of your concerns which type of processor you get. Gaming is bottlencked by the GPU far before it matters what type of CPU you have, especially when you have a lower-midrange GPU such as the ones in question here. And for basic usage (multitasking, movie watching, MS Office, email, web browsing, etc.), it makes essentially no difference what type of processor you have. Any perceived slowness during basic usage will be due to the hard drive speed, and can generally only be remedied by installing an SSD.

http://www.amazon.com/gp/product/B007MW73A4/ref=s9_simh_gw_p147_d0_i1?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-3&pf_rd_r=1G4R11RPSPA91D9HF1TF&pf_rd_t=101&pf_rd_p=470938811&pf_rd_i=507846

But one of the biggest issues with 15.6" 1366x768 displays, even if you don't notice how little they allow you to fit onscreen, is that they generally cannot properly reproduce dark colors since they tend to be cheap low-tier LCD panels. Go find a 15.6" 1366x768 laptop, and display a dark image on the screen. You should notice the issue right away that black isn't dark, and actually appears as a grayish, purple-ish, or bluish gradient from the top to the bottom of the screen-- sometimes with a quick "dark point" after which it starts getting lighter again. This will change depending on how you tilt the screen. You will really notice the issue if you compare it to any decent desktop monitor.
 
Sorry, but shaders do not matter more than clocks for GPUs. As shader count increases, performance scaling form that shader count drops exponentially because there are limits to how parallel a job can be run. Clock frequencies scale up performance almost perfectly all of the time unless the memory bandwidth hold the GPU back. This is why the 1280 shader Radeon 7870 at 1GHz can just about match the 1792 shader Radeon 7950 that's at 80MHz. The 7950's shader count advantage is much higher than the 7870's clock frequency advantage and the 7950 even has a substantial memory bandwidth advantage, yet the 7950 is not noticeably faster than the 7870.

This phenomenon's exponential nature means that low shader counts can scale better than higher ones (going from 512 to 1024 scales better than going from 1024 to 2048), so low end GPUs have this effect minimized in comparison to higher end GPUs, but they still don't scale from shader count increases quite as well as from sheer clock frequency increases. The same is much less true for embarrassingly parallel compute tasks, but it is nearly unavoidable with gaming performance.
 

wheresperry

Honorable
Jul 13, 2012
27
0
10,530

I honestly don't know a lot about computers; could you do your best to explain this in a "simpler" way?

If it matters, these are the specs for the 555m, with the one in the Lenovo bolded:

144 cores 709MHz (GF106), 128Bit GDDR5, e.g. MSI GX780
144 cores 590MHz (GF106), 192Bit DDR3, e.g. Dell XPS 17, Alienware M14x
144 cores 590MHz (GF106), 128Bit DDR3, e.g. Schenker XMG A501 / A701 (Clevo W150HRM / W170HN)
96 cores 753MHz (GF108), 128Bit GDDR5, e.g. Lenovo Y570p / Y560p
144 cores 525 MHz (GF116), 128 Bit DDR3, e.g. Medion Akoya P6812
 


http://en.wikipedia.org/wiki/Amdahl%27s_law
That gives a more detailed explanation of it. As you spread a task among more and more threads, it gets more and more difficult to not only utilize them well with complex tasks, but with a GPU, the cores aren't the only aspect of performance even within the hardware to worry about. No matter how many cores you have, there are also the ROPs and more to consider. Increasing the GPU frequency will also increase the frequency of the other hardware, but increasing core count doesn't increase their performance when you don't also increase he number of ROPs and memory bandwidth to accommodate them. This is just one example. All of this along with the link that I posted at the top of this post cause GPU shader scaling to get worse and worse, among even more reasons (such as the increasing distance as measured in transistors between each part of hardware).

I'm not very familiar with many lower end mobile Nvidia cards, but if I had to guess, I'd think that the MSI would somewhat beat the Lenovo which would beat the rest of the other laptops more significantly. The Lenovo's GPU as the higher frequency, but a less than 10% higher frequency is unlikely to beat a 35% core count advantage. However, that would just be in terms of raw graphics performance... Which laptop is overall better for you might not depend strictly on the graphics performance. How the mentioned Asus fits in here, I would need more time to look into and I'm out of time for the day until much later tonight.
 

wheresperry

Honorable
Jul 13, 2012
27
0
10,530

Thanks.

1st bolded: Could you elaborate on this a bit? I think I get what you're saying, but the wording might be a little weird and I want to make sure.

2nd bolded: This is the thing, I've fallen in love with the Lenovo. The Asus, while a great laptop and obviously a very good company, just seems a bit... I don't know... soulless/generic to me. I'd be willing to slightly compromise to get the Lenovo, which I would thoroughly enjoy using.

How much better would you say the Asus performs regarding gaming than this Lenovo?

http://shop.lenovo.com/SEUILibrary/controller/e/web/LenovoPortal/en_US/catalog.workflow:item.detail?GroupID=457&Code=08626JU&category-id=F41BC8E1656C5F85171FBDB172571912

(coupon gets it down to $700 flat)

Thanks for the feedback everyone.
 

edit1754

Distinguished
May 14, 2012
233
0
18,760
The ASUS doesn't perform better for gaming, but it looks better for gaming and for everything else due to its significantly better display quality, and it is also better for general usage because you can fit more on your screen.
 


1. I was saying that I don't deal much with Nvidia's low end graphics cards (I'm working slowly into them), but judging from the specs of each graphics setup, the graphics in the MSI beats the Lenovo's graphics somewhat and the rest of the laptops don't even come close to the MSI and the Lenovo.

I think that edit1754 answered your other question excellently.
 

cbrunnem

Distinguished


then how come the new 680 has 1500 plus cores?
 


What do you mean? The 680 has 1512 CUDA cores simply because that is how many Nvidia chose to have. Shader count increases don't scale perfectly, but they do scale upwards and are an easy way to do so. What do you think is an easier way to get say another 40% performance, increasing the core count, or redesigning the cores for greater performance and/or die-shrinking them (still meaning that a redesign is necessary, just not as much of one if you go strictly for a die shrink and not for a new architecture)? The answer is simple. It is much easier and cheaper in R&D to make a larger GPU with more cores than it is to die-shrink a GPU or even worse, creating a new architecture. Companies have to do these things often anyway because increasing core count can only be done to so much of an extent, but it does make for a great way to distinguish between lower end and higher end cards.
 

cbrunnem

Distinguished


ah i thought you where talking about core counts. should have read more carefully
 
Status
Not open for further replies.