AMD CPU speculation... and expert conjecture

Page 467 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

I don't care what you set the resolution to, only what block 54 of EDID reports. The only way to find out is to use an EDID reader to tell you the contents of block 54, Extron provides a really good free one.

http://www.extron.com/product/software.aspx?id=edidmanager

Here is an example of a certified 42 inch 1080p LG XCanvas display bought a few years ago.

http://www.lg.com/us/tvs/lg-42LG50-lcd-tv

Time: 8:38:21 AM
Date: Monday, February 24, 2014
EDID Manager Version: 1.0.0.14
___________________________________________________________________

Block 0 (EDID Base Block), Bytes 0 - 127, 128 BYTES OF EDID CODE:

0 1 2 3 4 5 6 7 8 9
000 | 00 FF FF FF FF FF FF 00 1E 6D
010 | 01 00 01 01 01 01 06 12 01 03
020 | 80 73 41 96 0A CF 74 A3 57 4C
030 | B0 23 09 48 4C AF CF 00 31 40
040 | 45 40 61 40 81 80 A9 40 01 01
050 | 01 01 01 01 66 21 50 B0 51 00
060 | 1B 30 40 70 36 00 C4 8E 21 00
070 | 00 1E 02 3A 80 18 71 38 2D 40
080 | 58 2C 45 00 C4 8E 21 00 00 1E
090 | 00 00 00 FD 00 30 58 1F 64 11
100 | 00 0A 20 20 20 20 20 20 00 00
110 | 00 FC 00 4C 47 20 54 56 0A 20
120 | 20 20 20 20 20 20 01 83

(8-9) ID Manufacture Name : GSM
(10-11) ID Product Code : 0001
(12-15) ID Serial Number : N/A
(16) Week of Manufacture : 6
(17) Year of Manufacture : 2008

(18) EDID Version Number : 1
(19) EDID Revision Number: 3

(20) Video Input Definition : Digital
(21) Maximum Horizontal Image Size: 115 cm
(22) Maximum Vertical Image Size : 65 cm
(23) Display Gamma : 2.50
(24) Power Management and Supported Feature(s):
RGB Color, Non-sRGB, Preferred Timing Mode

(25-34) Color Characteristics
Red Chromaticity : Rx = 0.636 Ry = 0.336
Green Chromaticity : Gx = 0.300 Gy = 0.690
Blue Chromaticity : Bx = 0.134 By = 0.034
Default White Point: Wx = 0.282 Wy = 0.297

(35) Established Timings I

720 x 400 @ 70Hz (IBM, VGA)
640 x 480 @ 60Hz (IBM, VGA)
640 x 480 @ 72Hz (VESA)
640 x 480 @ 75Hz (VESA)
800 x 600 @ 56Hz (VESA)
800 x 600 @ 60Hz (VESA)

(36) Established Timings II

800 x 600 @ 72Hz (VESA)
800 x 600 @ 75Hz (VESA)
1024 x 768 @ 60Hz (VESA)
1024 x 768 @ 70Hz(VESA)
1024 x 768 @ 75Hz (VESA)
1280 x 1024 @ 75Hz (VESA)

(37) Manufacturer's Timings (Not Used)

(38-53) Standard Timings

640x480 @ 60 Hz (4:3 Aspect Ratio)
800x600 @ 60 Hz (4:3 Aspect Ratio)
1024x768 @ 60 Hz (4:3 Aspect Ratio)
1280x1024 @ 60 Hz (5:4 Aspect Ratio)
1600x1200 @ 60 Hz (4:3 Aspect Ratio)

(54-71) Detailed Descriptor #1: Preferred Detailed Timing (1360x768 @ 60Hz)

Pixel Clock : 85.5 MHz
Horizontal Image Size : 708 mm
Vertical Image Size : 398 mm
Refresh Mode : Non-interlaced
Normal Display, No Stereo

Horizontal:
Active Time : 1360 Pixels
Blanking Time : 432 Pixels
Sync Offset : 64 Pixels
Sync Pulse Width: 112 Pixels
Border : 0 Pixels
Frequency : 47 kHz

Vertical:
Active Time : 768 Lines
Blanking Time : 27 Lines
Sync Offset : 3 Lines
Sync Pulse Width: 6 Lines
Border : 0 Lines

Digital Separate, Horizontal Polarity (+), Vertical Polarity (+)

Modeline: "1360x768" 85.500 1360 1424 1536 1792 768 771 777 795 +hsync +vsync

(72-89) Detailed Descriptor #2: Detailed Timing (1920x1080 @ 60Hz)

Pixel Clock : 148.5 MHz
Horizontal Image Size : 708 mm
Vertical Image Size : 398 mm
Refresh Mode : Non-interlaced
Normal Display, No Stereo

Horizontal:
Active Time : 1920 Pixels
Blanking Time : 280 Pixels
Sync Offset : 88 Pixels
Sync Pulse Width: 44 Pixels
Border : 0 Pixels
Frequency : 67 kHz

Vertical:
Active Time : 1080 Lines
Blanking Time : 45 Lines
Sync Offset : 4 Lines
Sync Pulse Width: 5 Lines
Border : 0 Lines

Digital Separate, Horizontal Polarity (+), Vertical Polarity (+)

Modeline: "1920x1080" 148.500 1920 2008 2052 2200 1080 1084 1089 1125 +hsync +vsync

(90-107) Detailed Descriptor #3: Monitor Range Limits

Horizontal Scan Range: 31kHz-100kHz
Vertical Scan Range : 48Hz-88Hz
Supported Pixel Clock: 170 MHz
Secondary GTF : Not Supported

(108-125) Detailed Descriptor #4: Monitor Name

Monitor Name: LG TV

(126-127) Extension Flag and Checksum

Extension Block(s) : 1
Checksum Value : 131

___________________________________________________________________

System Information Summary:

Processor : AMD A10-6800K APU with Radeon(tm) HD Graphics
Operating System : Microsoft Windows 7 Ultimate
OS Version : 6.1.7601
Service Pack : 1.0

Video Controler:

Device ID : VideoController1
Name : AMD Radeon HD 8670D
Adapter Compatibility: Advanced Micro Devices, Inc.
Video Processor : AMD Radeon HD 8670D (0x990C)
Video RAM : 1.00 GB (1,073,741,824 Bytes)
Availability : Running or Full Power
Driver Version : 13.251.0.0
PNP Device ID : PCI\VEN_1002&DEV_990C&SUBSYS_99011849&REV_00\3&267A616A&0&08
Resolution : 1360x768
Max Refresh Rate : 75 Hz
Min Refresh Rate : 23 Hz
Current Refresh Rate : 60 Hz
Current Scan Mode : None-Interlaced
Status : OK


___________________________________________________________________

Block 54 states preferred resolution 1360x768 as nearly all HDTV's do (well any made before Gen 10 motherglass). Block 72 states that it can do 1920x1080 which makes it a 1080p display (not an 1080I or whatever BS you want to add). 1360x768 is not 720p, 1280x720 is 720P. 1360x768 is not 1080I, 1080i is 1920x540 scanned twice. 1360x768 is the internal resolution the screen's scaler use's to convert the incoming HDTV signal into the proper amount of physical elements on screen.

To be 1080P a display doesn't need 1920x1080 as a native resolution, that's a VERY common misunderstanding. To be "1080P" the display merely needs to accept a 1080P/24 signal.

Now if you happened to get a display with 1920x1080 in block 54 then count yourself extremely lucky as those were incredibly rare for screens under 50 inch's in size.
 

jdwii

Splendid


Yeah i'm wondering what he means to i missed that whole comment facts don't lie and the PS3 actually came out ahead by a little more like a tie to me and the Wii did win in hardware sales.
 


Initially the 360 had a huge lead on the PS3 due to better exclusive software, honestly Halo is what sold the 360. Once Bluray started picking up steam and sony got a few titles under their belt sales started catching up. And now PS4 is continuing on those sales as MS has no good system exclusive launch titles. With most software being multi-platform there is, quite literally, no reason to buy a XBONE over a PS4 while the PS4 enjoys the reputation established with the PS3. Not to mention the RRoD problems the 360 became associated with.

Product sales are all about marketing and establishing an image, not about technical specs. Nobody remembers technical specs, only enthusiasts even argue over it.
 

jdwii

Splendid


Downloading the software now...However i don't see how that makes the TV 720P instead of 1080P when its just the preferred resolution? Also if text looks more crisp at 1080P vs 720P doesn't that mean its a 1080P TV or at least closer to being one?

EDIT here is what i got i'm the lucky one


Time: 11:25:33 PM
Date: Sunday, February 23, 2014
EDID Manager Version: 1.0.0.14
___________________________________________________________________

Block 0 (EDID Base Block), Bytes 0 - 127, 128 BYTES OF EDID CODE:

0 1 2 3 4 5 6 7 8 9
000 | 00 FF FF FF FF FF FF 00 04 72
010 | 65 32 01 00 00 00 31 14 01 03
020 | 80 46 27 78 0A 11 3D A5 53 4C
030 | 9A 26 0F 47 4A BF 6F 00 71 4F
040 | 81 C0 D1 C0 B3 00 81 80 95 00
050 | 01 01 01 01 02 3A 80 18 71 38
060 | 2D 40 58 2C 45 00 BA 88 21 00
070 | 00 18 01 1D 00 72 51 D0 1E 20
080 | 6E 28 55 00 BA 88 21 00 00 1E
090 | 00 00 00 FD 00 38 4C 1E 4B 0F
100 | 00 0A 20 20 20 20 20 20 00 00
110 | 00 FC 00 41 54 33 32 36 35 0A
120 | 20 20 20 20 20 20 01 C9

(8-9) ID Manufacture Name : ACR
(10-11) ID Product Code : 3265
(12-15) ID Serial Number : N/A
(16) Week of Manufacture : 49
(17) Year of Manufacture : 2010

(18) EDID Version Number : 1
(19) EDID Revision Number: 3

(20) Video Input Definition : Digital
(21) Maximum Horizontal Image Size: 70 cm
(22) Maximum Vertical Image Size : 39 cm
(23) Display Gamma : 2.20
(24) Power Management and Supported Feature(s):
RGB Color, Non-sRGB, Preferred Timing Mode

(25-34) Color Characteristics
Red Chromaticity : Rx = 0.641 Ry = 0.321
Green Chromaticity : Gx = 0.297 Gy = 0.603
Blue Chromaticity : Bx = 0.148 By = 0.058
Default White Point: Wx = 0.276 Wy = 0.290

(35) Established Timings I

720 x 400 @ 70Hz (IBM, VGA)
640 x 480 @ 60Hz (IBM, VGA)
640 x 480 @ 67Hz (Apple, Mac II)
640 x 480 @ 72Hz (VESA)
640 x 480 @ 75Hz (VESA)
800 x 600 @ 56Hz (VESA)
800 x 600 @ 60Hz (VESA)

(36) Established Timings II

800 x 600 @ 75Hz (VESA)
832 x 624 @ 75Hz (Apple, Mac II)
1024 x 768 @ 60Hz (VESA)
1024 x 768 @ 70Hz(VESA)
1024 x 768 @ 75Hz (VESA)
1280 x 1024 @ 75Hz (VESA)

(37) Manufacturer's Timings (Not Used)

(38-53) Standard Timings

1152x864 @ 75 Hz (4:3 Aspect Ratio)
1280x720 @ 60 Hz (16:9 Aspect Ratio)
1920x1080 @ 60 Hz (16:9 Aspect Ratio)
1680x1050 @ 60 Hz (16:10 Aspect Ratio)
1280x1024 @ 60 Hz (5:4 Aspect Ratio)
1440x900 @ 60 Hz (16:10 Aspect Ratio)

(54-71) Detailed Descriptor #1: Preferred Detailed Timing (1920x1080 @ 60Hz)

Pixel Clock : 148.5 MHz
Horizontal Image Size : 698 mm
Vertical Image Size : 392 mm
Refresh Mode : Non-interlaced
Normal Display, No Stereo

Horizontal:
Active Time : 1920 Pixels
Blanking Time : 280 Pixels
Sync Offset : 88 Pixels
Sync Pulse Width: 44 Pixels
Border : 0 Pixels
Frequency : 67 kHz

Vertical:
Active Time : 1080 Lines
Blanking Time : 45 Lines
Sync Offset : 4 Lines
Sync Pulse Width: 5 Lines
Border : 0 Lines

Digital Separate, Horizontal Polarity (-), Vertical Polarity (-)

Modeline: "1920x1080" 148.500 1920 2008 2052 2200 1080 1084 1089 1125 -hsync -vsync

(72-89) Detailed Descriptor #2: Detailed Timing (1280x720 @ 60Hz)

Pixel Clock : 74.25 MHz
Horizontal Image Size : 698 mm
Vertical Image Size : 392 mm
Refresh Mode : Non-interlaced
Normal Display, No Stereo

Horizontal:
Active Time : 1280 Pixels
Blanking Time : 370 Pixels
Sync Offset : 110 Pixels
Sync Pulse Width: 40 Pixels
Border : 0 Pixels
Frequency : 45 kHz

Vertical:
Active Time : 720 Lines
Blanking Time : 30 Lines
Sync Offset : 5 Lines
Sync Pulse Width: 5 Lines
Border : 0 Lines

Digital Separate, Horizontal Polarity (+), Vertical Polarity (+)

Modeline: "1280x720" 74.250 1280 1390 1430 1650 720 725 730 750 +hsync +vsync

(90-107) Detailed Descriptor #3: Monitor Range Limits

Horizontal Scan Range: 30kHz-75kHz
Vertical Scan Range : 56Hz-76Hz
Supported Pixel Clock: 150 MHz
Secondary GTF : Not Supported

(108-125) Detailed Descriptor #4: Monitor Name

Monitor Name: AT3265

(126-127) Extension Flag and Checksum

Extension Block(s) : 1
Checksum Value : 201

___________________________________________________________________

System Information Summary:

Processor : AMD Phenom(tm) II X6 1100T Processor
Operating System : Microsoft Windows 8.1 Pro
OS Version : 6.3.9600
Service Pack : 0.0

Video Controler:

Device ID : VideoController1
Name : AMD Radeon HD 6900 Series
Adapter Compatibility: Advanced Micro Devices, Inc.
Video Processor : AMD Radeon Graphics Processor (0x6719)
Video RAM : 2.00 GB (2,147,483,648 Bytes)
Availability : Running or Full Power
Driver Version : 13.251.0.0
PNP Device ID : PCI\VEN_1002&DEV_6719&SUBSYS_E182174B&REV_00\4&5AC7D5A&0&0010
Resolution : 1920x1080
Max Refresh Rate : 75 Hz
Min Refresh Rate : 56 Hz
Current Refresh Rate : 60 Hz
Current Scan Mode : None-Interlaced
Status : OK


___________________________________________________________________
Sorry i don't know how to do a spoiler
 
Downloading the software now...However i don't see how that makes the TV 720P instead of 1080P when its just the preferred resolution? Also if text looks more crisp at 1080P vs 720P doesn't that mean its a 1080P TV or at least closer to being one?

EDIT here is what i got i'm the lucky one

Yes you got luck as f*ck with that display. The "preferred" resolution is the resolution the display's scaler works at natively, frequently called the native resolution. Thing is "native resolution" is any resolution the screen can work at without requiring overscanning / underscanning. LCD's are manufactured with a similiar concept to CPU's. There is a gigantic panel made known as "motherglass" that is then cut into smaller panels to be shipped to display manufacturers. The vast majority of the worlds panels come from Samsung and LG. When producing panels four or more defects renders the entire panel bad and even one defect forces them to sell the panel as grade B vs grade A (zero defect premium panel). It's similiar to CPU dies off a wafer, the larger the die the more likely you'll have a defect that renders that die useless / subpar. This is what makes big screens so damn expensive, you can't get many of them off a single motherglass and a single defect renders the entire panel practically useless. A way to get around that is to use a smaller internal resolution which allows them to have a defect and still produce a viable panel since its being masked.

That's why I said most HDTV's are running a native resolution much less then 1920x1080. Unless they bought one within the past two or three years there is a very high chance the manufacturer used this trick to keep the displays affordable. That's something I was told by a friend of mine who happens to work at Samsung Korea.

Spoilers are used with [spoil ][/spoil ] or the button right next to the quote.
 


Nevermind those specs don't always matter. Remember Star Fox? Which needed a built-in chip (Super FX) running at ~10.5MHz on top of the SNES's main CPU (Ricoh 5A22) to run? Someone managed to port the first stage to the Genesis by simply brute forcing it on the Genesis's main CPU (the legendary Motorola 68000), which runs at 7.67MHz. No one would argue the Genesis is stronger then the SNES but, there you go.

So yeah, all people care about is pricing and games. And on both, Sony has the edge.
 

jdwii

Splendid


Every console used multiple CPU's/DSP's; the Genesis's sound controller was a bloody Zilog Z80 for crying out loud!

As far as the Genesis/SNES CPU's go, in terms of raw processing power, the SNES wins. Even more so when SuperFX is considered. In terms of accessing memory, however, the SNES CPU runs at a much lower rate. So it depends on what you are doing at the time. Either way, its impressive the Genesis could actually brute force Star Fox if it wanted.

Point being, all people care about are the games, not the technical specifications behind them. That being said, marketing one console as 1080p versus 720p and $100 more expensive is an easy enough sell to make.
 

CooLWoLF

Distinguished
The Xbox One and PS4 each have good qualities and features (I have both).

I find the UI on the Xbox One to be far superior to the PS4. And the multimedia capabilities with the Kinnect and your cable box work fantastic. I will never go back to using the plain cable box! This was a great decision by MS in my opinion.

The PS4 definitely has better graphics so far as I have been able to tell. Killzone: Shadowfall is a beautiful game.

As far as the games, its way to early to even talk about this. This happens every single new generation; we only get a handful of games/exclusives. Wait till Halo comes out (and it looks like we may get 2 Halo titles this year alone).
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I smell a class action lawsuit coming. Which will get twice as confusing for consumers when the 4k marketing BS rackets up.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


As mentioned before, the concept of APU is not restricted to having a GPU inside

http://en.wikipedia.org/wiki/Accelerated_processing_unit

The HSA specification considers explicitly the general kind of TCUs, which satisfy HSAIL-ISA.



AMD said that 3 years ago and has not changed the plans for exascale supercomputers, because the laws of physics are the same now than 3 year ago...

I don't know how you got the 4x ratio, but it is wrong. The AMD chief engineer is using the common HPC practice of mentioning DP performance. The APU he mentioned gives 14x more theoretical performance than the R9-290X (which only offers 790 GFLOPS). Moreover, as I already said to you before, you cannot just compare raw FLOPS. In practice the single APU will be much faster than 14 discrete cards "R9-290X in uber mode" working together.

Let me mention that the APU designed by Nvidia engineers is a 20 TFLOPs beast. I will leave you as homework to get how many discrete cards "R9-290X in uber mode" working together you need to match the performance of the Nvidia APU for supercomputers.

I already said before that the process node is 10nm.

The link you provide doesn't say what you pretend. At contrary he confirms stuff I have said and you and others negated. He clearly says that current supercomputer architecture doesn't scale up. He mentions some of the challenges of the design of exascale supercomputers.

What he says is all well-known. The exascale designs from AMD, Nvidia, and Intel provide explicit solutions to the problems. Regarding interconnects, the AMD design includes a NIC of 40--100GB/s. The Nvidia design includes an interconnect of 150GB/s. This is beyond the buses used in current supercomputers based in the outdated CPU+dGPU architecture...

I have given you before a slide from SC13 (aka Supercomputer 2013), explaining how the supercomputers of tomorrow will use APUs instead discrete cards used today:

Xeon-Phi-Knights-Landing-GPU-CPU-Form-Factor-635x358.png


Can you see the words "Today" and "Tomorrow"? Intel expect the supercomputers of 2015 to use the APU at the right of the slide. What you propose (CPU+dGPU) is outdated and terribly slow.
 


Won't work as technically the displays are indeed 1080p displays. They accept a 1080p/24 signal and display it on the screen for the view to watch. That is all that's required for it to be branded as such. Marketing then runs with it and spins it to mean something similar to what we think exists on PC monitors. I fully expect "4K" displays to have internal resolutions of 1920x1080 or 1600x900. People also need to realize that when your looking at a display from 8+ feet away, the resolution isn't nearly as important as it is when your sitting three feet away. HD is more about the source material having enough quality that the display can render it for your environment. That's why all the definitions are based on media resolution vs physical pixel count.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


You're missing some basic fundamentals of how the Xeon differs from the Phi regarding reliability, scalability and compute. Take for instance the Large Hadron Supercolider compute network. They have a distributed network of one Tier-0 compute site, 11 Tier-1 compute sites, and 140+ Tier-2 compute sites, and uncounted Tier-3 access sites. You can bet that for the Tier-0 and Tier-1 compute sites these are clustered Xeons with hardware redundancy, or comparable Power based systems. They can't afford any downtime as the data collection process is mission critical. As you get down to the Tier-2 the local storage will be less critical but they still have controlling nodes that are Xeons. For the compute only portion that can be done twice, thrice or restarted. This is where your GPU compute or Phi comes into play. Only at the bottom rungs of this tiered network will the possibility of 100% Phi be possible. This is why there will never be a 100% Phi HPC. Even Intel is saying they're only planing to make 1 Phi for every 13 Xeons.

Now this ratio will likely change over time as Intel increases the capabilities of Phi, but there is a long road ahead before these even have the possibility of merging into a singular HPC solution. Maximum reliability and maximum performance are divergent paths. Also, with 8000 scientists running experiments there's a very high probability the workloads will continue to run better on Xeon over Phi.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I've tried explaining this to you before but there is no fundamental difference between the interconnects of the recent past, today, and the interconnects of tomorrow. They are all high speed serial IO or a more popular term MGT (multi-gigabit transceiver). The future performance of these interconnects is simply an extrapolation of what they think will be achievable in the future. There are parts shipping today with 32Gbps/lane. They're just expensive. That's why PCs lag behind with their 8Gbps/lane PCIe 3.0 ports. Any number of protocols can be used with MGTs (PCIe, Infiniband, Dragonfly, SATA, Ethernet, Aurora, Fibre Channel, etc).

http://en.wikipedia.org/wiki/Multi-gigabit_transceiver

Fast forward a couple years when 50Gbps transceivers are widely available that will give you the 100GB/s bandwidth in a typical x16 channel. For the 150GB/s mark you'll need 75Gbps transceivers. Again this has nothing to do with CPU + dGPU or CPU +CPU or APU + APU. It's just a natural progression of the interconnect technology.
 
Bandwidth isn't an issue, hasn't been for awhile. It's latency that hurts you the most. You send a set of commands and need to wait for their acknowledgement to return before you send another set, during that pause you can't do much else. Big data is more focused on I/O latency then bandwidth right now with the sole exception being storage as that can be buffered and multiplexed easy enough.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I have provided you two quotes from Intel HPC representatives (Walczyk and Hazra) saying clearly that Intel plans to sell HPC made only of the new socketed KL Phi, without any Xeon CPU. You can continue deleting the quotes and the links... but you are not disagreeing with me anymore. You are disagreeing with Intel.




The APU--APU interconnect is a speed-up version of the existent CPU--CPU interconnects plus extensions needed for exascale supercompute. As I have said to you plenty of times, current computer architecture cannot be simply scaled up to exascale. You cannot take existent PCIe/Dragonfly triplicate the bandwidth and you are ready. New paradigms are needed.

A particularly interesting extension is the development of an self-customizable memory model based on the needs of the application. Applications can have flat or hierarchical memory based on what is optimal for them. This unique memory model is not available by using current interconnects and just scaling the bandwidth as you pretend...

Moreover, the key to exascale performance is on the interconnects inside each APU. No CPU--dGPU interconnect can compete with that.
 

ColinAP

Honorable
Jan 7, 2014
18
0
10,510


I have never heard it claimed anywhere that a 1080p TV isn't actually 1080p native. In Europe at least, if something is marketed as being "Full HD" (as opposed to "HD Ready") or 1080p, then it has to have a native display resolution of at least 1920x1080 or else the EU or country-level Trading Standards would come down on a company like a ton of bricks.
 


The problem is your use of the word "heard". The devil is in the details, notable in the technical definitions used. You, as a layman, think "native resolution" and immediately associate that with 1920x1080 for a total of 2,073,600 physical pixels. That's not the technical definition though. What resolution does the screen accept without overscan / underscan? Above I posted the EDID information off a 42 inch LG XCanvas 1080p display. Block 54 denotes it's "preferred" resolution of 1360x768 with a second supported resolution of 1920x1080. The presence of that second supported resolution is what makes it a, technically speaking, 1080p screen. The number of actual pixels present is irrelevant to the classification of the display, only what it's supported resolutions are. This is very common amongst older displays prior to gen 10 (sometimes 9) motherglass.

Btw these displays are indeed sold in Europe. Where do you think the LCD panels for all those screens come from?
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


I believe you are mistaken.

It would cause a public outcry if news ever emerged that when a TV manufacturer or outlet says it has a resolution of 1920x1080 that it doesn't actually have. I've not heard about it, I've not read about it any where any time. In the absence of any evidence from a mainstream source, I am more inclined to believe that your TV is either faulty, or your software is misreporting the resolution or you have been mislead by LG, probably unintentionally.
 


And you can believe whatever you want to. I merely provide the technical information, information that's easy to verify. If you have a HDTV in your house and a laptop or other HDMI enabled device, you can easily check it with the above posted EDID program. The net is replete with people asking why their 1080p display lists 1360x768 as the recommended resolution and that if they set it to 1920x1080 the text is hard to read.
 
I started a new sticky on CryptoMining hardware adjacent to this one in the CPU section, and invite you all to participate and help educate the other users on the appropriate hardware required.

Please feel free to post some of the graphics and cpu benchmarks and if you have advice please assist others who post on that topic.

Thanks.
 

ColinAP

Honorable
Jan 7, 2014
18
0
10,510


The net is replete with stupid people.
 

juggernautxtr

Honorable
Dec 21, 2013
101
0
10,680
mmmmm, gonna have to look my lg 42" hd 1080p tv that I use as a monitor.
though i pd 700 for mine.

as for APU's they will stay around i don't see them going away, to0 useful for the laptop/tablet market,low end desk top, and most likely will be working their way up as power and performance improve. for as many of these they could fit on a blade style server, the benefits out weigh any discrete option for power/space.
 
Status
Not open for further replies.