AMD Unveils ATI Eyefinity Six Monitor Output

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
kick ass, I think 3 would work best for most games, center screen to concentrate on and two side screens for peripheral vision. 6 screens, unless they had no bezel would be terribly distracting.

Oh, they also did some of these tests using 6 30" monitors which was actually 7680x3200 of which WoW pulled 80fps with maxed settings. Very cool.
 
[citation][nom]corei5_equals_pwnd[/nom]AMD just keeps batting them right out of the park. I'm definitely buying some more AMD products in the near future. They just pissed all over the over-hyped Core i5 parade.[/citation]



So, now the score is Intel:785234, AMD:1?
 
mavroxur: I hereby nominate you for the 2009 Fanboy of the Year award.
 
The big deal about this isn't 2D at all. It's 3D. The math works like this:
With a 46" 1080p LCD screen positioned 5 feet away from your face, the visual arc between pixel (960, 540) and pixel (961, 540) is roughly 0.0199 degrees. If you assume that the viewer's pupils are 2.75" apart, then the difference between (A) having a "left eye" object image be 2.75" to the left of the corresponding "right eye" image of the same object (assuming alternating image 3D glasses-style technology) minus 1 pixel, and (B) having the left eye image be 2.75" to the left minus 2 pixels is that in the first 3D view the object being represented looks like it's about 658 feet away, while in the second 3D view the object looks like it's only 329 feet away. By increasing the horizontal resolution, you also increase the resolution of the perceived depth of objects rendered using alternating image 3D technology. In the same visual arc as the above example, using twice the horizontal resolution you can now display objects that appear to be ~1300 feet, ~650 feet, ~440 feet and ~330 feet. The closer an object is, the more accurate the shifted-image approach is for representing depth, but even still this should have a significant impact on 3D simulated-depth resolution.

That is what I see as the benefit of being able to support such dramatically higher resolutions.
 
For those who are saying that a they don't understand why a 50" 1080p HDTV couldn't just be used instead of 6 monitors, please understand that resolutions such as 1080p are always 1080p. That is 1920 x 1080 scales up. You have 1920 pixel columns and 1080 pixel rows, regardless of size. So, a bigger set will just have bigger pixels. The aspect ratio is 16:9. If you multiply 1920 by 9, then divide the sum, 17,280, by 16, you get 1080. Most monitors use 1920 x 1200 (as mlopinto2k1 explained), which would be a 16:10 aspect ratio.

Here is a separate article from CNET on the subject:

http://news.cnet.com/8301-13924_3-10350261-64.html?tag=newsEditorsPicksArea.0

AMD is using a resolution of 5760 X 2400 for each monitor. That would be an aspect ratio of 24:10 (so, I'm guessing Samsung would make custom monitors for it).

The math of it is 1920 X 1080 would equate to 2,073,600 pixels per frame--roughly 2 megapixels

1920 x 1200 would equate to 2,304,000 pixels--roughly 2.3 megapixels

5760 x 2400 would equate to 13,824,000 pixels--roughly 14 megapixels

13,824,000 x 6 monitors would equate to 82,944,000 pixels--roughly 83 megapixels.

With the 3D, you are multiply 82,944,000 x 3, which is 248,832,000--which is close to their target of 268 megapixels. I'm not sure if they are doing something a little extra, but the math is off by 20 megapixels.

 
I work in the financial industry where multi-monitors are everywhere. We have plenty of 6, 8 and even some 12 monitor configs here. We currently just use a mix of 4-port and 2-port ATI Fire or NV Quadro cards. 6 ports in a single card would save space and most likely money. 3 4-port cards in a single SFF PC gets pretty crowded.
 
AMD is using a resolution of 5760 X 2400 for each monitor. That would be an aspect ratio of 24:10 (so, I'm guessing Samsung would make custom monitors for it).

stevo, I think they're using 1920 x 1200 for each monitor. 1920 x 3 = 5760, and 1200 x 2 = 2400. That's a 3x2 array of 1920 x 1200 monitors, yielding 5760 x 2400 resolution...

I still say that this is all about 3D, not 2D. I've been wondering for a while now how they were going to get sufficient resolution to make 3D look good for mid-range and distant objects, given that the distance between pixels is visible to the naked eye.
 
[citation][nom]fanboy_committee[/nom]mavroxur: I hereby nominate you for the 2009 Fanboy of the Year award.[/citation]



And you've been nominated for the "Hiding behind a fake forum name to talk crap" award.

 
With the new i7, i5's coming out (and i3 just around the corner) people were wondering how AMD was going to stay alive, while the answer is ATI. this stuff is so cool
 
Other then some specific business interests and occasional gamers with money to burn, this won't become viable until they can get rid of the lines in between.
 
There is a lot of misunderstanding surrounding this hype.
As has been mentioned above the multi-monitor setup is far better than any single display possible; you get much higher resolution. We're not talking about pixel doubling here.
I've watched Devil May Cry being played on one of AMD's six monitor setups *AWESOME* You notice the bezel for the first five minutes then your mind just filters it out.
Note also that this could not be accomplished before even with a six head graphics card. Games would only play on one monitor, or could be pixel doubled onto multi-monitors. With the introduction of DX11, you can run on six monitors with each monitor set to max resolution, if the game will allow it.
 
DAMNIT! I wrote this whole paragraph, forgot to sign in and LOST IT! Ahhh!!! Anyway, did my math wrong before, yada yada yada. It would be 5760x2400 for 16:10 and 5760x2160 for 16:9. No matter how you look at it, it is more pixels then a large 52" 1080p monitor, obviously!!! But, taking 6 52" 1080p Widescreen TV's and mounting them together would be sick!
 
[citation][nom]mlopinto2k1[/nom]DAMNIT! I wrote this whole paragraph, forgot to sign in and LOST IT! Ahhh!!! Anyway, did my math wrong before, yada yada yada. It would be 5760x2400 for 16:10 and 5760x2160 for 16:9. No matter how you look at it, it is more pixels then a large 52" 1080p monitor, obviously!!! But, taking 6 52" 1080p Widescreen TV's and mounting them together would be sick![/citation]

hells yes it would like a screen room, I would be suprised if after this technology unveiling someone doesn't start working on a large screen (40+) that has a resolution around 5800x2300. It would be expensive but so are 6 1080x1920 or 2560x1600, and there wouldn't be any bars.
 
That "eye-definition optical clarity" boasted by ATI will not be anywhere near reality on traditional LCD monitors, of which can only produce at best 100 DPI. (A dense-pixel LCD costs about $8,00 for only a 22" monitor)

Not to complain though, it's still really cool, and at this point I've had more then enough reasons to stop buying Nvidia cards.
 
[citation][nom]tkgclimb[/nom]hells yes it would like a screen room, I would be suprised if after this technology unveiling someone doesn't start working on a large screen (40+) that has a resolution around 5800x2300. It would be expensive but so are 6 1080x1920 or 2560x1600, and there wouldn't be any bars.[/citation]
It'll be insane before we know it. I can just see it, "30 inch 5760x2400"... or larger as you suggested.
 
"well i guess i know whose gpu im gonna get. ATI all the way. and that is just with one card wait till you get crossfire. im am so glad i waited to upgrade."

If only it were that simple...
I own two ATi 4870x2's for quadfire, it's amazing... when it works.
You'd be surprised the number of games that don't have profiles, which results in either horrible performance, or Crossfire not working at all:
Street Fighter IV has been out for 3-4 driver releases, and it has yet to get a Crossfire profile. (basically, you HAVE to disable catalyst AI, meaning.. Crossfire for it to work, otherwise it bugs out in the shipstern map.)
Mini Ninjas, two driver releases, runs at 10 FPS.. on quad chips.. amazing experience.. =\
Batman: Arkham asylum. doesn't seem to take advantage of crossfireX at all.

If ATi got their act together with crossfire this would make so much sense, but they haven't... so if anything i'd buy just a one chip card, which is a shame because this setup would take a lot of advantage from crossfire.
 
As a multi monitor user... I run 3 monitors... BUT I've never been a big fan of spanning a single image, game or application over multiple monitors. The monitor bezels are extremely annoying and ruins the experience. I use my multi monitor display so I can have three programs up at the same time. Before this ever becomes mainstream (if ever) bezel-less monitors need to be made.
 
[citation][nom]TeraMedia[/nom]stevo, I think they're using 1920 x 1200 for each monitor. 1920 x 3 = 5760, and 1200 x 2 = 2400. That's a 3x2 array of 1920 x 1200 monitors, yielding 5760 x 2400 resolution...I still say that this is all about 3D, not 2D. I've been wondering for a while now how they were going to get sufficient resolution to make 3D look good for mid-range and distant objects, given that the distance between pixels is visible to the naked eye.[/citation]

Thanks for clarifying that TeraMedia. The other article wasn't very clear on that.
 
why the fuck would you put 6 small screens together when you get get one big screen? fucking lines in everything mothefucker
 
Status
Not open for further replies.