Philips Intros Quad HD, UltraWide Monitors

Status
Not open for further replies.

drew455

Distinguished
Jun 17, 2007
212
0
18,690
Stop making them wider. I hate scrolling up/down. It makes programming harder when you can only see 12 lines at a time. Next thing ya know its gonna be "4K is here: 4096x640"
 

pacomac

Honorable
Sep 24, 2012
129
0
10,680
Most people think of 1080p as HD which certainly doesn't put these anywhere near quad HD! This is the same resolution iMacs have had for years now.
 
G

Guest

Guest
They need to do away with those lame 1080 displays and move up in the world. I've been gaming on a 30 inch 2560 x 1600 display for 2 yrs now and never looked back. Yeh they cost a lot but so do cars houses and cable bills but people still get them...
 

TheCapulet

Distinguished
Jan 21, 2009
123
0
18,680
I imagine these super wides are less for gamers looking for the best aspect ratio, and instead for the medical Industry, where Phillips holds a huge market share bordering on a monopoly.
 

blibba

Distinguished
Aug 27, 2008
166
0
18,680


Happens to be the same aspect ratio as many films.
 

InvalidError

Titan
Moderator

Buy a monitor with tilt support.

For programming, I flip my LCD from 1920x1080 to 1080x1920.

I wish 1200p was more common since 1080xNNNN often means having to scroll horizontally.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780


stop codeing at 72 point font?
i get 57 lines on 1920x1200



talking to a person who has 1000-2000$ to dup on a monitor and another 1-2000$ on gpus.
cars are required
tv is required (if you have kids, you know what the hell i mean)
apc that can play a game at decent settings on 2560x1600... that is so optional its hard to justify, and i have been in the market for a 2560x1600 for years now, not for gaming mind you, but the extra space i would have.



what sucks about these monitors is that they are not wide enough, yet cost more than 2 similar monitors.
we need a 3840x1080 or a 5760x1080 come out for slightly (1-200$) more than getting 2 or 3 monitors separately.
hell i would get a 3840x1080 and thats not even for gaming.
 

InvalidError

Titan
Moderator

Personally, I much prefer 16:10 aspect ratio for everyday computing.

For programming/reading, I have a portrait-mode 1080p display but 1080 is around 100 pixels short from wide enough to avoid horizontal scrolling. 1200p in portrait mode would have been ideal for me but those displays tend to cost nearly twice as much.
 

InvalidError

Titan
Moderator

Is that even a question? Those are "professional" displays and apart for those intended for use with active shutter 3D glasses, they are pretty much all 60Hz just like 99% of other LCDs out there.

TVs with fancy 240Hz "refresh" rates still only have 60Hz input rate. The extra 180 frames are either re-paints of the last input frame or interpolated between two previous inputs to make animations look smoother.
 

paleh0rse

Distinguished
Oct 8, 2010
7
0
18,510

Wrong answer. When it comes to TV's, you are correct; however, modern GPU's used in gaming rigs are quite capable of outputting a true 120+ FPS -- especially in crossfire or SLI. Therefore, displays that are capable of receiving 120 FPS (those with 120 Hz refresh rate, or higher) can display the actual frames without leveraging vsync and/or capping the FPS at 60.

I currently use a 144 Hz LCD Asus monitor for gaming. There are many gamers, including myself, who are eagerly waiting for the first 120+ Hz IPS displays with low response times to finally hit the market.
 

InvalidError

Titan
Moderator

What display interface do you use? Display Port officially maxes out at 60Hz, HDMI officially maxes out at 75Hz and DVI maxes out at 120Hz. There is no standard that officially goes to 144Hz.
 

mapesdhs

Distinguished
TheCapulet, for medical apps you want one of the following, not a consumer colour
display (and anything that's only sRGB-capable is still consumer IMO):

http://www.kikatek.com/P288710/K9601618-BARCO-MDCG-10130-Coronis-Fusion-30
http://www.ampronix.com/content/web/barco_coronis_fusion_10mp.asp

It amuses me that 10bit precision is now regarded as high-end, given 12bits
and even 16bits per channel has existed for nearly 20 years (genuine precision,
not via any kind of dithering), though of course one needs CRTs to convey
that level of fidelity. Nobody makes such things anymore though because it's
horribly expensive to do; in 2002 it needed 10GB VRAM to run such a design
properly for multi-display output with subsample AA, etc., which cost a fortune.

Quality is expensive. Consumer displays will get better as and when the
economics permit.

Yesterday I went hunting for a 2560x1600 display. What models were available
had prices way beyond my budget, so I bought a 1440 model instead which was
about 75% cheaper. It's the same effect which shoved industry production away
from 1200 height displays to 1080, just the economics of purchasing demand.
1080s were cheaper to make, so the rise in demand reinforced the cheapness
ever more. We have affordable IPS panels now because the volumes are migher.

I'm sure we'll have well priced 4K+ OLED (or whatever) displays some day, it
just might take a while.

Ian.

PS. If anyone cares, I settled on the following after reading oodles of reviews,
etc., partly because it supports sync-on-green which I need for my SGIs:

http://www.tftcentral.co.uk/reviews/dell_u2713hm.htm

It cost the equivalent of about $620 US (price was 407 UKP total).

 
Status
Not open for further replies.