Just wanted to say thanks

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
so.......tell us how it looks under games already?

_______________________________________

The world sucks. Keep your pants on.
 
Hehe, gota love that weight huh, puts me off cleaning behind the desk trying to move my monitor around.

Well it looks like my old Phillips Brilliance 109s is still there, doing a mighty fine job at 1024x768 - 85Hz and will go too 1600x1200, as mentioned above going higher involves a bit of windows tweaking and more Hz, I really dont need to do that now.

<A HREF="http://www.koalanet.com/australian-slang.html" target="_new">Aussie slang</A>
 
Looks awesome. There are two visible lines at 1/4 and 3/4 the way down the screen. I'm not sure if they're supposed to be there, or if it's a defect in the tube. Regardless, you can only notice it when the screen is white or close to it, so I don't really care. Not a problem at all.

Speaking of white, I had to turn the brightness down to 30% to keep blank white screens from blinding me. Definitely a big difference, and it'll be nice when watching movies with the lights off, or watching TV in bed.

So far, everything is perfectly crisp. 2D pictures, 3D in games, text on websites (even at 1600x1200). I'm definitely happy with this monitor. And games are much more immersive with a bigger monitor, I have a feeling that future parts of Morrowind are going to scare the crap out of me :tongue:

<font color=blue>Hi mom!</font color=blue>
 
ffffff where did you get it!? it's not liisted as available under price grabber!

_______________________________________

The world sucks. Keep your pants on.
 
Some games do not scale to higher resolutions, and 1024x768 for desktop, which is really a personal preference, is fine for me. I actually found higher resolutions harder to use, but as you say, they can scale if you adjust font and graphics. In the end, the straw that breaks the camel's back is price--monitors that run in high resolutions are usually, though not always, the most expensive.

The higher resolution for games is an entirely different argument. I'll quote my other long post:

"Also, the advent of anti-aliasing becoming a default feature on graphics cards now makes running 1600x1200 less economical and even less aesthetic. Running 2x AA in 1024x768 or 4x in 800x600 is much more rewarding than running no AA in 1600x1200 because you get playable framerates and generally better image quality. From personal experience, running 4x 1024x768 almost complete elminates jagged edges (jagged edges are unnoticeable when playing), so if you ask me if I think the games industry is pushing for higher resolutions, I say no. If you think about it, FSAA 4x in 1024x768 resolution is really like running a game in 2048x1536 without FSAA. The reason is simple. FSAA 4x draws the scene 4 times and then collates the image into one. So if we do the math we get:

1024x768x4=3,145,728 texels
2048x1536=3,145,728 texels

One of the very reasons why FSAA was invented was to allow us the freedom to not have to buy monitors and graphics cards that support 2048x1536 resolution at stable refresh rates (after all they aren't cheap!). Also, keep in mind that a monitor you buy today will probably last 7 or 8 years, and during that time, anti-aliasing technology will improve, whereas resolution, unfortunately, will always stay the same. Just look at the matrox parhelia and think about what ATi and especially nVidia will do. Theory suggests that AA processing can be done with just 15-50% of the processing power of current FSAA functions if you target just the jagged edges as opposed to the whole screen, making higher resolutions not only inacessible to many due to high monitor requirements but also requiring more processing power thereby lowering framerate!"

I'm being optimistic in the post above. I'm hoping Matrox's 16X FAA (fragment anti-aliasing) will be successful and that others will follow its lead. The anti-aliasing quality alone should be a factor when buying video cards nowadays, but since people care a lot more about speed than quality, or think that speed=quality, I'm not sure if 16X FAA is getting the attention it deserves.

We'll never know anything until we get the benchmarks for the Parhelia-512 using 16X FAA, but based on what we know in theory, it's going to make all higher resolutions obsolete except for anything involving 2D graphics. For the professional user, by all means, run high resolutions and get the monitors that can run those high resolutions. But for the average user who plays games every now and then, it's smarter to save money and not pay for features you're not going to use. I'm not trying to bash higher resolutions, but I'm working against the notion that people think they're actually going to use those high resolutions. A recent survey among gamers who played Half-Life showed that 50% of them used 1024x768, with the rest split evenly between 800x600 and the resolutions higher than 1024x768.
 
Cake


I think I know why we disagree, you are looking at the issue from a Gamers point of view. I admit that running games at lower resolutions has many advantages. I look at the issue from a business application point of view. CAD/CAM, document imaging, word processing, financial spread sheets, where the value of high resolution and the payback in your return on investment is quite short even for an expensive monitor.

There have been many studies on this subject, the most notable by Alistar Sutcliffe and Peter Faraday of the Center for HCI design in London. They concluded that larger monitors at higher resolutions increased user productivity significantly.

I do not have a soft copy of the study but if you send me a fax number I can fax you hard copy.

Also prices are not that bad these days, I have a 21” 1600 x 1200 at 85Hz monitor, my new C1035 for only $499. See this link.


http://shop.monitorsdirect.com/product.asp?sku=1908711

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com


<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
I did some more thinking, and I believe the argument I posted earlier is actually at fault. When games become more detailed, the polygon count will increase and the polygons themselves will shrink in size. So the games industry actually <i>is</i> going to push for high resolutions, but not for another 10 years. 10 years is basically a very rough estimate for the point in time in which we will say "running 1024x768 doesn't look as good as 1600x1200". Within 5 years we may notice an impact as well if we start to deal with games that are dominated by polygons with edges that are 4 pixels long.

However, I do think that there will be a limit to how far we will go. At some point in time, resolutions will be so large that people cannot notice any differences. We're awfully close to that figure right now. Perhaps 2048x1536 is that particular resolution? $499 for a 21" 1600x1200 monitor is a steal, anyway. Thanks for sharing your point of view, it's nice to hear opinions from someone who designs monitors for a living.
 
There are two visible lines at 1/4 and 3/4 the way down the screen. I'm not sure if they're supposed to be there, or if it's a defect in the tube
I dont know if anyone has answered your question yet,
but - Yes, these lines are supposed to be there. All Sony tubes have them also, they are used to stabilize the AP.
After a while you will stop noticing them.
<pre>unless you post from home too, for these boards are mostly grey </pre><p>
 
<i>flyin says:</i>
ffffff where did you get it!? it's not liisted as available under price grabber!

I found a great price on there, then it disappeared. There were about a dozen, and they're all gone. Dunno what happened. I got mine from <A HREF="http://www.emscomputing.com" target="_new">EMS Computing</A> for $318, shipping included. They weren't incredibly fast, but no complaints.

<i>globe111 says:</i>
Yes, these lines are supposed to be there. All Sony tubes have them also, they are used to stabilize the AP.
After a while you will stop noticing them.

Ok, thanks. That's what I figured. And I've pretty much stopped noticing them. The only time I do is when the lines are on white or nearly white.

<font color=blue>Hi mom!</font color=blue>
 
On my own vivitron I always notice them whenever I type in Word XP. But that's the only time I ever notice them.

Censorship makes us so much more creative.
 
There are two visible lines at 1/4 and 3/4 the way down the screen.
Yep, they are supposed to be there. What is so amazing is that Sony could design such technology, think that its okay and then SELL it! Goes to show what suckers us consumers are.
As you can see, I don't like the Trinitron lines, therefore I have a shadow mask.
What is also clear from your email is that you didn't follow the number one golden rule when buying a monitor. GO SEE IT.
 
sjonnie:

I don't appreciate your reference to "us customers" as suckers. If you believe Sony along with other brands incorporate the metal dividers simply to rip off customers, then...well I don't care what you believe.

I went to BestBuy and CompUSA recently to execute your method of " GO SEE IT." The problem? Their monitors weren't setup to be tested for quality, convergence, moire, etc. They were setup to advertise the brand name computers they have for sale. And it's impossible(or at least improbable) to "GO SEE" every monitor available. Some high quality monitors aren't available at large chains let alone on display for testing.

While your opinions of the Trinitron's technology are appreciated, your blatantly biased and offensive opinons are not. Please be helpful, not hurtful.

_______________________________________

The world sucks. Keep your pants on.
 
It's a DiamondTron made by Mitsubishi, not a TriniTron made by Sony.

I'd much rather have an awesome monitor with two hardly noticeable lines than something that sucks ass, but does it across the entire screen.

You didn't mention what monitor you have, either.

<font color=blue>Hi mom!</font color=blue>