LCD vs CRT

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
yeah same thing happened to me, ive got a 7800gt aswell, and fear is the same, not sure about command and conquer, will have to check it out next time i play
 
Why do you think companies like Dell advertise new computers and mention that you now get an LCD monitor? Because nobody really likes CRTs, otherwise that would be a selling point.

This cracked me up for two reasons. First, we are discussing LCD vs. CRT for gaming, and you bring up Dell. ROFL. Dell's first top of the line gaming XPS systems were advertised for ultimate gaming and had a GF FX5200 in them. :lol:

Second, is the typical uneducated consumer who buys a computer for looks. LCD's sure are a selling point. I have 3 new CRT's in stock because everyone who wants a build also wants an LCD now. And Dell's LCD loaded ads, paired with their name and seemingly cheap prices...WORK. But Dell will sell you a crappy, bloated, un-upgradeable piece of junk with an analog LCD for a cheap price, and people jump on them only to find out the shortfalls later. All they know is Dell, black, and LCD... it must be new and good. Not to mention all there bundled LCD's and free upgrades are Analog only, and often they charge big bucks for one with DVI inputs.

It's late so I'll ramble. Funny story that goes in line with this is I had a customer buy one of my used beige spare LAN gaming systems. When he picked it up, he saw a few more systems set up near his. He pointed to a one and said "someday I want one of those fast black ones like that". I informed him that while it's brand new and black, everything in his system is far superior to that black internet surfing celeron. But if he wanted to spend $100 more, he could have the new,black,slower one. I still chuckle picturing him point and in awe say fast black ones like that. 😛
 
It can be funny what looks and a name can do. About 2 or so years ago I seen a 200 MHz computer sell for $350 at an auction just because it had gateway on the front. People kept bidding on it, but it was not worth that much money for a used computer running at 200 MHz. I had to laugh.
 
When we really think about it, it comes down to what you’re doing with the monitor. For gaming video editing, photo editing and the like, I would go with a crt. Most of your professional editing systems use crts because they are easier to calibrate using a spyder and can provide a closer match in color. The other main feature in lcd tech. when a graphic is changing, its only the moving graphic, the background and surrounding parts don't change. Very apparent on LCD televisions.

In a business application, lcds are a better choice mainly because of text. But you won't find an lcd that's been around for ten years without a few dead pixels in it. CRTs just don't have that problem. With lcds dead pixels are the name of the game. Most companies still say that a dead pixel right out of the box is still acceptable. In fact you can have up to five dead pixels in different areas of the screen and some companies still won't take them back.

My money, for gaming and editing anyway, is on CRT. Just more reliable
 
i'm kind of a new generation LCD fanboy. after reading all of this and some other stuff i mite actually buy an good old technology CRT for my next monitor, mainly cos its cheaper and will apparently last a lot longer. however for any web surfing @ 1280x1024 i'll prolly get an lcd as well 8)
 
You need to add a con for LCD's. You left out the Interpolating when not in native resolution. Any advantage an LCD has for text only happens when run in native res. And for gaming, well rule out most midrange and lower cards for having the power to run the newest titles at a 19" or larger LCD's native res.
Added. I listed resolution changes in CRT, but not the con for LCDs.

Hope that list helps :)

On a side note, I had my 19" Dell Trinitron (Was a Sony, so very nice) for almost 10 years before it finaly started having problems (Being a bit too blury, and finaly the image begain to shake randomly). Now, while I can't speak much to the longevity of my LCD panel, I've used other LCD panels that have been around for 2 years with little problem. I fully expect my LCD to last at least 3 years, and probably closer to 5. A good CRT can last longer, but it will go eventualy, and at about 5 years, it won't be as good as the day you bought it as far as sharpness is concerned.

CRTs are nice, and work fine for many people. LCDs work well too, and have matured quite nicely in the past 3 years or so. There are some applications that some are better suited for (LCDs work better in multi-monitor setups due to the thin boarders and smaller profile for example). Ultimately, it's a personal choise, and what works best for you may not work well for someone else.
 
These people need to learn how to set up an LCD and know its limitations. If you were so smart you could turn the interpolating off.

Whether interpolation is 'on' or not you can't fill a 1280x1024 display with a 1024x768 resolution without interpolation, whether done by your monitor or the card, it's just a limitation of the defined square pixels on an LCD.

If I were so smart, I could set up my CRT properly?? Are you talking about the display frequency?

Frequency, colour temp, brightness, contrast, etc.

Here's an idea: learn how to type; "the you" makes no sense!

Oh the typo threw you, whopee. Yeah, I guess filling in the blanks for you is something people are going to have to get used to.

But I'll get on that exercise thing. Of all the LAN parties, I've NEVER heard someone brag about how nice their CRT was.

And that would matter how? Because people go to LANs doesn't mean they're any good or even good judges of quality hardware. If you argument for which is better is based on portability it simply shows how weak the rest of the features are by comparison.

So while you make some compelling arguments, there is a reason why LCDs are gaining ground fast.

Gaining ground means even you don't find them to be equal.

Why do you think companies like Dell advertise new computers and mention that you now get an LCD monitor? Because nobody really likes CRTs, otherwise that would be a selling point.

No the reason companies like DELL sell computers with LCDs is that most people use them for text. Also no hardcore gamers buy their desktops from DELL, so I wouldn't use that as a 'pro'.

If you are good at gaming, an LCD wouldn't make a difference. For videos, LCDs were bad for that......about 10 years ago!

And they still haven't equalled CRTs after 10years.

I have had LCDs now for several years, and have NEVER had a video ghosting or any other problem that made me wish for a CRT.

So, and many people haven't seen the need to move beyond integrated graphics. Probably that you ONLY work on an LCD just ensures you won't know what you're missing.

With the new LCDs, the human eye really can't distinguish between the current latency times, so 12ms to 8ms really is a moot point.

You don't know WTF you're talking about. First, the human eye can distinguish between far faster differences, and so can the visual cortex/brain. Next the 12ms and 8ms panels are far from that fast with black to white to black transitions. Look at any of THG's reviews to see the true latency of those panels, even the 4ms panels don't fair so well. Perhaps you should do a little more research, instead of just commenting on your very limited experience.
 
2 questions:

1. Why do some games support my native of 1280x1024, but some (like FEAR) only go up to 1280x960?

4:3 is 1280x960, so it's the prefer resolution for a square CRT to not distort the image.

2. The human eye sees images at about 60fps. What millisecond response time would be able to display at 60fps?

The human eye is far more sensitive than that. The question is perception of the brain, and that depeneds on many things, contrast, motion blur, angularity, convergent/divergent motion, and similarity of images.

So if a card can do a better job of simulating motion blur you perceive more fluid motion at low rates, but if you have some of the features above which appear alot in things like jump-fragging then the task is harder and the threshold is raised even higher.

As for 60fps and what that equal in milliseconds, that would be about 16.7ms (1000/60).

Oh yea, how big can they make real LCDs (not projection)? Last time i remember it was 42".

They just showed off a 100+ inch one (3+ megapixel) last week.
 
To add to list;

Pros for CRTs:
Higher colour range/depth
Higher contast.
Truer blacks and whites.

Cons for CRTs:
Need to warm up for best/true picture quality (5-30mins)
Can have a visible app gril (especially trinitrons)l
Can have sympathetic flicker (can be adjusted though).
Greatly affected by emf

Pros for LCD:
Ease of set-up.
DVI quality is pretty solid, less chance of interference.

Cons for LCDs:
Poor white and black levels.
Poor contrast ratio.
Visible pixel lines, especially on large panels with low res.
Fragile (adjusting the bezel can put pressure on them).

Now these are things to consider, but despite all the pro/cons. I still say neither does everything well, but I can and do work on both. Working on both though I notice the limitations, especially when doing a quick photoshop job on my laptop, then looking at it on the CRT, what looked ok before I suddenly notice the areas that need work. But I prefer reading lotsa text on LCDs, prefect for surfing.
 
Yep, your list along with Grapes additions, make up a very useful list. It's good for people to know the limitations of both as far too often people just 1) believe newer is better and 2) take peoples happiness with their own monitor to be fact and not preference. I won't talk anyone out of an LCD, but I will point out where both excel and struggle and let them decide. Lately most people I deal with seem to still just want the LCD because it looks cool and takes up little space.
 
CRTs will always be better than LCDs for anything. They can more accurately reproduce colors, the picture is uniform across the screen, and look good at any resolution. The downside is its bigger, heavier, and uses more power.

The best LCDs out there are around the same quality of a CRT now. But they're expensive. A good LCD monitor like a Samsung will satisfy almost anyone though. And as long as you get a quality monitor whose response times are below 12ms (actual not some number the manufacturer spouts to sell them), then you won't have any problem in games or watching DVDs.

I have an 8ms Samsung 915N and love it. The colors are good and no ghosting whatsoever. No dead pixels either after 2 years. So as long as you buy quality products and don't get the cheapest piece of crap out there (as most Americans do, I'm American btw), then you'll be fine. You get what you pay for though.

Theres also the aspect of your vision to think about. With a CRT, the image is set behind the screen. With an LCD, its like reading a piece of paper. My eyes have actually improved since I started using an LCD because I'm on the computer all the time (my line of work calls for it).
 
CRTs will always be better than LCDs for anything. They can more accurately reproduce colors, the picture is uniform across the screen, and look good at any resolution. The downside is its bigger, heavier, and uses more power.
Actualy a well adjusted LCD does colors just fine, it's just that few people bother to adjust them properly. And a low end CRT will have an identical problem.

Blacks and Whites are more an issue with LCDs, and the newer ones are getting better with this, but due to the nature of LCDs, blacks will likely always be an issue (A black is blocking the backlight instead of being the absense of light, hense the issue).

I'll filter through that other pro/con list and add it to mine later. I disagree about Pixel lines being an LCD only thing, but that's personal preference. Not sure why I confused DFI and DVI input 😳
 
LCD monitors are an excelent choice. I have an Samsung SyncMaster 730bf and as soon as i started using it i changed my point of view about CRT.
And, in my opinion, the images look better at a n LCD monitor 8)
 
CRTs offer better performance than LCD, however, LCD consume less power and space (which is why I have one). I personally love my Hyundai L90D+, no ghosting whatsoever, and 700:1 contrast.
 
2 questions:

1. Why do some games support my native of 1280x1024, but some (like FEAR) only go up to 1280x960?

2. The human eye sees images at about 60fps. What millisecond response time would be able to display at 60fps?

I believe its your graphics card that is limiting 1280x960. I swear I've seen FEAR benchmarks on THG done @ 1600x1200

I didn't see this added anywhere, but FEAR has a setting.cfg file where you can change the resolution.
 
CRT are the better one's... they say, a friend of mine bought an 17" lcd screen, it looks beautiful even on 1024*768 8O
But they are expensive.

If I had to chose i'd buy an CRT screen for the moment, burn it up and buy a LCD when they dont cost that much anymore for the performance they deliver.

I've seen there are some "flat" CRT screens, not as flat as a LCD but they save allot of space :wink: (the one i have seen was 17")
 
So, and many people haven't seen the need to move beyond integrated graphics.

'Tis true. 😳 I'm a huge fan of integrated graphics. I've never actually purchased a discrete graphics card for any of my system builds (although, when I build my own system I do always make sure the mobo has an up-to-date discrete graphics card upgrade path). Right now I'm on a Dell Optiplex GX100 using an Intel i810 igp . Works okay for me. I know its limitations. :wink: Hey, it was free.

I currently am using a 14" LCD I bought from Walmart a couple months ago for the paltry sum of $125. Sure it's cheap in more ways than one, and the color quality pales in comparison to 15" Samsung lcd on my wife's Dell, but it takes up a lot less space, and only uses a quarter of the power of my 17" crt. I do miss the color quality and size of the crt's screen, but this lcd looks a lot nicer.
 
'Tis true. 😳 I'm a huge fan of integrated graphics. I've never actually purchased a discrete graphics card for any of my system builds (although, when I build my own system I do always make sure the mobo has an up-to-date discrete graphics card upgrade path). Right now I'm on a Dell Optiplex GX100 using an Intel i810 igp . Works okay for me. I know its limitations. :wink: Hey, it was free.

Hey Free can't be beat unless you expect good gaming. And that's the thing, we're talking best case scenario for all these things, for many people the GMA900/950 is MORE than what they will even need. I'm a stickler for 2D quality when I need it, but for surfing and general stuff like text editing a NeoMagic would be fine, let alone the latest integrated options, which truely are finally getting better. For a 2D pro I'd always worry about EMI from using the noisy board, but for probably the majority of users out there integrated will be fine.

Of course like Pauldh say, you are a luddite/heretic and will not be easily tolerated without much grog. Thankfully I'm on alot of cold medecine right now so you're fine. :tongue:
 
hey does anyone know of any 17" CRTs that do 1280x1024 @ 85Hz?
That's going to be hard to find now. There were some that 12x10 85hz was the recommended res. Not sure just how high quality the current ones are compared to a few years ago.

Samsung 700NF
http://www.samsung.com/Products/Monitor/DiscontinuedModels/AQ17NSBU.asp?page=Specifications

Viewsonic PF775 Supports 12x10 90hz
http://www.viewsonic.com/support/desktopdisplays/crtmonitors/proseries/pf775/

Pretty cheap just to grab a 19" as most will run 12x10 85hz.
 
why they need 85 Hz. I have no prob with 60 Hz at 1024x768 on a 19".

Higher refresh rate will shorten monitors lifetime I think.
 
60hz can be painful to ones eyes, had the same problem with crt at 60hz, on a plethora of monitors
I thought if theres no flickering then everything is fine. I didnt know that even if ur eyes dont notice a diff btn 60 Hz and 85 Hz, but still 60 Hz still causes pain.