Any of you still playing with CRT?

Status
Not open for further replies.

VGAmike

Honorable
Apr 10, 2012
55
0
10,630
I still keep my CRT 17" Sony gpd 200 monitor as a treasure. No input lag, no ghosting, superb colors, and no need to crank up resolution beyond 1280x1024. In this resolution games looks fantastic!! I only use a LCD panel for office.

Don't you think new panels force you to have tons of extra GPU power for nothing??
 
I don't think so. LCDs are available at 1280x1024 resolutions as well, but high resolution looks better. It's obvious.. Also, LCDs tend to have bigger screens, which I consider a plus. Let's say a 24" LCD monitor with 16:9 aspect ratio. It has 1588 cm² surface area. 17" CRT, on the other hand, has only 895 cm², which is almost twice as small!

Resolution debate: on lower resolutions, there's very little screen space.. considering 1920x1080 has 2.6 times more pixels than 1024x768, it's much more enjoyable with more screen space.

Last semester we had to do some experiments in the university, and there were some CRTs in that class. Damn it, I could say, at first, when I got my first LCD I didn't notice much of an improvement, but when I saw a CRT after 6 years of upgrading to LCD.. It's just day and night. CRT is blinking, it makes my eyes hurt after 30 mins of watching at it.

About the input lag and colors. Do you really notice the supposedly better colors and no input lag? I personally don't feel any input lag on my LCD nor that the colors are bad. Also, I've never ever experienced ghosting either.

I'd say there's no contest.
 

fil1p

Distinguished
Nov 29, 2010
944
0
19,360
^Same

also LCD's have advanced a whole lot, even when I used my first LCD it was great, and a big improvement over flickering CRT displays.
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630
I agree with all of you that LCD are WAY better for static images, with better geometry, no flickering. They are an improve to daily task, such as surfing the web, or office over CRTs. BUT, there is no contest for playing (at least in my opinion). You haven't noticed ghosting??? and colors... well, contrast is one thousand superior in a good CRT. Blacks are blacks, and they have much more depth of color. I have my computer plugged to 2 screens. One of them is my CRT, and the other a decent 20" Samsung syncmaster 1680x1050... and trust me, bigger is not always better...
 
Well you have made up your mind then... I have a 24 inch Apple Cinema on my computer and i can tell you your colors are not better than the ones on this display. i have never had ghosting issues ever so not sure on that one. Also as the other poster said unless the refresh was 85 i could see the flicker and would get a headach in short order. I am betting you LCD is a TN lower quality display and you personal comparison might be correct but it is not a blanket CRT is better. After this screen I would never disgrace my desk with a huge chunk of CRT ever again...
Thent

I moved from a Sony Trinitron 21" almost the best CRT they made in 2003
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630

LOL, that's the reason why you don't remember how great CRTs were, 8 years is a long time! :p (just joking)

I have tried many LCDs, and... even now in 2012, there isn't a single one that has better colors, better contrast, better response time than a a good CRT. LCDs are larger, thinner, and as I said before they are much better for some tasks, but they are a step backwards in many aspects, which I consider a big dissapointment...
 
Generalizations lead to poor discussions. Making generalizations by comparing 1 CRT with 1 LCD allows you to draw no worthwhile conclusions. Making comparisons even among LCD's is kinda pointless as an IPS panel behaves very differently than a TN panel. Put a bunch of gamers in front of a $300 Asus 120 Mhz Panel and a $2000 Dell IPS panel and 19/20 will choose the $300 monitor, with the most common quote being that the true color of the IPS panel looks "dull or washed out".

Every LCD has ghosting, it may be beyond one;'s ability to notice it, and this will vary by individual, but in the test lab it's quite apparent. See the little race car pictures here (scroll way down):

http://www.tftcentral.co.uk/reviews/dell_u3011.htm

I have a 24" Nanao monitor 1600 x 1200 upstairs that I paid $2400 for ..... geez, could be last millenium <g>..... It's not as bright as the 24" 120 Hz Asus TN monitor, the 24" Dell IPS monitor also has its strengths. But overall, other than the smaller resolution, the CRT more than holds its own against the other two.....but those two were both under $500.

An expensive, high quality CRT can be better than an expensive, high quality LCD in many respects but your run of the mill $200 CRT can't hold a candle to what can be had for $200 in an LCD.
 

maxinexus

Distinguished
Jan 1, 2007
1,101
1
19,360
Save your eyes man! Do this experiment: do at least 8 hours gaming on CRT( which is superior in quality to LCD) and take a picture of your eye...bloody red and inflammated
do the same with LCD maybe few burst capillaries. So do your self a favor stop bombarding your face with electrons and move on to something safer for your eyes ;)
I remember times when I game on my 14" CRT for 12-15 hours holy shish I did not have eyes but two red balls lol
 

warezme

Distinguished
Dec 18, 2006
2,452
57
19,890
Sinius, like LCD's there are good CRT's and bad CRT's. Also unless you have a decent CRT, the refresh rate may be to low and you will get that blinking eyestrain. If you increase the refresh rate on the CRT above 75Hz, some say higher then your eye will not notice the flickering and the eyestrain goes away. Some of the good CRT's had better contrast and as a CRT ages it loses its brightness making your image look more washed out or dimmer. CRT's are nice for some things but they have their weakness (ok a lot of weaknessess) which is why they are gone for the most part.

Unfortunately the LCD industry is stuck at low resolutions and cheap quality panels. I don't like apple but if they can kick the LCD industry in the butt and make them realize that high quality, high resolutions are what people would rather have like on the new ipad then I will give them kudos for that.
 

mightymaxio

Distinguished
Nov 9, 2009
1,193
0
19,360
I use to run a 22" CRT with a 2048 x 1536 resolution which is better than my 1920 x 1200 monitor although i prefer the LCD over the CRT for obvious reasons such as weight, color, and brightness.
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630
Well I guess I'm the only one who remember what is a black screen, how good is to be able to see the same color from different angles, and that changing resolution is not "forbidden" as it happens in a LCD. It annoys me that in a brand new super 1080p LCD panel, if you want to change resolution for example (something very common in a CRT), the image quality becomes pure crap!
I guess that a really expensive LCD can deliver some of those features in a single panel (not all of them I'm sure).
I hope this low quality screens (that we are force to buy) die out fast, and new technology give us a truly successor of CRTs. SED technology died before being born, let's see what happens with oled or amoled or whateverled comes...
 

mightymaxio

Distinguished
Nov 9, 2009
1,193
0
19,360
CRT's had over 40 years of perfection you gotta remember, ever since TV's were made they made them with tubes and originally CRT monitors almost always broke or had major issues when they were first coming out. I don't ever want to deal with a CRT even with some of the "extra" features. The weight, portability, color, resolution, power consumption are all better in LCD and LED technology than CRT's ever had. Remember when you turned on a CRT that the lights would dim for a split second because of the huge power draw from them.

Why would you even need to change the resolution on a LCD? If the resolution seems too big and you cant see anything because the text is too small they have compensation for that built into windows itself which lets you adjust the text size, window size, menu size, icon size, etc...
 

warezme

Distinguished
Dec 18, 2006
2,452
57
19,890

I wholeheartedly agree today's cheap LCD panels need to go. Even expensive panels are not that great. I have 3 rather expensive Alienware 23" 1080p panels and while they are 120hz fast with 5ms their contrast and color compared to a good CRT is lacking. I have to disagree on changing resolutions. I have no issues changing from one resolution to the next with pretty good scaling. An LCD being run by a decent video card and drivers, has two ways it scales. One scaling method is through the monitor scaling. Whenever you switch resolutions if your new resolution is not the same aspect ratio of your screen the monitor will scale according to its built in processing. Some LCD's do a decent job others do a crappy job. I notice that if I have the software driver do the scaling I get a better result. Plus I get extra features of color/contrast/brightness controls above the ones on the monitor and I can chose not to scale at all. This means that if a choose any resolution below the highest max, it recreates the resolution inside the screen with a border but not scaled with no artifacts at all.
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630

I see, very interesting. So there is really a way to move from native resolution and get a decent image quality like it happened in CRTs! I didn't know that!! How can I do that? How can I know if I'm getting the best possible scaling from my GPU-monitor??
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630

Why do I need to change resolution?? For gaming for example! Am I forced to play at 1920x1200 only because that is the maximun resolution of the monitor?? That is crazy, but we have got used to it, and we think it's normal. we are forced to buy a GPU that can deliver enough performance to fit with a "1080p" panel. A good 17" CRT monitor maybe could go up to 1600x1200, BUT if you played at 1024x768 or 1280x1024, the image quality was pretty decent.
And as I said before, I realise that LCD panels have their goods, lots of them. I haven't worked on my CRT since I've had my 1680x1050 LCD panel which is great for office tasks or surfing the web.
 
lol, what a hilarious thread.
I held onto my 2 CRT setup for YEARS. I had a pair of Mitsubishi 19" monitors (got them free from a closing business that had no idea what they were worth), and they knocked the socks off of all but the best IPS screens on the market. Push them to to 85+Hz and the blinking problem went away. Stellar contrast ratio, true blacks, pure whites and solid colors, no ghosting or lag, truly they were the best thing ever... and also served as heaters in winter!

The problem: After getting about 12 years old with constant use they got blurry. I started getting bloodshot eyes, and feeling major eye strain and fatigue. Finally I decided on a single large TN screen. 28" (now rated 27" for some reason), 1920x1200. Much clearer text than the Mitsubishis' ever had, but the color/contrast/black/whites are clearly inferior. But the lowered eye strain was well worth the switch. Input lag is there, but not noticeable for day to day use. There is no ghosting that I feared. And the best feature: with 28" at 16:10 I can put up 2 full page documents/browsers side by side with no problems which use to require 2 CRTs to do. I can design ledger sized documents at 100% size to see exactly how things would look (except that print obviously has higher resolution).
I would like a good IPS screen, but they are still to expensive for the size I require (I sit ~3-4 feet from my monitor, so I require a larger screen), but the reality is that I have grown use to the color gamut of my LCD screen, and I still have one of the old CRTs still to do true color accurate work when it is required (it's been about a year though).

Long story short; I finally moved over to LCD ~4 years ago, and for day to day use there is no going back. If color is a huge deal to you then get an IPS monitor. But really the TN screens are pretty good, and have improved a lot in the last 4 years to the point where the good ones are at near IPS quality (with the noted exception of black level and backlight bleeding on larger screens, but even then it is much improved from what they use to be). For what I paid for my mondo sized TN screen 4 years ago I could simply get a good quality LED TV today with better color/contrast/black level, and have an even larger screen... but it would be hard to give up that 16:10 aspect ratio because it is very nice for doing work (though gaming and movies are just fine on 16:9). Also, being limited to 1080/1200p is pretty frustrating as it is a relatively low resolution.

As for the future: 120Hz is a requirement for my next screen, especially as I do more gaming now, but it is also smoother for video watching as it is more easily divisible for more video formats than 60Hz. Also LED will be a requirement to get true whites and better colors again. IPS may not be required as normal screens are pretty good now, and backlight bleed is slowly becoming less of an issue. OLED may be a good route to go if it is widely available by the time I am ready to buy again (2-4 years). 3D and touch capability may be required too depending on which way things go with movie and game formats, but I would generally like to avoid these if possible. Higher resolution will be a must! I do 1080p video editing now, which means that something like 4K resolution would be much appreciated! My screen was fine when I was merely viewing 1080p content, but when editing I am only getting ~1/4 resolution with the playback because the editing interface eats most of the screen. If they can cram 1080p on a 10" screen now I do not see why we do not have much higher resolutions for desktop displays for a relatively affordable price (under $400).
 

warezme

Distinguished
Dec 18, 2006
2,452
57
19,890
I have been a big fan of Nvidia for a while because of the ease of their drivers. Under the Display tab there is an option called "Perform scaling on" and your choices are GPU (the driver does the scaling) or Display your monitor does the scaling. Under adjust desktop size and position, you have the options for aspect ratio, full screen or no scaling (bars according to ratio). I also check the option to control color via driver (GPU). Like I said this is with the Nvidia control panel. I moved away from ATI because of their drivers so I could not tell you how those are set. I had to be careful with the Alienware monitors because they have a fairly good built in scaler and it was on by default and was overriding and causing weird issues when I first got them and tried doing the software setup. They still have some minor glitches when switching from 60 to 120hz at different resolutions.
 

It is just a difference of technologies. You are not forced to play at 1920x1200 because it is the max it will display; You are forced there because it simply is the resolution of the monitor. If you go above that then the input gets lost and confused. If you go below that then you are going to have an ugly blow up effect from the pixels displayed overlaping multiple physical pixels. There are ways to combat it (like keeping resolutions of similar aspect ratio, or resolutions easily divisible of the native resolution), but it will never go away. CRT monitor technology simply did not work that way which is why the 'suggested resolution' was merely a suggestion and you could often go above or below it, or with a completely different aspect ratio entirely, without any major image quality issues. Any fixed pixel technology suffers from that effect (LCD, Plasma, OLED, AMOLED, etc.), so do not expect that to change any time soon.
 

VGAmike

Honorable
Apr 10, 2012
55
0
10,630
Caeden V
Yes, I know, and it's a pity. Don't you think it's a step backwards? at least for gaming? (current monitors have only one resolution). But maybe it's not as bad as I think. Warezme has said something interesting about scaling that I didn't know. Monitor scaling vs GPU scaling... I'll have to have a look to that.
 
Reading more posts on this thread I think there is a huge confusion between color accuracy vs color saturation. Color accuracy means that it is 'true to life', and reproduces what the content's author originally intended with their photo or movie (or game as the case may be). Generally most people would not like a properly calibrated screen for movies and games as it would look dull or bland to them (thankyou to the TV market for having terribly over saturated images for so many years). Saturation is merely the ability of a screen to have bolder colors, and gives a good 'wow' factor, but is not very useful. CRT monitors (with the exception of the cheapest ones) were generally very true-to-life and much better to work on as what was displayed was much more accurate to life and print. LCD monitors (or at least the popular ones) generally blow the saturation out of purportion. A lot of people like it because it makes the images 'pop' more (even I have to admit to liking it on some games and movies), but if you are doing anything where the color has to be 'right' instead of 'enjoyable' then pulling out an old $150 CRT (probably $800+ when it was new) can be a much cheaper way to verify things than buying a $500+ LCD.

In addition to the accurateness of the color saturation is the color range that a screen can display. CRTs (even relatively cheap ones) could easily display 10 or even 12bit graphics with a high degree of accuracy (I loved my Matrox g550 GPU for a long time because 10bit color was so much better when I was in school, especially on reds which can be hard to reproduce). TN displays can only do 6-8bit colors, while IPS struggles with 10bit. AMOLED can finally do 10bit well, but is relegated to phones and some laptop screens, and we are not likely to see them in desktop displays for a good long time. The problem is so pervasive that now all of the mainstream graphics cards (including the highest end GTX and HD cards) only do 8bit color processing, and all games are made for this. If you want 10bit graphics then you have to purchase a professional GPU (Quadro, firepro, or specialty GPUs like Matrox), and again there is little benefit for gamers as all games are designed with 8bit in mind.

When it comes down to it high end panel technologies are just now beginning to catch up with good CRTs from the '90s, and you pay a high price for that technology. Cheap panels are still below the color quality of even the cheapest CRTs. But when it comes to size, power use, eye strain, and general convenience then LCD wins every time, and most people (myself included) are more than willing to sacrifice a little image quality for a more modern panel screen.
 

GPU scaling is MUCH better, but it is still not going to match the quality of your native resolution.
As far as a step backwards; I'm not sure. It was definitely a step back when panels first came out and changing resolution was common practice, but in general you can play most games at 1080/1200p with very good settings on a $150 graphics card. Up until recently I was gaming with an old 9800gt on high settings with very rare fps issues (sadly only at the exciting and crazy bits). My new 570 obviously blows that out of the water and games very well (way more than my 'needs' but I wanted CUDA support for Premiere, so I splurged and bought it). Still, 9800gt only cost $80 or less these days ($62 on newegg), and have still got game (especially if you do not mind lowering some of the game settings). If nothing less adding $80 card to an average PC will give you better graphics than a $200 console, so I find little reason these days to lower the resolution on a monitor to play a game.

Even some onboard video cards can play modern games at 1080p with decent settings. I know that AMD can, and I am sure the Intel IB graphics will do well enough for the masses.
 
Status
Not open for further replies.