News Ancient CRT monitor hits 700Hz — resolution compromised to just 120p to reach extraordinary refresh rate

Status
Not open for further replies.
I don't see where the article says what GPU was used to drive it, but maybe I can save folks some trouble. Nvidia's Maxwell (GTX 900) was the last generation to have analog out.

I was thinking about buying a Fury, but then I saw it lacked analog and I still had a pair of 24" CRT monitors. So, when I noticed that the GTX 980 Ti both had analog outputs and was deeply discounted (due to the launch of the GTX 1070 and 1080), I snapped one up.
 
What's the maximum frequency they support? I haven't looked in a while, but the ones I saw were all trash (1080p @ 60 Hz).
No clue what all they're limited to I just know that's what they're using for this testing as I checked out the channel.

I believe this is the adapter they're using: https://www.delock.com/produkt/62967/merkmale.html?g=1023

edit: It sounds like what the good quality ones all have in common is the Synaptics VMM2322 chip.
 
Last edited:
  • Like
Reactions: bit_user
What's the maximum frequency they support? I haven't looked in a while, but the ones I saw were all trash (1080p @ 60 Hz).
If you look at it as pixels per second, 1920x1080x60 is 4.6x 320x120x700. Not including the porches, but even then I would doubt the 700Hz is significantly more scanning than 1080p60. That's why this CRT gimmick works at all. The dongles may not allow it even if the DAC update rate is the same, though.
 
  • Like
Reactions: bit_user
In the early 00s i was gaming at 200 Hz and 200 FPS on my 22" CRT screen, with a SONY tube of course.
Mind you NOT at the screens MAX resolution, and of course with lowered in game image settings, back then we played the game for the game and not its looks.

About the same time some of my fellow clan members was gaming on early LCD screens, and i was like WTF ?????? how can you even look at this lagging garbage.
I was getting headaches just watching them play.

Still a few of them, like me was most often in top 50 worldwide in the games we played.
 
  • Like
Reactions: bit_user
If you look at it as pixels per second, 1920x1080x60 is 4.6x 320x120x700. Not including the porches, but even then I would doubt the 700Hz is significantly more scanning than 1080p60.
OMG. With the mention of porches, I'm suddenly having semi-traumatic flashbacks of having to manually compute my own X11 mode timings, in Linux. They never looked as good as the automatic timings Windows would use.

In the early 00s i was gaming at 200 Hz and 200 FPS on my 22" CRT screen, with a SONY tube of course.
I typically would back off the resolution a tad, just so I could get higher refresh rates, like 85 Hz. For me, it was mainly about reducing flicker, since I didn't play games.

About the same time some of my fellow clan members was gaming on early LCD screens, and i was like WTF ?????? how can you even look at this lagging garbage.
I was getting headaches just watching them play.
Yeah, with low framerates and no backlight strobing, the latched-pixel effect can result in excessive motion blur.
 
CRT was still the best tech compared to flatscreens we have now. No ghosting, much better, brighter lightning and colors, refresh rates actually being real (as the screen in it's complete is refreshed and not per pixel basis on flatscreens today). Downside was their weight, heat / power, but also size of these things.
 
  • Like
Reactions: iLoveThe80s
CRT was still the best tech compared to flatscreens we have now.
I disagree.

No ghosting,
Oh, yes they did! If you flashed a bright image on screen, there would usually be an afterglow. Unlike LCD, you can't just apply a higher voltage to switch a pixel off, faster. Once excited, the phosphors had an exponential decay curve.

Downside was their weight, heat / power, but also size of these things.
Not to mention:
  • focus
  • convergence
  • purity
  • flicker
  • sensitivity to electrical interference
  • sensitivity to magnets
  • floating blacks
  • geometrical distortions
  • burn-in
 
Last edited:
I've found it rather interesting that a fair number of the advantages and disadvantages of OLED and CRT overlap. Though realistically every display technology has something that it does very poorly.

I'm still holding out hope that there will be a breakthrough in micro LED, but I won't be holding my breath. Sony's multilayer LCD technology seems extremely interesting should they be able to deliver it at a reasonable price it may actually carve out a solid chunk of the market.
 
I've found it rather interesting that a fair number of the advantages and disadvantages of OLED and CRT overlap. Though realistically every display technology has something that it does very poorly.
I ran dual 24" monitors until 2020, at which point I borrowed a LCD monitor from my job (we were allowed to take equipment home, due to pandemic & work-from-home requirements). For the longest time, I had been holding out for OLED. It's therefore somewhat ironic that I bought my own LCD only last year, once OLED monitors hit the mainstream. However, I just felt too much uncertainty about the degree to which burn-in was still an issue. Also, there were minor complaints about text quality, on some monitors, due to their subpixel layout. Price was another minor consideration, but wouldn't have stopped me if everything else were right.

I've got to say that when I got my high-end plasma TV, more than a decade earlier, I was disappointed that it looked noticeably worse than my CRT. It had a lot of dithering noise that forced me to position it farther away than I had intended.

BTW, in all my years of running CRTs, burn-in wasn't something I personally experienced. However, I used a screen blanker and kept the contrast relatively low.
 
I would prefer monitor with high resolution and "good enough" refresh rate than high refresh rate with low resolution one.
Still enjoying my Neo Odyssey G8, 4K @ 240 Hz 😊
 
CRT is just the best for PC Gaming , one of the strongest selling points of PC gaming is its immense backwards compatibility, most games before the 7th gen werent made for HD 16:9 resolutions, there will always be some undesirable stuff, and the textures and art-style gets utterly destroyed by modern 4K OLEDs.

Cyberpunk looks best on an OLED monitor with proper HDR, but it still looks amazing on a CRT monitor.
Half-Life 1 looks best on a CRT monitor but it doesn't look right on an OLED, actually it looks disgusting on an OLED.

That asymmetry is why CRT will always be the PC Gaming display, the ability to go through 50 years of gaming in 1 sitting and all of it looking somewhat *right* is just incredible, no it may not be worth the inconveniences that come with actually using a CRT monitor, but it is something that can't be faked.
 
I've found it rather interesting that a fair number of the advantages and disadvantages of OLED and CRT overlap. Though realistically every display technology has something that it does very poorly.

I'm still holding out hope that there will be a breakthrough in micro LED, but I won't be holding my breath. Sony's multilayer LCD technology seems extremely interesting should they be able to deliver it at a reasonable price it may actually carve out a solid chunk of the market.
OLED is the best tech right now but i expect the tech to go extinct when micro led hits the mass. 5 to 10 years maybe?
 
I ran dual 24" monitors until 2020, at which point I borrowed a LCD monitor from my job (we were allowed to take equipment home, due to pandemic & work-from-home requirements). For the longest time, I had been holding out for OLED. It's therefore somewhat ironic that I bought my own LCD only last year, once OLED monitors hit the mainstream. However, I just felt too much uncertainty about the degree to which burn-in was still an issue. Also, there were minor complaints about text quality, on some monitors, due to their subpixel layout. Price was another minor consideration, but wouldn't have stopped me if everything else were right.

I've got to say that when I got my high-end plasma TV, more than a decade earlier, I was disappointed that it looked noticeably worse than my CRT. It had a lot of dithering noise that forced me to position it farther away than I had intended.

BTW, in all my years of running CRTs, burn-in wasn't something I personally experienced. However, I used a screen blanker and kept the contrast relatively low.
Burn in on OLEDs is overblown. If the correction algorithm is perfected, there should be no burn in no matter how hard you try, since the algorithm pushes more voltage to overworked subpixels to retain their brightness. If the algo messes up, there might be issues.

But the thing is, an oled that has some burn has still better uniformity than a brand new IPS or VA. Ghosting, IPS bleed, haloing and a plethora of other flaws are inherent to IPS and VA displays, so people worrying about their OLED not being perfect after 2 years of use is kinda weird when every other tech is imperfect straight from the factory.
 
OLED is the best tech right now but i expect the tech to go extinct when micro led hits the mass. 5 to 10 years maybe?
OLED is situational because you can absolutely still get burn in even with the new algorithms it's just a lot more subtle. It's especially problematic on edges of bright windows if they're always in the same place like if you had your desktop setup with spreadsheets and the like. If you're mostly gaming and media consuming then the issues are basically nonexistent.

It's also still awful in bright rooms compared to the alternatives.

Overall visual quality and response time though is second to none on the consumer market.

MicroLED if they can get manufacturing under control is the future for sure.

Sony also has a really fascinating technology they used for their new reference displays. It has two display panels so it can get brighter than anything else while maintaining perfect visuals. I'm not sure that will ever come to consumer displays and I imagine the response times wouldn't be great, but it's hard to say overall.
 
OLED is situational because you can absolutely still get burn in even with the new algorithms it's just a lot more subtle. It's especially problematic on edges of bright windows if they're always in the same place like if you had your desktop setup with spreadsheets and the like. If you're mostly gaming and media consuming then the issues are basically nonexistent.

It's also still awful in bright rooms compared to the alternatives.

Overall visual quality and response time though is second to none on the consumer market.

MicroLED if they can get manufacturing under control is the future for sure.

Sony also has a really fascinating technology they used for their new reference displays. It has two display panels so it can get brighter than anything else while maintaining perfect visuals. I'm not sure that will ever come to consumer displays and I imagine the response times wouldn't be great, but it's hard to say overall.
Yes, you can get burn in if you use windows side by side 8 hours a day every day. But, in order to actually notice that you need to test specifically for that (you know, switching through all the colors etc.). On the other hand, what you do indeed notice is all the imperfections non oleds have straight from the box.

They are not awful in bright rooms, they are awful if the light is directly in front of them. So yeah you need to think about your setup before getting one I guess.
 
Status
Not open for further replies.