What are the limitations of CRT display technology? resolution, nits, ....

  • #36
jman5000 said:
Believe me I've tried converting film to higher framerates without interpolation to try and get rid of the duplicate images the eye perceives, it doesn't work.
Of course, without some form of interpolation you can't up convert in any useful way. Way back, my department were developing the use of motion vectors for improving motion portrayal in 525/60 to 625/50 TV standards conversion. This link is an abstract of a paper I found which is not free but it mentions that motion portrayal can be enhanced beyond simple interpolation. If you are interested in frame rate upconversion then you might find that link interesting. A search using the term 'motion vector' should take you to some useful stuff.
You will have seen slomo sequences in sports TV coverage and they do much better than simple interpolation. In most respects, TV has far more potential for quality improvement than film can ever have.
 
Engineering news on Phys.org
  • #37
sophiecentaur said:
Of course, without some form of interpolation you can't up convert in any useful way. Way back, my department were developing the use of motion vectors for improving motion portrayal in 525/60 to 625/50 TV standards conversion. This link is an abstract of a paper I found which is not free but it mentions that motion portrayal can be enhanced beyond simple interpolation. If you are interested in frame rate upconversion then you might find that link interesting. A search using the term 'motion vector' should take you to some useful stuff.
You will have seen slomo sequences in sports TV coverage and they do much better than simple interpolation. In most respects, TV has far more potential for quality improvement than film can ever have.
Can you think of a hypothetical way to have a display that doesn't hold samples and works similar to crt method of immediately fading the image? Games cannot really be run at super high framerates so overcoming blur with more frames isn't really feasible. I wonder if fed and sed displays would of been sample and hold?

Obviously if you had high enough refresh rate display you might be able to just simulate a crt method of drawing images, but that level of display seems like sci-fi.
 
Last edited:
  • #38
sophiecentaur said:
Actually, the whole thing is full of flaws. @Baluncore has pointed out some of them.
Also, It's a bit late to be trying to improve the Shadow Mask design. That ship has sailed.
Would you mind pointing out some of those flaws? I'm not asking to spend a whole afternoon thinking through all the problems but if you immediately see something wrong with it I'd appreciate it as now I'm curious what's wrong with it.
I already noticed it's not actually open circuit by being in contact with the phosphors, but disregarding that what else was wrong with it?
 
  • #39
I found an article linking to all the other articles on the blurbuster website talking about all the sources of motion blur if anyone wants to see it. It explains it much better than I do especially if you read many of the links. The blur I was describing how to overcome with 1000hz displays is the eye tracking blur.
https://blurbusters.com/faq/lcd-motion-artifacts/

More specifically this article covers eye tracking blur: https://blurbusters.com/faq/oled-motion-blur/
 
Last edited:
  • #40
This thread has split into two now - motion is a second issue.
jman5000 said:
Can you think of a hypothetical way to have a display that doesn't hold samples
You can't avoid a sampled image these days so you are stuck with samples. Before long and if necessary, they will probably invent a solid state display with high enough power to flash the pixels at significantly less than the frame repeat period. That could mean impulse samples rather than long ones.

jman5000 said:
Would you mind pointing out some of those flaws?
There is a list of them in the earlier post I mentioned from @Baluncore.

I've been thinking more about CRTs and there are issues about the choice of a horizontal raster display. It favours horizontal motion portrayal. Your interest seems to be mostly about horizontal motion and perhaps most games are designed with more of that (?).
jman5000 said:
I think a large portion of these artifacts are tied to human brain doing some work on the image it perceives and therefore isn't a directly measurable attribute without using a human eye.
The human eye is there all the time., of course but it is not hard to make a mechanical display (large rotating cylinder ) as a reference moving image. Much easier than trying to produce fast moving unimpaired images electronically. Do you fancy making one of those and having it in the corner of your games room? :wink:

I must say, I found the blur buster display examples were very hard work to look at and there is an enormous snag that they have no reference CRT or 'real' images.
 
  • #41
jman5000 said:
Technically we already have similar motion handling on certain lcds that use strobing backlight, but the implementation is usually bad, creating artifacts that look even worse, and the colors end up looking so washed out and dim it's not worth using.
Still, modifying them: doing it right based on that direction looks more promising than the resurrection of a tech now long dead.

jman5000 said:
Stuff that is filmed at 24fps, like nearly all Hollywood film and youtube videos don't gain anything by running at higher framerate than it was captured at.
Now AI is working on that: both on resolution and fps. The results are quite surprising for a tech barely out of its cradle. Also looks like an area worthy of interest (more useful and interesting than those ch(e)atbots).
 
  • #42
sophiecentaur said:
I must say, I found the blur buster display examples were very hard work to look at and there is an enormous snag that they have no reference CRT or 'real' images.
What exactly do they need a crt for? Are you saying you don't believe the assessment that the visibility of a pixel reduces the perceived motion blur? I have a 144hz lcd and setting to different refresh rates of 60/100/144 definitely allows me to set that ufo test faster without blur and it is definitely less blurry when panning around in a game. Whatever it is seems to coincide with refresh rate so even if it isn't directly caused by refresh rate raising it seems to keep lowering perceived blur.
 
Last edited:
  • #43
sophiecentaur said:
This thread has split into two now - motion is a second issue.
Are you saying I need to make a second thread specifically for motion? I'm probably close do done bugging you now. There's not much more I can ask. Well, that's not true I could probably talk your ear off asking questions about crt but I won't.

Also, in relation to motion horizontal/ vertical on crt, does that distinction matter? I've never noticed a discrepancy between the two directions on my crt but then again maybe I never see stuff move vertically.
 
Last edited:
  • #44
Rive said:
Still, modifying them: doing it right based on that direction looks more promising than the resurrection of a tech now long dead.Now AI is working on that: both on resolution and fps. The results are quite surprising for a tech barely out of its cradle. Also looks like an area worthy of interest (more useful and interesting than those ch(e)atbots).
I hope so but it's been two decades of having a literally worse image in motion than what we had at the super high end crt monitors and while we have 4k now we still haven't caught up in motion resolution, which is kind of the whole point of a display.

I know there are theoretical ways of doing it such as having super bright lcds that can use strobing, but those still lack the deep blacks, so now you are looking at doing that with microled instead. microled isn't even available to consumers at this point.

The only other option is an oled that gets so bright that you could insert a thick rolling bar in like 75% of the frame duration, that dims the image to gain motion clarity by reducing pixel visibility duration. With a bright enough oled the blackbar wouldn't matter as long as I was okay with running at nits comparable to crt. The current black bar insertion in some oleds only has around 600p resolution in motion and ends up even dimmer than a crt.

I hope technology can keep improving. I don't know though, crt didn't keep improving ad Infinium. Who is to say we won't hit similar limitations for oled/ lcd? Do we know the limitations and it's just a matter of developing the tools to manufacture it?
 
  • #45
jman5000 said:
Who is to say we won't hit similar limitations for oled/ lcd
Just noticed that a solution got skipped here. I wonder if you have checked plasma displays (both actual parameters and potential for development) for your requirements/expectations?
 
Last edited:
  • #46
Rive said:
Just noticed that a solution got skipped here. I wonder if you have checked plasma displays (both actual parameters and potential for development) for your requirements/expectations?
Plasmas supposedly scaled poorly in terms of power and weren't power efficient enough at higher resolutions to pass regulations is something I read on the internet. You are right that plasma might have been the solution if only we'd have used faster decaying phosphors. Plasmas that were sold were typically using 4ms persistence phosphors.
 
  • #47
jman5000 said:
weren't power efficient enough at higher resolutions to pass regulations is something I read on the internet.
... to be honest, I have some doubts that CRT would pass those parts of the regulations at resolution/frequency expected these days...

For me, I see far higher chance of (partial) resurrection for plasma than for CRT.
 
  • #48
jman5000 said:
I've never noticed a discrepancy between the two directions on my crt
if you wave your hand across the screen side to side and up and down, there's a noticeable difference. But I can't repeat that or other experiments because my last CRT went away 15 years ago. There's a wealth of old papers on motion portrayal on raster scan CRTs - just dig around. Its won't have been written on computer, though!.

I have to admit that I find video games rather tiresome (but I am in a minority) so I don't really care too much about the amazing spec they need for processor and connection speeds. But I am very impressed at the eye watering performance that's available.
jman5000 said:
an oled that gets so bright
I suspect that Oleds are just a passing phase. I suspect that a bright enough device with a short duty cycle will involve high volts and high mean power consumption. That could limit applicability to non portable products. But I watch this space.

The future will be in the minds of the developers and they will ration out the advances to maximise the profits by regulating the rate that new stuff becomes available. I remember the idea that Kodak always had a set of 'new' films and processes to bring onto the market every time (and only after) another company brought out a 'copy' of the latest Kodak product.
 
  • #49
sophiecentaur said:
I have to admit that I find video games rather tiresome (but I am in a minority) so I don't really care too much about the amazing spec they need for processor and connection speeds. But I am very impressed at the eye watering performance that's available.
I'll admit I'm also getting tired of video games, only some that stand out catch my attention nowadays. But that is just it, we need super high refresh rates to lower blur from sample and hold, it's a fundamental flaw with it. Yes, more refreshes results in a more continuous image but the clarity of those images is bad because of the long persistence. My crt running at 94 refreshes flat out looks better in terms of motion clarity and smoothness than my high end 144hz lcd when objects exceed a certain threshold in speed that's not hard to exceed. The clarity is better at even 60, though I'll admit the smoothness feels less than the 144hz ld at that point, it still ends up looking better.

This isn't just for games though it applies to movies as well. Having super high framerates just produces the image duplications and they still have the blur from sample and hold as well. Running 24fps movies on a higher hz display doesn't actually decrease the persistence of it as it duplicates the frames to fit into however many hz it is. To do otherwise would result in a super dim image.

Like I said, these should be solvable problems if we can get even brighter pixels than what we have now. I know there are some displays around two thousand nits nowadays so maybe that would be bright enough, but nobody has tried using those displays in ways that improve motion clarity.
 
Last edited:
  • #50
jman5000 said:
Having super high framerates just produces the image duplications
Why would it involve something as basic as image duplications? I think your experience with increasing frame rates reflects that the processing is not top range. If you want to see how it can be done really well then look at slow mo, produced at the TV picture source - before transmission. It's very processor intensive and it builds a picture for each frame, using motion vectors for each pixel. The motion portrayal is very smooth with very little blur. I'd imagine that the cost of circuitry to do this real-time in a tv monitor may be just too much.

Rive said:
For me, I see far higher chance of (partial) resurrection for plasma than for CRT.
Haha. And heat your house at the same time. 'Electronic Tubes' use as much power for the heaters as your whole solid state display, I'd bet. But we still use one electron tube in our homes and there is, as yet, no alternative. The heroic Magnetron is a legend. I read a story that the very first one that was manufactured in a lab worked really well - the result of some brilliant lateral thinking about how electrons behave in magnetic fields
 
  • #51
Okay so modern displays seem to not be able to reach super high refresh rates for the clarity it affords. Do you think this is due to a limitation in how fast the logic can refresh the screen or could it be done if the manufacturers felt so inclined?
I feel like this is probably a dumb question that stems from me not knowing how electrical stuff works, but If it is a logic issue, as absurd as this sounds, would it be possible to use a micro crt that does the logic of painting the pixels sequentially and make some sort of analog system to convert the logic to the oled to light up the screen in the exact same way? So instead of painting a phosphor it is painting up against a grid of sensors that represent a pixel on an oled?

I watched a video on analog computers and how they can be used to do specific calculations at insane speeds for specific tasks. That's kind of what I was thinking with this since I'd want the larger display to mimic the actions of the smaller one.
 
Last edited:
  • #52
jman5000 said:
as absurd as this sounds, would it be possible to use a micro crt that does the logic of painting the pixels sequentially and make some sort of analog system to convert the logic to the oled to light up the screen in the exact same way?
You are right; it does sound absurd and I don't think you will get much further by inventing these things your head. The level of sophistication in tv monitors depends largely on how much money they want to spend on development and how much profit 'they' can make. I advise you to read around a bit more, outside your blur buster site if you want to have a better understanding.
BTW, analogue computers were once the only way to solve certain problems but digital computers have progressed in leaps and bounds. I remember, in the late 60s being shown an analogue computer that was used to investigate under water guided weapons. It was large wardrobe sized and the guy was very proud of what it could do but, at the time, almost no one had a personal electronic calculator even and all computers occupied the best part of a large air conditioned room. Things digital took off very fast after that and I'm not aware of modern analogue techniques for problem solving. (No doubt someone will put me straight about that - such is PF)
 
  • #53
The video I watched about modern analog computing was this:
The guy in the video claims modern digital computing chips were 1-4x faster than the chip they currently had but the chip they had only consumed 3 watts.
That company went bankrupt last I knew.
 
  • #54
Thread is veering off the OP topic, so it may be closed soon...
 
  • #55
berkeman said:
Thread is veering off the OP topic, so it may be closed soon...
That's fair. I don't really have any more questions.
 
  • Like
Likes berkeman
  • #56
jman5000 said:
but If it is a logic issue
It is not. The last few generations of (CRT) TVs were already digital, and high performance (CRT) monitors could not be done without fine digital control already. The issue was rather the difficulty of manufacturing and the problems of wide frequency optimization for magnetic parts/drivers.
And, of course the competition they lost.

sophiecentaur said:
I'm not aware of modern analogue techniques for problem solving.
It's a tech good to flirt with but hard to actually do, so it resurfaces for time to time: especially when some new areas requiring new solutions - but then it vanishes just as fast.
This time it was with neural networks, machine learning and AI.
No real success so far.
 
  • #57
Rive said:
especially when some new areas requiring new solutions
Imo, the way forward for that stuff is with specialist 'co-processors' for commonly used functions in regular computers. But, of course, you need an extra bolt-on for every one of those functions.That's the beauty of the General Purpose Digital Computer.
Rive said:
This time it was with neural networks, machine learning and AI.
Both of those terms tend to be used out of true context. Many examples of artificial intelligence are better described as 'sheer grunt' but which term would go best on an advert?
 
  • #58
I actually am wondering, if I want to get more brightness out of x amount of phosphor, is it impossible to get y amount of brightness without z amount of acceleration voltage? Does that mean making any type of crt with higher brightness would be dangerous due to acceleration voltage generating xrays?
 
  • #59
Also, what physically happens that causes crt burn in? Could a phosphor generate x10 the nits but be exposed to the beam for a much shorter time and be just fine in turns of damaging and burning the phosphor?
 
  • #60
jman5000 said:
is it impossible to get y amount of brightness without z amount of acceleration voltage?
The beam current is also relevant (P = VI).
CRT design (particularly with three colour systems) is compromise all the way. You have to get good beam focus and colour convergence over a huge range of angles. There must also be a limit to the power density that the screen mechanism can handle.
 
  • #61
sophiecentaur said:
The beam current is also relevant (P = VI).
CRT design (particularly with three colour systems) is compromise all the way. You have to get good beam focus and colour convergence over a huge range of angles. There must also be a limit to the power density that the screen mechanism can handle.
I ask because I was looking into the fed/ sed display tech that was going to come out in the late 2000's, which had a electron emitter for every single subpixel. There is a 720p prototype you can see on YouTube. Wikipedia states it would allow for increased brightness due to not needing to steer the beam, but It just makes me wonder if that is actually true since I thought increased brightness meant higher acceleration voltage which is bad for xrays. I'm also not sure if these would be impulse displays or not.
 
  • #62
jman5000 said:
I ask because I was looking into the fed/ sed display tech that was going to come out in the late 2000's, which had a electron emitter for every single subpixel. There is a 720p prototype you can see on YouTube. Wikipedia states it would allow for increased brightness due to not needing to steer the beam, but It just makes me wonder if that is actually true since I thought increased brightness meant higher acceleration voltage which is bad for xrays. I'm also not sure if these would be impulse displays or not.
Can you provide a link to this please?

If there is no shadow mask, that eliminates a big e-beam power loss for color CRTs.
 
Last edited:
  • #63
berkeman said:
Can you provide a link to this please?

If there is no shadow mask, that eliminates a big e-beam power loss for color CRTs.
Here is the prototype. It's just a video demo though.

The wiki for sed, which probably has inaccurate info because it's Wikipedia: https://en.wikipedia.org/wiki/Surface-conduction_electron-emitter_display

but that is probably too general for you. I found this more technical explanation as well: https://www.eetimes.com/a-technical-comparison-between-sed-and-fed/

I can't find the info stating they could be run at higher brightness. I might have imagined that. Apparently, this never made it to market due to legal patents. Although I don't know if it would even do anything better than current displays honestly.

I'm kind of curious if these would have issues with longevity as I understand it ion bombardment is a problem with crts that I think would be exacerbated by making the vacuums really small. Actually thinking about it, even if you could create brighter vacuum displays wouldn't that just result in heavier ion bombardment? Seems like a pretty fundamental part of vacuum tubes that limits what you can do.
 
Last edited:

Similar threads

Replies
2
Views
3K
Replies
12
Views
7K
  • Electromagnetism
Replies
12
Views
6K
  • Introductory Physics Homework Help
Replies
2
Views
4K
  • Electrical Engineering
Replies
11
Views
7K
  • Electromagnetism
Replies
28
Views
2K
  • Quantum Physics
2
Replies
53
Views
4K
Replies
6
Views
12K
  • Introductory Physics Homework Help
Replies
4
Views
5K
Back
Top