Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect even most hardcore retrocomputer hobbyists care most about emulating the parts of a display that actually came up in use of the machine. If I were an eccentric billionaire who wanted a replica of the Mona Lisa to hang on my wall and enjoy regularly without the inconvenience of weekly flights to France, I'd care much more about my money going into making a product that got the visible details of the canvas right (https://hyper-resolution.org/view.html?pointer=0.055,0.021&i...) than spoofed the proper results if I carbon-dated it or something. I think the same concept applies here. I don't really care as much if the only thing an emulator can't replicate is a clever but (to an observer) comically specific physics-based test for authenticity if I get everything I'd notice while using the computer correct at a fraction of the price. In the context of preservation, just knowing some other (far richer than me) person or org keeping a single-digit number of the actual artifact maintained for future reference is good enough for me.


I'm not sure I follow. CRTs draw the image to the screen in a fundamentally different way than modern displays due to how the electron beam moves sequentially left to right/top to bottom. This analog process, happening at 15 or 25hz, is what gives authentic arcade machines their look and feel. Same for old computer terminals. My understanding is that to reproduce this effect on a modern display, you'd need an extremely high refresh rate. To properly replicate this requires some pretty low level aspects of the system to be addressed. Hardware limitations are bound by the laws of physics after all.

Beyond just the aesthetics, there are practical reasons why this is important, whether it be lighgun idiosyncrasies or how the game "feels," which can affect timing and such for competitive players. There's a lot more to preserving the look, feel, and compatibility of displays for old computer systems than most realize and the rabbit hole can go quite deep on this one.


    there are practical reasons why [how tthe electron gun works is] 
    important,  whether it be lighgun idiosyncrasies or how the 
    game "feels,"
This is always interesting to discuss because there are so many factors at play! To put it in less than a zillion words,

The way a game "feels" in this context is essentially a function of input latency. The old-style "chasing the beam" hardware, plus a CRT display, equals something very close to a true zero lag environment.

Here's a breakdown of the input lag in a modern environment, for contrast. These are all latencies that don't exist in something like, say, a Sega Genesis/Megadrive hooked up to a CRT: http://renderingpipeline.com/2013/09/measuring-input-latency...

In an ideal emulation situation, you could theoretically recreate something close to a zero-lag analog environment (in terms of latency) without necessarily simulating the path of the electron beam itself.

Although, as the linked article implies, there are a lot of bits in the emulation stack that would need to be optimized for low latency. High refresh rate displays get you part of the way there "for free."


Not everything is games that require minimum latency, though. For, say, a terminal, or a CDC 6x00 console, some lag is perfectly acceptable.


Sure, and even many games don't particularly benefit from it. However, it's a really remarkable thing to play e.g. Mega Man or Smash Bros. in a true lag-free environment.


I wonder about that, might be that having specialized display controller on say OLED display could've been enough ?

You could then have the controller artifically drive it line by line instead refreshing whole screen


Perhaps. One issue I foresee is the way CRTS glow. The phosphor doesn't light/dim immediately the way an LED does. So there's some amount of fade in/out that happens on a CRT as the beam moves across the screen. I imagine this could be difficult or impossible to reproduce with a traditional OLED screen. Some old games rely on this technique along with the slow refresh rates to to create a sort of dithering/aliasing effect.


Phosphor decay is not terribly difficult to simulate to an acceptable degree. Doing it at the pixel level is pretty easy, doing it at the phosphor level is computationally harder but not much more complicated.

The larger issue w.r.t. this specific quirk of CRTs is that we're running out of human beings that are familiar with what this is "supposed" to look like, and actually care.

I care a lot, but I'm old.


I'm not aware of any cases where it's been emulated in any acceptable manner. I can't be bothered to do the math myself, but I imagine doing this well would be beyond the capabilities of modern displays (probably in the 1000s of hz refresh rate). Maybe some special FPGA based controller with an OLED like was suggested above could make it possible. I'm not sure.


Can you talk more about why you feel it would be infeasible? I'm a guy with a house full of CRTs so I am genuinely interested.

What sorts of things are advanced filters like CRT-Royale are missing? https://www.google.com/search?q=crt-royale

Each individual phosphor dot on a CRT is not terribly tricky to emulate.

The brightness at any given moment is a fairly simple decay function based on how long it's been since you lit it up with the electron gun. On top of that, you would typically want to apply some level of bloom to simulate the way light is diffused by the glass. Sure, you've got a few million dots to simulate, but this is also anembarrassingly parallel problem.

Now of course, admittedly, you're only simulating that phosphor glow decay at the refresh rate of your monitor -- 60hz, 144hz, 240hz, whatever -- instead of an effectively infinite level of steps as would be the case in real life. However, I don't think that is a practical issue.

You're clearly thinking of factors I'm not and I'm genuinely interested. To my mind, the visual aspects of CRTs are pretty easy to simulate, but not the near-zero lag.


The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining. And in vector graphics you don't have pixels at all, the light shines quite beautifully in a way I don't think is possible at all with backlit displays.


> The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining.

I did said OLED not LCD precisely because of that


Are you sure it's the decay they're using, and not the natural blurriness and texturing?

And some phosphors have a much longer decay than others, but you could easily emulate those long-term effects on a normal screen.


That and the fade follows a non linear curve. It’s pretty cool, but quite a lot of math to match the physics going on.


It would need to redraw the whole screen to account for the phosphor decay. To do that with line resolution and an NTSC signal, you’d have to redraw it roughly 1500 times per second (60 fields of about 250 lines). You’d draw the current line at full brightness and “decay” the rest of the frame according to that phosphor persistence. Since there is some quantization, you could reduce the frequency of decays as the line gets older.


On the concept of these very weird displays, I remember an HP oscilloscope that had a monochrome CRT and a polarizer in front cycling between R, G, and B components on every refresh cycle. Overall, the screen resembled a DLP projection when you'd see different color frames when your eyes moved, but a stable color when you were looking at a part of the screen. A very neat way of producing crazy small color pixels on a 7"ish CRT.

And yes, that device cost about the same as my house back then (2002).

https://hackaday.com/2019/01/17/sharpest-color-crt-display-i...


I'll give you an example from the LCM. They had a PLATO terminal with its original plasma flat panel display. I'd been reading about PLATO for years and had even run it in emulation but I'd never seen actual hardware before visiting the LCM.

The experience on the original terminal was way different than emulation. The way the screen worked and the tactile feel of the keyboard was the core of the experience of it. Sitting at an actual terminal really changed my understanding the system because it gave me a physical context that software emulation could not provide. You'd be hard pressed to emulate the eye melting nature of the original plasma display or the stiffness of the keyboard.


The physical experience is a huge part of the overall thing. I have a C64 Maxi and it's absolutely amazing, exquisitely close to the original (but with an HDMI output and USB ports)


    I'd care much more about my money going into 
    making a product that got the visible details 
    of the canvas right than spoofed the proper 
    results if I carbon-dated it or something
You've inadvertently highlighted one of the challenges of preservation: identifying which aspects matter.

Does fooling a carbon dating test matter? This is purely subjective, but for most people surely not.

But interestingly you've linked to an ultra high resolution image viewer that lets the viewer drill down into a nearly microscopic view of the painting. If a person doesn't know much about art, they might think that if you could take something like this and hang it on your wall, it would be a pretty damn good replica of the real thing. It would certainly be cool, I have to admit. Hell, I'd love it on my wall.

And yet, it's utterly different than the real thing. Paintings in real life are three dimensional. Van Gogh in particular is one who used thick gobs of paint. Each fraction of a micron of the painting has height and its own reflective properties which interact with the light in the room as you walk around and observe it.

   if I get everything I'd notice while using the 
   computer correct at a fraction of the price.                                                                                                                                                                                                                                                                                                                         
Well, that's the thing. It's certainly up to the individual whether or not they give a crap about any particular detail.

If you don't care about how oil paintings actually look in real life, or what video games actually looked and felt like, and you choose to brand all of the things you don't understand or don't care about as "comical", then... well, more power to you. That's your choice.

But some people choose to care.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: