Feature request for future versions of emulators
Posted: Fri May 14, 2010 6:28 am
I wanted to post this in a general emulation subforum, but it only allows for posting in forums specific to one emulator and ecluding the others, so I posted here instead. Move it if you feel like it.
I would love if you could implement the original hardware's LDTV (240p/288p) display modes for the console emulators, specially, but not limited to, 2D consoles like SNES, NES, MD, etc, because you don't get any more resolution out of these games (with some exceptions), and displaying them in interlaced causes nasty flicker.
EDIT: I'm talking about this because the "Zelda Collector's Edition" NES games use 240p, so the hardware allows for it. I don't know about support in homebrew devkits, but technically is possible.
And now a "little" explanation for those that don't know what I'm talking about but want to know.
As most know, analogue TV works in interlaced mode, but not everyone knows or realizes how interlacing is achieved (of course I'm not talking about tech people here). You might think it's some complex thing, or maybe even a fixed feature of your TV, but actually, it's nothing like that.
Picture information in analog video consists, roughly, of a continuous "line" of video information (whether it is transmitted as RGB, YPbPr, S-Video or Composite video). It doesn't even have pixels, it's just a stream of contiuous intensity variances. To make this line look like a 2D picture in your CRT, you need a couple of signals more. Vertical and horizontal syncronization.
What these 2 signals do is "chop" this line in segments and arrange them one after another from top to bottom over the screen. They do so by deflecting the electron beam the CRT's cannon shots towards the inner side of the screen surface, so it travels horizontally from side to side and from top to bottom.
These two signals are SAWTOOTH waves.
Now let's concentrate on the vertical sync signal. It controls the vertical position at which the electron beam is at each time. The higher this signal is, the higher (or lower, depending on specification) the beam will be pointing at.
When you are sending an interlaced signal, the peak and bottom are shifted a little.
In PAL (50hz), since it uses 625 lines (of which 576 are today considered to be the visible portion), we could consider that this signal would have 625 possible voltage values (not counting the intermediate values between them).
Each "hertz" in an interlaced video mode is called a field. There's the odd field and the even field. Each of them consists on every other line. The odd field contains lines/positions 1, 3, 5, 7... of this scale we have defined, while the even field consists of 2, 4, 6 , 8...
The sync signals are not merely "informative", they actually DRIVE the electron cannon. So, the voltage you suply in any given moment actually defines the region of the screen it will be aiming at. If position 1 of the sync signal is 0 volts, and the position 625 is 1 volt, then each position is 1/625 volts higher than the previous (could be the other way arround). 1/625 = 0.0016. In this setting we have defined, each "step" would be 16 tenthousandths of a volt.
For the odd field, the vertical toothsaw wave would go from 0.0000 to 0.9984, and when displaying the even field, it would go from 0.0016 to 1.0000. The result of this is that these half frames that are the fields, would get displayed in alternating heights in the physical screen, and Voilá, we have an interlaced picture.
But what if the vertical sync signal would go between the SAME values for EVERY field?
If you haven't guessed yet, this results in the lines of each field getting displayed at the same height as the lines of the previous one. The concept of odd and even fields disappears... even the very concept of fields disappears since they become full frames, and you get a PROGRESSIVE picture which is half the resolution of the "normal" interlaced modality.
Realizing about this was rather cool for me, because I was into the (wrong!) impression that classic CRT TVs IMPOSED interlacing somehow, but it's all in the sync signals, and consoles and TV computers have been doing this LDTV "hack" for about 30 years or more.
All consoles up to N64 produced 240p/288p most of the time. Some games included higher (VERTICAL) resolution video modes for some parts of the game or for the entirety of it (like Turok 2 hi-rez, but not PD hi-rez, which only augments horizontal resolution and not vertical), but this was costly at the time and these modes usually caused slowdowns so they were only used in menus and other things. Another inconvenient of this mode is that it introduces flicker not pressent in LDVT resolutions since they are progressive. As an interesting note, apparently some SNES games, I don't know which, feature sections programmed for high resolution in interlaced video mode, but it's hardware imposed extra limitations when using this mode, so it was not used for fast games.
Interestingly LDTV mode was never allowed for broadcasts, although there was no technical limitation that prevented this. I guess it has to do with the original intent for TV to transmit 30 or 25 frames per second using the pairs of even and odd fields as halves of one frame, instead of 60 or 50 independent frames, as it's often done today.
I would love if you could implement the original hardware's LDTV (240p/288p) display modes for the console emulators, specially, but not limited to, 2D consoles like SNES, NES, MD, etc, because you don't get any more resolution out of these games (with some exceptions), and displaying them in interlaced causes nasty flicker.
EDIT: I'm talking about this because the "Zelda Collector's Edition" NES games use 240p, so the hardware allows for it. I don't know about support in homebrew devkits, but technically is possible.
And now a "little" explanation for those that don't know what I'm talking about but want to know.
As most know, analogue TV works in interlaced mode, but not everyone knows or realizes how interlacing is achieved (of course I'm not talking about tech people here). You might think it's some complex thing, or maybe even a fixed feature of your TV, but actually, it's nothing like that.
Picture information in analog video consists, roughly, of a continuous "line" of video information (whether it is transmitted as RGB, YPbPr, S-Video or Composite video). It doesn't even have pixels, it's just a stream of contiuous intensity variances. To make this line look like a 2D picture in your CRT, you need a couple of signals more. Vertical and horizontal syncronization.
What these 2 signals do is "chop" this line in segments and arrange them one after another from top to bottom over the screen. They do so by deflecting the electron beam the CRT's cannon shots towards the inner side of the screen surface, so it travels horizontally from side to side and from top to bottom.
These two signals are SAWTOOTH waves.
Now let's concentrate on the vertical sync signal. It controls the vertical position at which the electron beam is at each time. The higher this signal is, the higher (or lower, depending on specification) the beam will be pointing at.
When you are sending an interlaced signal, the peak and bottom are shifted a little.
In PAL (50hz), since it uses 625 lines (of which 576 are today considered to be the visible portion), we could consider that this signal would have 625 possible voltage values (not counting the intermediate values between them).
Each "hertz" in an interlaced video mode is called a field. There's the odd field and the even field. Each of them consists on every other line. The odd field contains lines/positions 1, 3, 5, 7... of this scale we have defined, while the even field consists of 2, 4, 6 , 8...
The sync signals are not merely "informative", they actually DRIVE the electron cannon. So, the voltage you suply in any given moment actually defines the region of the screen it will be aiming at. If position 1 of the sync signal is 0 volts, and the position 625 is 1 volt, then each position is 1/625 volts higher than the previous (could be the other way arround). 1/625 = 0.0016. In this setting we have defined, each "step" would be 16 tenthousandths of a volt.
For the odd field, the vertical toothsaw wave would go from 0.0000 to 0.9984, and when displaying the even field, it would go from 0.0016 to 1.0000. The result of this is that these half frames that are the fields, would get displayed in alternating heights in the physical screen, and Voilá, we have an interlaced picture.
But what if the vertical sync signal would go between the SAME values for EVERY field?
If you haven't guessed yet, this results in the lines of each field getting displayed at the same height as the lines of the previous one. The concept of odd and even fields disappears... even the very concept of fields disappears since they become full frames, and you get a PROGRESSIVE picture which is half the resolution of the "normal" interlaced modality.
Realizing about this was rather cool for me, because I was into the (wrong!) impression that classic CRT TVs IMPOSED interlacing somehow, but it's all in the sync signals, and consoles and TV computers have been doing this LDTV "hack" for about 30 years or more.
All consoles up to N64 produced 240p/288p most of the time. Some games included higher (VERTICAL) resolution video modes for some parts of the game or for the entirety of it (like Turok 2 hi-rez, but not PD hi-rez, which only augments horizontal resolution and not vertical), but this was costly at the time and these modes usually caused slowdowns so they were only used in menus and other things. Another inconvenient of this mode is that it introduces flicker not pressent in LDVT resolutions since they are progressive. As an interesting note, apparently some SNES games, I don't know which, feature sections programmed for high resolution in interlaced video mode, but it's hardware imposed extra limitations when using this mode, so it was not used for fast games.
Interestingly LDTV mode was never allowed for broadcasts, although there was no technical limitation that prevented this. I guess it has to do with the original intent for TV to transmit 30 or 25 frames per second using the pairs of even and odd fields as halves of one frame, instead of 60 or 50 independent frames, as it's often done today.