Page 1 of 1
Per Game RT4K setting
Posted: Fri Aug 02, 2024 9:00 am
by retro
Not sure if this is possible or not.
I have been having problems with the Star Wars: Rogue Squadron games when the optimize display for RetroTink 4K option is enabled in Swiss.
This is especially apparent in Rogue Squadron II where the display goes haywire.
I bet this has something to do with these two games always changing resolutions.
I haven't tested against too many GameCube games... so I am not sure if this may affect other games.
So the question is, is it possible to enable this option on and off on a per game basis?
Re: Per Game RT4K setting
Posted: Fri Aug 02, 2024 2:58 pm
by Extrems
This issue has already been addressed in master and will be fixed in the next release.
Re: Per Game RT4K setting
Posted: Sun Aug 04, 2024 12:05 am
by Lumina333
When exactly do those games change resolutions anyway?
Re: Per Game RT4K setting
Posted: Mon Aug 05, 2024 2:42 am
by retro
Extrems wrote: ↑Fri Aug 02, 2024 2:58 pm
This issue has already been addressed in master and will be fixed in the next release.
Awesome! Many thanks!
Re: Per Game RT4K setting
Posted: Mon Aug 05, 2024 2:44 am
by retro
Lumina333 wrote: ↑Sun Aug 04, 2024 12:05 am
When exactly do those games change resolutions anyway?
I believe it's only noticeable if you're using a GCVideo HDMI dongle like the Carby or GCHD.
I noticed the resolution changes between cutscenes (i.e. the X-Wings approaching the Death Star, every time you die) and gameplay.
Re: Per Game RT4K setting
Posted: Mon Sep 02, 2024 12:02 pm
by N7Kopper
retro wrote: ↑Mon Aug 05, 2024 2:44 am
Lumina333 wrote: ↑Sun Aug 04, 2024 12:05 am
When exactly do those games change resolutions anyway?
I believe it's only noticeable if you're using a GCVideo HDMI dongle like the Carby or GCHD.
I noticed the resolution changes between cutscenes (i.e. the X-Wings approaching the Death Star, every time you die) and gameplay.
In short, standard definition CRTs are dumb and just try to output whatever gets fed to them, even if it's off-spec. (This is in fact where 240 and 288p come from) While some game developers in the Atari days pushed the envelope so far that some TVs weren't compatible anymore, in general you can throw any old nonsense and the TV won't bother to check if it makes sense. In an age of standardisation, this doesn't fly so well, so these modes need to be accounted for - and that accounting takes time. Time that gamers often don't have. That's why any even remotely modern system outputs at a consistent resolution and handles internal resolution changes
internally.
Re: Per Game RT4K setting
Posted: Sat Oct 19, 2024 1:04 pm
by Papy.G
No need for a per-game setting to adress that, just set the right setting in global as you'll use RT4K for all your games.
Unless you play some games on a CRT, and SWRSII on another display with a GCVideo device and RT4K.
N7Kopper wrote: ↑Mon Sep 02, 2024 12:02 pm
In short, standard definition CRTs are dumb and just try to output whatever gets fed to them, even if it's off-spec. (This is in fact where 240 and 288p come from) While some game developers in the Atari days pushed the envelope so far that some TVs weren't compatible anymore, in general you can throw any old nonsense and the TV won't bother to check if it makes sense. In an age of standardisation, this doesn't fly so well, so these modes need to be accounted for - and that accounting takes time. Time that gamers often don't have. That's why any even remotely modern system outputs at a consistent resolution and handles internal resolution changes
internally.
They are not exactly trying to output whatever… It's more that the beam is straight controlled by the console's video generator, with only analog, and near to instant signal processing. Newer TVs have to "know" and recognize a said format in order to adapt it to the panel, and that takes some time, while CRTs supported changing formats seamlessly from frame to frame, and, even in some cases, during a frame.
The Gamecube itself handles Internal resolutions and output scaling, and that doesn't prevent such format changing. Handling internal resolution isn't the best way to achieve the best final display, as we have seen with late developments and hacks, but it's an easy way for manufacturers to cheat and lie on the real games resolution (broadcasters often use this technique nowadays too).