Re: Cloning the GameCube component cable
Posted: Wed Jul 22, 2015 1:04 am
Posted video of usage of Shuriken video V1 https://www.youtube.com/watch?v=rMpSMagFTyI
Gamecube/Wii support & news forums
https://www.gc-forever.com/forums/
megalomaniac wrote:
here is a quote from the very first post:here is another from page 3Unseen wrote:The board can output analog RGB (both CSync and HV-Sync available) or Component video.Unseen wrote:Yesmeneerbeer wrote:If you use the RGB output from the board and the game is 480p, can you then display it on a computer monitor with VGA?
RGBHV = VGA compatible
Not true at all, HDMI supports YUV (YPbPr) colorspace. Nearly all blu-ray players use it so it's very widely accepted.Streetwalker wrote:It has to be converted to RGB by the display anyway, using these luma/chroma based colorspaces only allows to reduce the bandwidth for composite and s-video by chroma subsampling and doesn't really make sense once you use better cables.
My guess is that your wiring is longer than with the Shuriken video. This problem sounds really like the problem I had with my own board. I changed the code a bit, which I described in the thread. I think that will fix your problem. I could compile a new version for you, but I will have access to ISE earliest this weekend.. My good PC is at my parents.Ok some more time with it on the TV reveals that when you change video modes like starting a game in 480i then the game asks you if you want progressive scan and you press yes when it makes the change from i to p sometimes the picture ether won't recover or come back with artifacts in the picture. Other times its fine need to play around with it more
I suspect this is because I had a workaround for this problem in the non-OSD versions that I planned to move to the software side, but then forgot to add there. You could try an older non-OSD version to check if it helps - although I'm planning to fix this in a different way.andre104623 wrote:Ok some more time with it on the TV reveals that when you change video modes like starting a game in 480i then the game asks you if you want progressive scan and you press yes when it makes the change from i to p sometimes the picture ether won't recover or come back with artifacts in the picture. Other times its fine need to play around with it more
If I can find it... I tend to ignore patches whose existence I'm not aware of. ;)meneerbeer wrote:If the fix works, then perhaps Unseen can add it to the Github?
I will upload my modified file (do not have it right now) this weekend here, I guess. Together with a modified bitstream for Andre so he can see if it works.Unseen wrote: If I can find it... I tend to ignore patches whose existence I'm not aware of.
Ah, that sounds reasonable. I thought I could get away witout registering the external inputs because in theory the sampling point (falling edge of Clock54) is during a "quiet" time in the signal, but it certainly wouldn't hurt anything. Thanks for the hint, I'll add it in the next release.meneerbeer wrote:In case you want to try it before then. Basically in the gcdv_decoder you need to store the VDATA and CSEL inputs into a register at the rising edge. For instance VDATA_BUF and CSEL_BUF for instance. Then you change the rest of the VDATA and CSEL entries to VDATA_BUF and CSEL_BUF. That way the path your input needs to go is a bit shorter. This has solved all instability problems I was having with 480p.
Wiring is very short maybe 1/4 inch or for you guys over seas 1.5CM maybe less. I placed it inside the cube this timemeneerbeer wrote:Good stuff! The 5V is used as a HDMI detect to the TV. I know that my parents' TV needs this signal. My own TV just needs a ground and all TMDS pairs and that is it. I guess the 100 Ohm resistor was the problem then..
My guess is that your wiring is longer than with the Shuriken video. This problem sounds really like the problem I had with my own board. I changed the code a bit, which I described in the thread. I think that will fix your problem. I could compile a new version for you, but I will have access to ISE earliest this weekend.. My good PC is at my parents.Ok some more time with it on the TV reveals that when you change video modes like starting a game in 480i then the game asks you if you want progressive scan and you press yes when it makes the change from i to p sometimes the picture ether won't recover or come back with artifacts in the picture. Other times its fine need to play around with it more![]()
The fix works very well for me. Been playing with it the whole weekend and not a single problem occured.
If the fix works, then perhaps Unseen can add it to the Github?
Not down to the last pixel - the details may depend on the game that is running.meneerbeer wrote:Unseen, did you have a look at how well the GC's video output matches the CEA861 spec?
The bit in the flag byte sent by the GC appears to use 1 for "sync active" all the time, but the top level module translates this to low-active HSync and VSync.Is the hsync low, when it is active?
If the timing sent by the cube is not sufficiently compliant, you could try to modify it - the blanking regenerator module does this to extend the active area to a full 720xwhatever.Basically I need GCVideo's output for 480P to be fully compliant to CEA861, so hsync length of 62 pixels, front porch of 16 pixels etc.
It may be easier to measure this with a few counters directly on the FPGA and use the existing OSD to show the results. ZPUVideoInterface.vhd already measures the size of the active area that the GC generates.I have not been able to check how well video output matches the specifications. Chipscope seems to suck really badly and hooking up my GC to an Altera FPGA for SignalTap is a bit troublesome.
I haven't been able to simulate that code either, but there is another Spartan 3A HDMI project out there which worked for me both in simulation and on actual hardware. I think its implementation is a bit cleaner too.meneerbeer wrote:I see, I am using Charcole's hdmidirect code, which I modified to take video input and then mix incoming audio + video to an HDMI signal. The code seems to fail miserably when doing simulations.
Could be, although I hope that many displays do not demand a "CEA-compliant down to the very last pixel" timing. Did you pass the pixel clock enable signal to the encoder? GCVideo always runs at 54MHz internally, but pixels from the Cube arrive at either 27 or 13.5MHz depending on the video mode - and since 13.5MHz is too low for DVI/HDMI, the DVI encoder is fed a synthetic enable signal that results in a 27MHz pixel clock for all video modes, effective doubling the horizontal resolution for 15kHz modes.I just tried to generate my own video signal with 480p CEA861 timings that is plugged into the modified hdmidirect code and I now have HDMI with sound working on a Xilinx fpga, so it seems it is actually synthesized correctly. It does not work with GCVideo however, so I think the video timings are different for 480p. :(
I like it when other people reduce my workload ;)Oh uh, do not bother to help too much with this if you do not feel like it.
We'll see... My initial code to send just one infoframe per video frame (to get things right before making it more complicated) was rather messy - so messy in fact that it was mis-synthesized, but figuring that out took a few weeks and a second FPGA running the HDMI receiver part of bunnie's NeTV to check the actual data on the wire.As you are working on this yourself, most likely with a way cleaner implementation there is not really much of a point I guess. Right now I am just too lazy to write the packet generation with VHDL myself..
Code: Select all
case address is
-- preamble
-- enc c3c0 bt4 shft hmo dmo ldp 1st done
when 0 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "1" & "1" & "0";
when 1 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0";
when 2 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0";
when 3 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0";
Yes, I passed the clock enable signal. At first I thought the clock enable signal was a bit over engineering things, but it actually makes sense now. I guess it saves on PLL outputs. I have to say, your code is really nice to work with/hack things into!Unseen wrote: Could be, although I hope that many displays do not demand a "CEA-compliant down to the very last pixel" timing. Did you pass the pixel clock enable signal to the encoder? GCVideo always runs at 54MHz internally, but pixels from the Cube arrive at either 27 or 13.5MHz depending on the video mode - and since 13.5MHz is too low for DVI/HDMI, the DVI encoder is fed a synthetic enable signal that results in a 27MHz pixel clock for all video modes, effective doubling the horizontal resolution for 15kHz modes.
Interesting. I do not completely understand yet how that code would look, but I guess that should also be a lot easier to synthesize.The new version of that code is now much shorter and cleaner, but now I'm thinking about writing a microcode assembler because editing sequencer outputs in VHDL is tedious:Code: Select all
case address is -- preamble -- enc c3c0 bt4 shft hmo dmo ldp 1st done when 0 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "1" & "1" & "0"; when 1 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0"; when 2 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0"; when 3 => data <= "010" & "0101" & "0" & "0" & "1" & "1" & "0" & "1" & "0";
Actually I'm just scared of working with multiple clock domains =)meneerbeer wrote:At first I thought the clock enable signal was a bit over engineering things, but it actually makes sense now. I guess it saves on PLL outputs.
Cool!I have something working now!
It's basically the list of desired outputs for each pixel. A single data packet with all the surrounding bits is always 44 pixels long (8 preamble, 2 guard, 32 data, 2 guard) and the only thing that changes between different packets is the transmitted data, so it can be modeled as a counter that steps linearly through a ROM whose outputs control what should happen on the line (e.g. "set encoder mode to TERC4", "shift the packet data by one bit", "switch from calculating the ECC to transmitting it"). The HDMIDirect code mostly does this by checking for specific horizontal pixel positions, the table-driven approach just needs a counter that is started at some point and stops when the "done" output is active. It's similar to a microcoded CPU and it will probably save some resources because the table can be synthesized as a Block-RAM.Interesting. I do not completely understand yet how that code would look, but I guess that should also be a lot easier to synthesize.