So I’ve been polishing off my MAME cabinet, and the last move was getting a nice big TV set to replace the little 15" computer monitor I’d been using to test the system.
In the store, there wasn’t a lot of choice. My video card has VGA, DVI and S-Video out. My possibilities, then, are straightfoward:
I can either get a cheapo TV with only coax in, and have an RF modulator convert S-Video to coax.
I can get a TV with S-Video in, and just run S-Video from the card to the TV.
And that’s really it, because I couldn’t find an RF modulator that converted S-Video to composite, which is the only other real choice: my VGA out seems pretty useless for hooking up to a TV, and the DVI would only be useful if I dropped hundreds and hundreds on a HDTV. No.
I plumped for the S-Video, and it looks rather poorer than I expected. I’ve hooked s-video up before for TV-Out, and remember it being bad, but I just assumed I had configured things wrong.
It looks considerably worse than, say, my XBX or PS2 on s-video. Is there a reason for this? Everything from 640x to 1024x produces nearly unreadable text.
It’s not so bad it can’t be read, but why so bad? Even 640x480 is annoying. It makes me wonder how horrible coax would have looked!
At the end of the day, I’m going to be running stuff like the god damn New Zealand Story on it, so it won’t matter much. But the “why is this worse than console S-Video?” is getting to me a little. My suspicion is that the TV-out controller on my card (Radeon 8500) is somewhere between “meh” and “Bah!” on the quality scale.