Am I messing with this Progressive Scan stuff for no reason?

So, if my fancy-pants HDTV has a built-in de-interlacer, that means I can stop messing with the component cables on my PS2 to get Shadow of the Colossus’s progressive scan mode working, right?

The TV has a 480i input and a 480p input. The 480p input MUST have a progressive source, or the screen is messed up. So I can play Colossus, but nothing else – even the Sony menus and loading stuff are all messed up. It doesn’t right itself until the game loads its setting from the memory card, presumably.

However, I looked in the manual and it says the tv can deinterlace the input on the 480i component inputs. What does this mean? Where am I? You’re not my daughter.

Honestly for me the ps2’s progressive scan mode has always been a little pissy. Though I have never had the problem you are describing with any ps2 progressive scan games (beyond good and evil, socom 1 and 2, gt4).

Honestly it kinda sounds like a tv issue rather then a ps2 one. You did set the progressive scan option in the ps2 start up menu though right?

Sorry I am not much help and good luck!

The easiest test is probably the eyeball one. Try loading the game via both inputs on your set, and see if one looks better than the other to you. If you can’t tell by looking, then it’s probably not worth worrying about. Just use the input that lets you see the menus as well as the game.

You set’s component inputs are seperated between 480p and 480i? Is the 480p input just 480p or does it also include 720p and 1080i? The manual saying the TV can deinterlace signals on the 480i input just means that it uses its own scaler for interlaced sources. All HDTVs do that. But that doesn’t mean much for video games as its better to let the PS2 send out a progressive picture itself. It’s usually better to let the source send its own progressive scan image out instead of the TV unless you have a really high-end set with a fantastic scaler. I’ve used my Xbox on some extremely expensive TVs and it always looked better when sending out its own 480p, 720p, or 1080i picture rather than hooking it up s-video and letting the TV deinterlace it.

It has 1 set of component inputs specifically for 480i and 1 set of component inputs which can take 480p or 1080i. There are various other non component inputs (s-video 480i, DVI 1080i, coax lol).

However, if something is plugged into th 480i inputs, an extra option appears in the on-screen menu, asking me if I want to have this input Interlaced or Progressive. My current belief is the TV has a built-in deinterlacer to accomplish this.

If I plug into the 480p inputs, no such option appears. However, the screen flashes “progressive” momentarily when that channel is switched on.

I haven’t eyeballed anything yet, but because my TV is too dumb to sensibly autodetect, testing each one would require unplugging and replugging inputs and so on, and I can’t be bothered if it’s all the TV equivalent of buying gold-tipped speaker cable and all that nonsense.

The manual saying the TV can deinterlace signals on the 480i input just means that it uses its own scaler for interlaced sources. All HDTVs do that. But that doesn’t mean much for video games as its better to let the PS2 send out a progressive picture itself. It’s usually better to let the source send its own progressive scan image out instead of the TV unless you have a really high-end set with a fantastic scaler.

Looking it up, the consensus seems to be that there is no way of telling wether your output device or TV has the better scaler without eyeballing it yourself. One site, arguing against buying progressive scan DVD players, said that almost nothing actually outputs progressive, they just have built in scalers, and you may as well let the TV do the legwork and get the cheaper player.

Well, yes, eyeballing it is pretty much the best way to tell. But I will say that the PS2 and Xbox on my friend’s 62" Mitsubishi DLP both look better when the systems and games are set to run in progressive mode. If they’re just running interlaced, the TV’s scaler will improve the picture as much as it can, but they definitely look better when running in whatever high-def mode they can output. 480p games don’t look loads better than ones that just run in 480i, but the difference is there and of course 720p and 1080i games look quite a bit better.

Unfortunately, it’s more complex than that.

/jargon warning

For movies on DVD it’s an issue of doing IVTC to take your (usually) 3-2 pulled-down 60i (from 24p film source) video and reconstructing 24p. Given that DVD field flagging has been done so poorly, it’s generally a matter of looking at the fields, making the best guess at the cadence, and doing the reconstruction. Both TV and DVD are capable of that. However, it’s hard to find a device that handles all sources well. That this is even an issue is just one of the biggest tech cock-ups of the whole DVD era.

For video games you are starting with something that is likely 60p or 30p. If it’s 60p and you output 60i, you’ve thrown away half the data and there’s nothing the TV can do to make that up. It can try to reconstruct 60p by vertical interpolation (but that will just look smoother than line doubling) or, god help you, it could try to reconstruct 30p, which will look like crap.

For 30p source, it’s possible that the TV can reconstruct from 60i (of course adding some slight latency), but I don’t know if TV’s will do that in practice.

From a practical standpoint, a still image 480p and 480i aren’t going to look that different. On a TV that will actually show 480p or 480i (ie good CRT), the progressive image will look more solid.

Of course, there’s always the question of what your TV does with 480i.

And I understood it all, which means I have already wasted too much of my life learning the inconsequential details that explain the inconsequential differences.

Seriously, thanks. That made me see the pitfalls of deinterlacing much better, especially in regards to how data is lost :)