Win7 - resets resolution every time I reboot

Almost every time I boot (randomly, but more often than not), my computer resets the resolution to 1024x768. Even more disturbing, sometimes it does this and I can’t select my monitor’s native resolution (1680x1050) no matter what I do. I can’t do it from the desktop nor from the Catalyst Control Center. I try everything, detecting my monitor, etc… still, it just won’t let me.

I also notice that it recognizes my monitor as a non-PnP unknown. I have a Dell 2007W or some such monitor, and there are no Win7 drivers for it on the Dell site.

The only way I can get my monitor in it’s native resolution is to reinstall the Catalyst drivers. Then it magically finds it’s native resolution. It then also correctly identifies my monitor by it’s exact model number (!). After a reboot, all of this randomly gets reset to non-PnP monitor and no option to select it’s native resolution.

Any ideas on why this is happening?

The computer is only a month old:

i5 700 quad core 2.66
Radeon 5770 1 gig latest Catalyst drivers
4 gig Ram
Win7 Home Premium

Seems like the video card isn’t getting the proper EDID from your monitor. I’ve heard there are ways to force it to use a specific EDID, but never tried it myself. Something is probably wrong either with your monitor or your card.

Oh dear god please tell me this isn’t a hardware issue. That would be the third time I’d have to get this thing fixed in the month since I’ve owned it.

It’s connected through VGA, if that’s any help?

why are you using VGA and not DVI?

That’s a good point. You might get the proper EDID if you went DVI. Drop the analog man, it’s time to move to digital!

I find that DVI looks far too sharp. Sounds crazy, I know, but it makes games looks much too aliased, even at 1680x1050

Uh ok. Not sure what to do with all of that but using a DVI cable might solve the problem.

Between this and your other thread, there’s one common denominator when you switch from analog to digital: your eyes. Maybe you should have them looked at? Seriously though, DVI/HDMI will always have a crisper picture at higher resolutions, because it’s a fully digital stream that isn’t susceptible to interference and whatnot. Analog is not going to be a viable option for much longer (some newer TV sets and monitors are opting out of analog HD connections entirely), so… get used to it, I guess?

I sorta had the same problem when moving from my HD CRT to my HD LCD TV. The picture just looked too sharp to me – I could actually pick out the artifacts in the compression from my cable. After a little while, though, I got used to it. Blurring your picture is not really the answer to aliased video. But I guess you could wipe some vaseline on your glasses or something.