So… finally 4k 60fps at max settings in all the games? :)

@wumpus would get it and probably play Bejeweled 3.

Actually while it looks to be faster than the Titan Xp/1080ti in every respect, it probably isn’t much faster-- this particular GPU is design optimized for compute. I wouldn’t be surprised if it only has a 15% framerate advantage apples/apples. Best case scenario, 30%.

Yeah, it seems like it wouldn’t be worth it unless you are seriously into deep learning stuff

Yeah, unlike previous Titans this really is their ultra-expensive compute card being sold to consumers, HBM and all. Great if you’re doing deep learning, not so much for gaming.

Seems like a good reason to start digging into deep learning :)

I can haz CheezGPU?

Regarding the monitor discussion: We’re now close enough to CES I’d avoid spending serious cash on a gaming monitor until we see what is dropping in early 2018. HDR is going to quickly become a MUST HAVE.

All I’m wishing for is a 1080p, IPS, G Sync capable monitor. Not too much to ask.

Well the $3k Nvidia card looks about 50% faster than the current Titan:

What are you waiting for @wumpus ?

About 42%, actually. 30% of that is from additional cores, and the remaining 12% from IPC improvements. That actually isn’t bad for a non game-optimized GPU.

At 1080p is gsync worth it? I guess a monitor with a really high refresh rate?

The best part of gsync isn’t high refresh rates, it is eliminating screen tearing and making things feel smooth even when frame rates are relatively low, imo. So, yeah, it could definitely make sense. But then, it’s a big price premium for a relatively low-end setup and one wonders if the money wouldn’t be better spent elsewhere.

Yeah I was just thinking that it would resolve any potential issues with a.high refresh rate on a 1080p because it only updates when needed.

G-sync adds $200 to the price of the monitor so no, I wouldn’t bother at 1080p when you can buy a GTX1060 for $215 that will absolutely smoke 1080p. Spend that money on a faster GPU, then worry about adaptive sync.

I recently bought a 1060 and that sort of trapped me into 1080p. And I’m hoping G Sync at 1080p will drop in price by next year, making that monitor, if available, affordable to me. I don’t think I will be upgrading to 1070 or 1080 in the next 3 years but I do need a good monitor soon.

The 1060 can do 1440p, just not at >60fps with everything at max settings. I would try to get gsync for that though.

Wouldn’t that be moot though? If it can’t go >60fps, G Sync will be overkill except for screen tearing.

That’s all gsync ever does, it stops tearing when you can’t push out enough frames to match the monitor’s native refresh.

I thought without adaptive sync monitors have to stay at some multiple of it’s max framerate, so if your game fps drops from 60 to 55 fps your monitor is only going to refresh at 30 or 45 fps. So adaptive sync allows you to actually display in between those multiples.

That’s only the case if you’re using double buffered Vsync. Turning v-sync off or using triple buffering allows all the frames to be displayed. The difference with G-sync/Freesync is that those frames don’t have to be displayed on the proscribed refresh cycle. It smooths things out by letting full frames display as soon as they’re ready rather than trying to match a 60hz cadence. Without it you can swing between new frames every 16ms to 33ms and back and forth unpredictably. People feel that as a stuttering or jerkiness that goes away if the monitor can just show a frame every 20ms instead, for example.

Ah good to know thanks