When do the next generation GPUs drop?

Why would the 1440p/4k “onslaught” be an issue? They can always be run at 1080p if desired?

You know, if they weren’t priced so absurdly high I might bite. I paid a premium for the GTX 1080, but it was a huge leap in performance from my 970. I just can’t convince myself to shell out so much for relatively minor gains, and the ray tracing stuff won’t really be much of a factor until the next generation is ready to roll out anyway.

Pretty much, but then switch on DLSS on the RTX2080 and its far better than ANY pascal card.

RTX 2080 performance with standard TAA reveals that the card enjoys a straight 30 per cent lead over GTX 1080, and it’s basically on par with the GTX 1080 Ti - a state of affairs that’s fairly common in the standard benchmarks to come. DLSS grants the RTX 2080 a further 39.5 per cent of raw performance, which clearly takes it well beyond the capabilities of even the most powerful Pascal cards.

I was always told that isn’t as good as if the resolution you run at is the native resolution. Is that not true anymore? Maybe that was a thing only in the early LCD era?

https://www.overclockers.com/nvidia-geforce-rtx-2080-and-rtx-2080-ti-review/

Yes, games using Nvidia’s proprietary antialiasing technique will perform better than older AA techniques. But it is proprietary, and the question is how many games will implement that feature without Nvidia’s dev relations group doing it for them and/or Nvidia paying them off.

@Rock8man: 1080p on a 4k monitor is pixel-perfect, each pixel is represented by precisely four pixels.

I’m confused. Is DLSS simply replacing another AA solution, so that the performance gains are from reduced AA overhead, or is it also upscaling so the gains are from lower rendered resolution? If you’re gaming at 4K using it, is that a native 4K or an AI imputed 4K?

True! For a hefty price premium though. I’d still prefer to buy the cheap new 1080p monitor if they continue making them.

DLSS is a per game, dedicated AA algorithm - nvidia take a given game and run some samples of its rendering through a supercomputer to generate a super efficient AA algorithm specific to that game, which they then apply via their drivers/geforce experience. How many games are going to support it and how much will cost.time be a barrier? Who knows?

Why? High-DPI is amazing for text, reading webpages, etc, and 1080p will look perfect on a 4k monitor. You probably will lose freesync/g-sync though, or pay for it if you don’t.

I wonder if the super computer also says - don’t put any points in ranged skills, they suck. That would be some deep learning.

You can now get an EVGA 1070 for $350. That’s pretty tempting, given that I’m currently limping along on a single Radeon HD 6970 from 2011.

I have loved my evga 1070. Works great for 1440p and below.

@marquac 's people did a review. I trust Canadians!

I read teh Techreport write up, looks like a 2080 Ti is generally a 100% improvement over my GTX 1070. . .at 4k. I don’t own a 4k panel, and likely won’t anytime soon. So how much of a booster at 1440p would be interesting for me to read, but then I’m still running a i7 4770 CPU which leads to the obvious realization that if I were to drop that crazy amount of bank on a video card it’s not just the card but the entire system it would be installed in.

Maybe 2020, my 4770 will be 6 years old and I’ll be ready for a complete system overhaul.

I enjoy Linus vids.

The one point you missed is I believe that DLSS allows 4k with AA faster than 4k without any AA, primarily because it renders at a lower resolution then uses the game specific DLSS algorithm to upscale it to 4k, thus giving a performance boost with the AA applied. DLSS is supposed to take care of any upscaling artifacts as well due to the algorithm being specialized per game.

One more here this time from Gamers Nexus:

I watched a lot of youtube today.

These articles and vids even make more more excited for the 2080 Ti.

Yeah its a beast Brian, wish I had the spare $500-600 for it. Even without DLSS or RTX features.

They pretty much squished better than 1080ti SLI performance into 1 card with the RTX2080ti.

The nice tech editor at RPS agrees that the 2080ti is really meant for 4K and a silly waste below that.