I’m still sitting on a 970 and I’m looking at both the 1660 ti and the 2060. Good to hear it’s a noticeable bump (the numbers say it is, but sometimes the numbers don’t translate into anything I can see).

I went from playing Apex with everything on Low, including the lowest texture budget, and being in the 40s-to high 50’s to everything on Ultra at 90+. Very pleased. Though for your situation (starting with a better card) the 2060 might be more future proof/better at 2k.

Have a 2000 series card? “Play” the RTX demos for yourself:

NVIDIA has unlocked the ray tracing ability on 10 and 16 series cards as well, looking forward to giving this a shot.

I wouldn’t get to excited. Without the RT cores on your card the performance will be awful.

Sony just “announced,” via an interview in Wired, that the PS5 will have an AMD CPU/GPU with ray-tracing support. That was surprising to me, but that means that we should see pretty wide adoption of the tech starting in late 2020. I would imagine most cross-platform games that come to PC from that point on will support RTX.

Nvidia must be doing a little happy dance this morning.

If you assume Q4 2021 for release that doesn’t seem too silly.

I’ll be an old man by then. A bunch of Qt3ers probably won’t make it :/

Get off my lawn!

hahah

(you get the extra h because discourse won’t let me do four characters, another reason for this useless sentence)

Anyone know why the console makers have been sticking with AMD? I’m guessing it is pricing?

I think it’s because AMD has teams that can do semi-custom chips for console makers, like ARM has been doing with Apple on the A-series chips. I haven’t seen Nvidia do the same for anyone.

Intel doesn’t do custom work and they don’t have high end GPU tech. Nvidia can’t do x86 CPUs. AMD is only one that offers both.

I betcha the next-gen consoles will be a lot less “custom” than in the past. Rather than designing an integrated SoC with both CPU and GPU on the same die, we’ll see a normal CPU and completely separate GPU on the same substrate, like the Kaby Lake-G, with a separate AMD Vega GPU and its own HBM.

With this setup there’s no reason they couldn’t go with Nvidia.

Just because they support AMD’s implementation doesn’t mean they’ll support Nvidia’s. That said, if they go to the effort to make a game look good with raytracing, they may want to show that off to the widest possible audience.

Fair, and I was also just reminded that it’s unlikely that this chip will have dedicated ray-tracing components on it. Instead, I think AMD is counting on software, which means implementing ray-tracing on this chip and on an Nvidia card will be different, but I’m not technical enough to know how different.

My assumption is Sony will write extensions to GNMX and MS will use the DXR DirectX extensions. Any AMD implementation would be tooled to work on PCs as well, so I would expect rough parity with actual games and handling pretty much all of DXR. And again that could come from either AMD or Nvidia.

Perhaps part of the reason might be the BC they mentioned for ps4 titles. I’m no engineer, but I’d imagine it might be easier not to radically change the architecture if that’s important.

I know I’m on board if I can simply trade in my ps4 and not worry about my games.

Nobody is suggesting they go to ARM or (heh) PPC, but they could easily use intel for the CPU component.

Perhaps. That would be a pretty radical departure. Current systems have an unified memory architecture. As long as the memory system can cope with the demands of both, that’s a pretty big advantage to give up.