When do the next generation GPUs drop?

Wait, you shouldn’t have to join Insider to get 1809. You’d get some newer less stable build if you joined Insider.

It is odd, it was still keeping me on 1803 even though I don’t have any of the things that were supposedly blocking. So I just tried the insider program to see what would happen, and it loaded 1809. Then opted back out, and 1809 is still here.

You could have just used this
https://www.microsoft.com/en-us/software-download/windows10

I’ve tried that before and it failed

LOL, Radeon VII is $699, same price as a 2080. AMD seizes defeat from the jaws of victory.

So they caught up to the 2080 which gives a bunch of real estate to the rtx and tensor bits, and decided to price it in the same range as the 2080. I really hope Intel does well with their discrete gpu project.

So that is how graphics cards are made!

What actually happened here was AMD built this new GPU intended for scientific visualizations, Vega 20. It was never intended for consumer sales. Then Nvidia didn’t focus on rasterizing performance improvements in their next-gen and jacked up pricing so far that AMD could afford to sell a slightly cut-down Vega 20 to consumers at the same price/performance and still make a profit, so that’s what they did. That’s why it has 16GB of VRAM.

I’d buy that. The theory I mean, not the card.

Yeah, the 16GB of HBM2 and new HPC focused stuff will make it pretty attractive to people doing AI, Scientific research, etc. They’ll be able to sell well enough to that market without having to undercut nVidia’s prices. This isn’t a high volume part, afterall.

I still can’t figure out why a gamer would buy it, given that the RTX2080 offers the same rasterizing performance, also does ray-tracing, and probably runs much cooler. But that’s why the card exists, at any rate.

I had some time and figured out how to show frames in B5. I compared RT on and off at 1440 on Ultra settings. Shockingly, the 2070 I got wasn’t that unique Pokemon to make a liar of all the internet.

Without RT I was hovering in the 80s. With RT on I was hanging more in the high 30s.

That being said, it was playable to me, although I noticed the lag difference. I’d totally leave it on if it added anything noticeable, but it doesn’t. What a horrible game to showcase this technology. I really hope that it doesn’t tank the whole push, as I just love the tech.

I think Metro Exodus, which is out in a month, is the first reasonably big title with a fuller RTX implementation. It was in the announcement demos.

This is the first game demo I saw that had RTX look to be worth a damn because it has global illumination and real ambient occlusion. It’s clearly better to me, and a fairly dramatic difference. Whether that will be worth the FPS hit who knows. Metro isn’t known for hitting high frame rates to begin with and I just got a 2080 and have been enjoying ultrawide 60-90fps gameplay, often on the highest settings. I am guessing I will get 30fps with RTS on? Ug.

Right - low fps might be ok if it has exploration sections but less so for action sequences. Maybe RTX is smart enough to only run when noticeable? I guess the Battlefield example suggests not.

Yeah but RTX was announced for Tomb Raider and BFV prior to their release too, yet only one got RTX patched in after release and the other we are still waiting for. Have they explicitly said that RTX will be available in Metro day 1?

No, it’s either on or off. They could I guess do something that takes into account framerate but it would be very tricky to fade on and off. It’s not like texture resolution or LOD, it’s global illumination so flipping it on or off is going to smack you in the face with the differences.

Haha, I didn’t even think of that. Yeah, I bought a 2080 in spite of RTX. I like the idea but it’s pretty clear to me at this point that we won’t be doing much if any RTX gaming on this first iteration. God damn miners, I would have a 1080ti if it wasn’t for that whole fiasco.

I’m not sure that’s totally accurate. At least in BFV what’ they’ve stated is they generate a heatmap of the current rasterization and figure out which pixels have a high chance of viewing objects with a material of a certain smoothness. Only if a high probability exists of an area having a certain smoothness level (meaning high likelyhood of a ray bouncing off for a reflection) then it will actually trigger a ray to be pushed out for that pixel.

This is how BFV got their performance increase from the original RTX version. They optimized the way they determined which pixels need to have rays utilized from, and less rays = better performance. So in theory they can control how FPS effects it by making sure that less rays are needed in areas of high action (via what materials they use in the scene).

How practical that is without it being jarring is another question though.

For what BFV does you may be right, but they are not using it for global illumination like Metro. I don’t see turning that on off being practical.

Ahhhhh. That’s very interesting. So due to performance constraints we’re back to using tricks, rather than “it just works”. Well, hopefully the tricks won’t be necessary next generation.