When do the next generation GPUs drop?

Consoles have 8GB of ram period, they have zero dedicated GPU memory. Is the 8GB of ram in the PS4 even comparable speed?

The new GRRD5X RAM in the GTX 1080 uses a 256-bit memory bus and delivers 320GB/s of memory bandwidth.

The Xbox One provides 68.3 GB/s of main memory bandwidth and 102 GB/s for its eSRAM, while the PS4 memory provides 176 GB/s.

The PS4 has DDR5 memory. I learned that from Sheldon.

Yeah, consoles have 8GB, but the OS and general overhead eat some of that up, and, of course, game code aside from graphics/textures.

Both consoles have access to around ~5GB for the games but thatā€™s obviously split between game data and video/frame buffer use.

The PS4 Pro actually has 1GB of DRAM for OS stuff, separate from the 8GB of GDDR.

Depends on what AMDā€™s Vega is capable of - it will be first to market in the first half of '17. Nvidia isnā€™t set to drop until q4 at the earliest I think.

Okay, I think Iā€™ve waited long enough in that case (since the release of the 980ti)!

(But will wait and see Vega impressions at least.)

You donā€™t need HBM2. GDDR5X is enough bandwidth for the current generation. AMD made a bad bet.

These 1080 Ti benchmarks are significant, particularly at 4K where more grunt was needed

For example

Iā€™m definitely getting one of these to replace my GTX 980 whenever a decent Gigabyte card comes out in a non-blower configuration. Especially considering the 980 does not handle all that well on my 2560x1440 144hz monitor.

As for why Gigabyteā€¦ Iā€™ll likely be getting that for free.

Truly impressive that it beats the Pascal Titan X. Whatā€™s the point of that more expensive card now?

There is none. In previous generations the Titan retained better compute performance, but not this one.

Basically that is why the TI only has the bizarre 11GB configuration to give the Titan some niche. Obviously for gamers the best choice is clear cut but apparently thereā€™s some segment of the market where the extra 1GB and more ROPs matter. Or at least NVidia believes so.

Of course, then you remember that the Quadro P5000 and 6000 are rocking 16 and 24 GB of VRAM on non-gamer-oriented boards, respectively, and you really start trying to figure out what sets the Titan apart. . .

Yayyy my 1080 Ti finally shipped. I canā€™t wait to hack its exhaust bracket to bits and horrify people here.

(this bracket is even better because they drop the stupid DVI-D port)

Oh man I would like a 1080 Ti. I have a 1080 noTi and the extra 20+ fps would be very convenient for some games in 4k.

With that said, something to keep in mind for people interested in 4k but money is a significant factor, what those graphs wonā€™t tell you is that they are mostly done at MAXMAX settings. Take Hitman at 4k, that graph shows 50 fps for the 1080 and 75 for the 1080 Ti. ā€˜Wellā€™, you might say, ā€˜I donā€™t want to play the game at 50 fps so I guess itā€™s the 1080 Ti or no 4k gaming at all for meā€™. What the graph doesnā€™t tell you is that you can get 60 fps easily with the 1080 by turning off one or two niche graphic options that make almost no perceivable difference on the visual quality whatsoever. I played Hitman at 4k at 60 fps and it looked fantasmo.

The upcoming R500 series thatā€™s dropping in Aprilā€“is that Vega?

Yes. Vega is Q2.

Exactly, Witcher 3, Batman Arkham Knight , and Rise of the Tomb Raider all play well enough at 4k with some settings dialed down. Using a non-TI 1080.

No, RX500 is basically going to be just rebadged RX400 (maybe tiny clockspeed bumps). Iā€™ve heard nobody suggest that Vega would launch at the same time, even if itā€™s still supposed to be Q2.