When do the next generation GPUs drop?

Yep count me in as waiting for an over locked Asus low noise. Love my 980ti with that configuration. Plan on swapping that out and putting it into a SFF box if I can find one that will support the power and airflow needs for it. Will probably start a new thread about that.

Also does anyone have the power draw on these? Is it higher than the 980ti? Does each card take an 8+8 or an 8+6 pair of power connections? My box is kitted for 2x 8+6 but my understanding is that adapters can take that 6 block to an 8 no sweat…

This says it takes an 8 block per card and is 2/3rds the power draw, looks like I should be good to go:

Also, replay:
https://m.youtube.com/watch?feature=player_embedded&v=it3HVZMSBfY

What I read on wccftech was the 1080 has single 8-pin connection

Sent from my SM-T700 using Tapatalk

Yep, the 1080 is 180W TDP, compared to 250W TDP on the 980ti and 165W on the 980. A single 8-pin connector.

OK, other than price I can’t see any reason to not get two of em. I had heard that a lot of VR games had problems with the 980TI in SLI, think that will be a problem here?

SLI generally sucks. Crossfire too. Alternate frame rendering is going out of fashion.

DX12 and Vulcan support something called explicit multi-adapter. Rather than NV and AMD supporting specific games in their drivers and presenting a “virtual GPU”, game developers need to support multiple adapters themselves. This allows for all kinds of cool stuff including heterogenous environments. For example, an AMD 390 and a 380, or even AMD and NV mixed in the same computer. Ashes of the singularity was the first game to support this.

That sounds like a ton of work for the developers, but really when you think about it, the trick is to get Unreal, Unity, Frostbite, and Crysis engines to support EMA and then everything else will fall in line. It’s possible, and even plausible, that the console refreshes (PS4 Neo, NeXbone) will come with multiple AMD polaris GPUs, which will push EMA quite handily.

So that’s all very cool tech, and it may be worthwhile at some point in the future, but it isn’t now. And SLI/Crossfire suck too-- tons of games don’t work with them, and those that do, don’t scale well. SLI/crossfire is a lot of money for relatively little gain.

Moral to the story, get the fastest single-GPU solution you can reasonably afford. If EMA works out in the future, maybe pick up another one then.

I’ve followed this advice since multiple gpu chaining was a thing and never regretted it. Price to performance you’re just kicking yourself in the dick going sli/crossfire period.

It’s not worth the bullshit.

Just hope Vulcan gets 100% adoption everywhere as well as a D3D-2-Vulcan app, so I can stick with Win 8.1 (or go to Linux) :)

OK, will see what’s what. The Display warping feature they showed has me really considering triple 1080p displays. I’d reckoned I’d need more than one card for that…

Awesome, glad you liked the article! The Founders Edition card available for pics only had an 8-pin connector

Per one of the mag editors there at the launch…

I have an i3-3220 and I’m CPU limited in TW3 and FO4 with a slightly over clocked 290x.

I can only imagine how bad it will get when games become even more multi threaded.

Damn. I bought a GTX970 a month ago because I was told that consumer Pascal GPU wouldn’t arrive till 2017. If I had known this I would have waited for the 1070.

EVGA? They have a step up program for 90 days, but you might have had to register it.

No, one is enough to drive 4 monitors.

Everything stusser posted above is true, but in case that didn’t dissuade you, let me crap on your dreams some more =)

SLI is garbage. I buy my cards two at a time because I drive 5 monitors at a time and now a Vive and soon a Rift. I rarely to never turn it on. If I played a single game at 4k all the time and it was confirmed to work awesome with SLI maybe it would be worth it… but I doubt it.

It’s also a pain to turn SLI on and off. Especially these days because it won’t switch SLI on unless you kill any app that uses certain video card features. This includes weird shit like OneDrive for Business & Outlook. If you drive more than 4 monitors you have to turn it off to use them. Many games also just don’t work with SLI and introduce rendering artifacts or flicker so that’s another reason to turn it on and off all the time.

Then, when you finally do get a game that supports it on day one and actually makes a 25% performance difference, you end up with flashing textures somewhere. The latest Tomb Raider game was a great example - all the pre rendered CGI content integrated with the ingame graphics flickered terribly, ruining a lot of key moments in the game.

I can’t remember the last game I played that worked satisfactorily in SLI.

Triple display gaming is also kind of garbage. For one, you have to turn it on and off because you don’t want to leave it on all the time. The drivers include special features to mitigate working with a single viewport vs 3 discrete monitors (like maximizing windows to a single display) but it’s buggy and not worth the workflow interruption of leaving it on.

It rarely works well, even on supported games, unless you fiddle, patch, config and tweak. I mean you can turn it on for any game that works with weird resolutions, but then you end up with mini maps on the far right and have to hope for fan patches unless the developers directly support it.

When it does work, yes it’s great, even with the distorted view points we have today. Having that extra viewing area made some games, especially horror survival , easier. Which I guess may be a bad thing, but I appreciated the hell out of it.

If this new nvidia tech actually fixes that distortion, it would be considerably better I think, but you still have the issues above. It’s just not supported by games well and the past history of SLI & Triple display all point to poor developer implementation. It’s a very niche market and developers just don’t care.

If developers ever get around to supporting it all properly I am sure we will hear about it and you can buy another card then.

Oh cool, suddenly I feel less ambivalent about my three week old 980 Ti. However, I notice on the EVGA page descriping the program (http://www.evga.com/support/stepup/) the best available step-up is currently a 970. Is the program not available for high-end cards, or is it just that nothing new has been released recently that constitutes a step up for that range of cards?

Edit: discussion of this exact issue on this Reddit page. It appears to be uncertain whether the 1080 will be available for the step up program at or near launch.

Realistically the latter. The 1070 (most probable step-up) doesn’t really exist in a consumer-facing way yet. Probably won’t for a month or so.

Enough to drive them convincingly at high frame rate with all effects turned on?

There are not two GTX 1080 models made by nVidia. Only the “Founder’s Edition” exists; there is not a cheaper card made by nVidia than the $700 Founder’s Edition, which ships first.

Just to be clear: nVidia is making one official GTX 1080 and one official GTX 1070 model.

The “Founder’s Edition” is not specially binned.

The “Founder’s Edition” is not pre-overclocked.

The “Founder’s Edition” uses the new industrial design and cooler from nVidia. Historically, this is what we would call the “reference cooler.” The cooler is more-or-less identical to the previous reference models. It’s got vapor chamber cooling, a VRM blower fan, and a large alloy heatsink under the shroud. There is a backplate on the GTX 1080 Founder’s Edition.

This card is not “limited edition,” despite its name that would indicate as much, and will run production through the life of the GTX 1080 product line.

Well you wouldn’t want 4 monitor gaming, that’s terrible. It needs to be an odd number so you have a primary screen for your reticle, and likely other HUD elements. But in terms of 2D desktop space, sure, more than enough.

Otherwise it depends on the game and card. SLI would help, if you can get it to work or have the patience to wait for NVidia to sort it out, then realize they never really sorted it out months later and it’s still glitchy despite their driver notes saying it’s fixed.

Do I sound bitter? I have gone through AMD (CrossFire) then 2 NVidia (Surround Vision) revisions and came to the conclusion SLI is shit and will continue to be shit until NVidia comes up with a better solution than AFR. AFR pisses developers off because they like to do funky things between frames that AFR can interfere with.

Who knows, the new NVidia SLI bridge and DX12 could pave the way for this but again - that’s up to the developers now.

Yep, the “founder’s edition” is just a reference card priced at an extra hundred bucks. That $100 is the price you pay for getting a 1080 before late June (at best).

Pure crazy