When do the next generation GPUs drop?

Also early reports are that it only generates 30MH/sec, around the same as a GTX1070. Given its power consumption, it’s a terrible coin miner. Could be broken drivers, still very early.

https://nl.hardware.info/reviews/7517/23/amd-radeon-rx-vega-56--64-review-minder-euros-meer-watts-testresultaten-amd-radeon-rx-vega-voor-coin-mining

Yes indeed, same here Scott.

The rumors about it being great for mining were started by a vendor trying to pump up preorders. They never made a lot sense; why would Vega RX be any good when the Vega FE wasn’t? The 100MH/s number was particularly unbelievable. Vega doesn’t have anywhere near enough memory bandwidth for that, the absolute theoretical maximum is about 60MH/s. And nothing ever reaches the theoretical max.

It’ll be interesting to find out exactly what went wrong. Because it’s not just that they had to pump up the voltages to reach their target clock speeds. It’s pretty miserable on other metrics too. What are they doing with all that silicon? There were supposed to be half a dozen major new architectural features; why are none of them making any difference? Why the long delay between FE and RX Vega? It clearly wasn’t about finishing the “magic drivers”.

The reason why it might still be good for mining is because it has fp8 and fp16 support, although it would require rewriting the miners.

You can’t rewrite the underlying hash functions using floating point. The core operations are mostly bit operations, e.g. XORs, ANDs, rotates. Low accuracy floating point will be great for machine learning though.

It seems that with GPUs AMD hasn’t been competitive (performance, power efficiency, PCB length) since the 6xxx or 7xxx series?

The Fury Nano competed on PCB length, and the RX-480 was very competitive on price.

No SR-IOV on consumer Vega either.

It’s really strange, with CPUs AMD finally understood that they shouldn’t play these silly artificial market segmentation games. But in GPUs where you’d think they could use any possible advantage? Nope.

They’ve added mining specific instructions to the ISA in Vega. To soon to know how those will pan out.

The correct answer is to get the card to decrypt its own BIOS. Go, math!

Meanwhile, can someone tell me that my 7870XT is still good for games? I just want to not always browse the video cards page of online stores…it’s a bad habit…

7870 should be OK for 1080p gaming. Not 60fps and not maxed settings, but generally OK. It’s a bit faster than the release PS4 and Xbone GPUs. You can stretch it out another year or two if required.

I’ve had the same habit lately, and it’s a depressing one. You know things are bad when I check every day, hoping against hope, trying to find a sale where I can get a 1080 for the low low price of . . . MSRP.

I’m selling my old 980ti if you wanna upgrade, Kao.

It’s up to 36 vs 32 for a 1080 Ti with first driver tweaks. If it gets another 10 or 15 percent it might be a real problem for anyone buying it for games again.

Hey let’s put out a driver specifically for miners so any gamers that want Vega are SOL

Based on the power utilization and heat generation, there’s no particular reason for gamers to buy Vega anyway, unless they have a freesync monitor. If it excels at coin mining at least AMD can sell their product.

Are there people that buy high end GPUs but refuse to buy a G-Sync or Freesync monitor?

Not exactly refuse, but the lower cost of a Vega+FreeSync package over a 1070 or 1080+Gsync package is probably going to tip the decision for me. I have power to spare and don’t really care about heat.

No, but you might already have a freesync monitor. And like the Fish said, freesync monitors are around $200 cheaper than g-sync. That’s really AMD’s only competitive advantage.

Pity Nvidia is too stubborn to enable freesync support in their drivers, which they could trivially do.

Yeah, I skipped paying the price for a gsync monitor last year when I built this system and grabbed the 1070. It was enough work getting my wife to not veto spending what I already was. I would love it if they enabled freesync support, or opened up g-sync. I will possibly spring for a monitor when the next gen nvidia cards come out.