They need to place explosives on the non mining cards so that if the card detects it’s being used to mine and not game the explosives detonate.

Some interesting statements from nVidia:

Taiwan Semiconductor Manufacturing Company (TSMC) chairman Morris Chang has recently said that he expects the company to see an on-year revenue growth of 10-15% in the first half of 2018 thanks to the demand from the cryptocurrency segment. The market watchers believe many cryptocurrency miners are likely to turn to procure ASICs from suppliers such as Bitmain.

Bitmain is ready to release ASIC products in April eyeing cryptocurrencies that has been relying on GPUs for mining and the move is expected to reduce miners’ demand for graphics cards.

Policies from governments worldwide on cryptocurrencies and significant price changes also have weakened the returns for mining.

Seeing the trend, Nvidia has recently started placing restrictions on its downstream graphics card partners, forbiding them to publicly promote cryptocurrency mining activities or actively sell its consumer graphics cards to miners, the sources said. Nvidia hopes to shift its main sales target back to consumers in the gaming market, the sources added.

Nvidia also has further increased its GPU quotes recently, which the sources believe is meant to help cover the gap that may occur after GPU demand starts sliding.

Since profitability from graphics cards has been weakening, Nvidia and AMD have both been decelerating the developments of their new GPU architectures and prolonging their existing GPU platforms’ lifecycle

Maybe some good news finally…

“Since profitability from graphics cards has been weakening, Nvidia and AMD have both been decelerating the developments of their new GPU architectures and prolonging their existing GPU platforms’ lifecycle” doesn’t seem like good news to me.

Also, how the hell is that possible when every GPU they make is immediately purchased way above MSRP?

Google bans Cryptocurrency ads, which should start slowing down and receding the value quite a bit, sounds like. That should in turn weaken the grip miners have on GPU’s, and start getting GPU prices down and drive once more a reason for Nvidia/AMD to start putting out more powerful cards. In theory.

No kidding!

Yeah, it’s total bullshit. What he really meant is that GPUs have gotten most of their performance increases from the semiconductor process free lunch, but that’s come to a screeching halt for everyone (Intel, TSMC, GloFo). Sure, you can still wring out more performance from a newer process. But it’s also more expensive per transistor, 14nm might end up being the sweet spot for that for a long time.

But if you’re the chairman of a foundry, apparently it’s impossible to admit that your business has hit a wall. Better to blame the customers of your customers.

I think stusser was referring to how could profitability be falling when they sell literally everything they make at full price right now. And I agree, that’s a valid question. I know they are forcasting and trying to align engineering resources around things, every company does. But it’s still a cash cow and hasn’t dipped yet.

I think what you’re referring to is that the comment is based on there isn’t much to gain left, therefore they make that statement, “profitability weakening,” instead of, “we’re reaching a wall in where we can go further.”

Also they’re raising prices for the past three generations. Release MSRPs:

GTX1080: $599 ($699 for Founder’s, which was all that was available for ~6 weeks)
GTX980: $549
GTX780: $500

It’s just bullshit, plain and simple. Unless they’re referring to R&D costs rising out of alignment with pricing and sales volumes, which is possible, but they didn’t actually say that.

Now AMD profitability is certainly lower, because they accepted poor deals to supply custom SoCs for Microsoft and Sony consoles. Hopefully volumes are making up for that by now, though.

Nvidia profitability should be great, but they have less incentive to invest in R&D because AMD is still a solid tier below their products on both performance and power efficiency.

I don’t think that’s quite what I meant.

Look, I assume we can all agree that the original statement about “GPU profitability weakening” is just absurd. There’s no argument to be had there.

The interesting question is why Chairman Chang made a statement that’s so obviously untrue. I suggest it’s them trying to get ahead of the curve on the blame game. GPU performance increases are going to stall. It’s better for TSMC if the narrative is “NVidia stopped investing in GPUs” rather than “the price/performance ratio of new semiconductor processes is worse”.

Noted, and very possibly the case, too.

The article layout is a little weird. That statement was attributed to ‘market sources,’ so I don’t think the TSMC chairman said it. Maybe.

More interesting to me was the last bit stating that Nvidia’s Turing-based GPUs will enter mass production in Q3.

I know it pretty much halted with CPUs a while ago, but won’t the inherent parallelism of GPUs let is keep coasting for awhile, even at a slower pace? At least relative to CPU single threaded perf.

That gets really expensive as adding cores makes your die bigger. But sure, up to a point.

Yeah, GPU dies are already pretty damn big compared to CPU dies. There’s only so much more they can shrink on silicon; any bigger and you start running into serious heat issues.

Used 1070 ITXs appear to be going for $500-$550


Something finally in stock. Do I upgrade for 90-150?

Someone make my decision for me.

Half size 1080 seems like a bad idea but the newegg reviews say it works okay?

Yeah, it’s fine. Cooling doesn’t really matter on the 1080, they all get to around 2Ghz.

My understanding is that the more cores you have the bigger the die, but the bigger the die the less you are able to work around defects in the silicon so costs go up because more chips have to be thrown out.

Our local PC chain has had a shortage of GPUs and PSUs and halted their price match policy on both

Will it get cheaper to continue producing with the same process? Maybe crossfire or sli is the way to go (if you have a couple grand spare to buy some gpus)