The A.I. Thread of OMG We're Being Replaced

It’s a pretty recent development–maybe since Monday?

We have a thread for stuff like this, so the appropriate person can notice/respond. I’ve seen the youtube embeds break / unbreak myself, and other people have reported those issues.

Ah, good idea! I’ve just gotten in the habit of totally glossing over the Hardware section and it’s become functionally invisible to me.

Hey bud, this started occuring after a Discord update. Deleting browser cahce seemed to resolve it for most folk on Qt3. I assume that is a thing that can still be done on a phone browser.

That did the trick!

I feel this way about pretty much all techbro-dom these days, not just the AI salesmen.

Regarding that article, I feel like either the author is kind of missing some obvious things, or I am.

I mean, like this part:

There was something that Huang said during the keynote that shocked me into a mild panic. Nvidia’s Blackwell cluster, which will come with eight GPUs, pulls down 15kW of power. That’s 15,000 watts of power. Divided by eight, that’s 1,875 watts per GPU.

The current-gen Hopper data center chips draw up to 1,000W, so Nvidia Blackwell is nearly doubling the power consumption of these chips. Data center energy usage is already out of control, but Blackwell is going to pour jet fuel on what is already an uncontained wildfire.

The new processors draw more power… But they offer much higher performance, so the actual power efficiency of computationis going up. That means you use less power. That’s good, not bad, right?

Am I missing something here?

Say these GPUs are being used for bitcoin mining. Sure, they’re more efficient than the previous generation. But you’re still burning the equivalent of a first-world country’s worth of electricity for bullshit. The problem is all this AI has caused all these tech giant’s companies promise of Carbon Zero to go out the window.

The author says that we can’t bring nuclear power on line in time, but completely ignores the rapid growth of renewables.

One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.

On the other hand…

China’s rapid solar rollout has put it on track to meet its renewable goals years ahead of schedule, with installed solar capacity of 655 gigawatts (GW) as of March, the most in the world by far, well ahead of second-placed United States with upwards of 179 GW at the end of 2023.

So, if we get a million of these rolling, we use 1% of our solar power.

The author was assigned to cover the keynote, and decided to write about something else instead.

But… You would burn even more electricity for that bullshit if you were using older GPUs.

The argument that efficiency is bad… Is not a good argument.

MidJourney is all “Hey, you guys like porn right? Just think of all the porn possibilities in your future with us!”

https://www.reddit.com/r/midjourney/comments/1eg3mg8/were_in_the_endgame_now/

What I find really interesting, is that in that video you have photorealistic people, pretty much perfect in every way… except the hands are still messed up. (note in most of the clips they don’t show hands at all, but when they do… they’re weird.)
image

It’s so interesting to me that hands, specifically, seem to be so difficult. I almost feel like faces would be harder, or at least, faces would be more likely to trigger an uncanny valley situation since our brains are so heavily focused on them.

It’s like the AI systems are all Rob Liefeld.

What a great interview! Thanks for posting. The guy is a wonderful communicator of these ideas. I guess he’s spent a lot of time thinking about them!

Doesn’t the volume of work always expand to fill any available capacity, whatever capacity you consider? It’s not like energy use is going to go down, is it?

Thank you, sir!

Jevon’s paradox isn’t an effective argument against increasing efficiency though.

Improving the efficiency of GPUs isn’t going to make people do less computation, obviously. It’ll let us do more computation for the amount of energy we expend.

Another example is adding insulation to homes. You’d expect people would use less energy to heat their homes, but it turns out they just turn the thermostat up.

It’s less of a paradox when you consider that efficiency gains effectively increase the utility of a dollar’s worth of energy. That makes it worth using more.

Yeah, it’s kind of obvious when you consider normal patterns of human usage. When you have resources, you tend to use them, as there’s limited utility in conserving them. A business has money to use, it doesn’t serve much purpose to have it sit idle. When people get raises, they don’t tend to just save more money, they tend to increase their standards of living.

Again though, the idea from that article that increasing efficiency of GPU’s is somehow bad, was just a straight up dumb argument though. In truth, he didn’t even seem to understand the notion of computational efficiency, and instead was just looking at the raw power consumption of a single GPU, without consideration for what the GPU was able to do for that power.

The implication of improved GPUs isn’t going to be reduced power usage. It’s going to be increased computational power for the same power usage, letting us do more stuff for the same amount of energy. Again, it’s good.

The obvious solution is to build just a single GPU, and people can fight for which project is the most important to run on it. That way, the utility provided for the energy used will be maximized.