Massive CPU Security Flaws Revealed


#442

Academics and corporations already built quantum computers, but they’re only a very limited number of qubits, maxing out at 72 last I saw, and they are still slower than traditional computers that don’t use alternate universes. So they need to get much wider, with lower error rates and long persistance, to achieve what they call (in a rather dramatic fashion) quantum supremacy.


#443

But what if those universes wants their calculations back and invades us with an interdimensional battle fleet?


#444

Whatever you do, remember to bring a towel.


#445

I still say quantum computing should take a back seat to 3D computing. It would take an entirely new way of manufacturing chips though and an entirely different way of programming.


#446

The beauty of 3D computing is that through the power of Artificial Intelligence, Nvidia DLSS would do it twice as fast.


#447

I remember having a lot of hope that diamond substrates would become a new thing and make solving the heat density problems much smaller, but it doesn’t seem to have turned into anything commercially practical. It feels like we are trying to jump ahead on the technology evolution s-curves without some of the intermediate steps.


#448

Why use DLSS when you could use Radeon 3DD for twice the heat and 10% slower rendering?


#449

Well compound semiconductors like GaN aren’t skipping ahead; they’re a whole new process that needs to be matured, but not too different than silicon. They’re the obvious next step.

3D architecture on silicon/GaN/carbon(crystalline diamond), carbon nanotubes, graphene, etc, those are much bigger leaps ahead. And quantum computing is conceptually even further, if quantum supremacy ends up being real.


#450

I have my suspicion that if someone base a paper on old school law of entropy and apply it to quantum the computing dream, he or she will be able to show that the energy you put into a quantum computer to extract information from the qu bits will be the same as amount of energy you you put in to extract from a normal computer. And the person will be famous.

From a information systems view point, quantum computing sounds like the perpetual motion machine.


#451

OK, it’s time.


#452

Don’t panic! Due to a terrible miscalculation of scale the entire battle fleet will be accidentally swallowed by a small dog.


#453

If alternate universes with people exist isn’t it likely they are already fucking with us?


#454

Did anyone ever get anywhere with gallium arsenide? That is what Seymour Cray was trying to use at the end.


#455

Clearly we are the alternate universe…


#456

We’re the backup universe in case the “true” universe borks itself.


#457

If we are the backup to the “true” universe, then the “true” universe is in a boatload of shit.


#458

This forum needs likes.


#459

That’s awesome. Like a quaternion in Hilbert space.


#460

Gallium Nitride (GaN) is starting to pop up in aftermarket power adapters (where it’s credited with making them smaller and lighter).

Intel has been talking about it, too, with some references mentioning it as something they’ll explore during the 7nm cycle (I’ve yet to see any realistic projection of an actual marketable product from it though). It’s expected to be really good at high temperatuture, high frequency operation:


Ars has a good write-up from 2016:


#461

https://www.theregister.co.uk/2019/03/05/spoiler_intel_flaw/

SPOILER, the researchers say, will make existing Rowhammer and cache attacks easier, and make JavaScript-enabled attacks more feasible – instead of taking weeks, Rowhammer could take just seconds. Moghimi said the paper describes a JavaScript-based cache prime+probe technique that can be triggered with a click to leak private data and cryptographic keys not protected from cache timing attacks.

Mitigations may prove hard to come by. “There is no software mitigation that can completely erase this problem,” the researchers say. Chip architecture fixes may work, they add, but at the cost of performance.

I’m seriously considering going back to AMD with my next build. What a cluster.