The thread of sh*tty things being done with AI

I do think this needs to be broken out from the general AI thread, as one of the main problems with AI is what shitty things shitty people will do with it.

I’m leading with this article. Hydra would be envious.

Yeah, 37,000 people targeted for assassination via AI.

this appears to be the bloke who did it.

They added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”

The people they interviewed don’t sound any less coldhearted than the AI does.

I feel like the thread should be named “…being blamed on AI”. It’s very easy to make these tools give you the exact answers that you want.

To contribute, here’s another example: Using AI to Set Prices Can Be a Form of Price Fixing, FTC and DOJ Say | Inc.com

Indeed, that’s why they’re using it. They get the indiscriminate killing they want, and they get to blame it in software.

Yah, it´s probably just a very thin moral fig leaf

I think we can add this from awhile back:

Salad AI…

A guy named Bob Miles appears to be who’s in charge there. Daniel Sarfati is also key it appears.

There’s a funny case of “we don’t know what we’re doing so maybe porn”. If they don’t know what they are doing, I would have expected opting out would result in you generating nothing.

I guess people don’t realize that these activities cost them money via electricity, unless they are leeching off their parents, school, etc.

On the Salad website, the company explains that “some workloads may generate images, text or video of a mature nature”, and that any adult content generated is wiped from a users system as soon as the workload is completed. This fuzzy generalisation around sexually explicit content creation is likely because Salad is unable to view and moderate images created through its platforms, and is thereby covering its bases by marking all image generation as “adult” in general.

I think they don’t care - especially teens using their folks stuff ;)