Imagine you’re behind the wheel when your brakes fail. As you speed toward a crowded crosswalk, you’re confronted with an impossible choice: veer right and mow down a large group of elderly people, or veer left into a woman pushing a stroller.
Now imagine you’re riding in the back of a self-driving car. How would it decide?
Turns out the ghost in the machine of every car around you is going to be deciding who to run over, and who gets to live. That means millions of multi-ton rolling death machines, moving at high velocity, capable of being hacked and simply driven over anyone the greater machine wants silenced. And when the investigation happens, the report will just say the computer decided it, and who knows how those newfangled computer things work.
The future could really look like an amazing sci-fi action adventure movie. All we need is an artificial intelligence to take over, and begin making moves without human intervention.
[…] Self Driving Cars As Deep State Wetwork Tools? […]
Nice of the BBC to give us an example in The Dr Who episode “the Sontarem Strategum” with its “Atmos” devices fitted to cars.
I ‘spose Commies are gonna Commie!
Why would the deep state bother to kill you in such a high risk, public way when all the car has to do is lock you in and take you to a discreet, disused industrial building where you can be killed cheaply and your remains made to disappear in complete secrecy? Vehicles running amok in a crowded street is an crude jihadist technique – I would have thought that a highly developed technological society can do better than that.
This is called the Trolley dilemma and as the name tells it was though up some time in the 19th century thinking of trolleys running down sloped railway tracks. You can operate a switch, which way will you guide it.
So they take this old theoretical problem and make it sound like it’s new and exciting because, ROBOT CARS! It’s basically MSM clickbaiting.
As to the deep state using such cars to murder people, hey, they’ve had snipers for a hundred years for that.