Slaughterbots
November 19, 2017 1:16 PM   Subscribe

UC Berkeley professor Stuart Russell and the Future of Life Institute have created an eerie viral video titled "Slaughterbots" that depicts a future in which humans develop small, hand-sized drones that are programmed to identify and eliminate designated targets. In the video above, the technology is initially developed with the intention of combating crime and terrorism, but the drones are taken over by an unknown forces who use the powerful weapons to murder a group of senators and college students. UC Berkeley professor's eerie lethal drone video goes viral [Warning: graphic violence]
posted by chavenet (64 comments total) 32 users marked this as a favorite
 
This is basically the plot to the Black Mirror episode, Hated in the Nation, except the drones (robo-bees in this case) weren't even originally designed to be weapons.
posted by lilies.lilies at 1:35 PM on November 19, 2017 [8 favorites]


> "We have an opportunity to prevent the future you just saw, but the window to act is closing fast."

So we’re counting on police, the military, criminals and/or politicians to all pass on the opportunity to have their very own army of slaughterbots because of ethical issues and fears of unintended consequences? That worked out well with nuclear weapons.
posted by The Card Cheat at 1:44 PM on November 19, 2017 [11 favorites]


Why would anyone program in the Three Laws of Robotics when you could make a bigger profit NOT programming in the Three Laws of Robotics? Oops. I guess that's why.
posted by rikschell at 1:53 PM on November 19, 2017 [4 favorites]


The kind of people who only respect and believe in power are not the kind of people we should trust with it. Unfortunately, nobody believes in anything else anymore it seems. We need to figure out some new common ethical norms and standards of decency and get everybody onboard fast.

Ha ha. Oh dear lord how my heart aches for the future!
posted by saulgoodman at 1:53 PM on November 19, 2017 [7 favorites]


So, this is a viral video in support of a ban on lethal autonomous weapons...

And yet it chooses basically the least-bannable killer robot imaginable as its example! A standard quadcopter platform, plus explosives, plus software. Quadcopters are simple enough that an undergrad could build one from scratch even if we tried to ban them, explosives are already about as banned as people can make them (which is not banned enough, as people can make them), and software is possibly the most difficult category of thing to ban that has ever existed in human history. (I say, while at this very second a libdvdcss-based application is ripping a new addition to my video collection)

I am all in favor of not permitting the military's fleets of killbots to make the life-or-death decisions with no human in the loop, fine. But I don't see how that missing permit is going to do much to stop terrorists like the ones hypothesized in this video.
posted by roystgnr at 1:55 PM on November 19, 2017 [17 favorites]


They set a Slamhound on Turner's trail in New Delhi, slotted it to his pheromones and the color of his hair. It caught up with him on a street called Chandni Chauk and came scrambling for his rented BMW through a forest of bare brown legs and pedicab tires. Its core was a kilogram of recrystallized hexogene and flaked TNT.

- Count Zero, William Gibson
posted by tclark at 2:02 PM on November 19, 2017 [49 favorites]


The idea reminds me a little bit of the bolo in The Counselor, but with auto-delivery.
posted by hwestiii at 2:08 PM on November 19, 2017 [1 favorite]


Please excuse me if I seem less than terrified by a hypothetical future where a small device can kill dozens of people with a minimum of effort. We already live in that future and have done for over a hundred years. What exactly is the difference that makes a gun on a RC helicopter more terrifying than guns in the hands of millions of Americans? Is it because they can fly? Is it the automation thing? As far as I can tell robots will never be as effective at killing people as people are at killing people.
posted by runcibleshaw at 2:18 PM on November 19, 2017 [12 favorites]


What exactly is the difference that makes a gun on a RC helicopter more terrifying than guns in the hands of millions of Americans?

Anonymity plus economies of scale.
posted by mhoye at 2:26 PM on November 19, 2017 [49 favorites]


As far as I can tell robots will never be as effective at killing people as people are at killing people.
Robots will never question unlawful orders, deliberately fail to kill people, or have costly PTSD diagnoses.
posted by xyzzy at 2:31 PM on November 19, 2017 [4 favorites]


What exactly is the difference that makes a gun on a RC helicopter more terrifying than guns in the hands of millions of Americans?

Also the potential for more precise targeting than the Las Vegas shooter was capable of. So every person may not have reason to be more terrorized, potential targets should be.
posted by oneswellfoop at 2:36 PM on November 19, 2017


Hopefully it's more like Bolo from The Mighty Boosh.
posted by Brocktoon at 2:46 PM on November 19, 2017 [1 favorite]


Anonymity plus economies of scale.

Are you referring to RC helicopters or guns?
posted by 2N2222 at 2:55 PM on November 19, 2017 [2 favorites]


This ends in the cyberpunk scenario where smaller robots that kill killer robots beget smaller killer robots which beget smaller robots that kill killer robots until the air is basically just robots killing each other.
posted by ethansr at 3:09 PM on November 19, 2017 [9 favorites]


Needs more backflips.

*goes back to stockpiling microwave magnetrons.
posted by loquacious at 3:22 PM on November 19, 2017 [6 favorites]


Come on, 2017, I *just* beat Horizon Zero Dawn. Can we be a little less on the nose?
posted by fast ein Maedchen at 3:30 PM on November 19, 2017 [2 favorites]


It's really about anonymity, accuracy, quantity, and no fear of consequences. Sure, suicide bombers, or even suicide knife wielders, could do more or less the same thing. Not so easy to find millions of highly skilled kamikazes though.

It's probably time to require a license to operate drones for any reason ever. Even if just deploying fully autonomous units, somebody has to be in charge, kind of thing.

Thinking about technical solutions.. There isn't much sign of hobby scale semi-conductor fabs, so mandated built in controls in certain key chips is still a possibility. GPS and Wi-Fi chips are the obvious things to target. Some kind of complicated shutdown condition?
If built in accelerometer detects inhuman motion, and built in proximity sensor detects living creature within 15m, shutdown immediately.
Very large scale actors would easily by-pass such limitations, but any degree of anonymity would be pretty impossible. Detailed analysis of the non-compliant parts would finger print the source(s) pretty easily.

It's a hard problem.
posted by Chuckles at 3:30 PM on November 19, 2017 [4 favorites]


There isn't much sign of hobby scale semi-conductor fabs, so mandated built in controls in certain key chips is still a possibility. GPS and Wi-Fi chips are the obvious things to target. Some kind of complicated shutdown condition?

The result would be underground fabs or a bustling trade in pre-legislation chips.
posted by leotrotsky at 4:08 PM on November 19, 2017 [1 favorite]


I guess is seems scarier than dealing with the usual gun wielding maniacs because we'd be entirely indefensible. How effective would a SWAT team be against this threat? We'll need counter terrorism drones. A big group of people will want to own lethal micro drones "for self defence” and fight any legislation that might in any way deprive them of that sort of defence.

Even less sophisticated drones could likely undermine current safety procedures at airports by being able to attach themselves to taxiing jets. How do you stop that from happening? How much more control of our lives will we be willing to surrender to safeguard our families? How much more wealth will disappear down the black money hole of "security"?
posted by bonobothegreat at 4:14 PM on November 19, 2017 [2 favorites]


The real concern would be targeted assassinations of political figures. Particularly when one vote can have huge consequences. Democratic Senators in states with Republican Governors, or Republican Justices with Democratic Presidents. It’s the final step in the escalation of political brinksmanship. Drones can move faster than security people can react. If successful, it will result in security zones that will really isolate political figures from their constituents.
posted by leotrotsky at 4:15 PM on November 19, 2017 [6 favorites]


What's the slogan from that episode of the Simpsons? "Your job is not to fight the wars of the future, your job is to maintain and service the robots that will fight the wars of the future."

The US already has drones flying over the skies of multiple countries, blowing people up with abandon. I guess since they're run a bunch of sweaty pilots in a facility somewhere, there's some semblance of humanity guiding the decision. Automating it like a Facebook marketing campaign is the kind of thing that makes keyboard commandos go weak at the knees.

Since STEM is all about coulda, not about shoulda, I suspect a "we have a death robot gap" will be a political fight sooner rather than later.
posted by fifteen schnitzengruben is my limit at 4:24 PM on November 19, 2017 [1 favorite]


Autonomous target identification seems like the most difficult part. If you don't have that, you have to send a video signal back to the controller. A video signal requires high bandwidth, which means a short wavelength, which limits the distance the controller can be from the drone. To get around that, the controller needs to either use a public network (making it potentially trackable) or build a private high-frequency repeater network (which would also draw attention).

If someone does achieve autonomous target identification, it will likely suck a fair amount of power, again limiting range.

Quadcopter style drones are also very noisy, and I believe that some of that noise is an inevitable result of the physics of small rotors and heavy loads. So the would-be assassin is forced into a noisy takeoff within a couple of kilometers of their target, and a noisy approach toward their target.

Some of these problems will be solved, but I suspect it'll be a while before this threat matches that of the Beltway sniper. For now, you'd probably be better off training a crow to hate your assassination target, and then strapping a bomb to the crow.
posted by clawsoon at 4:39 PM on November 19, 2017 [2 favorites]


Gibson also revisits this in the Peripheral with swarms of nanobots that take someone completely apart.
posted by kokaku at 4:43 PM on November 19, 2017 [1 favorite]


Also the Knife Missles from Iain M. Banks' Culture books (less swarm than high-powered AI-driven flying killing machine that usually acted solo).
posted by kokaku at 4:59 PM on November 19, 2017 [1 favorite]


Some of these problems could be solved by using fixed-wing drones instead of 'copter drones. For a given amount of power and load, they can fly farther, faster and quieter. However, higher speed also means that they'd have to process target data that much faster; with a fixed-wing craft, there's no such thing as sitting still and observing the scene. With long enough wings, it could float in silently, but then you've got a 10-foot albatross floating down toward the target.

The more that I think about this, the more I think that a land-based drone would be more effective. It could quietly drive (or be dropped from a drone) into place the night before, quietly sit and process the scene, and have a stable platform for aiming and firing. It could carry bigger batteries, a bigger payload, and more visual processing power. It wouldn't have to use any of its energy just to hold itself up. Drop it into the rough of your target's favourite golf course the night before a game and drive it into the trees. Then, let it watch.

I may, however, be completely wrong about this. (Morally, I'm definitely completely wrong about this.)
posted by clawsoon at 5:08 PM on November 19, 2017 [1 favorite]


training a crow to hate

sounds like a weird metaphor for the internet
posted by Sebmojo at 5:08 PM on November 19, 2017 [2 favorites]


Now, can I bid Bitcoin via smart contract to hire a set of these for personal use?
posted by Samizdata at 5:09 PM on November 19, 2017 [1 favorite]


I think that a land-based drone would be more effective.

Sounds familiar
posted by leotrotsky at 5:22 PM on November 19, 2017 [3 favorites]


So, there's no Gene Simmons in this remake? Came for the Runaway reference, left disappointed that I had to make it myself.
posted by sysinfo at 5:23 PM on November 19, 2017 [8 favorites]


Oh, there it is!
posted by sysinfo at 5:23 PM on November 19, 2017 [3 favorites]


Cicada drone, years undetected until the victim walks by and zap.
posted by sammyo at 5:24 PM on November 19, 2017 [1 favorite]


Drones for me, but not for thee, because that's exactly where this is going.
posted by Beholder at 5:42 PM on November 19, 2017 [4 favorites]


Seems like you could have a two part system:

First part, with either facial recognition or just remotely connected to a person, uses a laser designator to paint a target. This can be distant from the target, and larger in size. Could also be disguised to blend into environment, and if it goes quiet and still after it does its job it’d be pretty hard to detect.

Second part is a small laser guided missile. It’d be loud, but very fast to target. The designator and the missile need not be anywhere near each other.
posted by leotrotsky at 5:42 PM on November 19, 2017 [1 favorite]


Laser designators can paint a target up to 5 kilometers out.
posted by leotrotsky at 5:51 PM on November 19, 2017 [1 favorite]


An important but subtle point here is that when you're deploying competing weapons platforms reaction time is a competitive advantage, and human decisions are slow; another way to say that is that any weaponized systems that give humans a window for moral choice are at a pure disadvantage over those that don't.

If we decide to deploy those systems, then we're going see atrocity after atrocity that we can only understand forensically, if at all, if we put the work in. And given the state of infosec we'll be lucky to get 1 generation of use-as-intended before it's all about who can game these systems at various different scales. So in about 30 years there will be no-go zones on military maps in regions where autonomous weapons systems no longer have verifiable command/control/ownership infra, and many of those will be semipopulated, quasi-feral urban areas.

Autonomous weapons are going to be our generation's landmines.
posted by mhoye at 6:08 PM on November 19, 2017 [10 favorites]


I've made a similar comment before, but it's funny when we watch the Boston Dynamic videos of the quadruped robots. Those things will eventually be used to kill people. Not necessarily the ones built by BD, some progeny of them.

We're in this race to nullify and continue advantages of asymmetric warfare. On one hand, I'm terrified of governments having exclusive access to Gibsonian murderdogs, on the other hand, democratizing bomb swarms doesn't seem like a step in the right direction either.

• Quadruped MurderDogs.
• Bomb Swarms.
• Hacking driverless/IOT Cars.
• Social Media Bot Armies.
• Social media's tendencies to radicalize subgroups.
• Dismantling of monoculture/objective truths/consensual reality etc.
• Emergent AI and the current international AI Arms Race.
• CRISPR technology.
• CRISPR technology in conjunction with biowarfare.
• Bespoke, DNA specific biowarfare.
• Rise of totalitarian/populist governments.
• Whatever the hell nanobots are.
• Equifax-style bullshit and hacking in general.

What am I missing? I wish I could just sign up for a paleolithic LARPing enclave, sort of like a cross between the twist in The Village and the Savage Reservations of Brave New World.
posted by Telf at 6:09 PM on November 19, 2017 [4 favorites]


I initially thought this was about the NZ butcher robots, in their own way horrifyingly efficient. I saw one in action recently but photos weren't allowed. Being in a room with some-thing that appeared to think (variable length pauses before cutting ) was unsettling. Quite apart from the fact these will displace skilled jobs. No links as couldn't find anything illustrative but non- disturbing.
posted by unearthed at 6:39 PM on November 19, 2017 [1 favorite]


The winged drones will likely drop clusters of wheeled or quadruped drones can sit and observe before hatching the killer micro drones.
posted by bonobothegreat at 6:52 PM on November 19, 2017 [2 favorites]


I just want to stand on a balcony and release a bunch of these while shouting "Fly, my pretties" and then cackling. Is that too much to ask?
posted by greenhornet at 7:46 PM on November 19, 2017 [6 favorites]


Autonomous weapons are going to be our generation's landmines.

No, because their batteries will run out. Landmines last longer because the charges don't use energy just sitting there. Autonomous systems have much higher overhead.

...unless they're solar powered.
posted by leotrotsky at 8:01 PM on November 19, 2017 [1 favorite]


No, because their batteries will run out. Landmines last longer because the charges don't use energy just sitting there. Autonomous systems have much higher overhead.

Not so much on the landmines. I know there has been work done on landmines that can redistribute themselves to fill gaps in a minefield, which is terrifying in and of itself.
posted by Samizdata at 8:03 PM on November 19, 2017 [1 favorite]


mhoye: An important but subtle point here is that when you're deploying competing weapons platforms reaction time is a competitive advantage, and human decisions are slow; another way to say that is that any weaponized systems that give humans a window for moral choice are at a pure disadvantage over those that don't.

That's true in Clausewitzean "absolute war", but most wars are fought with intentionally limited means and goals. Achieving a morally plausible victory in addition to victory on the battlefield has been a goal of most, if not all (looking at you, Assyrians and Mongolians), successful empires. In part, it makes financial sense: Conquered areas are more likely to surrender and more likely to produce surplus wealth and taxes after being conquered if the victor made "moral" choices while killing, and taxpayers at home are more likely to support a "moral" war. That's not always true, but it has been true more often than not.

So in about 30 years there will be no-go zones on military maps in regions where autonomous weapons systems no longer have verifiable command/control/ownership infra, and many of those will be semipopulated, quasi-feral urban areas.

The modern instances which comes to mind of no-go massacre zones in which all moral restraints were left behind in the service of killing as many people as quickly as possible were at opposite ends of the technological spectrum: Nuclear bombs in Hiroshima and Nagasaki, and machetes in Rwanda.
posted by clawsoon at 8:14 PM on November 19, 2017 [2 favorites]


An additional this particular scenario is worrisome is because it gives significantly more destructive power to a small group. In the case of a bomb or guns, you are still somewhat limited by the number of people you can convince to carry out an attack.
In the video's storytelling, one person could have programmed the thousands of killbots and then unleashed them on a city.
posted by mulligan at 8:27 PM on November 19, 2017 [1 favorite]


I'm all for this. I have a little girl now. And everyone keeps congratulating me and going "aww, isn't she sweet" instead of shitting themselves in terror. Which I just can't understand for the life of me. Because the way things are going, to make sure she has a future worth living in, it's looking more and more like I'm going to have to kill a whole lot of people. Like millions of them. Tens of millions more likely. And I just don't see how I'm going to manage it without some serious industrial scale automation.
posted by Naberius at 8:32 PM on November 19, 2017 [3 favorites]


Count me on team this isn't nearly as scary as shit that is already being used to kill people right now every day.
posted by aspersioncast at 8:58 PM on November 19, 2017


Daniel Suarez wrote a good book about this. Also has a TED talk.
posted by ryoshu at 9:10 PM on November 19, 2017 [1 favorite]


there will be no-go zones on military maps in regions where autonomous weapons systems no longer have verifiable command/control/ownership infra, and many of those will be semipopulated, quasi-feral urban areas.

Second Variety, PKD
posted by j_curiouser at 9:13 PM on November 19, 2017 [4 favorites]


I think that a land-based drone would be more effective. It could quietly drive (or be dropped from a drone) into place the night before, quietly sit and process the scene, and have a stable platform for aiming and firing.

Not mutually exclusive. People generally choose to build drones that either fly or roll / walk / crawl / whatever, but you could pretty easily build one that does both. Obviously you sacrifice some of your weight budget for the secondary mobility system, but it could be a valid tradeoff. I don't know if it would offer a big tactical advantage over a purely flying drone, but... it's certainly possible.

If someone does achieve autonomous target identification, it will likely suck a fair amount of power, again limiting range.

Seems like a bet against Moore's Law; always dangerous. There's a huge amount of commercial pressure to build very space/power-efficient devices, and accompanying software, which can be applied to this problem; it's not a military exclusive domain. The same hardware that makes Snapchat's face swapping algo run faster on your phone could potentially let a small autonomous platform do target identification at some point in the not-too-distant future. The computational power of off-the-shelf mobile hardware is getting better all the time. And you don't have to light up all the silicon all the time; it can sit there idle until you need to use it, not drawing power.

[re NZ lamb butchering robots] No links as couldn't find anything illustrative but non- disturbing.

"Non-disturbing" seems like a high bar considering the subject matter... But how can you not love everything about this video? The production values, the music, and most of all the can't-say-it-with-a-straight-face name (the "Automated Lamb Boning System"). Anyway, the pièce de résistance is the "Hindquarter System" at around 3m10s.

I suspect a "we have a death robot gap" will be a political fight sooner rather than later.

Set your time machine to April... 2016.
[W]e believe we're in a world of what we call "fast followers." And as the Deputy Secretary of Defense, I'm okay with that. I just want to make sure that the United States and NATO are the fast leaders. And if people -- if adversaries try to copy us, we will always want to be ahead of what they're trying to do.
[W]e believe quite strongly that the technological sauce of the Third Offset is going to be advances in Artificial Intelligence (AI) and autonomy. [...] A.I. and autonomy put inside these battle networks is going to allow collaborative human-machine operations to absolute new levels, allowing machines to do what they do best, and the humans to do what they do best, in what we call human-machine symbiosis.
So again, whenever we talk about this, usually I hear killer robots come up. I want to emphasize that this third offset strategy is about making the human better. It's about making the human operate better.
reaction time is a competitive advantage, and human decisions are slow; another way to say that is that any weaponized systems that give humans a window for moral choice are at a pure disadvantage over those that don't.

This is true and a point very well taken, although it's not new. The transcript mentioned above specifically mentions missile defense systems as an example of handing over control to an automated system as a necessary step in achieving parity with a competing system (incoming missiles) which otherwise have the advantage.

There was a fair bit of theoretical work done in the 60s and 70s on how to try and balance the competing objectives of maintaining positive control but also preserving reaction time and thus effectiveness, at the strategic level. It's a tradeoff, and it depends on your adversary's position as well as your own, and there's a mutual interest even between adversaries in not putting so much pressure on the other side that it causes a reduction in positive control.

What concerns me is that there doesn't seem to be a widespread understanding that going after an adversary's tactical command and control systems, similar to strategic ones, could also be a very stupid move, if it moves the fulcrum under the capability-vs-control tradeoff (e.g. by encouraging them to give their edge systems the ability to attack targets at will). That's how you get flying landmines and stuff.
posted by Kadin2048 at 9:24 PM on November 19, 2017 [2 favorites]


Second Variety, by Philip K Dick (of course), covers this extremely well.
posted by Slinga at 10:09 PM on November 19, 2017 [1 favorite]


You know who's not threatened by this? Hikikomori. The robots can't get you if you don't go outside. I know who is going to inherit the earth, and it won't those sun-seeing extroverts.
posted by Balna Watya at 10:42 PM on November 19, 2017 [5 favorites]


Here's your fun terrifying land-based solution, of course from Boston Dynamics.
posted by Harald74 at 12:25 AM on November 20, 2017 [1 favorite]


To get around that, the controller needs to either use a public network (making it potentially trackable)

... so, burner phone SIM cards and anonymous VPN?
posted by sebastienbailard at 12:43 AM on November 20, 2017


If successful, it will result in security zones that will really isolate political figures from their constituents.

We currently accomplish this with massive wealth, lobbyists, and extreme ideologies.
posted by mecran01 at 3:22 AM on November 20, 2017 [3 favorites]


This discussion has gotten me thinking about how quiet muscles are. It'll be a great advance for slaughterbots when they can move themselves silently instead of buzzing and whirring and clanking.

(It also got me wondering whether early animal propulsion systems were louder. It seems like something which would be driven by heavy selective pressure in an arms race between quieter muscles and more sensitive ears.)
posted by clawsoon at 4:01 AM on November 20, 2017 [2 favorites]


See also Runaways (1984), starring Tom Selleck.
posted by biffa at 5:37 AM on November 20, 2017


clawsoon,
I had similar thoughts after watching both the trailer for the Pacific Rim sequel and that Japan vs US Megabots deal.

Our current system of hydraulics and cables just don't scale up or down very well.

Regarding early propulsion, I'd assume that ATP popped up pretty early in the evolutionary order of things. I have no idea how it work in insects or invertebrates, but I'm going to guess it's similar to what we mammals have going on. (If only I had access to all of humanity's collective knowledge in one simple search function.)

Anyway, the sliding filament model of muscle contraction is hella complicated but damn if it doesn't get the job done. I feel like we need to move away from the current system of servos and cables and get some synthetic actin/myosin sarcomere shortening going on. There's a lot to be said about serially organized contractile units working in unison. Certainly much more effective than limiting all the movement to one point. An antagonist muscle pair system would also really increase efficiency rather than having hydraulics expand or contract.
posted by Telf at 6:24 AM on November 20, 2017


The Ban Lethal Autonomous Weapons site (which the video points to) has an FAQ answering questions such as: What is the Convention on Conventional Weapons (CCW)? and Does the campaign have any comments on the agenda for the CCW meeting? plus a research and reports page.
posted by brainwane at 6:31 AM on November 20, 2017 [1 favorite]


until the air is basically just robots killing each other

In The Diamond Age this is called a "toner war"
posted by Lazlo Hollyfeld at 6:33 AM on November 20, 2017 [6 favorites]


Drones can move faster than security people can react.

The drone doesn't have to move toward the target though, it can sit waaaay back with a stripped down, single shot rifle and be powered down until the operator knows that the target is present. So the engines being loud doesn't really help when the movement is done days ahead of time.

Attacks on cars have been popular as assassination methods, this would make them harder to detect. A drone that is driven, in the dead of night, under a car. It attaches to the brake line (or whatever your desired target is), then cuts it at an opportune time.

Do we even have a good way of detecting drones right now? Anything that tries to identify various means of communicating with drones and can point out in what very general direction they are currently in?
posted by Slackermagee at 6:45 AM on November 20, 2017


This discussion has gotten me thinking about how quiet muscles are. It'll be a great advance for slaughterbots when they can move themselves silently instead of buzzing and whirring and clanking.

There were attempts at bomb dogs and spy cats and batbombs, but in those cases the 'machine' carrying the military technology was too stupid to succeed (although war pigs appear to have worked better).

Technology is just catching up with what humans have been trying to do for a hundred years or more. Send something smart enough to follow orders into battle without worrying if it lives or dies, as long as it causes a lot of death and destruction in its path.
posted by AzraelBrown at 7:20 AM on November 20, 2017 [1 favorite]


...unless they're solar powered.

Or fuel themselves by consuming biomass.
posted by paper chromatographologist at 7:39 AM on November 20, 2017 [1 favorite]


What exactly is the difference that makes a gun on a RC helicopter more terrifying than guns in the hands of millions of Americans? Is it because they can fly? Is it the automation thing?

Traceability and autonomy of the UAVs are big factors. A person who would never perform a suicide bombing can perform many "suicide bombings" at no risk to his person, up to the limits of his budget, and until/unless he gets caught.

Essentially, it lowers the "cost" of a suicide bombing from a price of a highly devoted person's life, to the price of a mass produced UAV.
posted by theorique at 10:09 AM on November 20, 2017 [2 favorites]


What exactly is the difference that makes a gun on a RC helicopter more terrifying than guns in the hands of millions of Americans? Is it because they can fly? Is it the automation thing?

And sometimes distancing mechanisms make it psychologically easier to kill people. Someone who's willing to blow up a federal building and kill dozens of people might shrink from clubbing dozens of people to death with a rock.
posted by sebastienbailard at 10:40 AM on November 21, 2017 [2 favorites]


I feel like we need to move away from the current system of servos and cables and get some synthetic actin/myosin sarcomere shortening going on.

There's a team at Columbia working on "soft actuators" which are impressively muscle-like, although they have a totally different chemical path. They use a silicone-rubber matrix with embedded ethanol bubbles, which are actuated by heating the matrix above 78C using a resistive wire. The boiling ethanol causes the matrix to change shape.

It's pretty inelegant compared to mammalian (and maybe all) muscle tissue, but it does seem to work, and apparently you can spit it out of a 3D printer. I see no reason why you couldn't use something other than ethanol to bring the working temperature down, if you wanted.

Perhaps my mind is just in the gutter, but when I read about its "better compatibility for human-robot interaction", I think its initial applications might be, uh, more lover than fighter.
posted by Kadin2048 at 11:59 AM on November 21, 2017


« Older The ting goes skrrrahh, pap, pap, ka-ka-ka   |   I wasn’t meant for reality, but life came and... Newer »


This thread has been archived and is closed to new comments