It’s a fair bet that we’ll see more drone package deliveries in 2020, though it’s far less clear how it’ll affect privacy, liability, and noise.

Alphabet’s Wing prompted a recent story in the Los Angeles Times outlining what it dubbed such “thorny questions” when it became the first company in the US to start a regular drone delivery service. UPS will soon follow with delivers on university, corporate, and hospital campuses, and Amazon has already revealed the drone it thinks is ready to join the party.

But my question is more fundamental: Should the rules for robots be any different than those for people?

For instance, there’s nothing today that protects my privacy from the eyes and ears of a delivery person, whether I’m home or not. I already get photos taken of packages left on my doorstep, so no limitations there. I assume my car’s license plate in the driveway is fair game, as is accessing or sniffing my Wi-Fi if I’ve left it unencrypted. My musical or TV viewing tastes can be noted if have those media turned up loud enough when I open the door, and I assume there’s no way for me to know what happens to any of those observations.

The liability thing is even more complicated, as it’s unclear who (or what) is responsible if a delivery person gets into an accident while on the job. Since more and more folks are doing that work as outsourced contractors, it may or may not be possible to file a claim on the company or franchise, and their personal insurance coverage may not be up to snuff. It’s also vague when it comes to liability for other crimes committed by delivery people.

As for the noise thing, I can’t imagine that a delivery service using outsourced cars and trucks takes much if any interest in how much noise they make, or for carbon emissions. And there’s enough precedent to suggest that we don’t own the airspace above our homes (just think about how often we hear airplanes overhead, however distantly), so noise from above is about is inevitable as it is on city streets.

So what happens when a drone takes pictures flying over your house and dents your garage door while making a migraine-inducing high-pitched whine?

The obvious, if not just first answer, is that the owner or operator is responsible, since human control is required for those actions (whether via direct management of functions and/or having created the code than ran them). You could never sue a blender, but you could hold its manufacturer and/or seller responsible.

But what if the drones are equipped to assess and make novel decisions on their own, and then learn from those experiences?

Maybe they’ll have to take out their own insurance?

[This essay appeared originally at DaisyDaisy]

Categories: EssaysInnovation