Devices Gone Wild

These days all sorts of things are connected to the Internet, even if they are not traditionally thought of as communications devices. We have looked at “devices” such as cars, pacemakers and dolls. We have looked at legal issues such as privacy, defamation and evidence.

The Internet of Things is a gift that keeps on giving … legal questions. Here are a few more examples of what can, and thus will, go wrong.

I. Alexa and the parrot:

Alexa is’s digital assistant. It responds to voice commands and voice inquiries. It will find out information, play music, and order groceries. And in this story, a household parrot imitated the voice of the woman whom Alexa was trained to recognize, and ordered a number of watches. The woman found out about it only when the house started filling up with boxes.

As it happens, Amazon has a very generous return policy for non-digital goods, so sending all the parrot’s orders back should not have been a problem Even for many digital goods, there is a possibility of returning them for a credit. In the absence of those policies, it seems likely that the parrot would have bound the household.

The purchases are funded by credit card that is pre-registered with Amazon. Its use is authorized by the holder. Merchants are allowed to impose charges on transactions when the card is not physically present, if the usual authentication steps have been taken on registration – as is the case with any online sale.

A cardholder/owner of an Alexa device can protect him/herself with respect to any transaction by providing for a spoken confirmation code in order to place an order or for the use of a confirmation code after the order. For that matter, one can turn the microphone or the whole device off when one is not using it. If the desire for convenient shopping is stronger than the desire for security, the risk properly falls on the consumer. The “tyranny of convenience” can lead us to many risky places.

One may note, moreover, the provision of the Uniform Electronic Commerce Act on material error in dealing with automated merchants, a provision implemented in most Canadian jurisdictions. Section 22 of the UECA says that an electronic document (such as a contract or offer to contract) has no legal effect if a material error is made with the electronic agent of another party and the electronic agent did not provide an opportunity to prevent or correct that error.

This provision is not likely to work against the merchant in this case, however. First, the party who made the error must be a “natural person” – and the parrot does not qualify. Was the parrot the legal agent for the owner of the Alexa device? Doubtful. Was what the parrot did an “error”? Not clear. Second, Alexa does allow the user to set up ahead of time, before any particular transaction, a confirmation code or verification case by case. The UECA simply says that the electronic agent (Alexa? Amazon? The merchant’s site?) has to provide an opportunity to prevent or correct the error – not at what stage that opportunity must be provided. Amazon’s options surely qualify as either prevention or correction.

The owner of the device may allege a different error: that of Alexa in mistaking the parrot’s voice for the voice of the person authorized to place orders with merchants. However, the owner of the device “trained” it to recognize his or her commands. Is the provider of the machine responsible for its being insufficiently trained? Perhaps if it were insufficiently trainable … which would be hard to prove.

Let us expand the example a bit. The trend these days is for many devices to have sensors and to be programmed to report what they sense – and to do something about it. Thus, a refrigerator will be able to sense that the milk or butter supply is low and to order some. In a thoroughly integrated home, the order would be sent through Alexa (though the fridge would not have to speak, but perhaps we should not rule that out.) We can be sure that anyone receiving such orders will be able to tell – or Amazon will – that the orders were routed through Alexa rather than any of the other devices in the home. In any event, the legal issues would not be different if the fridge had a direct line to the supermarket, except possibly for the return policy.

A washing machine may detect that it is not working well and call a service operator for repair. At the first stage, the owner of the machine will probably enter the information about the repair shop. In the AI version, the machine will be able to pick a local shop based on its knowledge of its location and the shop’s.

What happens when the sensors sense wrong, or the communications part of the device works less well (after all, it’s a device in a fridge) and orders fifty pounds of butter – possibly one at a time, but it does not stop. Who pays for those?

And if the washing machine chooses a local repair shop whose employees come and burgle the house instead of, or along with, repairing the machine, is the washing machine manufacturer liable for the loss? The AI creator? Amazon, if the communication was routed via Alexa?

To the extent that such questions are not disposed of in the agreement made when the device was acquired, the usual principles of contract and tort law will apply. Who should bear the risk, who can best avoid it, who can best insure against it? How foreseeable do the risks have to be before a contract excluding them will be effective against a consumer?

Ultimately a parrot ordering on Alexa is no different than a child doing so, except that a child might hide the packages once they are delivered.

In short, while the facts are unusual, the law is pretty ordinary. It may be worth remembering that, in order to resist calls for targeted legislation to solve problems that existing law can readily dispose of.

Does all this smartness and connectivity seem more risky than it is worth? The day is soon coming when all appliances will have this capacity, like it or not. The information harvested from the machines by their manufacturers, Amazon and its competitors will be too valuable to them to do without, even if the purchaser does not choose to use the automated ordering feature.

II. Fish and the high rollers

Communicating devices do not just connect with the Internet, they connect with each other, forming a communications network that may lack a network administrator, much less security staff. They may have nothing in common but their connectivity.

We have read about hackers who got into a car’s operating computer through the entertainment system, enough to run the car off the road. It seems pretty well established that the air pressure gauge for the tires might be enough of an in-route to control the vehicle.

Lately some hackers managed to get into a casino’s computer through the thermometer on the fish tank in the lobby, which was connected to the full system. From the computer the hackers extracted a list of names and addresses of the most devoted, or addicted, gamblers. That could be a valuable list to some people.

Connectivity for the sake of connectivity has its risks – or even for the sake of ‘convenience’. Devices are often small, with not a lot of room for software that could be patched or upgraded. Their developers do not think much about system security. Often the devices have old and well-known flaws but not the almost equally old and well-known patches. Perhaps we should all be in less of a rush to hook up to a device, and to hook up all our devices to each other. Who knows who might be listening in.

A more thorough look at the challenges of a buggy Internet of Things is here.


Connectivity creates potential legal issues. It will be important to resolve as many as possible under existing legal principles, rather than running off for special legislation. The less there is a separate law of cyberspace, the more readily people (or at least lawyers) will understand the risks and remedies and fit them into existing relationships. Despite their apparent novelty, Alexa and her counterparts can sing familiar tunes.


  1. Client: “I didn’t do it! Sure, I said: ‘Alexa, order the hit.’ But I meant the #1 song that was playing–I didn’t know what Amazon would do.”


    Lawyer: “Um, one sec. Alexa, cancel order.”

  2. I A. Alexa and the dog whistle:

    So – it appears that not only can Alexa understand parrots, it can understand sounds outside the range of human ears.

    I don’t suppose Alexa, or the Amazon shipping department, records the pitch at which the order was made.

    So if somebody sends a high-pitched order on your device, are you responsible for it?

    Suppose the high-pitched command was not to buy something but to send someone a legally offensive message … then what?

    This may be yet another hard-to-foresee bug that the manufacturers will remedy at source, but the range of perception of the owners of the device and what a “normal” machine may perceive could differ anyway.

    How would you restrain your device to protect your interests in such cases?