We recently had a discussion about police access to the recordings made by in-home digital assistants like Amazon’s Alexa and its (her?) ilk.
Now our focus turns to the actions of these devices if they do bad things themselves. This story reports that Siri, Apple’s version, routinely answered requests in Toronto for prostitutes by referring the inquired to an “eSports bar” – one where clients play electronic sports games. Apparently the word may be too close to “escorts” for Siri’s sense of discrimination. It is clear – take it as established for the present discussion – that the bar is NOT a hangout for prostitutes.
So: Has a tort been committed? Even if these days of pretty legal prostitution, it can lower the estimation of a bar in public opinion if it is known as a hangout for prostitutes. So: defamation.
Who is liable, if anyone? Canada’s rules of intermediary liability are a bit vague, but tend to impose liability once the intermediary knows of the defamation. Apple has been told.
Even in the US, where s. 230 of the Communications Decency Act gives very broad immunity to intermediaries, that immunity is for content provided by others. Siri’s algorithms for finding answers to questions is proprietary to Apple – so the company is not really an intermediary, is it? It’s a primary party.
One has jurisdictional issues, but can somebody sell products and services (the device is probably both) in Canada and escape all liability for how it works? Would this be a defamation action or a product liability action? (The bar owner does not currently seem inclined to sue, but that does not stop us from speculating.)
Is this a good case study for the Law Commission of Ontario’s current project on online defamation? Or is it open-and-shut except for the novel medium of publication?