In the back pages of comic books, there was often a curious advertisement. One which purported to sell x-ray glasses, which would allow the user to see through things.
Although first patented in 1906, these novelty items simply created an optical illusion and involved no x-rays at all. This didn’t prevent many young readers from purchasing, with the intent of being up to no good. Roger Luckhurst explains in “X-Ray Specs,”
As anyone who spent a dollar (plus postage and packing) on mail order X-Ray Specs came bitterly to learn, Röntgen’s x rays were not involved in this technology. As the journalist Jack Hitt later reflected, ‘I remember X-Ray Spex, with that promise of seeing girls naked beneath their clothes, as my first shattering disillusionment with the world of adults and all their horrible lies.
It’s to much relief that this kids toys could not be used for what was often its intended purpose. Technology though, eventually, catches up with all forms of mischief.
An app developed last year, DeepNude, used an open-source algorithm from UC Berkley, to take any photo of a women and create an artificially nude version of them. The algorithm was trained on 10,000 naked images found online. The app was live for four days before it was taken down, due to a widespread backlash, but not before people were horrified about what could be done with readily available deep fake tools online.
In response to these types of concerns, about a dozen bills were introduced in the US Congress and state legislatures last year. The use of this technology is technically not revenge porn, as they are not true images of the subject.
For this reason, the Canadian tort of public disclosure of embarrassing private fact, would likely not meet the definition. The introduction of this privacy tort in Ontario affirmed three of the four of the privacy torts in Jones adopted from Professor Prosser.
The fourth was also introduced last year, in Yenovkian v. Gulian. The privacy tort of publicity which places the plaintiff in a false light in the public eye is therefore a valid tort in Ontario, and may be useful in combatting trends like deep nudes.
This case emerged in the context of a family law dispute, where the father engaged in years of cyberbullying of the mother on many online platforms. Justice Kirstjanson grounded her decision in international legal principles such as the United Nations Convention on the Rights of the Child, which was endorsed by the Supreme Court of Canada in A.C. v. Manitoba (Director of Child and Family Services), which include concepts of privacy for the family and the child.
The mother brought a cross-claim against the father, which proceeded with the family law trial, which included claims for invasion of privacy. Justice Kristjanson found that the new privacy tort was applicable in the circumstances,
 With these three torts all recognized in Ontario law, the remaining item in the “four-tort catalogue” of causes of action for invasion of privacy is the third, that is, publicity placing the plaintiff in a false light. I hold that this is the case in which this cause of action should be recognized. It is described in § 652E of the Restatement as follows:
Publicity Placing Person in False Light
One who gives publicity to a matter concerning another that places the other before the public in a false light is subject to liability to the other for invasion of his privacy, if
(a) the false light in which the other was placed would be highly offensive to a reasonable person, and
(b) the actor had knowledge of or acted in reckless disregard as to the falsity of the publicized matter and the false light in which the other would be placed. I adopt this statement of the elements of the tort. I also note the clarification in the Restatement’s commentary on this passage to the effect that, while the publicity giving rise to this cause of action will often be defamatory, defamation is not required. It is enough for the plaintiff to show that a reasonable person would find it highly offensive to be publicly misrepresented as they have been. The wrong is in publicly representing someone, not as worse than they are, but as other than they are. The value at stake is respect for a person’s privacy right to control the way they present themselves to the world.
 It also bears noting this cause of action has much in common with the tort of public disclosure of private facts. They share the common elements of 1) publicity, which is 2) highly offensive to a reasonable person. The principal difference between the two is that public disclosure of private facts involves true statements, while “false light” publicity involves false or misleading claims. (Two further elements also distinguish the two causes of action: “false light” invasion of privacy requires that the defendant know or be reckless to the falsity of the information, while public disclosure of private facts involves a requirement that there be no legitimate public concern justifying the disclosure.)
 It follows that one who subjects another to highly offensive publicity can be held responsible whether the publicity is true or false. This indeed, is precisely why the tort of publicity placing a person a false light should be recognized. It would be absurd if a defendant could escape liability for invasion of privacy simply because the statements they have made about another person are false.
Although there are no reported cases of this tort being used after this one, it has the potential to capture the type of social ills that we might see with deep fake technology. Images of individuals that are not true images can still be objectionably offensive, and should have a remedy in law.
Alternatively, another site launched earlier this year creates entirely artificial nudes, with the intent of eliminating much of the exploitation that can occur in the industry.
With statutory privacy reforms under consideration in Ontario, and the introduction of Bill C-11 last month, Canada may also be catching up to the calls worldwide for stronger protections in this area. However, at present the bill does not consider data that is manipulated to place a person in a false light, and this remains a growing concern.