Artificial intelligence is already capable of creating a staggering array of content. It can paint, write music, and put together a musical. It can write movies, angsty poems, and truly awful stand-up comedy. But does it have ownership over what it produces?
For example, an AI at Google has managed to create sounds that humans have not heard before, merging characteristics of two different instruments and opening up a whole new toolbox for musicians to play around with. The company’s DeepDream is also capable of generating psychedelic pieces of art with high price tags; last year two sold for $8,000—with the money going to the artists who claimed ownership over the images.
As it stands, AIs in the US cannot be awarded copyright for something they have created. The current policy of the US Copyright Office is to reject claims made for works not authored by humans, but the policy is poorly codified. According to Annemarie Bridy, a professor of law at the University of Idaho and an affiliate scholar at Stanford University’s Center for Internet and Society, there’s no actual requirement for human authorship in the US Copyright Act. Nevertheless, the “courts have always assumed that authorship is a human phenomenon,” she says.
Eran Kahana, an intellectual-property lawyer at Maslon LLP and a fellow at Stanford Law School, doesn’t believe we should award authorship to AIs. He explains that the reason IP laws exist is to “prevent others from using it and enabling the owner to generate a benefit. An AI doesn’t have any of those needs. AI is a tool to generate those kinds of content.” He likens the idea to a computer word processor using spell check. If you make a spelling mistake in something you’re writing and the computer corrects it, who owns the copyright to the final product? “Obviously not the computer”, Kahana quips. “The computer has no ownership of your writing.”
AI also raises some thorny issues by potentially impinging on the IP rights of others. This is especially true if a creative work is based on machine learning, a technique that allows an AI to learn for itself from data it is provided. If the input on which a creation is based—the data—is made by someone else, shouldn’t then be the owner of whatever work is created from it? And even if work is produced by an AI independently, can it really be classed as original?
A good case of this is the AI designed to imitate the works of the Dutch artist Rembrandt. “The Next Rembrandt” project asks “can the great master be brought back?” The AI creates original pieces in the style of the Dutch master, which it has been trained to make via data input from hundreds of paintings—but does that mean that Rembrandt should own the work, the person who gathered the data, or the AI itself?
Bridy says another important legal question here would by who is liable for copyright infringement. “Obviously, you can’t sue software,” she says, concluding that existing legislation means there would likely be a liability on the part of the programmer or owner of the infringing code. Kahana concurs, saying “You’d be accused, as the owner, of using a tool that caused infringement on others.”
In order to glean insight as to how AI IP may be regarded in the future, we can look to court case currently unfolding in California—but instead of it involving AIs, it involves animals.
Naruto, a rare macaque monkey, shot to fame in 2011 when it picked up a British photographer’s camera and seemingly took a selfie. David Slater, the photographer in question, claims he owns the copyright to the stunning images, despite him not being the one who held the camera or clicked the shutter. Instead, he says the images belong to him because of the work he did in setting up the shot.
A court in San Francisco agreed with the US Copyright Office, determining that human authorship is a requirement for copyright protection, and therefore Naruto cannot own the photographs it took. The case is now being appealed, with the People for the Ethical Treatment of Animals (PETA) arguing that the copyright belongs to Naruto. They insist that “in every practical (and definitional) sense, he [Naruto] is the ‘author’ of the works.” The case is ongoing.
Without developing some form of framework recognizing AIs as legal persons, just as monkeys are not, we cannot award an AI copyright. “And we’re a long way from that moment, if we’ll ever get there,” Bridy says. The most likely near-term solution would be to award copyright to the owners of the AI itself, which would be similar to how employers automatically own the work their employees produce.
As the sophistication and complexity of AI continues to grow, so will the work it produces. This promises great benefits in the fields of science, technology, and medicine—but it does in the creative realms, too. If we don’t resolve these thorny issues of ownership now, we risk delaying the delivery of these benefits across all industries. Our laws need to adapt to the reality of the modern world, and they need to do so quickly.
https://www.weforum.org/agenda/2017/08/if-an-ai-creates-a-work-of-art-wh...