What’s in a Voice? The Legal Implications of Voice Cloning
This Note focuses on the legal implications of artificial intelligence voice cloning, where algorithms are utilized to create convincing copies of voices. Such clones are easily manipulated and are often used to spread misinformation online. The number of video or audio clips posted online containing voice clones has increased drastically over the past five years, and this trend will likely continue. As more individuals—mainly celebrities—fall victim to voice-cloning attacks, legal avenues for recourse will become highly desirable.
Because this is such a novel technology, courts have not had the opportunity to address voice cloning within the privacy tort sphere. This Note aims to provide some guidance for future victims by examining existing causes of action and evaluating their applicability to instances of voice cloning. These causes of action include copyright infringement, (mis)appropriation of identity, defamation, and false light. Quickly determining that copyright and IP-related torts do not apply to instances of voice cloning, the Note then turns to an in-depth examination of two popular privacy torts: defamation and false light. Recognizing that both defamation and false light causes of action will apply to instances of voice cloning, this Note recommends that victims opt to pursue a false light cause of action, if at all possible. Because the false light privacy tort is not currently recognized in all states, this Note asks legislatures to reinstate the cause of action for instances of voice cloning.