Skip to main content

tv   Senate Hearing on Digital Replicas AI Concerns  CSPAN  May 4, 2024 10:23pm-12:23am EDT

10:23 pm
10:24 pm
here for live coverage of the hearing on artificial intelligence. you can continue watching if you go to our website
10:25 pm
cspan.org. we continue now with a senate hearing on digital replicas and artificial intelligence. >> i open that hearing with the debut of a new ai generated song aiai, a riff on frank sinatra's new york, new york, used to mimic frank sinatra's voice. the song was fun to create, with permission of course, was my song protected speech? if i hadn't gotten permission, would the song have violated mr. sinatra's rights to his voice or his style? since that hearing, ai generated replicas have only grown more pervasive, from deepfake videos of celebrities hawking products, posing as
10:26 pm
legitimate hits to scammed calls mimicking a panicked grand child's voice. ai generated videos of tom hanks and gayle king were used. a fake version of elon musk encouraged consumers to invest in a crypto scam. mcafee, our global leader in online protection found one in four american adults have experienced an ai voice scam with three quarters having lost money. scammers using ai generated replicas of a grandchild's voice to trick a grandparent out of money, have become so sophisticated both the faa and ftc have generated warnings. ai deepfakes don't stop there. we've seen other examples, nonconsensual explicit deepfake
10:27 pm
photos and videos. election interference, from the deceptive ai act that senators collins holly, and i have introduced. deepfake pornographic images of taylor swift circulated broadly on x, formerly known as twitter, before they were taken down. a voice clone of president biden encouraged voters to stay home during the new hampshire primary. and in slovakia, a deepfake likely had an impact on the outcome of a national election. in summary, as ai tools have been increasingly sophisticated it becomes easer to distribute fake images of someone. we can't let this challenge go unanswered and inaction shut not be an option. as president biden cautioned, we must regulate ai voice impersonation, but must do so thoughtfully, striking the
10:28 pm
right balance between defending individual rights, and fostering creativity. a bipartisan group of senators convened nine ai forums last year and senator schumer has encouraged committees to work on ai legislation on a bipartisan basis just as we're doing today. that's why i was excited to release the no-fakes act discussion draft last october with senators tillis, blackburn and klobuchar. this bill would protect people from having their names, images, or likenesses saying or doing things they would never. agree to or never say. the goal accomplishes this in two ways, by holding individuals and companies liable if they produce a replica. and by holding platforms
10:29 pm
liable, if they host or distribute an inauthorrized depicted replica if they know the person depicted did not authorize it. no fakes act protections would apply to all individuals regardless of whether they commercialize their voices, images, or likenesses. our bill tries to be careful to balance these protections against free speech rights. the first amendment will of course apply to this bill whether we say it does or not, but we've made clear carveouts, for example, parody and satire. over the past 6 months, we've had literally dozens of meetings and received hundreds of thousands of proposed revisions, tweaks, wholesale changes on the draft from stakeholders who love the draft, hated the draft, and everyone in between. that was exactly the point, and i appreciate the many
10:30 pm
constructive suggestions we've received. that's also the point of having a hearing today folks who support the bill, who question the bill, who oppose the bill, and to have a real dialogue. let me close, the feedback has centered around a few, five different sort of core technical areas. whether we should include a notice in takedown structure similar to the dmca, whether we've struck the right balance with first amendment exclusions. whether a 70 year postmortem term should be adjusted or narrowed. whether a bill should have preemptive impact over similar state laws. whether the bill should create some process by which individuals with limited resources and minimal damages can enforce the rights under the law. so i look forward to continuing this work with my colleagues and immediately following this hearing to work to promptly formalize the no fakes act for introduction next month. with senator blackburn and senator tillis, and their great
10:31 pm
cooperation, we've aseemabled a wonderful panel. i encourage you as our witnesses to tell us what you like about the draft, what you dislike about the draft, and be specific about what changes you would like us to consider and why. i'll introduce the witness panel in a moment, but let me next invite senator tillis to make his opening remarks. >> thank you, as you were going through the description of the no fakes act, i love the subcommittee, because we actually do work here. and we actually have a bunch of ip nerds or interested people show up. i really think the no-fakes act is unique among the other bills that we've carried forward in terms of intellectual property, because it touches everybody. normally, it's about patent holders or creators. and this touches everybody. every socioeconomic strata, it's interesting, but it's also one of the reasons why we've got to get it right. we've got to make sure that we
10:32 pm
come up with concrete solutions. we don't want to overreach. there is a need for legislation, so anyone who's in the don't fix it, it ain't broke category, i respectfully disagree, but i'd be fascinated to hear your testimony, if we have witnesses that are of that position. but we also don't want to miss the opportunity, or we don't want to stifle opportunities for innovation. that's why it's so important to get it right. we don't even know what ai is going to look like ten years from now. interestingly enough, ai is going to make ai even more sophisticated over a much shorter period of time. so that's got to be instructive to our policy formulation. but we've all seen as chairman kuhns has indicated, replicas, deepfakes, photos, videos, audio. we're going to show you an example here shortly. and the numbers just growing. so we have to work on it. and we have to do the fair
10:33 pm
things. entertainers, politicians, and the public at large have been subject to really fake media for really much of the last 100 years. but now it's getting serious, and it's producing and multiplying at a rate that requires congressional action. i wanted to, i think i want to go forward a little bit in my comments because i think senator kuhns did a good job is of describing some of the challenges, some of the things we wants to work on with our bill. but i'd like to, i think if staff is prepared, i want to show you a video, to give you a recent example. i use ai every morning as a part of my news feed, so i work with ai every morning, have for about two years since chatgpt first released their beta version. it was a week or so ago that i
10:34 pm
saw the estate of tupac questioning. we thought it was interesting for folks who aren't following the issue as closely as us to show the video. have we got staff ready to queue that up? ♪♪ ♪ we need ya the west coast savior if you deal with this viciously you seem a little nervous about all the publicity [ bleep ] canadian we need to know the west coast victory heard it on the bud and podcast it's got to be true ♪♪ >> so that entire musical rendition is a product of ai. and interestingly, that image,
10:35 pm
one of those, that name image and likeness is something that is the property of tupac's estate. the other is a ai generated image that was obviously done in violation to the extent to commercial purposes in violation of copyright. this is a hypothetical. this happened beginning a week or so ago, shortly after drake released that song. so we've got our work to do. and legislation addressing the misuse of digital replicas will have a multi-billion dollar implication, we've got to get it under control. there are a lot of questions to be asked, and my office in particular is guilty of putting drafts out there, knowing that they're drafts. sometimes we even do it sooner without the cooperation or involvement of other members, because we put scary stuff out there to give you a ghost of christmas future.
10:36 pm
in this case we didn't do that. we tried to work on putting together a discussion draft that makes sense. but we've got a lot of things we have to work out. you know, the questions that we need answered is it wise to mandate that individuals have no right to license outside of their individual likeness. should we create an exception for harmless noncommercial users? should there be a notice in takedown. there's a lot. there's a litany. i hope that you all can come up with other ones, but i'll submit the rest of my written statement for the record. but we have to act. this is hopefully in this congress, we can act, which means we have to move very, very quickly, or at a minimum, lay down a baseline that we can pick up when we come back with a new congress, and get it right. so i look forward to advancing
10:37 pm
everybody's active collaboration. the only thing that really makes me mad is when i see somebody trying to, through guerilla warfare, undermine the good faith efforts of this committee or my colleagues. if you're not at the table, you're going to be on the table. so why don't everybody just recognize our office is open to constructive criticism, use cases of where the policy doesn't make sense, but if you're in the category of it ain't broke, don't fix it, you're not up with modern times. and i look forward to a good, productive hearing today. and thank you in advance for your productive collaboration as this legislation moves forward. >> thank you, senator tillis, and thank you for another positive and engaging hearing. it's been a great experience serving on this committee with you. today, we welcome six witnesses to testify about the no-fakes act. our first witness is robert kinsle. ceo of warner music group, who
10:38 pm
has a lot of experience. he spent over a decade as youtube's chief business officer among another business engagements. next, we have twigs, a singer, songwriter, producer, and dancer who has used ai to help her create, and also has had personal experience with unauthorrized deepfakes. it's great to have a voice present from the creative community. next, we have a national executive and director for sag- aftra. the screen actors guild of radio, and television artists, a labor representing 160,000 members who work in film, television, music, and more. also the voice of the creative community. then we have ben shifter sheffner. thank you, ben. we welcome graham daveys,
10:39 pm
representing principally audio streaming platforms like spotify and youtube. mr. daveys davies also has a history of music. she teaches and writes on intellectual property law. after i swear in the witnesses, each will have five minutes to provide a summary of your opening statement that the senators have your written statements. then we'll proceed to questioning. each senator gets five minutes for the first round. we will likely have two, or even three rounds of questioning, time and attendance permitting. witnesses would you please stand, and raise your right hand to be sworn in. do you swear, or affirm that the testimony you're about to give before the committee will be the truth, the whole truth, and nothing but the truth, so help you god? thank you all. let the record reflect the witnesses have been sworn.
10:40 pm
mr. kinsle, you can proceed with your opening statement. >> chairman kuhns, rank and member tillis, and members of the subcommittee. i'm robert kinsle, chief executive officer of the warner music group. being here today is something i could not have imagined as a young boy growing up behind the iron curtain in communist czechoslovakia. i attended state university in new york, and there i met an amazing woman who eventually became my wife and now we have two amazing american daughters. i'm a proud american citizen and i have deep appreciation for the freedoms at the heart of this great country, having grown up without them. for the past 25 years i've been a tech and media executive. i joined warner music last year after 12 years at youtube and eight years at netflix.
10:41 pm
warner music is home to an incredible array of artists and songwriters who are moving culture across the globe. one of those artists twigs is here with my today. she is an extraordinarily gifted singer, songwriter, actor, and performer. i would also like to thank duncan crabtree ireland who negotiated between sag-aftra and record labels regarding ai and defends artist rights. music has so often been the canary in the coal mine for the broader trends in our society. more than any other form of communication or entertainment, music drives culture and innovation. and that's happening again with generative ai. today, music companies are helping artists, rights holders, and tech companies figure out this new world, which is both exciting and
10:42 pm
daunting. it's our job to not only help amplify artist creativity, but to protect their rights, their livelihoods and their identities. across the industry, legends from roberta flack, to the beatles, have embraced ai, as a tool to enhance their creativity. at the same time, generative ai is appropriating artist identities, and producing deepfakes of artists singing, saying, or doing things they've never done before. you can hear my identity in my voice. through ai, it is very easy for someone to impersonate me and cause all manner of havoc. they could speak to an artist in a way that destroys our relationship. they could say untrue things to the media that would damage our business. unfeathered deepfake technology has the potential to impact
10:43 pm
everyone, even all of you. your identities could be appropriate rated, and used to mislead your constituents. the truth is, everyone is vulnerable. families defrauded by voice clones pretending to be relatives, people placed in pornography without their consent. some people have spoken of ai as a threat to freedom of speech, but it's precisely the opposite. ai can put words in your mouth, and ai can make you say things you didn't say or don't believe. that's not freedom of speech. we appreciate the efforts of this committee to address this problem. including the no-fakes act discussion draft authored by chairman kuhns, ranking member tillis, senator blackburn and senator klobuchar. your leadership kick started efforts in this area, and we
10:44 pm
strongly support the bipartisan, no ai fraud act introduced in the house earlier this year by representatives salazar and dean, and the recently enacted elvis act in tennessee. as the members -- as the committee moves toward the introduction of a senate bill, there are three elements the bill should contain to be effective. one, an enforceability property right for likeness and voice. each person should be allowed to license or deny that right on the free market terms and seek redress for unauthorrized uses. two, respect for an important first amendment principles, without going any further and providing loopholes that create more victims. and three, affective deterrents. to incentivize a vibrant and responsible commercial marketplace, we need to
10:45 pm
maintain consequences for ai model builders that knowingly violate person's property rights. i applaud addressing these challenges issues with urgency. congress should pass legislation this year, before the genie is out of the bottle, while we still have a chance to get this right. i look forward to answering your questions. thank you. >> thank you mr. kinsle. twigs. >> as artists, we dedicated a lifetime of hard work and sacrifice in the pursuit of excellence. not only in the expectation of achieving commercial success and critical acclaim, but also in the hope of creating a body of work and recognition that is our legacy. so why am i here today? i'm here because my music, my acting, my dancing, the way that my body moves and the way
10:46 pm
my voice resonates through a microphone is not by chance. they are reflections of who i am. my art is a canvas by which i paint my identity. it is the very essence of my being, yet this is under threat. ai cannot replicate the depth of my life journey, yet those who control it, hold the power to mimic the likeness of my art, replicate it, and falsely claim my identity and intellectual property. this prospect threatens to rewrite and unravel the fabric of my very existence. we must enact regulation now to safeguard our authenticity and protect against misproposation of our inalienable rights. three decades ago, we did not realize that the internet would embed itself so deeply into the core of our evidence lives. policies and controls to keep pace with the emergence of the technology were not put in place to protect artists, young
10:47 pm
people and those that were vulnerable and it ran away with us. ai is the biggest leap in technological advancement since the internet. you know the saying. fool me once, shame on you. fool me twice, shame on me. if we make the same mistake with the emergence of ai, it will be shame on us. let me be clear. i am not against ai. as a future facing artist, new technologies are an exciting tool that can be used to express deeper emotions, create fantasy worlds, and touch the hearts of many people. in the past year, i have developed my own deepfake version of myself that is not only trained in my personality, but can also use my exact tone of voice to speak many languages. these and similar emerging technologies are highly available tools. this, however, is all under my control, and i can grant or refuse consent in a way that is
10:48 pm
meaningful. what is not acceptable is when my art, and my identity can simply be taken by a third party and exploited falsely for that own gain, without my concept of the absence of appropriate legislative control and restriction. history has shown us time and again, in moments of great advancement, those in the arts are those first to have their works exploited and commode commoditized. by protecting artists with legislation, at such a momentous time in history, we are protecting a 5-year-old child in the future from having their voice, likeness, and identity taken without prior consent. i stand before you today because you have it in your power to protect artists and
10:49 pm
their work from the dangers of exploitation and the theft inherent in this technology if it remains unchecked. i am here on behalf of all creators whose careers depend deeply on their voice, likeness, and identity. potentially are the wider image and related rights of society. you have the power to change this and safeguard our future. as artists, and more importantly, human beings, we are a facet of our given land, and developed identity. our creativity is the product of this lived experience overlaid with years of dedication to qualification, training, hard work, and dare i say it, significant financial
10:50 pm
investment and sacrifice. that the very essence of our being at its most human level can be violated by the unscrupulous use of ai that -- it is vital we work together to ensure we do all we can to protect and create an intellectual right system as well as protect the very basis of who we are. we must get this right. you must get this right before it's too late. thank you. >> thank you. >> thank you very much, chairman kuhns, ranking member tillis and the members of the sub committee on intellectual property. i'm the national director of sag-aftra, the country's largest labor union for media
10:51 pm
and artists. i'm here to testify in support of the no-fakes act. our numbers believe ai poses an existential threat to their ability to one, create consent for the use of their digital representation. two, receive fair paymentment for their voice and likeness, and three, to protect them from having to compete against themselves in the marketplace. i'm negotiator for last year's historic agreement with the major entertainment studios which was only finalized after the longest strike in 40 years, a strike that lasted four months. the broader public understand that ai poses real threats to them and they fully support protections against those threats. for an artist, their image and likeness are the foundations of their performance, brand, and identity, developed over time
10:52 pm
through investment and hard work. sag-aftra has long fought for right of publicity laws and voice and image protections. the exponential proliferation of technologies which allow for of voices and likenesses and or audio and visual works and sound recordings makes the work urgent for our members. intellectual property rangers members and all of us are protected and service providers provide the same protections to individuals' images, likeness and voices that they provide now for other intellectual property rates. they should be translatable like other intellectual property or any kind of property someone owns with durational limitations on transfers during one's lifetime to ensure we don't enter era of indentured servitude like actress and sag-aftra member bed to have an established the seven year rule to end abusive
10:53 pm
contracts. some will argue there should be broad categorical first amendment based exemptions to any legislation protecting this important rights. no stronger advocates for the first amendment then our members. they rely on first amendment rights to tell the stories artists and other countries are often too endangered to tell. however, the supreme court has made clear over half a century ago that the first amendment does not require the speech of the press or any other media for that matter be privileged over protection of the individual depicted. to the contrary, apply balancing test which determine which right will prevail. balancing tests are critical and incorporated into the discussion graft. they ensure that the protected individual is protected and rewarded for the time and effort into cultivating their persona will not unduly burdening the right of the press to report on matters of public interest or
10:54 pm
the entertainment media to tell stories at the same time, the test help ensure the depicted individual is not compelled to speak for the benefit of third parties who would misappropriate the value associated with the persona they have carefully crafted. with new a.i. technologies that can realistic depict an individual's voice or likeness with a few seconds of audio or single photograph and with constantly evolving capabilities with these technologies, it is more important broad categorical exemptions be avoided and the courts be empowered to balance the competing interest. it is also essential that action be taken to address these harms now. our members, the public, and our society are impacted right now by the use of deepfake technology and we must take timely action. just as one of many examples of the of use of deepfake technology, during the ratification campaign for our contract after the strike last year, unknown party on the internet created unauthorized
10:55 pm
deepfake video of me saying false things about our contract and urging members to vote against it. someone devoted my life more than a year to a contract i deeply believe in. there was no federal right to protect me, no take down right, tens of thousands of people were misled about something that really mattered to so many of us. it is neither necessary nor appropriate to wait for broader artificial intelligence regulation to be adopted. this narrow and technology neutral approach can and should proceed expeditiously forward. the companies behind many of these technologies are asking for rules so they better understand the appropriate boundaries on their conduct to the no fix act provides important guidance while helping to ensure individuals and protected from exploitation that puts livelihood and reputation at risk. thank you for this opportunity to speak and i look forward to answering your questions. >> thank you, mr. crabtree- ireland. >> mr. schaffner. day members of the subcommittee, thank you for the opportunity to testify today on
10:56 pm
behalf of the association about legislation to regulate the use of digital replicas. over a century, the mpa members have employed innovative new technologies to tell compelling stories to audiences worldwide. from the introduction of recorded sound in the 1920s, color in the 1930s, dazzling special effects for movies like this year's dune part 2. bringing the vision to the screen and the most compelling way possible. artificial intelligence is the latest such innovation impacting our industry. mpa sees great promise with a.i. as a way to enhance the filmmaking process and provide even more compelling experience for audiences. we also share the concerns of actors and recording artists about how a.i. can facilitate the unauthorized replication of their likenesses or voices, to plant performances by them which could potentially undermine their ability to earn a living practicing their craft. the no
10:57 pm
fakes act is a thoughtful contribution to the debate about how to establish guardrails against abuses of such technology. however, legislating in this area necessarily involves doing something the first amendment sharply limits, regulating the content of speech. it will take very careful drafting to accomplish the bill's goals without inadvertently killing or prohibiting legitimate, constitutionally protected uses of technology to enhance storytelling. i want to emphasize this is technology that has entirely legitimate uses. uses that are fully protected by the first amendment and do not require the consent of those being depicted. take the classic 1994 film forrest gump, which depicted the fictional character played by tom hanks, navigating american life from the 1950s through the '80s including interacting with real people from that era. famously, the
10:58 pm
filmmakers using digital replica technology available at the time had him interact and converse with presidents or should i say former senators, kennedy, johnson, and nixon. to be clear, those depictions did not require the consent of their heirs and it would grant heirs or corporate successors the ability to censor portrayals they don't like which would violate the first amendment. in my written testimony, i detailed specific suggestions we have for improving the no fakes draft so it addresses real harms without encroaching on first amendment rights. here, i will highlight four points. first, getting the statutory exemptions right is crucial and i want to thank the drafters for getting much of the way there. those exemptions give filmmakers the clarity and certainty they need to determine whether to move forward with spending tens of
10:59 pm
millions or hundreds of millions of dollars on a movie or tv series if the statutory exceptions are not adequate, some producers will simply not proceed with their projects. a classic chilling effect that the first amendment does not allow. second, the bill should pre- empt state laws that regulate the use of digital replicas in expressive works, simply adding a federal layer on top of that existing patchwork of state laws would only exacerbate the problems associated with inconsistent laws in this area. third, the scope of the right should focus on the replacement of performances by living performers. going beyond that risks sweeping in wide swaths of persimmon protected speech that would make the statute vulnerable to being struck down on overbreadth grounds. fourth, the definition of digital replica must be focused on highly realistic depictions of individuals. it should not
11:00 pm
encompass, for example, cartoon versions of people you might see on shows like the simpsons or south park. lastly, before legislating, npa urges the subcommittee to first pause and ask whether the harms it seeks to address are already covered by existing law such as defamation, fraud, state right of publicity law. often the answer will be yes indicating that a new law is not necessary. if there is a gap in the law, for example, regarding pornographic or election related deepfakes, the best solution is narrow, specific legislation targeting that specific problem. thank you again for the opportunity to testify today and i welcome your questions. >> thank you mr. sheffner. mr. davis. >> good afternoon and thank you to the committee for giving me the opportunity to speak today on this important issue. my name is graham davies and
11:01 pm
president and ceo of the digital media association representing the leading music streaming services. we support the committee efforts to bring forward legislation at the federal level which should pre- empt existing state laws to keep pace with new technology. we join you in the objective of ensuring appropriate protections for individuals likenesses, important issue for us all and support efforts to develop clear and balanced way forward. members benefit from clarity and lot providing fans with great experiences. indeed, our members have a strong track record of licensing complex rights to deliver music to fans, they work closely with record labels and music publishers with whom they have long relationships and robust contracts. this is our common objective. any new or increased rights should be appropriate and targeted him and they should not come at the expense of important freedoms of speech or creative expression. nor should they be overly broad
11:02 pm
to the point of creating confusion or needless litigation. over the true objective of protecting personhood. the no fakes act proposes to sweep and broad range of legitimate replicas and downstream activities within its scope. the current draft punishes good and bad actors alike. and new rights should not undermine the global content supply chains in which the streaming industry depends. we are in the early stages of the application of a.i. by the artistic community but we see existing practices for taking that illegal or deceptive content continued to suffice with this new context. streaming services are the last point in the supply chain, only the originator of the content and delivers the services has
11:03 pm
resources necessary to determine whether the content is legitimate or not. streaming services do not have any way to know the complex chain of rights and the content they receive from labels and distributors. to address the harms caused by a.i. technology used to imitate a musical artist, celebrity, or other public figure, we believe the committee objectives best achieve this new legislation was developed from existing right of publicity laws. this would have a number of advantages, firstly, body of existing case law on how first amendment protections can be balanced with the individual rights of publicity. second, liability sits squarely with the bad actors, those who create the deceptive content and first place it into the public sphere. thirdly, the focus is on commercial use with actual damages which we believe are proven to be a sufficient deterrent. establishing federal law that pre-empts the existing patchwork of service the public laws necessary.
11:04 pm
music streaming is a global industry, we believe the rights pertaining to the person should remain inextricably tied to the individual for the duration of their life. this ensures each person is always able to maintain control of how their voices used. the discussion draft released by the senators has been helpful to foster dialogue and encourage stakeholders to think about complex issues. i have included more with my written testimony intended to support the next stages of discussion and looking for to continued work with the committee. thank you. >> thank you. professor ramsey. >> kunz and other members of the subcommittee, thank you for the opportunity to testify about the first amendment proposed no fakes act area i'm a professor of law at san diego school of law, i teach
11:05 pm
intellectual property classes at usd and my scholarship focuses on the potential conflicts between trademark laws and the right to freedom of expression. the first amendment of the u.s. constitution commands congress shall make no law that bridges the freedom of speech. congress generally lacks the power to restrict expression because of its message, ideas, subject matter or content. this rule is subject for a few admitted exceptions for historically unprotected speech such as fraudulent speech and affinity. content-based regulations of speech are generally presumed invalid unless the government can prove the law is constitutional. the no fakes act imposes restrictions on the content of speech and targets the harms caused by unauthorized creation and dissemination of digital replicas of deepfakes of individuals and recordings nearly indistinguishable from that person's actual voice,
11:06 pm
image, or visual likeness when the act applies to the use of digital replicas to impersonate people in fraudulent speech or misleading commercial speech, it is consistent with the first amendment. there is also no conflict with the first amendment when the act restricts the use of digital replicas and sexually explicit deepfakes without consent if those images or videos constitute obscene speech or child pornography. the problem is that the current version of the no fakes act also regulates non-misleading speech that is protected by the first amendment. congress must therefore prove the act satisfied constitutional scrutiny. the law must be narrowly tailored to directly and materially further its goals and not harm speech protected by the first amendment more than necessary. strict scrutiny analysis may be required when the government is regulating the unauthorized use of digital replicas in
11:07 pm
political messages, news reporting, entertainment, and other types of noncommercial speech fully protected by the first amendment. as it is currently drafted, i believe the no fakes act is not consistent with the first amendment because the law is overbroad and vague. however, i think a revised version of the law could satisfy intermediate and strict constitutional scrutiny. there are three ways congress can better protect first amendment interest in the law. first, it is critical of the lot not suppress or chill protected speech more than necessary. the senate proposed no fakes act does a better job than the no a.i. fraud act setting forth pacific exemptions from liability for certain non-confusing uses of another's image, voice, or likeness. the law can be proved in certain ways i discussed in my written testimony. it is important congress not online service provider should
11:08 pm
implement and notice a takedown system to make it easier to remove unauthorized deepfakes that violate the law. accused infringers must also be able to challenge takedown request by filing a counter notification with the platform. my second recommendation is for congress to create separate causes of action that target the different harms caused by unauthorized uses of digital replicas. this includes number one, the use of deepfakes to impersonate individuals in a deceptive manner. number two, uses of sexually explicit deepfakes. number three, uses that substitute for individual's performance that they typically would have created in real life such as a performance in a song or movie. these causes of action should have different requirements and distinct speech protective
11:09 pm
exceptions. my third recommendation is that congress ensure each provision of the lot adequately protects speech interest. congress can better protect expressive values by allowing the new federal statute to pre-empt that and consistent state laws that protect the right of publicity and digital replica rights or laws that restrict the unauthorized use of digital replicas if licensing of digital replica rights as allowed by the act, individuals should be able to consent for each different use of their digital replica. allowing others to control a person's identity rights through a broad licensing agreement will work at cross purposes with many of the stated goals of this proposed legislation. it could potentially lead to greater a.i. generated deception of the public. it can also stifle the right of people to make a living through their performances and result in the
11:10 pm
use of their image or voice in sexually explicit material that was authorized by the broad terms of a licensing agreement. i encourage congress to continue to protect the interests of both public figures and ordinary people in the no fakes act. i encourage you to continue consulting with stakeholders, academics, and attorneys with expertise in this field of law. i look forward to answering your questions as you continue to approve the act. thank you. >> thank you to all six of our witnesses for the operation and engagement. i will start with questions about exploring how a.i. replicas are impacting individuals, entertainment is an assistant use the subsequent round to get to your perspectives on specific potential revisions to the no fakes act. mr. crabtree-ireland, thank you for sharing your personal experience of a.i. generated deepfake in the context of the ratification fight for the most recent contract. given your experience, should a digital replica right apply to all individuals regardless of whether they are commercializing
11:11 pm
image or voice of likeness use or primarily represent people who make a living commercialize vision, voice, likeness or image ? why should it be available to everyone? >> it is a great question, chairman. we support a right available to everyone, myself and others have explained the impact this can have on people who make a living and whose career is based on their image likeness or voice. the impacts are so obvious and real for so many americans outside of the scope of commercialize use. the example i gave in my mind is not commercial use example, this is an example that could put apply to anyone and the impact is so serious. we do support the right on a broader basis that should be applicable to everyone. >> could you help us understand how you are using a.i. as a creative tool on the one hand, and briefly tell us about
11:12 pm
your experience with a.i. deepfakes and what you think the future of your industry looks like if we don't heed your urgent call for us to act? >> over the past year, i have been creating a.i. version of myself that can use my tone of voice exactly to speak in multiple languages. i have done this to be able to reach more of my fans and be able to speak to them in the nuance of their language. i have currently explored french, korean, and japanese which is really exciting for me. it means even with my upcoming album, i can't explain in depth what it is about creatively. it also allows me to spend more time making art, often being a music artist or any artist this day in age requires a lot of press, promo, one-liners so it means if it is something simple
11:13 pm
that does not require my heart, i can do a one-liner and give it to people to promote a piece of work and it is harmless. ultimately, i can spend more time really meaningful for my fans. the next question you off -- asked -- >> your own experience with deepfakes. >> there are songs online collaboration of myself with other artists i did not make. it makes me feel vulnerable because first of all, as an artist, i think the thing that i love about what i do, i'm very precise, i take my time with things. it is really what, i'm very proud of my work and very proud of the fact i think my fans really trust me because they know i put so much deep meaning of my north star into what i do
11:14 pm
. the fact that somebody could take my voice, change lyrics, change messaging, maybe work with an artist i did not want to work with or work with an artist i wanted to but now the surprise is ruined, it leaves me vulnerable. i think if legislation isn't put in place to protect artists , not only would we let artists down who really care about what we do, who spend along time developing themselves to developing the way that we were, it also would mean the fans would not be able to trust people they spent so many years investing in. it would affect us spiritually, financially. it makes -- honestly, if i'm honest with you, i'm surprised we are
11:15 pm
having this conversation because it feels so painfully obvious to me it is hard to find the language, if i'm completely honest with you. >> there are a lot of painfully things for congress to act. your surprise is not unusual to >> ultimately, what it boils down to my spirit my artist, my brand is my brand and i spent years developing it, it is mine and does not belong to anyone else to be used in a commercial sense or cultural sense or even for a laugh. i am me, i'm a human being and we have to protect that. >> thank you. >> if i might briefly, we have seen steady increase in the quality of deepfakes with songs on streaming that forms virtually indistinguishable from talented artists like twigs, what are the challenges a.i. deepfakes are creating long- term for both the music and fans as well as performers?
11:16 pm
>> i think that twigs addressed one of those, no one can do that better than what she just did. i think the second one is that when you have these deepfakes out there, artists are competing with themselves for revenue on streaming platforms because there is a fixed amount of revenue within each of the streaming platforms. if someone is uploading fake songs of twigs and those songs are eating into that revenue pool, there is less left for her authentic songs. that is the economic impact of it long-term and the volume of content that will flow into the digital service providers will increase exponentially which will be harder for the artist to be heard and reach lots of fans. creativity over time will be stifled. >> as you both put it,
11:17 pm
relationship impact, spiritual and pack, financial impact, senator tillis, i turn to you. >> thank you, chairman and thank you for being here. ms. ramsey, i will start with you and others who may have an opinion on it. noticing the takedown with your comments, this is a strict liability bill in its current form, some of us think we have to wade and of that. we also talked about having the individual informed of takedown having recourse, can you talk more about that briefly? >> sure. you might have a situation somebody challenges your own personal use of your identity online and they are the one that is the bad actor but they filed a complaint with the online service provider and online service provider that wants to avoid liability automatically takes it down. that is one possibility. another would be that the person
11:18 pm
disseminating the image actually has a defense, exception applies to this particular use. it could be news reporting or parity so it is critical for the online service provider to be able to put that expression back up if it does not violate the law. under the copyright laws, my understanding is once the information is put back up, it stays up unless the copyright owner files a lawsuit. what is great about the takedown and notice procedure, that allows ordinary people to get these unauthorized uses off of the internet. that is one real benefit of having notice and takedown procedure and encouraging companies to adopt one. there are challenges with notice and takedown procedures like eric goldman and others talked about. it is great you're talking to interested parties when you figure out these issues >> anybody here have an opinion counter to that? okay.
11:19 pm
mr. kyncl, can you walk me through what rights typically grant what artist, what rights are typically granted to record labels under exclusive sound recording agreement and likenesses included in that? >> it is a pretty wide range of rights. anywhere from full copyright rights to distribution only writes or the copyright remains with the artist. increasing so that they include likenesses well because you can imagine as we work on open platforms with lots of music generated content, we are the ones who have staff of people working to issue notices, and the content, takedown the content and increasingly, we need the name, image, and likeness and voice rights to
11:20 pm
actually act on that on the orders we have with the platforms. >> i believe you think fully replica rights need to be transferable? >> yes. >> why isn't a license enough? >> i think it should be the choice of the artist, the artist should have a choice to transfer the license. >> mr. sheffner, state-level right of public publicity laws restricting commercial speech have existed many decades, developed, they develop their own case law and are well understood, the new digital replica right proposed by the no fakes act would affect noncommercial speech beyond what most state laws currently cover. can you explain how novel this proposed rate would be in the context of existing right of
11:21 pm
publicity laws and how should we consider pre-empting similar state-level digital replica laws ? especially when it is such new territory? >> thank you for the question, senator tillis. you're absolutely right, most state publicity laws more than a century limited to commercial uses in advertisements or merchandise. what congress is considering doing here is novel, although sometimes described as right of publicity, we think it is fundamentally different and would apply in expressive works like movies, tv shows, songs, which are fully protected by the first amendment. there has developed a robust body of case law in the right of traditional right of publicity context which says yes, if you apply someone's face on a billboard or use it in advertisement or lunchbox but it does not apply, for example, making a biopic or
11:22 pm
docudrama about somebody, you cannot use right of publicity lot to censor those portrayals. this is a novel form of right which will be subject to heightened constitutional scrutiny like professor ramsey described. because it applies in expressive works, it is really important up-front to provide clarity to film producers so that when they are about to embark on a project, they know what is allowed and what is not and if it is too vague or too uncertain, they will shy away from using this technology to engage in those sorts of portrayals that chills speech and for some of case law says that a statute is vulnerable to being struck down if it chills constitutionally protected speech. >> which is absolutely why we have to get it right. general consensus we have to make progress, the challenges of all this work being struck down our significant, we have to do the leg work for thank
11:23 pm
you. i will yield the second round. >> thank you. >> thank you, chairman and ranking member tillis to bringing this bill before us. as you say, mr. chairman, the bill has gone through a lot of input from a lot of different groups. if i listen to your testimony accurately, does not sound as though any of you think that we should not do something that will protect. i like the framing of protecting personhood, any of you think we don't need to do anything in this area? looking at the statute, let's go down the list quickly, what do you like most about the current bill and we will start with mr. c-span.org what about the current bill and the most important thing you would want to change, if anything. if you could keep your answer really short.
11:24 pm
>> i will start with what i believe it needs to contain which is it needs to contain consent for the use of people's name, likeness, and voice to thank a.i. models and create uploads. that is what needs to happen. second, it needs to contain monetization which is fair market license that person can exercise through consent. in order for that to happen, and in order for that to be operationalized by the platforms, we need two things. one, which is prominence of the content generative ai models are trained on and outputting to be retained which means they should keep sufficiently detailed records on what they trained on so that later on,
11:25 pm
that prominence can be embedded in the watermarks recognized by the platforms on which the content is. >> the point is, it's template consent is a critical part of this, consent of the creator. >> and the prominence of the content. we are good it tracing prominence of tree clothing, cheese, wine, we should be able to do it on intellectual property as well. >> going down the line, we are talking about this particular bill, is there something in the bill you think is the most critical aspect of the bill that you support? is there anything you would change? in the bill? >> i think the most important thing is to put the power in the hands of the artist. i want to be in control of my likeness, my brand, my legacy. i have sacrificed so many years to be good at dancing and singing, so much financial
11:26 pm
input, so much time and i do it in the name of my legacy, i do it so that one day i can look back at my body of work and say that was me that is what i want to be protected in the bill. >> thank you, senator. i think what i like most about this bill is the fact it is broader than limiting it to commercial use. the fact is, commercial use limitation may have worked 100 years ago, commercial use limitation does not solve the problems that we face today, especially because of generative ai. we need the breath reflected in this legislation. in terms of, the one thing i would change, i would adopt durational limitation on transfers or licenses of the rights during lifetime, may not be as necessary after death during a lifetime, i think it is essential to make sure someone does not improvidently grant a transfer of rights early in their lifetime that
11:27 pm
turns out to be unfair to them. i think there are various standards we could look at for appropriate direction. >> 70 years is a bit long. >> i'm sorry, 70 years is the duration of the right of the bill after death, i'm talking about duration of transfer during life, you had 21-year- old artist granting a transfer of rights in their image likeness or voice, there should not be a possibility of licensing that for 50 years or 60 years during their life and not have the ability to renegotiate that transfer. i think there should be shorter, perhaps seven year limitation on those. >> that makes sense. >> senator, one thing we like about the bill is first amendment exemptions. we think they are most of the way there giving numbers clarity and certainty they need. i think they could be a improved little bit, we have specific fairly technical changes we recommend. one thing we would recommend changing is there is currently essentially no pre-emption provision.
11:28 pm
we think it should be the opposite for the reasons i was just discussing with senator tillis, the novel law with uncertainties around the first amendment limits, we think would be important not to pre- empt all existing state right of publicity law but to pre-empt state regulation of digital replicas in expressive works like movies, tv shows, songs that are protected by the first amendment. >> thank you. the question -- >> mr. chairman, you mind if we continue with responses? >>: on some of the things said, effort to protect personhood is something we very much encourage with the draft, the fact it is a discussion draft, i think in terms of the key areas we want to focus on is where liability sets and we would encourage it to be focused on the creator and those first releasing the
11:29 pm
content. we would prefer it was based around right of publicity laws, existing body of law rather than ip. actual damages rather than statutory damages and pre- emption message. >> professor period >> thank you for your question. what i like most about the bill, i love the specific inclusion from liability even though there might be additional revisions that should be made and the fact you are protecting personhood. i will note that statement publicity will sometimes apply to noncommercial uses of a person's identity. zucchini case involved the entire use of his act in a news report said that is not commercial use. the comedy three case in california of the supreme court case, the law was applied to identical rendition of the three stooges in a lithograph which is not commercial speech. there are some circumstances
11:30 pm
where current laws do apply to noncommercial uses of a person's identity and also farce -- false endorsement laws, they have to be involved with noncommercial goods and services. what should we change? i think there's a tie, he said to pick one but i have to pick two i think we need separate causes of action, as i mentioned before, with distinct offenses. for example, a disclaimer might make sense if you are targeting deceptive impersonation of someone because it dispels confusion. a disclaimer does not make sense with sexually explicit deepfake put out without consent. you might have different requirements with regard to commercial use, just a general broad federal right of publicity, you might have commercial use requirement, whereas, you talk about sexually explicit deepfake, impersonation, apply to commercial and noncommercial speech. the other part of the tie is provisions with regards to no limits on the scope of licensing. my concern is
11:31 pm
individuals without significant bargaining power at the early stages of their career might sign a contract, it may be a long contract with the digital provisions write in it and sign away the right to their identity for a lengthy period of time and use of any context. i would like to see some way for congress to encourage or require those folks who are negotiating these agreements to perhaps have a specific use authorization for certain movie as opposed to use of your digital identity in any context , rights inside of a lengthy period of time, 56 years, i would say one to five years. i'm not an expert in the area with what is a good term, i think it is critical to make it shorter than longer because a lot of people, even if they have attorneys, they will not have that kind of bargaining power that the big studios, the big music companies will have.
11:32 pm
>> thank you, mr. chairman. i think what the professor is suggesting, different causes of action, it is very intriguing. but complicated so we will think on it. thank you very much. >> senator blackburn. >> thank you, mr. chairman and your good work on the bill. we have spent months working on a discussion draft and moving this forward so i'm so pleased that today we are to the hearing stage on this. i represent tennessee so it does not matter if you are on beale street or music row or maybe you are working with nexus or one of the symphony distributors, we distribute more symphonic music out of nashville, tennessee than anybody else in the world. we have gospel, contemporary christian, church music, bluegrass, we have the museum of
11:33 pm
african american music. it is all right there. we are really so protective of our creators and in tennessee, we have the good, the bad, the ugly relationship when it comes to a.i.. all of our people in manufacturing and logistics and healthcare, they are innovating and going to town with it but i'm deeply concerned about what is happening to the creative community. working on no fakes and making certain that there is a way for artists to protect that name, image, and likeness, their voice, there is a way for them to exercise their constitutional right, to protect intellectual property, and to benefit from that property. that is going to be so
11:34 pm
important. mr. kyncl, i want to come to you, i appreciate the comment you made when we were visiting, preparing for the hearing, you said we got data wrong, data privacy wrong. we still have not done data privacy and we can afford to get a.i. wrong. and it is going to require we take an action. tennessee stepped up last month and they passed the e.l.v.i.s. act. this is a great piece of legislation. mr. chairman, they took much of what we put in the discussion draft and they put it in place to protect our innovators and to give them that state right of action and not all states are following suit on this of course, and i
11:35 pm
think what we have done is establish that baseline for a federal action to i would like to hear from you, if you will, sir, about the need for a federal standard, a federal pre- emption on that for action? >> thank you for your efforts on e.l.v.i.s. act, truly groundbreaking. we are in a unique moment of time where we can still act and get it right before it gets out of hand. the genie is not out of the bottle but it will be soon. as you mentioned, senator blackburn, we got it wrong on privacy, we waited too late. don't get it wrong on identity. it is simply far too important.
11:36 pm
the speed at which this will happen will be afforded by open sourcing of foundational a.i. models which are developed. once that happens, everything accelerates exponentially and therefore it is imperative congress acts this year, thank you. >> we have heard some commentators are talking about, you have existing law when it comes to privacy or personal property and intellectual property protections. you can rest on that existing law and that is sufficient to go in and get a takedown order on some of these a.i. fakes. talk to me about why that is not sufficient. >> today, if you think about privacy, how many spam emails
11:37 pm
you get everyday in your inbox? quite a lot. your personal information is leaking everywhere whether it is sold or taken, it is not safeguarded properly. when that happens with your face and your voice, it is a whole new game. for you, this will happen at a volume impossible for every single person to personally manage which means it has to be solved with technology. it is technology that will unleash it and it has to be technology that helps manage it which is why it is important for us to work with the technology platforms to solve this and we have to have a working bill and working law that can be operationalized by all of us. it is the existing framework is simply whac-a-mole and does not work. >> let me ask you this, mr. chairman, if i could get one
11:38 pm
more question, do you think the platforms should be held responsible for unauthorized a.i. fakes? they are continuing to allow to be distributed? >> i think we need to develop conditions that they should meet and if they don't, then yes . there has to be an opportunity for them to cooperate and work together with all of us to make it so. that i think is that detail work that needs to happen. when we achieve that, it will be work and there will be good actors and many of them are. i think it is through that collaboration that we wrestle this down. >> thank you, thank you mr. chair. >> thank you, senator blackburn and your cooperation moving forward on this great bill. i have a series of questions to ask about potential tweaks so i will try to move relatively quickly.
11:39 pm
mr. sheffner, you testified we have to include first amendment exemptions for uses and works that have public interest or newsworthy value. some people say that any work involving a celebrity is newsworthy or in the public interest and that raises the challenge of how we define first amendment exceptions to ensure they don't just swallow the rule and permit all kinds of uses the bill is trying to stop. i would be interested in your views on how we narrow that, professor ramsey, how would you craft the first amendment exceptions to make sure that they don't swallow what the whole bill? with particular regards to what is newsworthy. >> sure, senator , we talked about it to your staff, have a great relationship with, we talked to stakeholders and listened to concerns they raise, wow, maybe these exceptions are overbroad and could swallow the right itself. we have listened and suggested tweaks to make sure that those
11:40 pm
types of exceptions do not apply if the use of the digital replica is deceptive fraud, we do not support fraud, fraud is not protected by the first amendment, it should not be allowed. but one other thing i would say is that these types of statutory exemptions have been routinely included in state right of publicity laws over the last 25 years or so, since the late '90s. one thing we have seen or have not seen is this type of abuses of those exceptions, they have worked very well separating out the uses where you should need to get permission to put somebody's face on a billboard or lunchbox versus the biopics. >> got it. professor, briefly. >> christine farley and i recently wrote a paper about how we can balance trademark and free-speech rights when someone uses a trademark and information
11:41 pm
in expressive right like a news report, entertainment, things like that. i think the proposal with that context might also work here. as you mentioned, some of these kinds of uses can be bad, impersonation, et cetera. one approach, in addition to listing out potential offenses, informational expressive use, that is a false statement or false representation so you say this is a certain celebrity when it is not for a certain teenage girl when it is not that would be actionable, still, even though there is some argument it is expressive. or, if the use is likely to mislead reasonable person about the source of the message or the speaker's identity. that way you would be able to at least have courts consider whether it is information expressive use or the safety valve if it is really causing harm because it is deceptive, you could still regulate it. >> understood. mr. davies, today you raised concern the bill lacks mechanism or showing form that
11:42 pm
the members have knowledge, should we incorporate notice and takedown structure, if so, should it be that dmca notice and takedown provisions, is there another mechanism you urge us to consider for knowledge and construction? >> thank you for the question. in terms of the current situation, members handling the leading streaming service, handling majority of music streaming consumption and processes are working very well. i think the example you have used and the other drake example which is a common one, this being no challenge in taking down the content expeditiously. we don't see our members needing additional burdens or incentives here but we do understand the committee is keen to look at if there is to be secondary liability, we would very much seek that there be safe harbor for affective takedown. i think that dmca takedown process we don't see as being a
11:43 pm
good process for here, it was designed for copyright and we have a position in terms of seeing this in a different set of rights. that said, our members can absolutely work with the committee in terms of what we think would be effective notice of takedown and building on the points of professor made in terms of it is essential that we get specific information on how to identify the offending content so that it can be removed efficiently we need information on the notifier in terms of why the content is offending and what basis and also the notice, if there is an objection, it can take place. >> i want to talk about pre- emption briefly. professor, several witnesses described existing state of right publicity laws as difficult to navigate patchwork. should the bill broadly pre- empt state laws or let limit
11:44 pm
pre-emption to unauthorized digital replica? >> i teach publicity lock, property survey, i think we need federal right of publicity law. state laws are so different, you go to jennifer rothman, she has a great log that talks about the different laws, even within a state, statutory provisions have different rules than the common law provisions. >> your answer is yes. >> so yes, i'm just building up, we need pre-emption. the challenges that obviously, congress is doing a great job trying to get this right. you get it right, then you pre- empt state laws, it sympathize everything for litigants, judges . instead of having to figure which law is going to apply in a particular, there is right now form shopping going on, people will file suit and whatever state is best for their interest. so yes, we need. >> 70 year postmortem provision modeled after the copyright act,
11:45 pm
postmortem rights are important but we understand 70 years is a long time, especially for individuals who do not commercialize their image, voice, or likeness. i would be interested, jump ball, several of you, perhaps mr. sheffner first and then others, should postmortem terms be longer for individuals who commercialize image, voice, and likeness? should they be limited or reviewed and re- extended every decade or so? how would you handle postmortem rights? the draft has 70 years rights postmortem, some of you enthusiastically supported that as part of your creative legacy, others have raised concerns. mr. sheffner, you kick this off and we will do this quickly . >> sure, we view this through the lens it is a content-based regulation of speech. as professor ramsey said with her openings statement, content- based regulation of speech needs to be justified by compelling government interest
11:46 pm
nearly tailored to serve that interest. what we have said is that as living professional performers, use of a digital replica without their consent impacts their ability to earn a living, you have compelling government interest in regulating there and it would be appropriate for congress to regulate. postmortem, that job preservation justification goes away. i have yet heard a compelling government interest in protecting digital replicas once somebody is deceased. i think there will be serious first amendment province with extending a right that applies in expressive works postmortem. >> any other witnesses think preserving the legacy and property rights of an individual is worthy of some protection? professor ramsey and then mr. crabtree-ireland . >> this will not shock you but i will say it depends on the goal of the law. if we are talking about a law regulating deceptive uses of someone's identity, talking
11:47 pm
about a law that is governing sexually explicit deepfakes, it seems to me it is fine to have those morgan wright, long-term possibly, may be plus 70. talking about protection of broad federal right of publicity, maybe not so much. i've not written in this area but would recommend looking at the works of people who have come mark bartholomew, jennifer rothman is working on a paper. >> mr. crabtree-ireland. >> to me it is shocking this right does not deserve to be preserved and protected after death. after all the reasons twigs stated about how personal it is, it is an economic right it is a personal right and something that has real value. how that should dissipate upon that and make itself to big corporate interest like the ones represented by folks here, it does not make any sense. i would argue there should not be 70 year limitation at all. the right should be perpetual and the reason is that every
11:48 pm
one of us is unique, there is no other twigs and there never will be. there is no other you or any of us, this is not the same thing as copyright. it is not the same thing we will use this to create more creativity on top of it later. this is about a person's legacy, it is about a person's right to give this to their family and let their family take advantage of the economic benefits they worked their whole life to achieve. from my perspective, this is intellectual property right that their blurbs protection, it should absolutely be protected after death. i'm waiting to hear a good reason why it shouldn't be, to be honest. >> in perpetuity, not at all, mr. twigs, make this brief. >> i agree with mr. duncan crabtree-ireland 100%. >> thank you. twigs, would you like to make a comment on that ? forgive me. >> i worked so hard throughout the whole of my career, when i die, i would like anything i created to go to my family and
11:49 pm
my estate that would have clear instructions of the weight i want to preserve my history and all of the art that i created. >> thank you. >> senator blumenthal. >> thank you, mr. chairman. i got a plane 20 minutes ago coming from connecticut so i do apologize for missing the bulk of the hearing. as you may have heard, we had no votes yesterday so today was a partial day off and i had plans in connecticut so i'm grateful to all of you for being here. we are very hopeful you are in good health and continue creating and i'm a big fan of your work so thank you for being here in particular. thank you, mr. chairman, for having this hearing, which focuses on a bill that you are going to introduce, i would
11:50 pm
like to be added at the appropriate time as a cosponsor. i'm a strong supporter and i believe there ought to be a federal right for people whose image and voice are used without their consent. whether it is an actor or songwriter or a singer or an athlete. what is shared here is a right in one's own likeness and creation as a person, individual right. i think there ought to be a right to take legal action under that right, the right without a remedy is unavailing, as we know from our first year in law school which for me, it was quite a few years ago but i have seen repeated again and again in real life as a prosecutor, as an advocate, as a litigator. i would also like to focus on a complementary
11:51 pm
remedy which could be watermarking or identification, attribution, giving credit. not just the deepfake and the right to recover as a result of the use of it without attribution or credit, so to speak, without watermarking. but also that kind of identification, public crediting of a work. i'm asking not only in the abstract, i had a different subcommittee privacy, technology, and the law. the ranking member of that subcommittee and i, senator josh hawley of missouri, set forth a framework, it is the most comprehensive bipartisan framework right now. we should do more adopting the kind of measure senator and others have
11:52 pm
proposed. it would provide a requirement for watermarking as well as an entity to oversee >> bill take it. >> thank you for all of your work. on this important issue.
11:53 pm
i think, you know, without, without attribution, with use through watermarking, we won't be able to operationalize what we're talking about here today. so you're focusing on absolutely the right issue. and, and i think the, the, the important part in this is to determine the prominence of content that's being displayed, the degrees of similarity to its original, you know, to the original, and then it is up to the rights holders, whether it's artists, music companies, movie studios, et cetera, to then negotiate the commercial relationships with the platforms, separate and aside from the laws and how it all works, using all of those can isms. we've actually done this when i, when i was at youtube. this is precisely what we have done with user generated content. we've just done it in the copyright scheme, where it was the exact content referenced,
11:54 pm
and so, so we built a whole framework around that. this is merely that on steroids, adopted for the ai age, with, you know, many more shades of gray and much more, much more speed. but it's really just upgrading that, but the framework exists. it has been developed by companies like youtube, which is best in class on that. and, and therefore, i'm hopeful that we can take it further and apply that to ai -- degrees of similarity, using watermarks to label content and --. >> thank you. >> another question. i mean, i can only really talk from personal experience, that in the last six months, i had 85 of my songs leak online, which is basically the whole of
11:55 pm
my experimentation for my next column. it was really scary, because it felt like having the whole of my notepad, i guess, of all my ideas being put out to the whole world before it was ready. but on the flipside of that, i felt very secure because i was able to call up my label and say, hey, this has happened, and immediately they could go and take it down, and it just disappeared, and now you can't find it. so i think that watermarking it would protect artists, because then we'll have a point of -- to go to to say, this has happened, and immediately, wherever, you know, been leaked online or put online can be taken down. but one thing i will say is that the thing that's really scary is once something is out in the world, we can't take it back. so if someone uses my likeness and said something offensive or says something harmful, people might think that that is me, and we've all seen, in the news, when someone does
11:56 pm
something wrong, and the big story is, like, the front page, with the only thing on it. they actually didn't do something wrong, it was a mistake, and the rewrite of it is so small. and i think that's the thing that i'm scared about, is even if something does get out in the world that's not me, it's the reputational damage that it will do, and the financial and cultural harm that won't be able to be amended after the fact. >> very good point. if the chairman would give me a little more time, i'd be interested in the others to answer. thank you. >> i agree with mr. cancel on the value of watermarking and other tools as well, c2 p.a., the coalition is working on that. but i also just want to caution, especially in deepfakes, it was mentioned earlier the idea of disclaimers, solving problems there, or the ideas of watermarking solving problems there. we also have to make sure that
11:57 pm
tools that we use to protect against abuses of these technologies are realistic. and so expecting viewers of content online to read deeply into captions to find disclaimers or things like that, that doesn't really solve these problems. so i hope, as the committee considers what to do, it's not enticed into thinking that that type of solution actually solves the problem. it needs to be more front facing so that the message that's delivered is received by all those who view it. >> thank you for the question, senator blumenthal. as mr. cancel was talking about in the copyright context, watermarking has proved useful in certain contexts. youtube's content i.d. system, which has been a great help in reducing the presence of pirated material on that platform. i would just say again our experience in copyright law, though, it's not a silver bullet. it sometimes can help identify the original source of material, but just because it's
11:58 pm
out, just because it has a watermark on it doesn't stop it from being further disseminated, et cetera. so there's really no silver bullet in this context. >> thank you. the question i'm going to build on, things that already been said. i think robert talked about the ownerships between the services and the rights holders. these are absolutely essential. this is where the content comes from for the services. so we're very reliant on them, on the data, on the metadata that exists. i think it would be true to say that data in music industry already have significant challenges, so for these challenges we work on together. >> thank you. professor? >> so i'll incorporate by reference everything that's been said before, but then also say that i think someone using a digital replica to impersonate someone, or, or basically put out a sexually explicit deepfakes, they're not going to use this kind of technology. so it's not going to help in
11:59 pm
certain circumstances. >> and i met this, i think i use the word, monterey. if not, i meant to say, elementary. it's not a sub, i didn't need a substitute. so i think all these comments is very helpful invalid. >> on behalf of the chair, senator club char. >> thank you very much. that was an ai attempt, i know it was. and assailed kind of closed, not quite. okay. professor ramsey, since you ended there, i'll pick up where you were about some of these and some of the other witnesses mentioned about this deepfakes and how some of these things, whether it's sexually explicit images or whether it is the political robo calls or videos or ads, and i wasn't going to start this way, but it makes sense here, because of what you just said. to me, some of this, we just had to get off there. they're not going to be able to listen to a major candidate for president for three minutes, and then think it, then look and see a label. and i think that in other
12:00 am
countries, that's what they've done. that's why senator hawley and senator and collins and a number of other senators have come together, we're marking up this bill, along with a labeling bill in the rules committee on elections. could you talk about why that kind of targeted approach to some of these, like, hair on fire things is very important, given the timing of all of this? >> as you can expect, i love the fact that you're working on these targeted laws. but again, one of the things we need to do is protect ordinary people from impersonation. over thanksgiving, someone called my dad when i was standing right next to him. it sounded just like my brother, and he said he was in jail and he needed money to get out of jail. and my dad was not duped by this, but, you know, some people have been, as the senators have noted. so i think it's a great idea, but i think that, you know, we still need the over, the more broad act to deal with these kinds of issues for folks that
12:01 am
are not politicians, et cetera. >> exactly, and we, my state directors son is in the marines, and she, her husband got a call where it was an impersonation, they scraped his voice. they didn't know where he was stationed, so we're going to see all of this deployed against military families as well. really all these kinds of scams. so it's going to be, i see this, you know, having some of the great uses, especially in healthcare, of ai, but then there's the hell part, and that should be our job to try to put in the guard rails in place, which is why i'm so honored to be working with senator and tillis and blackburn on this bill. so one of the things that interests me during the testimony, you, mr. schaffner, and mr. krebs or our line, you kind of got to this, but both the no fakes act and this election built include exemptions, excepted for the use of digital replicas to ensure the bills do not chill speech protected by the first amendment. can you talk a little bit more, as we look at how we can write these in a way, as i have tried
12:02 am
with exceptions for satire in the elections bill with senator hawley, how we can do this to ensure that commonsense safeguards do not chill protected speech, and that this is upheld in a court? >> right, so, senator klobuchar, i just want to say , agreeing with professor ramsey, that i think your approach of having specific legislation on pornographic deepfakes, other legislation on election related deepfakes, is really the right way to go. when you have a broad bill that essentially says you need permission to use digital replicas, and then let courts kind of sort it all out, that's where you get into trouble, and you have an overbroad bill that is going to necessarily end up encompassing detected speech, makes it vulnerable to being struck down on overbreadth rounds. so these kinds of exceptions, i think, are specific to the type of legislation. in the world of movies, our, our studios, the studios that we represent at the mpa, making a lot of movies that are based on or inspired by real people
12:03 am
and events. went through this morning, last five years of all the best picture nominees over the last five years, approximately half are based on or inspired by real people and events. our studios want to make sure that have legislation like this doesn't interfere with their ability to do like that. when you're talking about, say, nonconsensual pornographic deepfakes, you don't need those exceptions for biopic sent satire and parody. that stuff is bad, in almost every circumstance you can think of. and i think this never targeted approach is really the right way to go. >> okay, so mr. duncan, you got the best long name in the world. >> thank you. >> could you talk about balancing that right of creators with the right of those whose voice or likeness may be at risk, sitting next to one of them right there, with twigs? and how do you believe we should balance that? >> absolutely. you know, i think we all the great that, obviously, the first amendment has to be
12:04 am
protected, and that expressive speech is important. i think, you know, the exceptions that are written into this discussion draft now are not that far off, but i think it supported that they not be expanded upon, nor that may be broader than necessary. because the fact is, we can't anticipate what this technology is going to do tomorrow. we cannot anticipate every iteration of this. and while there are certain specific uses, or concerns, that are being addressed by legislation like the legislation you referenced, there is a broader need for protection. the example i gave in my opening statement is one. twigs has given examples as they apply to her. and so we do need to have that proper balance. and i am concerned that we are only looking at one side of the first amendment consideration here. the other side of the first amendment consideration is the right that each of us has to our own freedom of speech, to be able to communicate our ideas, to associate ourselves with ideas that we want to associate with, and not be associated with ideas we disagree with. and that is being really trampled on right now by this
12:05 am
unfettered ability of people without a federal right to, to do things like the deepfake ai experience that she experienced, et cetera. and so i do feel like the committee is going to have to go work on, you know, defining these exceptions, making sure they are no broader than necessary to keep the legislation viable, but also to make sure it doesn't swallow up the role, like the chairman said. if we make them so broad that they swell up the rule, that all of this work will have been for naught. and the reality is, today is not like 10 years ago. it's not like 30 years ago. this technology is fundamentally different, and what it can do with all of our faces and voices calls out, it screams out for a remedy that's actually effective. >> and do you see, maybe anyone, twigs, any of you, mr. kinzel, want to get this need for a national standard? just because senator blackburn's work with us on this bill, and is going to be a cosponsor, and they just did
12:06 am
the -- of course, in minnesota, we have the dillon act and the prince act. no, i just made that up. but we do have people, as you know, who are fiercely, fiercely independent and protective of their incredible music in our state, and, but we have a common law in minnesota that's helpful. there's like, this state, this state. talk about, few of you, if you want to, just this need to have this national standard and why it's so important. >> i just want to comment on some of the things from before, which is, as someone who grew up without the first amendment, i value it probably more than those who have, because i do not take it for granted at all. and it seems like well in the life in america. because half of the movies who were nominated for oscars, you know, were based on, you know, existing folks. so saying that any, you know, ai regulation that is respectful of the existing first amendment is not reducing it.
12:07 am
it's keeping it as it is, and it's alive and well. so i, i do, do think that we need to stay within the limits of first amendment and not go beyond. as, as to national regulation, we work with global platforms. we're talking about global platforms. not even national. we're talking about global platforms doing anything state- by-state is a very cumbersome process. twigs's content getting on the platform unauthorized, if we had to fight that on a state by, you know, state-by-state, it's untenable. it just doesn't work. >> very good. mr. davis, that'll be my last one, and then we'll go ahead. >> thank you. i just need to reinforce what rob has just said, you know? absolutely right. you know, music streaming is global. the success of this is having access to twigs's music from the uk or from tennessee or wherever. so it's high-volume. anything that adds complexity on a state-by-state level is, is a nightmare to this
12:08 am
industry. so we, we're very strongly in favor of preemption. >> very good. just the last thing, kind of along those lines, is don't laugh -- just, it'll be very fast. you can put it in writing of you, mr. davies. in january, we heard testimony that generative ai has been used to create unauthorized digital replicas of news anchors making comments, and we have a number of things going on in the journalism area. i have a vested interest, my dad was a journalist for the minneapolis star tribune. but also, senator kennedy, i have the bill to push for negotiation of the content, and to get them reimbursed, mainly from google and facebook for the use of this content, something that's going on in australia and canada, and i will not go on. but what steps can streaming services take to ensure that unauthorized digital replicas of journalists are not posted on the streaming platform?
12:09 am
>> senator, if i could follow up with you after. i got briefed on that. >> okay, excellent. thank you. >> thank you, senator klobuchar. back to senator tillis for his second -- twigs, if you'd like to comment. >> oh, thank you. i'd actually like to go back to mr. schaffner's point about the desire to make very big and financially successful films about artists without consent. i think the problem is, if you're able to use an artists voice and likeness without consent about their life story, you're giving the impression that it's, i guess the equivalent of a autobiography rather than a biography, you know? and that's the confusion. if you're able to use my voice and my exact face, you're saying, this is what happened from my point of view, and it's not. it's what happened from a team of writers in hollywood that want to over dramatize things and maybe make it more tragic or, you know, more fantastical.
12:10 am
and i think that's what makes me really nervous and feel uncomfortable and very vulnerable. i don't think it's fair that even after an artist deceases, somebody would be able to make a film about their life using them, you know? we can watch a film about a person, a star from the past, and if it's an actor, we know to take it with a pinch of salt. if it is the person himself, then it just feels too unclear, and, and not fair, and actually not in, not in, what am i trying to say? not be, the best intention of the artist's legacy. >> thank you. >> thank you. senator? tillis. >> thank you, mr. chair. i'm going to be brief. i did have a question for you, mr. crabtree ireland. the current draft legislation, individuals only have a right to license out their digital
12:11 am
likenesses if they hire an attorney or they're a member of a labor organization. we've gotten some feedback that your, your organization in particular, that this is a giveaway. can you give me other examples in law, giveaway, or really getting vectoring everybody they're into legal counsel arterial union? can you give me examples, other areas in lot where this is the case where you had to engage an attorney or a labor interest to move forward? >> sure, and i guess i would just say, i don't to distrust our union. it would be, you know, any collective bargaining representative. but i, there are a number of examples that our current labor law, labor and employment law, where there are defined worker protections that then can be deviated from three collective bargaining arrangement, but not through individual contracts. in this case, the proposal, i think, is a little broader, a
12:12 am
little more open, because of the option of securing individual representation by an attorney as an alternative that's not normally present in those kind of statutes. but i'm sure i can provide, i can't give you a longer list right now, but -- >> for the record, we're going to be submitting questions for the record for all of your, provide an opportunity for additional information. mr. chair, i just like it's remarkable, if you take a look at the attendance of, in the audience, and the engagement from the members here, you're hard-pressed to see, i mean, on certain subjects, but on technical subjects like this, to have members go twice, or, a lot of times, demonstrates the interest of twigs. i'm going to end my questions with you. the, i do believe that congress needs to act, but you need to understand that this is, it's tough to get virtually anything done, even, even what appears to be common sense, for the reasons we've talked about. we're going to have constitutional questions we had to address. we have to get to a number of matters. and hopefully we do get it done this year. but you are, in your opening
12:13 am
statement, you are emotional, or appears to be emotional on one or two points, and i, i'm just trying, i think that will need to understand, i think, excuse me, one of the reasons maybe you got emotional is because this is an existential threat to creators. and i'm trying to figure out how we educate people on the difference between an original creation from a human being and something that was either created or augmented from a machine. and this is, this is more of a societal thing that we have to sort out. at what point is society just prepared to say, boy, the sound as good, i know it comes from a machine, it's not -- you mentioned something about the investment of your fans, that they've made in you. how do you invest in a relationship with a machine? i mean, we're at an interesting point in time in history where we could have billions of people think the inauthentic
12:14 am
creation of a machine is somehow as good as the hard work of a human being. i wonder, at what point, that when we lose all the creators, philosophical question, at what point can those machines never possibly match the creative genius of an individual? >> thank you. >> it's okay -- which makes no sense to me. >> i think there's two things here. i feel incredibly lucky to have spent the whole of my teenage years without a smartphone, so i straddle a generation where i memorized all my friends numbers, i would walk to my friends house. if we said we're going to meet at 1:00, i just would have to be there, you know? there was no texting and saying that i was going to be late.
12:15 am
i loved my brain back then. i love how simple it was. i loved what truth was back then. i love that i was able to think for myself. even where we're at with the internet now, it's so confusing, you know, even if you just want to find a simple news story, we can't. even if you want to find the truth about, you know, whether a food, even, is good for you or bad, we can't, you know? it's just a stream of, of nonsense. i look at a lot of my friends that have children and teenagers, and their mental health is really struggling. we're looking at young people that have anxiety, that have depression, because they're overwhelmed with, with information, and, and lack of truth, and lack of stability. anything that scares me is that my fans look at me, looked to me for a northstar, a message, a sense of being. my work is something that they can find themselves in. and if you change the narrative of my work, we're just messing
12:16 am
with their brains. you know, the, like, the solid essence of my work that i spent 10 years developing, if someone can just take it and make up something completely different, i feel so bad, because i'm harming people then, and there would be nothing that i can do about it. i think the way that we can prevent this from happening is putting the power in the hands of the artists, and also putting the power in the hands of people that are there to protect the artists, whether that's third parties like record labels or agents, or lawyers, you know? that's up to the artist to understand and to, you know, sign a contract, if we want to. you know? but i think the, the way that i've been experimenting with deepfake is going to help my fans, it's going to help them understand the nuance of my language across all parts of the world. like, the way that i want to use it is not harmful, the's i think, inherently, artists just
12:17 am
want to express their emotions, and say things that you can't say for yourself. so if you're putting words in our mouths, it's going to be devastating. >> i also agree. i'm very glad there were cell phones back when i was a young person, but maybe for other reasons. and polaroids fade, but, but no. i, i do think that, you know, i'm glad that, that we're taking up this bill. i do feel strongly that we should do everything we can to try and move it in this congress. if not, then we just have to lean into it and get it done in the near future. but when we have these discussions, it points to all the other societal challenges, challenges of creators that we need to get right. this, this technology, i love it. i interact with generative ai for about an hour every day as a part of my own study of it and study that began back in the 1980s into artificial
12:18 am
intelligence, but for me personally. but we've got a lot of work to do. congress has a role to play. but we've got to be very, very careful not to overstep, not to trample the rights of others, and we're going to need your help and your continued engagement to get it right. so thank you all for being here today. >> senator tillis, thank you. thank you for, again, being a great partner. i have even more questions, but we have come to the end of our time, and you and senator blackburn have been terrific to work with. i am grateful to all of our witnesses, for the way that he brought your skills, your value, your background, your creativity, your voice to this hearing today. and we've engaged in a lot of different challenging questions about how we could refine this, how we could narrow it. there have been a lot of members who participated. for those who did not participate, or those who still have other questions, the record will be open for questions for the record for the witnesses. they are due one week from
12:19 am
today by 5:00 p.m. on may 7th, although, twigs, in your case, two weeks. before we wrap this up in cellophane and move forward. if i could, today's hearing was important to show that when we regulate the use of ai, we have to balance individual privacy rights and first amendment rights in a way that doesn't stifle creativity and innovation with these rapidly developing ai tools. it reinforces what we've heard today, why we need a clear policy to protect image, voice, and likeness of all individuals, from unauthorized ai replicas. and the feedback we heard and that our staff has received over the last six months it's critical. i look forward to working with my colleagues and cosponsors and the witnesses and the others who attended today to refine this in the next week or two and get to the point where we can introduce it next month, so we move from discussion draft to reality. i think we need to seize the moment, move forward. thank you for your partnership, thank you for your testimony.
12:20 am
with that, this hearing is adjourned.
12:21 am
12:22 am
12:23 am

10 Views

info Stream Only

Uploaded by TV Archive on