Understanding the Legal Side of AI Films: The Case of Kannada film ‘Love You’
- Isheta T Batra
- Apr 24
- 12 min read

Introduction: The Film That Raised Legal Eyebrows
In a surprising but defining moment for India’s entertainment and technology sectors, Love You—a full-length Kannada film entirely generated by artificial intelligence—has made headlines across the country. Directed and produced by Narasimha Murthy, a temple priest from Siddehalli village near Bengaluru, the film is being touted as one of the world’s first AI-generated feature. But beyond the novelty, it’s setting the stage for something even more significant: a legal conversation India has yet to fully have.
For content creators, production houses, digital platforms, and media lawyers alike, this isn’t just another indie release—it’s a case study in what happens when creativity, AI tools, and legal grey zones collide. Who owns the copyright? Can moral rights apply to machine-generated scripts? What happens if AI unintentionally mimics someone’s voice or likeness? The answers aren’t just academic—they’ll shape how the Indian film industry contracts, monetizes, and protects content in a future where AI is no longer behind the scenes, but at the center of it.
As more creators experiment with AI-generated stories, performances, and visuals, Love You may well be remembered not only as a cinematic experiment, but as the moment India’s legal framework had to play catch-up with its creative ambitions.
Authorship in Question: Can AI Own Copyright?
Following the buzz around Love You, the core legal debate emerging is around authorship and ownership. In traditional filmmaking, copyright vests in the person who creates the original work—be it the scriptwriter, director, composer, or producer. But what happens when the “creator” is a neural network or AI model trained on data sets scraped from the internet?
Under Indian copyright law, the concept of authorship is intrinsically human. Section 2(d) of the Copyright Act, 1957 defines an “author” as the person who creates the work, which, in current legal interpretation, excludes machines. So, when AI tools like ChatGPT, Midjourney, or Runway are used to generate scenes, dialogues, music, or even entire edits, the legal authorship still defaults to the human who directed the AI to perform the task. But how do we assess originality or creative contribution when the human role is limited to prompts?
In the case of Love You, Narasimha Murthy’s role as the director who fed inputs into AI systems could arguably qualify him as the “author” or “owner” of the resulting content. However, without clarity in statute or jurisprudence, this ownership can be challenged, especially if any part of the film unintentionally reproduces existing copyrighted material pulled from training data.
For production houses and OTT platforms considering AI-generated content, this uncertainty can translate into real business risk. Contracts must now include warranties and indemnities related to AI-generated output, and stakeholders should carefully consider how to document creative control, input-output chains, and authorship attribution.
As India stares at a future filled with AI-driven storytelling, this is no longer a hypothetical concern. Clarifying copyright authorship for AI-generated works is no longer just a policy issue—it’s fast becoming a commercial necessity.
Moral Rights in an Artificially Created Work
Another legal grey zone surrounding Love You and other AI-generated films is the issue of moral rights—particularly the right of attribution and the right to integrity, as outlined under Section 57 of the Indian Copyright Act. Traditionally, these rights allow an author to claim authorship of a work and object to any distortion or mutilation that may harm their reputation. But how do these rights apply when the work is not human-made in the conventional sense?
In the case of an AI-generated film, where the “creative brain” behind the visual, audio, or narrative output is an algorithm, who exactly is entitled to claim moral rights? Is it the human feeding the prompts, the data scientist who trained the AI model, or the AI company that owns the algorithm?
In Love You, Narasimha Murthy may be publicly seen as the face behind the project, but without a formal acknowledgment of his creative control in a legal framework, enforcing his moral rights could become murky. This is especially problematic if someone repurposes the film, edits key elements, or uses parts of it in a different context. If he isn’t recognized as the author under law, can he even claim that his reputation has been harmed?
For producers, studios, and OTT platforms evaluating AI-generated content, it is vital to address moral rights in contracts. This includes clearly designating the human “curator” of the AI work, outlining the scope of attribution, and explicitly waiving or limiting objections to future edits or adaptations. Failing to do so can expose the project to future legal claims from collaborators, prompt engineers, or even AI tool developers.
As AI begins to blur the boundaries of authorship and creative control, stakeholders must start preemptively building clauses that define, limit, or assign moral rights—before they find themselves in a legal battle over who gets to be called the “creator” in a machine-assisted masterpiece.
Performer Rights and Personality Protection
As Love You captures attention for being a fully AI-generated feature, another critical question surfaces: what happens when the faces, voices, or likenesses used in such films resemble real individuals—or worse, are modeled on actual celebrities without consent?
Under Indian law, performers are granted certain protections through the Performers’ Rights under Section 38 of the Copyright Act, and their image or voice also falls under personality rights, especially if they are public figures. However, in the world of generative AI, models can be trained on publicly available images, videos, or audio samples, often without the explicit knowledge or approval of those whose likenesses are being referenced or emulated.
In Love You, if the AI-generated characters appear eerily close to known actors or if the voices sound too familiar, it could give rise to personality rights violations or passing off claims. This is particularly concerning in a digital environment where AI can replicate speech patterns, facial expressions, and physical likenesses with remarkable accuracy.
For producers and content creators working with AI tools, this creates a legal minefield. While the algorithm may generate content “originally,” the data sets it was trained on could include copyrighted performances or images of living persons. Without proper clearances or licensing, this exposes the production to serious liability—even if unintentional.
Going forward, it is essential to:
Vet AI datasets to ensure they don’t include copyrighted performances or personality likenesses of known figures.
Obtain written waivers or licenses if AI outputs resemble actual individuals.
Use disclaimers where needed to indicate fictional nature or synthetic origins of characters.
In an era where AI is redefining performance itself, legal frameworks must evolve to protect the commercial value of human identity. Meanwhile, stakeholders in Indian media—especially producers, OTT platforms, and AI tech developers—must take proactive steps to mitigate legal risk and preserve trust in digital content creation.
Censorship and Certification: A Legal Vacuum
With Love You being positioned as the world’s first full-length AI-generated feature film, another grey area becomes evident—how does India’s certification regime apply to AI-generated content?
Under the Cinematograph Act, 1952, every film intended for public exhibition must be certified by the Central Board of Film Certification (CBFC). However, the law is built on the premise that a human director, screenwriter, and editor are responsible for the content. When a film is written, directed, and edited by AI—without a traditional “human author” in the creative sense—the regulatory framework doesn’t quite know where to place accountability.
Who is responsible if the AI-generated film carries subliminal messaging, harmful stereotypes, or misrepresents religious or cultural sentiments? Can the CBFC ask the AI to justify the choices made in scene construction, dialogue tone, or visual cues? Obviously not.
From a legal standpoint, there are two glaring issues:
Absence of Regulatory Guidelines for AI-Generated Content
There is no policy under Indian law that specifies how the CBFC should evaluate content that has no clear human author. Does it fall under the same lens as conventional films, or should a new standard be developed?
Accountability in Content Vetting
In traditional cinema, the director or producer is held liable for the content. With AI in the driver’s seat, creators may attempt to distance themselves from controversial outputs. However, that doesn’t absolve legal responsibility. Producers using AI must take ownership and ensure content pre-clearance checks before submission to the CBFC.
In practical terms, AI-generated films still need CBFC certification, and producers must sign off on the content as per standard procedure. But the lack of AI-specific checks opens the door to regulatory loopholes. Until formal legal updates catch up, creators using AI must treat the AI as a tool—not an entity—to ensure compliance with existing censorship laws.
This also underscores the growing need for sector-specific policy dialogue between AI developers, filmmakers, legal experts, and regulators, to safeguard creative freedom without compromising legal accountability.
Defamation, Obscenity, and Hate Speech: Who’s Liable?
With Love You ushering in a new era of AI-generated cinema, the legal question of liability for defamation, obscenity, and hate speech becomes more than just academic—it becomes urgent. In a country like India, where creative expression routinely walks a fine line with public sentiment, AI-generated dialogue, visuals, or storylines that cause offense could easily trigger criminal or civil proceedings.
Here’s the real issue: AI doesn’t have legal personhood. So if a film generated by AI ends up defaming a real individual or spreading offensive stereotypes, who takes the fall?
The legal system doesn’t allow for ambiguity here. Under Sections 499 and 500 of the Indian Penal Code (IPC) for defamation, Section 292 for obscenity, and Section 153A for hate speech, a person must be held accountable. In the case of AI films, that liability ultimately circles back to the producer, financier, or the person who commercially exploits the content.
Key Risk Areas for AI Films:
Defamation
An AI-generated character may resemble a real individual or public figure—either by accident or due to biased data inputs. If the portrayal is derogatory, it can lead to legal consequences.
Obscenity
The AI might produce sexually explicit or suggestive scenes without a creator intentionally directing it. But that doesn’t absolve the humans behind it.
Hate Speech
If the AI uses language or visuals that vilify a community or promotes enmity between groups, Sections 153A and 295A IPC could apply—both of which are non-bailable offences.
What Filmmakers Should Practically Do:
Human Vetting of AI Outputs
Treat AI as a creative assistant, not a final authority. Every script, scene, or line of dialogue generated should undergo legal and ethical review.
Bias Testing
Before using AI models, filmmakers should test them for potential cultural, racial, gender, or political bias. This can significantly reduce the risk of inflammatory or defamatory content.
Disclaimers Won’t Always Help
Simply stating “this is an AI-generated film” will not shield you from liability. Indian courts focus on the effect of the content, not just intent.
AI-generated cinema is not outside the purview of law. If anything, it makes the need for robust pre-release legal due diligence even more critical. As the regulatory framework evolves, one thing remains clear: creative innovation does not imply legal immunity—and that applies whether your script was typed by a human hand or an algorithm.
Contracts in the AI Era: Rewriting Talent and Production Agreements
As AI-generated content enters mainstream filmmaking, traditional production agreements are due for a serious upgrade. The emergence of projects like Love You—an AI-directed feature—brings new complexities to the table for producers, talent, and technicians alike. The big question isn’t whether to use AI, but how to draft contracts that reflect the realities of AI-driven production.
Most standard talent agreements—be it for actors, directors, DOPs, or music composers—presume human creativity as the foundation of work. In an AI-assisted film, this assumption doesn’t always hold. For instance, who gets credit when AI edits or even generates scenes, dialogue, or music? Can the “performer” object if their digital likeness is used in scenes they never shot?
This blurred authorship means contracts must now account for new risks and new forms of collaboration.
Practical Clauses Stakeholders Should Consider:
AI Usage Disclosure Clause
Talent should be informed—clearly and contractually—about the extent of AI involvement in production, including whether their likeness, voice, or performance may be digitally altered, cloned, or extended.
Consent and Control Over Likeness
Add clauses that define how far AI can go in replicating an actor’s voice, expressions, or movements. This is crucial to avoid future disputes or claims of misuse or misrepresentation.
Attribution and Moral Credit
In AI-generated works, it becomes tricky to decide who deserves credit. Contracts should pre-define attribution rights, especially when a human director is overseeing but not manually crafting every creative choice.
Ownership of AI Outputs
Who owns the AI-generated assets—scripts, visuals, or character designs? Contracts with AI vendors, production companies, and creatives must clarify whether the production house retains full rights, or if any software developer or AI tool provider has a stake.
Indemnity and Liability Clauses
If AI-generated content leads to defamation or infringement (as discussed earlier), there must be clear indemnity clauses to determine whether the liability rests with the producer, the AI tool provider, or any other stakeholder.
Post-Production Approvals
Especially where AI tools are used to modify voice, expressions, or full body doubles, give performers final approval rights or at least a say in the final portrayal of their persona.
With India’s legal system still catching up to AI, strong contracts are the first and most reliable line of defense. They give clarity, assign accountability, and help avoid reputational and legal risk. Whether you’re a producer hiring AI service providers or an actor stepping into an AI-enhanced set, contractual safeguards are no longer optional—they’re critical.
As we move deeper into an AI-first future, entertainment lawyers, producers, and creators must work together to rewrite the rules—not just of storytelling, but of how stories are made, credited, and protected.
The Legal Opportunity: Time for Policy Innovation
The rise of AI-generated films like Love You isn’t just a technological disruption—it’s a wake-up call for India’s media and entertainment law framework. While much of the current conversation focuses on risks and grey areas, there’s also a clear opportunity: to craft forward-thinking policy that nurtures innovation while protecting creative and commercial interests.
India’s copyright regime, performer protection laws, and censorship rules were designed for an era of human-centric creativity. Today, AI is not just assisting creatives—it’s co-creating with them, and in some cases, independently generating entire content streams. The legal vacuum around AI authorship, moral rights, content liability, and personality protection cannot be plugged by judicial interpretation alone.
We need a proactive regulatory framework that acknowledges the new modes of storytelling—whether through generative AI tools, synthetic voice cloning, or deepfake-driven narrative construction.
What Policy Makers Should Focus On:
Recognizing AI Contributions Under IP Law
India could take cues from global jurisdictions by defining “AI-assisted works” and “AI-generated works” separately. This would help in clearly attributing rights between human creators and those using generative tools.
Digital Likeness Rights and Performer Consent
Establishing a statutory right to digital personality—covering voice, likeness, and motion—would go a long way in preventing misuse and ensuring fair compensation in an AI-enhanced entertainment landscape.
Mandatory Disclosure Norms
For public-facing content, especially theatrical releases and streaming shows, policymakers could explore mandatory disclosure if AI tools have been used to generate significant parts of the script, visuals, or performances.
Ethical Guidelines for Synthetic Content
Just like advertising has ASCI, the film and digital content ecosystem could benefit from an AI content ethics board or code, especially for sensitive uses involving minors, religious content, or political messaging.
AI Tool Certification for Media Use
As AI tools become increasingly accessible, regulatory sandbox models could be adopted—similar to fintech—for pre-clearing or certifying AI systems intended for creative production.
Revisiting Censorship and Liability Provisions
With synthetic content capable of bypassing traditional checks, India’s censorship regime must evolve to include new-age accountability frameworks, particularly when content goes viral before certification.
This is the time for lawmakers, industry bodies, studios, tech companies, and artists to come together. What’s at stake is not just legality—it’s the future of Indian storytelling. If we get the balance right, India can be more than just a fast adopter of generative AI—it can be a global leader in building a responsible, innovation-friendly media law framework.
In an industry where content travels faster than regulation, the best legal response is not reactive—it’s anticipatory. The opportunity is real. The time is now.
Conclusion: “Love You” Isn’t Just a Film. It’s a Legal Precedent.
The emergence of Love You, the world’s first AI-generated full-length feature film, is more than just a remarkable technological achievement. It marks a pivotal moment in the intersection of technology, media, and law. As the film industry embraces AI-driven creativity, this project stands as a clear legal precedent that forces stakeholders to confront a new era of intellectual property, performer rights, and content regulation.
For filmmakers, tech innovators, and legal professionals alike, the legal implications of AI in media can no longer be viewed as a distant challenge. As India continues to evolve as a hub for media innovation and content creation, the questions raised by Love You—from AI authorship to performer rights and content liability—demand timely and practical legal solutions.
The legal opportunities created by AI in film and media are immense. By crafting forward-thinking policies and frameworks, India can unlock the full potential of AI-generated content while ensuring fairness and accountability. Whether it’s clarifying AI authorship or setting clear guidelines for AI’s role in creative works, now is the moment to lay down the legal groundwork for the future of digital entertainment.
For Indian lawmakers, industry professionals, and media lawyers, this is the time to lead the conversation. As the AI landscape evolves, the legal system must adapt swiftly to ensure that intellectual property rights, digital personalities, and AI-driven productions are appropriately protected. This legal precedent set by Love You will serve as a catalyst for change, pushing for clearer AI governance and more robust industry regulations in the future.
The film isn’t just a technological feat—it’s a clarion call for the legal industry to prepare for the challenges and opportunities that AI brings to entertainment. Love You might be the first, but it certainly won’t be the last. India must move quickly to ensure its legal framework supports both creativity and fairness, fostering an ecosystem where AI-enhanced creativity can thrive responsibly and sustainably.
As we continue to witness the fusion of law and technology in the entertainment world, Love You stands as a reminder that innovation often outpaces regulation—and the law must be ready to keep up.