Download (2)

Who Owns the Output In The Era Of Generative Ai ?

The AI Boom Has a Copyright Problem |

Generative AI isn’t just changing how we work—it’s changing what we create. From AI-generated music to photorealistic art, poetry, and even legal memos, platforms like ChatGPT, MidJourney, and DALL·E are producing content that looks, sounds, and feels convincingly human.

But as these tools become more embedded in creative industries, a thorny question keeps surfacing: who owns the stuff they make?

Is it the user who clicked “generate”? The engineers who built the model? The artists and writers whose work trained the algorithm? Or—controversially—no one at all?

At the heart of this debate is a reality many policymakers are still catching up with: the legal definitions of authorship and creativity were never built with machines in mind.

Copyright Law Meets Its Match

Traditional copyright law is crystal clear on one thing: only humans can be authors. In fact, in 2020, the U.S. Copyright Office rejected a claim for a painting created entirely by an AI, arguing that “copyright law protects the fruits of intellectual labor founded in the creative powers of the mind” (Thaler v. Vidal, 2023).

Put plainly: if a human didn’t create it, the law doesn’t protect it.

This creates a dilemma. AI-generated works may be original, compelling, and even transformative—but under current U.S. and EU law, they often fall into a legal no-man’s land.

As legal scholar David Schiff (2024) puts it, “We need a legal architecture that reflects the reality of collaborative creation between humans and machines.”

So… Who Gets the Credit? Four Ownership Models

As regulators play catch-up, academics and industry experts have proposed different frameworks to address AI authorship. Here are four of the leading ideas:

1. Developer Ownership

This model gives rights to the tech companies that built the generative AI systems. It’s easy to justify—after all, they created the tool.

But critics warn this could hand monopolistic control to Silicon Valley giants. “Granting ownership to developers risks creating monopolies over creative expression,” argue Bently & Sherman (2023).

2. User Ownership

Another approach gives rights to the end user who generates the output. That might seem fair—especially when the user guides the process through prompts, revisions, and intent.

Still, it raises a critical question: how much human input is enough to count as authorship?

3. No Ownership (Public Domain)

Some legal theorists propose a radical solution: AI-generated content should automatically enter the public domain. No copyright, no control.

This model maximizes access but discourages investment, especially for smaller creators or companies that can’t afford to build without IP protections.

4. AI as Tool, Human as Author

Perhaps the most balanced idea? Treat AI like a tool—like a pen, a paintbrush, or Photoshop. The user is the author, as long as there’s meaningful human contribution.

“When AI enhances human creativity rather than replaces it, the user should be recognized as the author,” write Geiger and Dann (2022). But once again, the definition of “meaningful” remains blurry.

From Art to Academia: The Real-World Fallout

This debate isn’t just academic. The implications are playing out in courts, classrooms, and creative communities:

  • Getty Images vs. Stability AI: Getty is suing the creators of Stable Diffusion for scraping copyrighted photos to train their model—without permission.

  • Authors vs. OpenAI: Writers like George R.R. Martin have joined lawsuits claiming their books were used in training AI without consent or compensation.

  • Universities and Plagiarism: Colleges worldwide are scrambling to define what counts as original work in the age of ChatGPT.

The result? Confusion, lawsuits, and a mounting sense that we need new rules—fast.

Building a Smarter Legal Framework

Experts are calling for a refresh of intellectual property law that fits the new AI landscape. Here’s what’s on the table:

  • Define Human Contribution: Lawmakers must set clear thresholds for what counts as meaningful human authorship in AI-assisted work.

  • Licensing Training Data: Companies should be required to license copyrighted data before feeding it into AI models.

  • New IP Categories: Some scholars recommend creating a whole new class of protection just for AI-generated content—distinct from traditional copyright.

  • More Transparency: Developers should disclose their data sources and model training methods to help resolve disputes and build trust.

Rethinking Creativity Itself

We’re in the middle of a seismic shift—not just in how content is created, but in how we define creativity, authorship, and ownership. Generative AI is challenging the old rules and demanding that we draw new lines.

Kate Crawford (2021), author of Atlas of AI, puts it best: “The age of AI demands not just new technologies, but new ways of thinking about rights, responsibility, and shared creativity.”

This isn’t just about lawyers and techies. It’s about who gets to be called a creator in a world where machines can create.

What Do You Think?

Should AI-generated content be copyrighted? Should users own their prompts? Or should everything go straight into the public domain?

References

Bently, L., & Sherman, B. (2023). Intellectual Property Law (5th ed.). Oxford University Press.
Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
Geiger, M., & Dann, D. (2022). Generative AI and the Future of Authorship. Harvard Journal of Law & Technology, 35(2), 123–145.
Schiff, D. (2024). AI and the Law: Toward a Framework for Machine-Generated Works. Stanford Technology Law Review, 27(1), 45–78.
Thaler v. Vidal, 17 F.4th 1349 (U.S. Court of Appeals for the Federal Circuit, 2023