Joshua Walker
Joshua Walker
AI Legal Expert

How I Learned to Stop Worrying and Love the Bombe*

Cults, Creator Contracts, and IP Cataclysm in Silicon Valley: Architecting Cooperative AI Value-sharing Structures.

This may be the first time in history when you can lose your pants by holding onto them too tightly.1

1. Background

Being ancient, I remember fondly [?] the debate and zeitgeist in the lead up to the Ninth Circuit decision2 in the RIAA’s landmark lawsuit against Napster.3

In advance of oral argument, the defense attorneys for Napster happened to be visiting my law school—for talks and recruiting. I remember their robust exuberance and proclaimed faith that the fair use defense would defeat claims of contributory and vicarious infringement.

Any Napster-enabled infringement, they argued, was essentially no different than driving one neighbour over to another’s house, to share vinyl records.

That they were “driving” a non-trivial portion of the US and global populace was immaterial. Enabling a chunk of humanity to share music freely was just like the odd mixed tape exchange amongst sweethearts.

Yes. And Russian mine fields are just minor inconveniences—an emphatic variant of hopscotch…

Scale constitutes its own quality. A zillion times a zillion digital copies is different than a twofer, or a six pack of mixed tapes slipped through a letter box at sunset.

Scale is its own quality, and it is better we understand that about Artificial Intelligence (AI)—particularly around data intake—sooner rather than later. Napster tells us so. It was a disaster both for content owners and—even more so, of course—the defendant. Actually, it was a disaster for consumers too. (But eventually, after much sorrow and wastage, consumers got the kind of service they wanted.)

When YouTube, the start-up, also faced an analogous intellectual property (IP) litigation wave—an existential threat written in Times New Roman—something strange happened. First, Google bought them. This seemed to us, the IP digerati of Silicon Valley at the time, as if Google had swallowed an IP litigation bomb.

Then something even stranger happened.

Google digested the bomb.

Not only did Google’s seemingly infinite cash flow machine buffer, and aggressively magik, the massed hordes of IP plaintiffs. Smarter yet: Google cut deals. At scale. With velocity. With myriad major players and content owners.

This was totally unexpected. The acquirer deserves all credit. And yet… Google had something else going for it.

Namely, History. Learning from the economic disaster that was the Napster imbroglio. The RIAA and related plaintiffs had beaten Napster (the company), yes; but the victory was pyrrhic. Consumers had tasted something new: Songs unbundled from albums. Digital convenience unbounded from clunky physical artifacts. More choice. They weren’t going back.4 So, record industry revenues cratered. Piracy decentralized and proceeded.

The video content industry evidently learned from the RIAA’s legal-victory/BusinessApocalypse. The video content industry didn’t just want to shut the YouTube platform down, or to shut it down at all. They wanted a cut of the flow.

And what a flow it was, and continues to be / grow. Democratized at that. At time of writing, YouTube has approximately 2.6 billion users. (For background: The population of Earth is just a smidge over eight billion; so 2.6 is just shy of a third of us humans.5) Their user demographics intertwine with the fondest daydreams of content producers and marketers.

The content industry did relatively better than the record industry by cutting licensing deals in the shadow of their potential claims. And, best of all, because consumers had increasing options as a result of such deals, YouTube’s content deals and their progeny both (i) increased consumer utility, by increasing options / usage alike and (ii) decreased piracy (see “(i)”).

So…

Just as consumers tasted music unbounded in the early aughts, so have consumers now tasted creative empowerment and “universal personal interns”6 in the form of ChatGPT, its progeny, and peers. The genie will not go back in the bottle, no matter how many IP lawsuits arise; and no matter the effect of those lawsuits on unique corporate progenitors.

Just so, we can learn from, and discern between, (i) the record of the record industry and (ii) the relatively aptly-named “rich media” industry and YouTube. We can see how each dealt with disruption. We can review the corporate and revenue effects of the choices they made. And most importantly, we can see the relative benefits to consumers and markets.

II. Two Questions

This note is aimed at two immediate, practical questions:

First: What deals can and should we construct to avoid “Contentmaggedon” (a Napster-like outcome to the detriment of all)? What are the potential Pareto-optimal outcomes for builders (tech) and creators (content owners)? And how would we potentially get there?

Second: What are the potential outcomes for regular people? How should our intellectual property, personhood, and data rights be designed? What is optimal?

As a practicing intellectual property and data rights attorney, I have generally avoided “should” questions like the plague, including law review articles and other notes on what the law “should” be. My clients did not care. And unduly theoretical aspirations for the law were not practically useful in the “now” of firm work.

But there are reasons to think that “should” matters to even the most jaded litigators, in this case. First, policy, social merits, transformative value, and market economics will all play a role in the copyright infringement defense of fair use, in antitrust cases around AI and analogous advanced technologies, and/or in other doctrines at play in how we regulate AI inputs and outputs.

Second, we are at a flux point. In other words, whether you believe we’ve hit the “AI Singularity” or just potentially devolved “universal personal interns”7 to every human with a phone, we must acknowledge that the cost of creating has gone down—whether code, articles, novels, screenplays, pictures, design frameworks, etc.

We can use this crisis as an opportunity to think about fundamental IP and data rights from a blank slate.

This does not mean ignoring precedent or existing rights holders for some kind of fabulist anarchic data or IP state. (It may mean the opposite.) Rather, we have an opportunity to leverage empirical data to construct rights models from first principles.

Again, the empirical data marshaled in favour of claimants and claim defendants in present AI cases will help determine the fate of fair use and other arguments. In other words, the presently suggested blank slate exercise is partially required by instant doctrine, in myriad live cases relating to AI input and output – many likely precedent setting.

A. Technologists v. Content Owners

To begin at the beginning, we have to establish the total value of material claims asserted against AI defendants, minus relevant cost of litigation. This includes a probabilistic / empirical merits analysis. If we, the courts, and the technologists can’t grok the total liability / injunction risk, compromise will not make sense for anyone—and litigation will drag on (and/or drag us down).

Once we have understood what “Contentmageddon” looks like for both technology platforms and creators we can think of the opposite. What is a YouTube like set of deals that will better benefit consumers / users, creator / content-owners (these two categories may collapse), and useful technology platforms?

Okay, but what?

B. Towards A Blank Slate, Optimized Model of IP, Personhood, Data Rights

When computer services can perfectly emulate each individual actual or figurative “voice”, we may have to rethink rights of legal personhood and legal agenthood for machines—and potentially strengthen IP personality rights for humans, or use common law to extend existing concepts to new technological effects. Similarly, screenwriters, actors, and creators of all kinds may need to negotiate hard against producers, yes. But that alone will fail. Producers are trying to profit-maximize and if some block an efficient new creative tool they may lose out to others moving into content automata at flank speed. The writers need do more.

But what?

Silicon Valley / San Francisco cults actually give us a couple of ideas (or “precedents”).

C. An Alternative Model for AI Regulation and Profit-/IP Rights- Sharing: Architecting Cooperative AI Value-sharing Structures

It is popular to link San Francisco, home of “AI Gulch” with not only the counter-culture of the 1960s, but cults of yore, and the present in almost exactly the same set of city blocks. Cults, including dangerous cults, have been a part of San Francisco culture for a long time. It is argued that the “alternative lifestyle” / way of thinking is also one of the things that powers radical new ideas about technology, including AI breakthroughs.

If that is the case, it provides one of the most powerful arguments for regulation of AI outside the Bay Area / the technologist’s heartland, because cultish / self-destructive ideas + the economies of scale of software, etc., compound danger.

But there is another kind of, more benign, cultish organization rampant in the Bay Area. Most, if not all employees of OpenAI, and every other Bay Area start-up are likely part of it, and most big tech company employees too.

It is called REI.

No, no. REI does not stand for some obscure Latin injunction. Nor does it stand for “Resonant Energy Investiture” or some 60s counter-culture meme. No. Nothing so ordinary and quotidian.

REI stands for “Recreational Equipment Incorporated”. They sell skiing, camping, and… most important of all… hiking equipment (whatever “hiking equipment” is).

Since we are all part of this cult, and proud of it, I can say with confidence that everyone involved in the IP and AI debates in the Bay Area understands it. And here’s the crux that we stupidly have ignored in architecting potential AI value-sharing structures:

REI is a cooperative.

It is basically run like a corporation. But excess profits get redistributed to the members: Us.

We should use the Co-op structure to manage value for AI inputs and outputs. Who gets what? TBD. But these tools are so widespread, so impactful, and so much leveraging our past individual and collective work, that they are far more appropriate than the “winner take all”/“loser be crushed” model. Writers’ unions need to create on a whole new level, collectively. They need to create their own creative AIs, using a co-op model.

Re-create. Extraordinary. Inventions.


* The “bombe” was a probabilistic device used by Alan Turing and other British cryptologists in WWII to help decrypt Nazi Enigma code systems. This tool was developed contemporaneously with the Colossus program; which deployed some of, or the, first programmable digital computers.
1 The Author, KQED Forum, Michael Krasny (Host), with Professor Mark Lemley (National Public Radio, November 21, 2006); see https://law.stanford.edu/press/the-business-of-technology-firms/.
2 A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th. Cir., 2001)(“Napster Decision” or “Napster”, italicized).
3 The “RIAA” is the Recording Industry Association of America. Napster was actually sued by a group of 18 individual corporate entities (including lead plaintiff A&M Records, Inc.), all of which were members of the RIAA, as well as two songwriters.
4 Not at scale anyway. Of course, there is a relatively sizable, somewhat nostalgic / sonic aficionado market in vinyl.
5 Of course, TikTok and other video platforms have learned and are also enormous, surfing a tidal wave of user and advertising growth.
6 Kevin Kelly 2023. See, e.g., https://www.youtube.com/watch?v=hIJw72PDRSc.
7 Kevin Kelly (public description of ChatGPT 4 utilities). A notable public intellectual, Kelly is the co-founder of Wired and numerous other entities.


Download PDF

Subscribe to our newsletter

Don’t miss out on the opportunity to be the first to know about our exciting updates. Simply enter your email address below and hit the subscribe button to join our newsletter today!

Related articles

The Daiki process enables you to use and customize language models responsibly, enhancing the safety, truthfulness, and helpfulness of LLMs.
ISO 42001 is a certifiable framework for an AI Management System (AIMS) that aims to support organizations in the responsible development, delivery, or use of AI systems.
A tale of delicate balances, trade-offs, and the need for pragmatic solutions in regulatory eco-systems
Daiki Logo

Request a demo today

Daiki Logo

Apply today

Daiki Logo

Partner with us

Daiki Logo

Join our waitlist for responsible AI development

Daiki Logo

Join our waitlist for responsible AI development

Daiki Logo

Join our waitlist for responsible AI development

Daiki Logo

Join waitlist for responsible AI development