In "The Coming Wave" (2023), Mustafa Suleyman and Michael Bhaskar argue that we are entering a period of exponential technological change defined by AI and synthetic biology. While the book is a broad geopolitical warning, Michael Bhaskar’s influence as a publisher is evident in the sections discussing the "Information Explosion" and Hyper-abundance.
The book is organized into four major parts. Below are detailed summaries for each, with a dedicated focus on the concept of Hyper-abundance.
Part I: Homo Technologicus
The Core Idea: Technology is not something humans do; it is what humans are.
This section argues that human evolution is inseparable from the "waves" of technology we create. From fire and stone tools to the printing press and electricity, every major wave follows a predictable pattern: it begins with a single breakthrough, undergoes rapid proliferation, becomes cheaper, and eventually becomes an invisible, foundational layer of society.
The authors explain that these waves are "containment-resistant." Once the "recipe" for a technology is out—whether it’s the design for a steam engine or the code for an LLM—it is almost impossible to put back in the bottle. This section sets the stage by showing that our current institutions (governments, laws, and publishers) are built on the logic of past waves, which moved slowly enough for society to adapt. The "Coming Wave," however, is different because it moves at digital speed while impacting the physical world of biology and atoms, creating a friction that our current systems may not survive.
Part II: The Next Wave (Including "Hyper-Abundance")
The Core Idea: The defining features of the new wave are asymmetry, hyper-evolution, and Hyper-abundance.
This is the heart of the book’s impact on the publishing and creative industries. The authors describe four features: Asymmetry (small actors having huge power), Hyper-evolution (tech improving itself), Omni-use (AI being used for everything), and Autonomy.
Deep Dive: Hyper-abundance
For Michael Bhaskar and the publishing world, the most disruptive force is the Hyper-abundance of content. Historically, the publishing industry acted as a "gatekeeper" because physical space, attention, and the ability to produce high-quality text were scarce. AI shatters this scarcity.
• The Death of the Gatekeeper: When an AI can "extract" the essence of a library and generate millions of new "books," "articles," or "reports" instantly, the value of a single piece of content drops toward zero.
• The Noise Problem: We are moving from an era of information "scarcity" to one of absolute "surplus." In a state of hyper-abundance, the challenge for a publisher is no longer producing content, but filtering it.
• Synthetic Competition: Authors are no longer just competing with other humans; they are competing with an infinite stream of synthetic content that is "good enough" for many readers. This "Coming Wave" of text threatens to drown out human voices in a sea of algorithmic noise, fundamentally breaking the economic model of traditional publishing.
Part III: States of Failure
The Core Idea: The "Grand Bargain" between the state and the citizen is breaking down.
Suleyman and Bhaskar warn that the hyper-abundance and power of AI will lead to "Fragility Amplifiers." In this section, they describe how the nation-state relies on having a monopoly on power and information. When AI allows a single individual to create a bioweapon or a small group to flood a country with "hyper-realistic" disinformation, the state loses its ability to protect its citizens.
For the publishing and media industries, this means the "truth" itself becomes a casualty of the wave. If AI can generate a hyper-abundant supply of "fake" books, "fake" news, and "fake" history, the very foundations of a shared reality crumble. The authors suggest that this could lead to "States of Failure," where governments are so overwhelmed by the speed and volume of the wave that they become paralyzed, unable to regulate the technology or protect the digital commons from being polluted by synthetic junk.
Part IV: Through the Wave (The 10 Steps)
The Core Idea: Containment is the "Great Dilemma," but it must be attempted.
The book concludes by acknowledging a terrifying paradox: if we don't contain AI, it could lead to catastrophe; but if we try to stop it too forcefully, we invite stagnation and lose the benefits (like curing cancer or solving climate change).
The authors propose a "10-step plan" for containment. This includes technical safety (building "off-switches" into AI), global treaties, and a "new social contract." For the world of knowledge and publishing, this means creating "watermarks" and "authentication systems" so we can tell the difference between a human-written book and an AI-generated one. They argue that we must move from a world of "open extraction" to one of "verified creation," where the digital commons is protected by new laws that recognize the value of human agency over machine-generated abundance.
Here is how the 10 steps to containment specifically apply to authors, publishers, and the future of copyright law in 2026.
The 10 Steps to Containment for Publishing
1. Safety: Technical Guardrails for Authors
This step focuses on "Safety by Design." In publishing, this means building AI models that have inherent "respect" for copyright. Instead of models that blindly extract data, "Safety" involves creating systems that can verify if a piece of text is a direct "memorization" of a copyrighted book before it is generated. It's about building a "kill switch" for plagiarism within the model itself.
2. Audits: Transparency of the "Extraction"
Suleyman and Bhaskar advocate for mandatory third-party audits. For authors, this means AI companies would be legally required to let auditors scan their training data. In 2026, this is manifesting as Transparency Reports, where publishers can finally see if their "closed" digital commons were used to train "open" models without permission.
3. Choke Points: Controlling the Hardware
The "Coming Wave" is powered by chips (Nvidia, etc.). The authors suggest using these hardware choke points to enforce rules. If a company is found to be mass-extracting and translating authors' works without licenses, regulators could theoretically restrict their access to the high-end computing power needed to run those models.
4. Makers: Critics as Builders
This step encourages those who are skeptical of AI to help build it. In publishing, this is seen in "Author-Led AI" projects. Instead of letting tech giants define the future, authors and publishers are building their own "Sovereign LLMs" that are trained only on licensed, high-quality literature, ensuring that the "Hyper-abundance" they produce is ethically sourced.
5. Businesses: Profit with Purpose
The authors argue that AI labs must move beyond "moving fast and breaking things." For the publishing world, this means a shift in the business model: AI companies should pay a "Digital Commons Fee" or royalty pool to the creators whose works make their models "smart" in the first place.
6. Governments: The New Copyright Act
Suleyman and Bhaskar warn that governments must "survive and reform." This applies directly to the 2026 AI Act and updated copyright laws. Governments are now moving toward a "Right to Remuneration," where "extraction" for commercial translation is no longer considered "Fair Use," but a licensed derivative work.
7. Alliance: Global Treaties on Content
Because the internet has no borders, an AI in one country can extract a book from another and translate it for a third. The authors call for an "International Treaty on AI." This would prevent "Copyright Havens"—countries that allow AI companies to steal books with impunity—by creating a global standard for digital IP.
8. Culture: Respecting Human "Spark"
This is a call for a cultural shift. We must decide that "human-made" has a value that "synthetic-abundant" does not. Just as the "Organic" label changed the food industry, a cultural movement for "Human-Authored" content helps publishers maintain a premium market in a sea of free AI text.
9. Movements: People Power
Containment requires a "grassroots" movement. We see this today in author strikes and class-action lawsuits. Suleyman and Bhaskar argue that it is the collective pressure of "The People" (the authors and readers) that will force tech companies to respect the digital commons.
10. The Narrow Path: Balance
The final step is the realization that we cannot ban AI, nor can we let it run wild. The "Narrow Path" for publishing involves using AI for the "boring" parts (metadata, distribution) while fiercely protecting the "human" parts (the original narrative, the emotional "spark").
The 2026 Solution: Watermarking & Provenance
A key practical outcome of these 10 steps is the rise of Digital Watermarking.
• For the Author: Every original work now contains "poison pills" or invisible digital markers that allow an author to prove in court that an LLM "extracted" their work.
• For the AI: Under 2026 regulations, any AI-generated translation must carry a "Provenance Tag," informing the reader that "This book was 90% generated by a machine based on the works of [Author Name]."
By following these 10 steps, the publishing industry moves from being "eaten" by the wave to "riding" it—using the efficiency of the machine to support, rather than replace, the human creator.
No comments:
Post a Comment