top of page

From Links to Answers: The New Social Contract of the Web

  • Writer: Felipe Palavecino
    Felipe Palavecino
  • Sep 12
  • 7 min read

Updated: Sep 17

ree


For more than two decades, the open web was sustained by a simple but powerful bargain: search engines directed users to publishers, and in return, publishers monetized that traffic through advertising, subscriptions, or sponsorships. This implicit exchange—attention for access—enabled news organizations to fund journalism while preserving a degree of plurality in the information ecosystem.


Generative AI is now testing the limits of that contract. Tools such as ChatGPT, Perplexity, Copilot, and Gemini increasingly deliver answers instead of links. The value once embedded in click-throughs to publisher websites is collapsing, replaced by synthesized outputs that often obscure attribution. In this shift, publishers risk losing not only traffic but also visibility, authority, and bargaining power.


This is not merely a continuation of the platform era defined by social media. It is a deeper reconfiguration of how audiences discover and consume information. As the Digital News Report 2025 shows, fewer than one in three users globally now start their news journeys on websites or apps. In a world where the dominant unit of currency is no longer the link but the answer, publishers and policymakers must rethink the principles of plurality, trust, and value exchange.


A Longer Arc to Today’s “Answer Era”


Even before generative summaries, search engines were already pushing toward zero-click searches, where users solved their needs without leaving the SERP. The arrival of AI Overviews accelerates a preexisting trend: more resolution in place, less referral to origins. The recent history is not an isolated rupture but the culmination of a shift away from the click as the central currency of the web.


From Distribution to Substitution


In the first platform era, dominated by Facebook, X (ex-Twitter), and Google Search, publishers retained a measure of agency. Platforms aggregated and distributed, but the endpoint was often a referral. Even when the funnel narrowed, distribution still led to destination.


Generative AI alters this logic. Instead of guiding audiences to sources, it substitutes the destination with a synthesized response. A Reuters Institute experiment found that chatbots frequently delivered fabricated citations, and when attribution was correct, click-through rates dropped to as low as 5%—compared with the 25–35% range in conventional search.


The risk is structural: if AI systems become the dominant discovery mechanism, publishers may be relegated to invisible suppliers—raw data for probabilistic engines—rather than recognized originators of journalism.


The Collapse of the Click


New research from the Reuters Institute and the University of Oxford (Beyond the Click: The New Currencies of the AI Search Era) underscores the depth of this disruption. The study documents a Great Decoupling: while impressions in AI-driven search interfaces remain steady or grow, click-through rates (CTR) fall sharply—by ~34.5% in organic search and nearly 50% in paid when AI Overview summaries intervene.


This decoupling exposes a critical flaw in the legacy web economy: the click is no longer a reliable signal of value. The unit of exchange that once linked user attention with publisher monetization is collapsing. In its place, new currencies are emerging—authority, trust, and engagement quality—metrics that better capture influence upstream of the click.


The Erosion of Brand Integrity


Brand is one of journalism’s most valuable assets. Audiences know the difference between a headline from the Financial Times and one from an anonymous blog. That distinction drives trust, subscriptions, and loyalty.


Yet when AI chatbots remix, summarize, or paraphrase news without clear attribution, brand integrity blurs. Columbia University’s Tow Center has documented cases where premium AI models served “confidently wrong” answers, sometimes attaching fabricated URLs to reinforce credibility.


Technical antidotes exist—but must be adopted. Open standards such as C2PA / Content Credentials allow publishers to attach provenance and editing credentials to every piece of content, enabling auditing, traceability, and brand signals resilient to de-branding. On the model side, approaches such as Attributable to Identified Sources (AIS) and RAG with mandatory citation point toward verifiable answers with visible references.


Power Asymmetries and the Licensing Dilemma


A handful of technology companies now control both the distribution channels and the computational infrastructure needed to scale AI. Their leverage over publishers is unprecedented.


Licensing agreements have emerged as a proposed remedy. These deals generally take three forms:


  • Training-data access: one-time payments for historical content to refine language models.

  • Inference access: ongoing permission to pull live information into AI responses.

  • Hybrid deals: covering both, often bundled with revenue-sharing schemes.


Major publishers such as The Financial Times, News Corp, and Axel Springer have already signed agreements with AI companies. Yet, as analysts note, training data quickly loses value, while inference access—crucial for real-time results—is typically undercompensated.


Smaller publishers face even greater challenges. Competition law often prevents collective bargaining, leaving them to negotiate individually with companies whose market power dwarfs theirs. The result is an uneven playing field: global brands consolidate influence, while local and regional voices risk exclusion.


The Audience Paradox


Audiences themselves are not passive in this transformation. Surveys show that younger users increasingly turn to AI chatbots for information, valuing speed, personalization, and conversational style. At the same time, they express concern about transparency, attribution, and reliability.


This paradox underscores a critical tension: users are not rejecting journalism, but the friction of accessing it. If generative AI offers a smoother experience—even at the cost of occasional inaccuracy—many will accept the trade-off.


For publishers, the lesson is stark: competing on accuracy alone may not suffice. Convenience, accessibility, and personalization are now decisive factors in audience behavior.


Policy and Regulation: Fragmented Responses


Governments are beginning to confront the collision between AI platforms and news publishers, but responses remain fragmented:


  • Australia: The News Media Bargaining Code forced platforms to negotiate licensing deals, establishing a precedent some see as adaptable to AI.

  • European Union: The AI Act and copyright reforms explore transparency obligations (including traceability for GPAI) and stricter disclosure requirements, with special attention to RAG systems that combine retrieval and generation.

  • United Kingdom: Consultations on scraping, data protection, and attribution are ongoing; the ICO urges transparency, and the CMA is assessing competitive risks of foundation models. Robots.txt is seen as insufficient.

  • United States: Litigation is mounting, but policy is fragmented. AI companies lobby to weaken copyright restrictions, framing data access as a national competitiveness issue.


This regulatory patchwork creates uncertainty. For publishers, the absence of harmonized standards means navigating a global marketplace of opaque and inconsistent rules.


Beyond Copyright: The Need for New Frameworks


Copyright alone cannot resolve the challenges of generative AI. Key issues transcend traditional IP law:


  • Attribution vs. ownership

  • Accuracy vs. liability

  • Plurality vs. homogenization


The Beyond the Click study proposes one potential remedy: a two-tiered economy of Authority and Action. In this model, AI systems would first allocate credit and value to authoritative sources in the attention/authority auction (measuring trust and influence upstream), and then enable monetization in the action auction (transactions and conversions).


This shift recognizes that authority itself must be treated as a currency, not just clicks.


Answer-space KPIs for a clickless economy:


  • Share of Source within answers (percentage of appearances and brand prominence).

  • Visible Attribution Rate (clear, clickable citations per answer).

  • Answer-Assisted Referrals (visits arriving after exposure to a branded answer).

  • Coverage Diversity Index (number and diversity of co-appearing sources).

  • Confidence/Disclosure Score (whether the UI shows sources, methods, and limits).


Toward a New Social Contract


If the web is shifting from links to answers, what principles should guide the next contract between AI companies, publishers, and audiences? Four pillars stand out:


  1. Transparency in Attribution

    Every AI-generated answer that relies on journalism must provide clear, accessible citations. Attribution should not be hidden—it must surface the publisher’s authority as part of the answer itself. Standards such as C2PA and attributable generation approaches (AIS/RAG) must be promoted as technical foundations.

  2. Fair Value Exchange

    Licensing frameworks must evolve into enforceable, auditable standards. Regulators should ensure inclusion of small and medium publishers and guarantee that remuneration reflects continuous value (inference), not just one-off training deals.

  3. Plurality by Design

    Generative systems should avoid collapsing perspectives into a single narrative. Technical and policy measures must encourage diversity of sources and surfacing of viewpoints within the same answer, with public controls for bias and concentration.

  4. Authority as Currency

    New economic models must recognize that trust and authority are marketable assets. Without that recognition, journalism risks degrading into invisible input for AI pipelines.


A Playbook for Publishers


While systemic reforms will take time, publishers can act now:


  • Experiment with agent-friendly models: Offer structured, well-documented content APIs (including licenses and rate limits) so agents/LLMs retrieve fragments with mandatory attribution.

  • Invest in brand-as-signal: Adopt C2PA/Content Credentials, strengthen schema/metadata (bylines, beats, timestamps, locations), and use watermarks and signatures where viable to persist identity.

  • Build collective intelligence: Join sectoral alliances (regional or language-based) for licensing standards, attribution auditing, and compliance lists.

  • Educate audiences: Show, with examples, how reporting is done (method, documents, interviews), why it differs from synthetic summaries, and what signals to look for (citations, context, accountability).

  • Measure beyond the click: Report internally and externally on Answer-space KPIs and link them with subscriptions, habit, and loyalty (e.g., whether visibility in answers correlates with newsletter signups or return frequency).

  • Design answer-ready content: Create verifiable summaries, FAQ blocks, timelines, and glossaries with deep links—making it easier for engines to cite faithfully and for users to expand with a trustworthy click.


Scenarios for 2030: Futures of the Information Ecosystem


  • Platform Dominance

    AI companies consolidate power, offering end-to-end answers with minimal attribution. Journalism becomes invisible infrastructure; plurality erodes.

  • Regulated Equilibrium

    Governments enforce transparent attribution and fair licensing. Publishers secure predictable revenue streams. AI enhances discovery without replacing original reporting.

  • Publisher Innovation

    News organizations reclaim agency by building their own AI-driven interfaces, blending convenience with credibility. Alliances prove decisive to offer multi-brand answers with rich citation and deepening options.


The trajectory is undecided. But the choices made today—by publishers, policymakers, and platforms—will determine whether journalism remains visible, valued, and viable in the next decade.


Conclusion: Preserving Journalism in the Age of Answers


The rise of generative AI represents more than a technological disruption. It is the renegotiation of the web’s social contract. Where once search engines rewarded publishers with traffic in exchange for content, AI platforms now extract value while returning little in visibility or revenue.


If left unchecked, this shift threatens not only the sustainability of journalism but also the diversity and accountability of the information ecosystem. To counterbalance, publishers must insist on transparent attribution, fair value exchange, plurality, and recognition of authority as currency, while innovating in distribution and user experience.


The open web was built on links. The next era will be built on answers. The challenge is to ensure that journalism remains visible, trusted, and indispensable within it.

 
 
bottom of page