From Spotify to ChatGPT: The case for distributed ownership of platforms built on human creativity
Our last chance for a fair internet may depend on who owns AI
Who deserves the internet’s value? Should this primarily be distributed towards the owners of the platforms that distribute content, or to the people who create that content? Due to my background in the music business, I’ve encountered this topic a lot in the context of Spotify, Daniel Ek’s vast wealth, and his decisions to invest in war technologies. Daniel Ek is worth $9.2 billion. The world's richest musician, Taylor Swift, is worth $1.6 billion.

The Spotify CEO’s wealth is a source of confusion for many people. The common complaint is: If he’s so rich, Spotify must be wildly profitable, so why don’t they pay artists more instead of funnelling money into killer drones? However, this logic does not explain why Ek was already vastly wealthy before Spotify ever turned a profit.
Ek’s wealth comes from Spotify’s valuation as a company. When expectations about a company’s future performance improve, its value increases. Shareholders like Ek can sell part of their shares and convert that valuation into cash. This is how a single shareholder of a market-leading music platform can make more money than the world’s most successful musician, since that wealth originates primarily from shares and not from dividends, salary, or bonuses. This is not limited to Spotify.
Meta, TikTok, Amazon, Google, and similar companies generate revenue by aggregating the value created by many individuals. This creates value in terms of revenue, as well as speculative value in the form of shares. Individuals can set up their market stalls on newly created marketplaces and generate revenue, while the creator of the marketplace (e.g., Spotify) imposes a tax on all revenues generated.
This dynamic raises a bigger question:
If platforms capture not only the revenue but also the speculative upside generated by millions of users, shouldn’t those same users have some claim to the platform’s broader value? After all, without the content, creativity, and labour of individuals, there would be nothing to monetise or speculate on in the first place.
In comes AI.
Many, if not all, of the most well-known AI companies trained their models using data scraped from just about everywhere. People’s writing, people’s music, people’s comments, people’s photos and art, etc. In most instances, nobody got paid. This is being addressed, but the existing proposals don’t go far enough. One recent countermeasure by Cloudflare, however, piqued my interest.
Cloudflare is a company that powers a huge portion of the internet. Up to one in five of all websites run through their network. When websites use Cloudflare, all visitor traffic first flows through Cloudflare's servers, allowing them to block attacks, filter out malicious visitors, and serve content faster from locations closer to users. This also means Cloudflare can identify AI companies that attempt to scrape content from websites using bots, known as crawlers.
Cloudflare’s measure: a pay-per-crawl model so that content creators can collect pennies from these tech giants. So they’re essentially saying: If you’re going to drive up people’s server bills by grabbing content from their websites, and then using that content for your own apps, without sending traffic back to the original websites, then you’ll have to pay for it. It’s interesting seeing these issues addressed at an infrastructure level.
In parallel, traditional publishers and creators are pursuing their own remedies. Some are striking licensing deals with AI companies, like the Washington Post and OpenAI, or suing them for copyright infringement, like the NYT and OpenAI. Or the class action suit of authors against Anthropic, which will force the prominent AI company to pay the writers of books used to train their models.
The unfortunate thing here is that this leaves publishers collecting one-time cheques, or maybe even recurring ones, whilst not participating in the value being generated on a platform level.
Publishers suing or striking deals with Google and Facebook haven’t stopped their newsrooms from shrinking, as these two tech giants concentrate value and power that used to be distributed among many dozens of companies around the world.
In that same era, the music industry managed to strong-arm a deal out of Spotify that could serve as a model for the AI era: the three major labels and a body representing a large swath of independent music labels secured equity in the streaming startup as part of their licensing deals. I would not be surprised if such demands were part of the ongoing litigation against generative music startups Suno and Udio.
What I’m worried about is this:
Over the last decades, we’ve seen a few tech corporations amass unprecedented wealth, and power. Currently, the oligarchs that fund and run those organisations are rubbing shoulders with an anti-democratic orange-hued world leader and his self-interested cronies. Now, the Big Mac President is threatening tariffs on any country that imposes taxes or regulations on these tech giants, which profit from their citizens’ data. It’s a wielding of power that serves to accrue more power.
The question I ask is this:
Should people and companies whose data is used to train LLMs participate in the financial success of these corporations? This way, power and wealth don’t concentrate as heavily in the hands of the few. A technology as transformative as AI, in any case, should be more distributed, especially if it’s based on the collective knowledge and data of humanity.
Of course, this question leads to many more questions:
What percentage of an AI company's valuation can be attributed to its training data?
How do you opt in data creators to co-ownership, or should another form of governance and value participation be explored?
Could nationalisation be an option, where the democratic bodies and representation that exist on national levels are leveraged instead? If so, how do we compensate the people whose data has contributed to these models?
What does the political and diplomatic nature of such deals look like? Given that the AI race involves people at the highest levels of government.
I actually suspect that internet-native value protocols (keywords: blockchains, web3, tokens, etc.) could form a part of the solution, or at the very least provide a great philosophical basis to address problems of inequitable value distribution and concentration. After all, it’s a domain that attracted some of the world’s smartest people who have spent years experimenting and creating technology that addresses exactly these types of problems.
As a matter of fact, when music streaming service SoundCloud was on the brink of collapse, artist and technologist Mat Dryhurst, who had also served on the advisory board of music streaming cooperative Resonate, suggested tokenising SoundCloud and letting its users (artists, labels, listeners, etc) participate in the governance and the financial stake of the company.
We’ve dragged our feet for too long in addressing these problems. A fair deal for AI may well be our last chance for a fair internet.
While in the case of the class action lawsuit authors brought against Anthropic, critics believe that:
“If every author in the class filed a claim, industry advocates warned, it would "financially ruin" the entire AI industry.”
A simple solution could be distributing equity instead of cash.
The creators of data shouldn't just be paid a licensing fee; they should participate in the success of those models. If companies can't afford to pay for licenses, then distribute equity.
Couldn't agree more with this! Updating the economic architecture of the industry feels essential, and regulators stepping up also feels essential. Most countries don't allow counterfeit goods to be produced unless under license, so why effectively allow the same with music without a deal in place that everyone is happy with.
On the flip side, moving away from dependencies on streaming revenues and social media marketing even more, could help. Building more direct connections with fans in more localised scenes builds value above the music, which AI could be less likely to impact.
This was on my mind so much and confusing me as well, thank you for putting the research together <3