Wow, this is a grim reality check: AI hyperscalers taking in billions of revenue, while at the same time putting honest business like Tailwind out of work, without any form of compensation. What happened to "you wouldn't steal a car" etc.? It's only illegal if you're not a trillion dollar company?
I have trouble expressing how terrible unjust it feels that AI companies are stealing money from the common people. I have no other way to put it.
Also: this will definitely limit the use of AI. People will stop publishing valuable content for free on the internet, if AI scrapers will steal and monetize it.
I’m not sure this is such a reality check. I remember figuring this out maybe a month or so after October 2023, when ChatGippity first dropped. Like, if it’s a “do anything platform” won’t the first anything be to cannibalize low hanging anything’s, followed by progressively higher hanging anything’s until there’s no work left?
Like play out AI, it sucks for everybody except the ones holding the steering wheel, unless we hold them accountable for the changing landscape of stake-in-civilization distribution. Spoiler: haha, we sure fucking aren’t in the US.
> Like play out AI, it sucks for everybody except the ones holding the steering wheel
Not true. Models don't make owners money sitting there doing nothing - they only get paid when people find value in what AI is producing for them. The business model of AI companies is actually almost uniquely honest compared to rest of software industry: they rent you a tool that produces value for you. No enshittification, no dark patterns, no taking your data hostage, no turning into a service what should've been a product. Just straightforward exchange of money for value.
So no, it doesn't such for everyone except them. It only sucks for existing businesses that find themselves in competition with LLMs. Which, true, is most of software industry, but it's still just something that happens when major technological breakthrough is achieved. Electricity and Internet and internal combustion engines did the same thing to many past industries, too.
> they only get paid when people find value in what AI is producing for them
The people "finding value in them" are other people with money to throw at businesses: investors, capital firms, boards & c suites. I'm not sure anybody who has been laid off because their job got automated away is "finding value" in an LLM. There's a handful of scrappy people trying to pump out claude-driven startups but if one person can solo it, obviously a giant tech company can compete.
blank stare they're not taking my data hostage, but they're sure as shit taking my data
I think we just fundamentally disagree on all of this. You may be right, and I hope you are. I go back and forth on whether it's going to be a gentle transition or a miserable one. My money is on the latter.
In the sense that a dark pattern is anything designed to trick people into doing something they didn't necessarily consciously want to do, the entire AI industry is an oligarch's wet dream of a dark-pattern: every day we're teeing them up latent information on human-level patterns of control that I promise you LLM providers are foaming at the mouth to replicate. Like if you've got an effective "doing" system, and you've got an effective "orchestrating" system, that's AGI. Deployed at scale, at competitive cost, and even a 1.1x improvement over regular workforce, that's game for anybody but billionaires. There will be a slow dynamic deplatforming of regular people, followed by an extermination. Palantir is building the rat poison and maid service.
> The people "finding value in them" are other people with money to throw at businesses: investors, capital firms, boards & c suites. I'm not sure anybody who has been laid off because their job got automated away is "finding value" in an LLM.
And the millions with ChatGPT (and other LLM) subscriptions, using it for anything from for-profit and non-profit work to hobby projects and all kinds of matters of personal life.
Contrary to a very popular belief in tech circles, AI is not only about investors. It's a real technology affecting real people in the real world.
In fact, I personally don't give a damn about inverstors here, and I laugh at the "AI bubble" complaints. Yes, it's a bubble, but that's totally irrelevant to the technology being useful. Investors may go bankrupt, but the technology will stay. See e.g. history of rail in the United States - everyone who fronted capital to lay down rail lines lost their shirt, but the hardware remained, and people (including subsequent generations of businesses) put it to good use.
this whole "ai is theft" argument is just pure cope. tailwind was always just a thin abstraction over css standards and they only became the industry standard by playing the seo game and dumping docs on the open web for everyone to see. you dont get to claim theft when a model actually learns the patterns you basically forced onto the world for free to build your brand. tailwinds business model was essentially rent seeking on the fact that css is tedious to write manually and now that the marginal cost of production has dropped to near zero they are suprised they cant sell 300 dollar templates anymore.
the car comparison is honestly embarassing for this community to even bring up lol. its not theft to recognize a pattern and its definately not illegal for a company to do what every junior dev has been doing for years which is reading the docs and then not buying the paid stuff. adam built a business that relied on human inefficiency and now that inefficiency is gone. its not a tragedy its just a market correction. if your moat is so shallow that a llm can drain it in one pass then you didnt really have a product you just had a temporary advantage. honestly tailwind should of seen this coming a mile away but i guess its easier to blame "scrapers" than admit the ui kit gravy train is over. move on and build something that actually provides value.
It doesn't matter what Tailwind your opinion is. It matters that they built something which definitely has market validation that people were willing to pay for. AI took their lunch AND their lunch money.
I'm not going to dogpile criticism on Tailwind or Adam, whose behavior seems quite admirable, but I fundamentally agree with the thrust of the parent comment. It's unfortunate for Tailwind and anyone who was invested in the project's pre-2022 trajectory, but no one is entitled to commercial engagement by unaffiliated third parties.
Here's a similar example from my own experience:
* Last week, I used Grok and Gemini to help me prepare a set of board/committee resolutions and legal agreements that would have easily cost $5k+ in legal fees pre-2022.
* A few days ago, I started a personal blog and created a privacy policy and ToS that I might otherwise have paid lawyers money to draft (linked in my profile for the curious). Or more realistically, I'd have cut those particular corners and accepted the costs of slightly higher legal risk and reduced transparency.
* In total, I've saved into the five figures on legal over the past few years by preparing docs myself and getting only a final sign-off from counsel as needed.
One perspective would be that AI is stealing money from lawyers. My perspective is that it's saving me time, money, and risk, and therefore allowing me to allocate my scarce resources far more efficiently.
Automation inherently takes work away from humans. That's the purpose of automation. It doesn't mean automation is bad; it means we have a new opportunity to apply our collective talents toward increasingly valuable endeavors. If the market ultimately decides that it doesn't have sufficient need for continued Tailwind maintenance to fund it, all that means is that humanity believes Adam and co. will provide more value by letting it go and spending their time differently.
Laws are not intellectual property of individuals or companies, they belong to the public. That's a fundamentally different type of content to "learn" from. I totally agree that AI can save a lot of time, but I don't agree that the creators of Tailwind don't see any form of compensation.
It does not feel not right to me that revenue is being taken from Tailwind and redirected to Google, OpenAI, Meta and Anthropic without 0 compensation.
I'm not sure how this should codified in law or what the correct words are to describe it properly yet.
I see what you're getting at, but CSS is as much an open standard as the law. Public legal docs written against legal standards aren't fundamentally dissimilar to open source libraries written against technical standards.
While I am all for working out some sort of compensation scheme for the providers of model training data (even if indirect via techniques like distillation), that's a separate issue from whether or not AI's disruption of demand for certain products and services is per se harmful.
If that is the case, it's a very different claim than that AI is plagiarizing Tailwind (which was somewhat of a reach, given the permissiveness of the project's MIT license). Achieving such mass adoption would typically be considered the best case scenario for an open source project, not harm inflicted upon the project by its users or the tools that promoted it.
The problem Tailwind is running into isn't that anything has been stolen from them, as far as I can tell. It's that the market value of certain categories of expertise is dropping due to dramatically scaled up supply — which is basically good in principle, but can have all sorts of positive and negative consequences at the individual level. It's as if we suddenly had a huge glut of low-cost housing: clearly a social good on balance, but as with any market disruption there would be winners and losers.
If Tailwind's primary business is no longer as competitive as it once was, they may need to adapt or pivot. That doesn't necessarily mean that they're a victim of wrongdoing, or that they themselves did anything wrong. GenAI was simply a black swan event. As a certain captain once said, "It is possible to commit no mistakes and still lose. That is not a weakness; that is life.".
You're clearly not a fan of Tailwind, and that's fair enough.
However, stating that Adam Wathan (AW) "basically forced [Tailwind] onto the world" is nonsense. People chose to adopt it because it solved a problem.
In case you're not familiar with the origins of Tailwind, AW was building a SaaS live on stream, and everyone kept asking about the little utility CSS framework he'd built for himself (rather than the short-lived SaaS).
That's how it all started. Not through a big SEO campaign, or the mysterious ability to force others to choose a CSS framework against their will, but because people saw it, and wanted to use it.
I have trouble expressing how terrible unjust it feels that AI companies are stealing money from the common people. I have no other way to put it.
Also: this will definitely limit the use of AI. People will stop publishing valuable content for free on the internet, if AI scrapers will steal and monetize it.