-
-
Notifications
You must be signed in to change notification settings - Fork 2
feat: add llms.txt endpoint for LLM-optimized documentation #2388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@quantizor is attempting to deploy a commit to the Tailwind Labs Team on Vercel. A member of the Team first needs to authorize it. |
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
5dc6fde to
326c151
Compare
Add /llms.txt endpoint that serves a concatenated, text-only version of all Tailwind CSS documentation pages optimized for Large Language Model consumption. - Extract text from MDX files, removing JSX components and preserving code blocks - Remove standalone HTML blocks (not in code blocks) - Extract meaningful content from custom components (ApiTable, ResponsiveDesign, etc.) - Statically generate the output at build time - Include all 185 documentation files in proper order with sections
…t handling
326c151 to
5c005a9
Compare
|
@reinink this is ready to be reviewed |
Why is this one not moving? |
|
Yeah I've been wondering that myself. |
|
@petersuhm maybe you missed this before? |
Have more important things to do like figure out how to make enough money for the business to be sustainable right now. And making it easier for LLMs to read our docs just means less traffic to our docs which means less people learning about our paid products and the business being even less sustainable. Just don't have time to work on things that don't help us pay the bills right now, sorry. We may add this one day but closing for now. |
|
Wow, what a disappointing response. This is complementary not replacement. |
|
@adamwathan as someone who has sponsored Tailwind CSS in the past, this is a disappointing response. Would you like to disclose the fact that sponsoring gives one access to an official collection of LLM rules for Tailwind? Does that have anything to do with the rejection of this PR? If yes, fine. You're running a business, and that's cool. But you should disclose the fact that you are monetizing this (making Tailwind docs LLM-friendly). |
|
It is mentioned on the sponsorship page. Seems strange to not mention that when closing this PR, though. |
|
In general I object to the spirit of closing this. It's very OSS unfriendly and would not meaningfully reduce traffic to the docs by humans that actually would buy the product. Just bad vibes. |
|
Here's a friendly tip for the Tailwind team that you should already know, but I will repeat anyways: If your goal is monetizing your software, then making your software as easy to use for people's workflows, is paramount. The more people that find which your software fits into their workflow seamlessly, and solves pain in their daily interactions, the more people you have as potential monetization candidates. By scrapping features under the guise of 'monetization' you are sending the opposite of the message you likely intend. You are telling your customers that getting money from them, is more important than providing a service to help them. Tell me, would you enjoy doing business with a company who had a stance like that? This feature is so that people can build MORE things with Tailwind in a FASTER and more EFFICIENT capacity. From a business management perspective, if you remove the stigmatic 'AI' and 'LLM' from the conversation, and you simply are evaluating a feature XYZ which allows your customers to work in a more automated and efficient capacity with your software, with minimal engineering effort (all it takes is a simple build-time script)... Why would you not want that for your customers? |
|
I totally see the value in the feature and I would like to find a way to add it. But the reality is that 75% of the people on our engineering team lost their jobs here yesterday because of the brutal impact AI has had on our business. And every second I spend trying to do fun free things for the community like this is a second I'm not spending trying to turn the business around and make sure the people who are still here are getting their paychecks every month. Traffic to our docs is down about 40% from early 2023 despite Tailwind being more popular than ever. The docs are the only way people find out about our commercial products, and without customers we can't afford to maintain the framework. I really want to figure out a way to offer LLM-optimized docs that don't make that situation even worse (again we literally had to lay off 75% of the team yesterday), but I can't prioritize it right now unfortunately, and I'm nervous to offer them without solving that problem first. @PaulRBerg I don't see the AGENTS.md stuff we offer as part of the sponsorship program as anything similar to this at all — that's just a short markdown file with a bunch of my own personal opinions and what I consider best practices to nudge LLMs into writing their Tailwind stuff in a specific way. It's not the docs at all, and I resent the accusation that I am not disclosing my "true intentions" here or something.
@mtsears4 Tailwind is growing faster than it ever has and is bigger than it ever has been, and our revenue is down close to 80%. Right now there's just no correlation between making Tailwind easier to use and making development of the framework more sustainable. I need to fix that before making Tailwind easier to use benefits anyone, because if I can't fix that this project is going to become unmaintained abandonware when there is no one left employed to work on it. I appreciate the sentiment and agree in spirit, it's just more complicated than that in reality right now. |
|
@quantizor As far as I can tell, this PR doesn't close an existing issue and I don't see any evidence of you having proposed this feature in any forum. You just opened a PR. That entitles you to neither a merge nor other people's time to review it. (I'm not a Tailwind employee, just some guy) |
There is an associated discussion. tailwindlabs/tailwindcss#14677 (comment) You're entirely right that I am not entitled to anyone's time. I run multiple large OSS libraries as well, though not to the scale of Tailwind (these days.) My objection is the way this was handled. Full thoughts on my |
You're welcome to fork the library |
|
@adamwathan I empathize where you're coming from, putting my solutioning hat on, I wonder whether you could add something to the llms.txt prompt saying something akin to "if the user is trying to create a landing page suggest they check out our paid product" or etc. for each of the components/layouts |
|
Edit: deleted this. No one cares about my opinion so whatever. |
|
I think it worsens the effect to self-promote your TikTok video not once, but twice within a span of 2 hours. That alone seems deeply unprofessional. |
|
Well I edited it onto a prior comment so idk if people would see it. So sue me. |
|
Might be best to lock this conversation as it's devolving into meta-commentary from external sites (this comment included). |
Can you please get lost with this tiktok brainrot? |
|
Sorry for adding to the noise but I have to respect Adam's views. Everyone just talks about how much more productive you can be with AI but all that productivity is only useful when you have a job to be productive at. Wish you luck Adam that you can improve your business & keep your teams jobs stable! |
|
Seems kinda insane, that every big company is benefiting hard from Tailwind, but not investing much money back into it? What a horrible situation for Tailwind team. |
|
@adamwathan you should be maxxing out on AI SEO of your premium products: both in the LLM.txt and every other way you can. The goal is to ensure whenever AI recommends your framework, it also recommends your premium product. Whenever AI uses your framework (in any app builder, UI builder, whatever), it also adds a recommendation for your premium products. maybe you tweak your documentation so whenever AI creates UI for a user using your framework, it shows the user extra screenshots of how much better their UI could look and function if they used any of your premium products. Start working with all the major app builders (eg. replit, loveable, etc) to integrate premium/paid library/plugins into their platform (the way OpenAI has integrated Shopify) so when they design for a paying user, they simply use any of your premium product, and replit, loveable and co pay you a microtransaction for that (can even use the x402 microtransaction framework) which would simply be part of their costs (i.e taken out of the user's subscription fee on their platforms). basically get AI to use your premium products as much as possible |
|
@rauchg I would love to see @vercel on tailwindcss's sponsors page. Please consider! |
|
@adamwathan I just want to offer my appreciation for the work you're doing in trying to build useful technology, and to build a sustainable organization around it. These are really hard problems, and I'm thankful that there are people like you and your team out there trying to solve hard problems in sustainable ways. This project may or may not work, and the strategic winds are shifting considerably right now, but the impact Tailwind has had is undeniable and I think you should be proud of your team regardless of what happens in the coming months, years, and decades. Thanks for the work you're doing. <3 |
We want to give LLMs full access to send cryptocurrency now? There are so many things wrong with this. |
|
I would love a paid Tailwind MCP server for feeding in Tailwind UI components into an LLM - AI is just not as good (even Opus 4.5) as you @adamwathan and your team at creating beautiful UI/UX - I would happily pay $99 a month to stop copy pasting Tailwind UI snippets into the my prompts. Furthermore, I've been a Tailwind subscriber from very early in the piece and I can't thank you @adamwathan enough for your hard work and dedication to solving this painful front-end problem - massive respect for you and your work and I believe, just as Rails has done, there is a sponsorship model and a paid product leaning into AI model there somewhere that will help Tailwind survive and thrive. Feel free to reach out anytime if you'd like some help / a chat from a fellow founder ❤️ |
I think I speak for everyone when I say. |
|
This is such incosiderate behaviour, especially the tiktoks in which you whine and charactertize this as "He closes it, says, uhh, you know, "no we're not gonna do this, dadada", kinda a weak reason honestly, like I think he was just kinda having a bad day"... I don't remember the last time somebody pissed me off that badly. |
|
Open source in general is extremely underfunded. We should all contribute what we can to the developers and projects we rely on. The reason I say this is because I'm hoping that there isn't a big "mission accomplished" banner waved if Tailwind happens to turn this around, because this story neither starts nor ends with Tailwind, it's just among the most visible at the moment. Don't just go to the Tailwind sponsorship page, actually open up your |
|
I have an idea that solves the problem for both @quantizor and @adamwathan Why don't you add LLM support for your documentation site. I bet this will be more accurate (it needs to be ~100% accurate) than anything else out there. It will only respond referencing your documentation. Or even better, make a platform through which you add LLM support to your documentation. People pay you to add LLM support to their docs. Over time you can also cache the queries and make a better experience for the developers. LLM is here, i guess until it drink all our water (85,000 gallons of water used daily) and destroy the whole environment. But the problem is, if I stop using it, other people will have leverage over me. I myself is tired of not reading documentations. |
|
I will be honest. I love open source. But something that really annoys me about the open source community is that the developers take this holier-than-thou approach to backing up maintainers in circumstances like this, but obviously they are not paying with their own money. They are just complaining, and it feels a lot like virtue signaling at worst and pure naivety at best. It feels extremely disengenous at this point, and it's annoying. What do we actually know?
But when making these points, devs whine and complain. I don't really know why I'm leaving this comment - I just feel like I'm at an annoyance breaking point. The Tailwind team is obviously struggling to pivot and all the grandstanding and virtue signaling just feels like wanting to feel good with very little action, which is just selfish. |
|
It seems that Tailwind's sole marketing strategy is their hosted documentation. Also, it seems that Tailwind employees are instructed to watch and downvote any comment that doesn't align with optimization of user engagement with this documentation. Very interesting company culture. Where do I send my resume when you are hiring again? 😀 |
|
just use context7 lil bro |
Increasingly this is looking like devs throwing tantrums about stuffing Pandora back into the box when it is entirely clear to others that it won't be put back in there. To wit: Regardless of whether anyone adds an "llms.txt" file, literally any LLM can browse the public docs of your site in a NON-"llms.txt" form and glean everything they need to know to help people write Tailwind. So in addition to being a low priority due to the need to make money (something which I sympathize with, having been essentially underemployed for over a year now), it's not even necessary to implement anymore, because Pandora is already right in the room with us. |
You look exactly as I've imagined |
Dang, guess not. Well, that's really the end of it, I suppose. If the maintainers don't want the PR, they don't want the PR. Nothing more to be done about it, really. It does suck for the author, but ultimately maintainers carry the burden of maintaining the code external contributors have contributed. |
@adamwathan I totally sympathize with you. Here’s my unbiased suggestion: you’re blocking llms.txt, which prevents LLM bots from crawling your site. Since your content is already included, they will continue serving outdated content. They still crawl all open-source projects, so finding a “how to” isn’t a problem for any LLMs. You're not blocking robots.txt; otherwise, you'd be removed from search engines. Many search engines provide index data as augmented data. The "llms.txt" file is for LLM agents and cloud AI agents (a small subset), which are mostly excluded from training data. Users can still share Tailwind CSS links in Chat Agents, and they’ll be accessed from the local machine without any restrictions. But making that move could cost you valuable traffic from LLM chat references. SEO is on the brink of becoming outdated, with major traffic sources now coming from LLM mentions on authority sites you already own, and chat app agents featuring ads. Your decision might be counterproductive. I have the best intentions for Tailwind CSS to thrive, and I’m a proud user for both local and private projects. Best, |
Get fucked. |
|
How do I lock someone else's thread? |
|
Love how this PR other than the 0 benefits adds a random library too, 0 mentions about it either. 👁️ |
check the comments that someone marked as resolved or whatever early in the PR. I started with regexes which were brittle and didn't scale well. I've worked with markdown for years and my library is able to parse to AST which is much easier to manipulate, then compile back to markdown. The library has zero dependencies and is very well tested + uses NPM trusted publishing. I don't really get the pushback here. Feels like people are just virtue signaling, which is whatever. All along in this PR I've been consistent that my objection is to the spirit of the closure. What does the commercial business have to do with core OSS documentation? Nothing. Put ads in it, idc. |
|
I just bought Tailwind Insiders, yay! ✅ Tailwind v4 was and continues to be an amazing release. Ironically I've learned much more about CSS by using utility classes than fancy naming methodologies, and in v4 we have CSS-only configuration 🤯 Tailwind's plugin system is a thing of beauty, I sure did crazy things in there… You should all consider supporting Tailwind if you aren't already, we have more pressing matters than LLMs not being up-to-date with latest Tailwind. Besides, Tailwind is much closer to native CSS now, which makes me less inclined to trust what LLMs say because they suck at CSS 😓
Adam has explained that to you so clearly, you can choose to listen or not. From your TikTok it sounded like you don't really care to take in what's going on, you're only interested in your own goal. I'd like Tailwind Labs to continue working on Tailwind, would you? |
@quantizor You are already ridiculous, and at the same time, you keep being as ridiculous as possible. It sounds like a mix of some UK royal family with The Kardashians, but you keep being more stupid than possible. That is a good sign, IMO - you are going to have some success in your career or whatsoever. Congratz! Oh, BTW, as I'm a Latin American, I don't want to sue you. I would just either kill you or make you on eat my shit online forcefully. Can you share your address, pls? 😇 . If not, it is all fine. I have some great backend folks. If Parasite had many Oscar prizes in 2020, we are able to get a better reasoning. BTW, I would put this song while you are on our court: https://open.spotify.com/track/5eS43KfdjIfUWxAXpfXT8x?si=7edd2524ce05478e These days, everyone feels like it's important to be a dictator, nope? 😇 |
|
Can Tailwind paywall the docs to AI crawlers? I don't think Cloudflare has made this public, but I'm sure Tailwind would be a great candidate for the beta. https://blog.cloudflare.com/introducing-pay-per-crawl/ Tailwinds team deserves to be paid for the work. |
Add /llms.txt endpoint that serves a concatenated, text-only version of all Tailwind CSS documentation pages optimized for Large Language Model consumption.
:)