I was wondering about this but more in 10 years when people have their physical house assistant robot, when you ask it to search something online for you, itll be the next seo thing to have your site be easily readable by AI because that's how the people will be doing a good portion of their google searches.
Even now I wonder what the stats for marketshare are, I know my first choice is for chatgpt or grok to scan a hundred websites for my query.
> AI Agent RequirementsDescriptionSemantic MarkupUse of HTML5 semantic elements to define web page structureConsistent CodingConsistent use of coding practices throughout the web pageWeb StandardsAdherence to established web standards and best practices
Right. Let’s ask AI how to make websites easier to navigate. Because that’s the top priority on the internet.
> For those of you using LLMs to help you write code — via Cursor or Copilot or Claude or Bolt or v0 or some other interface — we now publish the documentation in a selection of robot-friendly llms.txt files. This is experimental and will evolve over time, but by way of example here’s a snake game built by Sonnet 3.5 with no additional prompting.
As a consumer, I want immediate answers from the AI not visit another website.
As a website owner, I want people to come to my website. Not spend money sourcing data, and having it be stolen by AI Company, and repurposed.
---
What's the solution here? Perhaps company's can offer their data with a very explicit license that to show data sourced from my site you must present it in this way: X, Y, Z. Where it's like a mini embedded version of the site?
Whatever the solution, the incentive misalignment must be addressed sooner rather than later.
Why would anyone do this? Make it easier to scrape your site and lose organic traffic?
I hope we're designing them so they can't be used by AI. AI using software is like water using a straw.
I was wondering about this but more in 10 years when people have their physical house assistant robot, when you ask it to search something online for you, itll be the next seo thing to have your site be easily readable by AI because that's how the people will be doing a good portion of their google searches.
Even now I wonder what the stats for marketshare are, I know my first choice is for chatgpt or grok to scan a hundred websites for my query.
Fun to think about for sure.
> AI Agent RequirementsDescriptionSemantic MarkupUse of HTML5 semantic elements to define web page structureConsistent CodingConsistent use of coding practices throughout the web pageWeb StandardsAdherence to established web standards and best practices
Right. Let’s ask AI how to make websites easier to navigate. Because that’s the top priority on the internet.
Has anyone started to redesign their UI/UX now that AI agents about to start using the browser?
Svelte has added an AI-friendly text version of the documentation: https://svelte.dev/blog/advent-of-svelte#Day-13:-rise-of-the...
> For those of you using LLMs to help you write code — via Cursor or Copilot or Claude or Bolt or v0 or some other interface — we now publish the documentation in a selection of robot-friendly llms.txt files. This is experimental and will evolve over time, but by way of example here’s a snake game built by Sonnet 3.5 with no additional prompting.
More discussion about how Svelte maintainers are preparing for AI agents: https://youloop.leftium.com/?v=BlhgP3zADN0&a=1938&b=1985
Incentives are all wrong.
As a consumer, I want immediate answers from the AI not visit another website.
As a website owner, I want people to come to my website. Not spend money sourcing data, and having it be stolen by AI Company, and repurposed.
---
What's the solution here? Perhaps company's can offer their data with a very explicit license that to show data sourced from my site you must present it in this way: X, Y, Z. Where it's like a mini embedded version of the site?
Whatever the solution, the incentive misalignment must be addressed sooner rather than later.