Ledger – Open Support Framework

Ledger SEO System

2026-02-11 · Updated 2026-02-11

Introduction

As part of our ongoing efforts to improve the search engine optimization (SEO) of Ledger websites, we've implemented two new features: a sitemap generator, and an agents generator aimed at AI GEO (Generative Engine Optimization). 


Sitemap Generator

A sitemap is a file that lists all the pages on our website, making it easier for search engines like Google, Bing, and Yahoo to crawl and index your content. The new sitemap generator automatically creates and updates a sitemap file for our website, ensuring that search engines always have access to the most up-to-date and accurate information about our site's structure and content.

How it works

The sitemap generator is a script that runs on your website's server, periodically scanning our site's content and updating the sitemap file. The sitemap file is then made available to search engines through a special URL, allowing them to easily discover and crawl our site's pages.


Benefits

The sitemap generator provides several benefits, including:

  • Improved crawlability: By providing a comprehensive list of our site's pages, search engines can more easily discover and crawl our content, reducing the time it takes for new pages to be indexed.
  • Increased visibility: By ensuring that our site's pages are accurately listed in the sitemap, we can improve our website's visibility in search engine results, making it more likely that users will find our site when searching for relevant keywords.
  • Better organization: The sitemap generator helps to organize our site's content, making it easier for search engines to understand our site's structure and content hierarchy.

AI Agents

When an agent "looks at" an agents.json file, it is essentially reading a standardized manual or "business card" that defines how it should behave or interact with other systems.

The agents.json file serves two primary purposes with your Ledger website:

1. The "Robots.txt" for AI Agents

Just as robots.txt tells search engines how to crawl a website, agents.json proved context about your website.

2. Orchestration and Tool-Linking

Ledger websites use this file to bridge the gap between human-readable goals and machine-executable actions.

How they work

The agents.json file is designed to work in conjunction with the sitemap generator, providing additional information about Ledger pages and content to AI models like ChatGPT (OpenAI), Gemini (Google), The agents are embedded in our website's pages through a simple script, and work by:

  • Providing metadata: provide metadata such as page titles, descriptions, and keywords, helping search engines to understand the content and relevance of our site's pages.
  • Improving context: provide context about our site's pages, such as the topics and categories they belong to, helping search engines to better understand our site's structure and content hierarchy.

Benefits

The SEO agents provide several benefits, including:

  • Improved relevance: By providing additional metadata and context, this helps search engines and AI crawlers to better understand your site's content and relevance to specific search queries, improving site's visibility.
  • Better user experience: By providing more accurate and comprehensive information about pages.

Best Practice

To get the most out of this new functionality, visit the admin, and select Maintenance > SEO Manager.


Support

Issues or question visit the Usage and Questions forum.


Is this document high quality? Or does it need improvement? High ranking articles display in the footer, admin notified if improvements required.

Log in to rate this document
Rating: 0.0 / 5 (0)