Skip to content

Use Granit docs with AI assistants

Stop copy-pasting code snippets into your AI assistant. Granit exposes its entire documentation as plain-text files that any LLM can ingest in one shot — so it answers with real framework knowledge, not guesswork.

  1. Open chatgpt.com and start a new conversation.
  2. Click the attachment icon and upload llms-full.txt.
  3. Ask your question — ChatGPT now has the full Granit documentation as context.

For repeated use, create a Custom GPT and add https://granit-fx.dev/llms-full.txt as a knowledge file. Every conversation will start with Granit context built in.

  1. Open claude.ai and start a new conversation.
  2. Attach llms-full.txt as a file.
  3. Ask your question.

For repeated use, create a Claude Project and add the file as project knowledge — it stays available across all conversations in that project.

The fastest way: install the Granit MCP server as a local .NET tool so Claude Code can search the documentation in real time — no file upload, always up to date.

  1. Install the tool:
    Terminal window
    dotnet tool install --global Granit.Tools.Mcp
  2. Add this to your project’s .mcp.json:
    {
    "mcpServers": {
    "granit": {
    "command": "granit-tools-mcp"
    }
    }
    }

Claude Code now has 9 Granit tools:

ToolDescription
docs_searchFull-text search across all docs (returns ID + snippet)
docs_getRead full article content by ID
docs_list_patternsList all architecture patterns
code_searchSearch symbols across .NET and TypeScript
code_get_apiInspect a type’s public API surface with signatures
code_get_graphProject/package dependency graph
code_list_branchesBranches with a committed code index
nuget_listList published Granit NuGet packages
nuget_getPackage versions, deps, frameworks, license

Alternatively, add this to your project’s CLAUDE.md so Claude Code loads Granit context as a static file:

## Granit documentation
Full framework reference:
https://granit-fx.dev/llms-full.txt

Every conversation in the project will have access to the full module reference, patterns, and ADRs.

Add a .github/copilot-instructions.md file to your repository so Copilot always knows about Granit:

When answering questions about the Granit framework,
refer to the full documentation at:
https://granit-fx.dev/llms-full.txt

Then ask Copilot Chat as usual:

@workspace How does Granit handle multi-tenancy?

Any LLM tool that supports the llms.txt standard will discover the documentation automatically from https://granit-fx.dev/llms.txt.

For tools with smaller context windows, use the compact version: https://granit-fx.dev/llms-small.txt.