Skip to content

Artificial Intelligence

Granit integrates AI as an optional, provider-agnostic layer that enriches existing modules without modifying them. Every AI feature follows the same principle: existing modules define extension points with null-object defaults; AI packages provide smart implementations that activate via DI composition.

CapabilityModuleWhat it does
Chat & generationGranit.AIWorkspace-managed access to IChatClient (OpenAI, Azure OpenAI, Ollama)
Natural language queryQuerying.AIUsers type “unpaid invoices from last week” → structured filters
Semantic searchAI.VectorDataFind documents by meaning, build RAG knowledge bases
Import mappingDataExchange.AIAutomatically map messy CSV/Excel columns to your entity properties
Document extractionAI.ExtractionTurn a PDF invoice into a typed InvoiceData C# record
Workflow decisionsWorkflow.AIAI recommends next transition, scores risk for auto-approval
Notification contentNotifications.AIGenerate message body + route to optimal channel
Timeline intelligenceTimeline.AISummarize audit trails, detect unusual activity patterns
PII detectionPrivacy.AIScan free-text fields for personal data (GDPR compliance)
Content moderationValidation.AIDetect toxic content, harassment, prompt injection in user input
Blob classificationBlobStorage.AITag uploaded files by category, detect PII in filenames
Access anomaliesAuthorization.AIDetect suspicious access patterns in permission checks
Log analysisObservability.AISummarize error batches, surface root cause insights
Image analysisImaging.AIAlt text generation, object detection, smart tagging

Your application code depends on IChatClient (from Microsoft.Extensions.AI), not on OpenAI or Anthropic. Swap providers per environment:

EnvironmentProviderWhy
DevelopmentOllamaFree, local, no API key
StagingOpenAIFast iteration, latest models
ProductionAzure OpenAIEU data residency, Managed Identity, DPA

AI packages reference existing modules, never the reverse:

graph LR
    DE[Granit.DataExchange] -.->|defines ISemanticMappingService| NULL[NullService]
    DEAI[Granit.DataExchange.AI] -->|implements| DE
    DEAI --> AI[Granit.AI]
    AI --> MEAI[Microsoft.Extensions.AI]

    style NULL fill:#ffebee
    style DEAI fill:#e8f5e9

Without the AI package installed, the module works normally. With it, AI activates.

PrincipleHow Granit.AI implements it
Data minimizationAI modules send only metadata to the LLM (column names, schema), never business data
Audit trailEvery AI interaction is logged (tenant, user, workspace, model, tokens, timestamp)
Right to erasureWorkspace and usage records support soft delete; vector embeddings are deletable
Data residencyAzure OpenAI for EU-only; Ollama for on-premise (data never leaves your server)
Opt-in for sensitive dataPreview rows in import mapping require explicit IncludePreviewRows: true

AI calls are slow (200ms to 15s). Granit uses Wolverine for async execution:

Use caseExecutionWhy
Natural language querySynchronous (~200ms)User is waiting for search results
Import mappingSynchronous (5-10s)User is in the wizard, waiting for suggestions
Content moderationSynchronous (2s, fail-open)Must validate before persistence; timeout never blocks
Access anomaly detectionSynchronous (2s, fail-open)Runs after auth check; never blocks access
Document extractionAsync (Wolverine handler)5-15s, client gets 202 Accepted + push notification
Blob classificationAsync (Wolverine handler)Upload should not wait for classification
Notification contentAsync (Wolverine handler)Generated after notification is queued
Workflow auto-approvalAsync (Wolverine handler)Runs on ApprovalRequested event
Timeline summarizationAsync (Wolverine handler)On-demand or scheduled, never interactive
PII detection (batch)Async (Wolverine handler)Scanning existing records at scale
Log analysisAsync (cron job)Scheduled, batch analysis
Image analysisAsync (Wolverine handler)Triggered after upload
Vector indexingAsync (Wolverine handler)Batch operation, no user waiting

The minimal setup to get AI working in your Granit application:

Program.cs
builder.AddGranitAI();
builder.AddGranitAIOllama(); // zero config for local dev
Terminal window
# Terminal: start Ollama
ollama pull llama3.1
ollama serve

Then inject IAIChatClientFactory in any service:

public class MyService(IAIChatClientFactory chatFactory)
{
public async Task<string> AskAsync(string question, CancellationToken ct)
{
IChatClient client = await chatFactory
.CreateAsync(cancellationToken: ct)
.ConfigureAwait(false);
ChatResponse response = await client
.GetResponseAsync(question, cancellationToken: ct)
.ConfigureAwait(false);
return response.Text;
}
}

User experience

Data ingestion

Business intelligence

Security & compliance

Operations

For the core module reference (providers, workspaces, configuration), see Setup & Configuration.