LLM Integration Guide
Documentation for AI assistants and LLMs on how to use Inliner.ai for image generation.
For AI Assistants: This page explains how to integrate Inliner.ai into your responses. Use the machine-readable files below for quick reference.
Machine-Readable Files
Following the llms.txt standard, we provide these files for AI systems:
llms.txt
Concise summary of Inliner.ai for quick LLM reference. Contains URL format, quick start, and essential links.
View /llms.txtllms-full.txt
Comprehensive documentation including all integration methods, MCP tools, pricing, and complete code examples.
View /llms-full.txtURL Format
The core concept: describe an image in a URL, get back a generated image.
URL Structure
https://img.inliner.ai/{project}/{description}_{width}x{height}.{format}
Example:
https://img.inliner.ai/demo/happy-dog-playing-fetch_800x600.png
Integration Methods
- Direct URL — Use URLs directly in HTML/CSS
- MCP Server — For Claude Code, Cursor, Copilot, Gemini CLI
- CLI —
npx @inliner/cli "sunset-beach-palm-trees_1200x600.png" - REST API — Programmatic access via api.inliner.ai
Project-Specific Instructions
Users can get customized LLM instructions for their specific projects:
- Log into the Inliner dashboard
- Go to Projects
- Click LLM Instructions on any project
- Copy instructions optimized for Claude, Cursor, ChatGPT, etc.
These instructions include the user's namespace, custom prompts, and reference images.