AI Prompts
Ready-to-use prompts for Cursor, Claude Code, Copilot, and any AI assistant. Copy a prompt, start building.
Varity provides AI tooling to help you build faster. Copy a prompt into your AI assistant, and it already knows how to build, deploy, and debug Varity apps.
AI Prompts
Ready-to-use prompts for Cursor, Claude Code, Copilot, and any AI assistant. Copy a prompt, start building.
Cursor Rules
.mdc rule files that teach Cursor how to write Varity code. Drop into your project for intelligent autocomplete.
LLM-Optimized Docs
Machine-readable documentation for AI assistants. Available in two formats: /llms.txt (summary) and /llms-full.txt (detailed).
MCP Server
Model Context Protocol server for Cursor, Claude Code, VS Code, and Windsurf — deploy, search docs, and manage apps from your AI editor.
The fastest way to get AI assistance with Varity is to copy one of our AI Prompts into your project.
For Cursor:
.cursor/rules/varity.mdc in your projectFor Claude Code:
CLAUDE.md fileFor any AI assistant:
The Varity MCP server lets your AI editor create, deploy, and manage Varity apps directly.
For Cursor — add to .cursor/mcp.json:
{ "mcpServers": { "varity": { "command": "npx", "args": ["-y", "@varity-labs/mcp"] } }}For Claude Code:
claude mcp add varity -- npx -y @varity-labs/mcpThen ask your AI to “create a Varity app” or “deploy this project” — it has 7 tools for the full build-to-monetize workflow. See the MCP Server Spec for the complete tool reference.
Pre-configured rules that teach Cursor IDE how to write Varity code correctly.
Available rules:
Install:
mkdir -p .cursor/rules# Copy rule content from the AI Prompts page into .cursor/rules/varity.mdcSee the AI Prompts page for the full rules file content.
AI assistants can reference our documentation in machine-readable formats:
| File | URL | Use Case |
|---|---|---|
llms.txt | https://docs.varity.so/llms.txt | Summary — what Varity is, key APIs, common patterns |
llms-full.txt | https://docs.varity.so/llms-full.txt | Detailed — full documentation content for deep context |
Point any AI tool at one of these URLs for instant Varity context. Use llms.txt for quick context in chat-based LLMs (ChatGPT, Gemini), and llms-full.txt when the AI needs comprehensive documentation.
Our tools teach AI assistants these patterns:
AI assistants are trained to use developer-friendly language:
import { db } from '@varity-labs/sdk';import type { Project } from '../types';
// Always use typed collectionsconst projects = db.collection<Project>('projects');try { await projects.add(data);} catch (error) { if (error.code === 'VALIDATION_ERROR') { // Handle specific error }}| Tool | Integration | Level |
|---|---|---|
| Cursor | MCP Server + .mdc rules + prompts | Full |
| Claude Code | MCP Server + CLAUDE.md + prompts | Full |
| VS Code (Copilot) | MCP Server + llms.txt context | Full |
| Windsurf | MCP Server + Cursor-compatible rules | Full |
| ChatGPT | llms.txt context | Basic |
| Gemini | llms.txt context | Basic |
| GitHub Copilot | llms.txt context | Basic |