Skip to content

AI Tools for Varity

Varity Team Core Contributors Updated March 2026

Varity provides AI tooling to help you build faster. Copy a prompt into your AI assistant, and it already knows how to build, deploy, and debug Varity apps.

AI Prompts

Ready-to-use prompts for Cursor, Claude Code, Copilot, and any AI assistant. Copy a prompt, start building.

View Prompts →

Cursor Rules

.mdc rule files that teach Cursor how to write Varity code. Drop into your project for intelligent autocomplete.

LLM-Optimized Docs

Machine-readable documentation for AI assistants. Available in two formats: /llms.txt (summary) and /llms-full.txt (detailed).

MCP Server

Model Context Protocol server for Cursor, Claude Code, VS Code, and Windsurf — deploy, search docs, and manage apps from your AI editor.

View Spec →

The fastest way to get AI assistance with Varity is to copy one of our AI Prompts into your project.

For Cursor:

  1. Copy the Varity rules from our AI Prompts page
  2. Save to .cursor/rules/varity.mdc in your project
  3. Cursor now understands Varity patterns automatically

For Claude Code:

  1. Copy the prompt content from our AI Prompts page
  2. Add to your project’s CLAUDE.md file
  3. Claude Code now has full Varity context

For any AI assistant:

  1. Copy the prompt from AI Prompts
  2. Paste at the start of your conversation
  3. Ask it to build features using Varity

The Varity MCP server lets your AI editor create, deploy, and manage Varity apps directly.

For Cursor — add to .cursor/mcp.json:

{
"mcpServers": {
"varity": {
"command": "npx",
"args": ["-y", "@varity-labs/mcp"]
}
}
}

For Claude Code:

Terminal window
claude mcp add varity -- npx -y @varity-labs/mcp

Then ask your AI to “create a Varity app” or “deploy this project” — it has 7 tools for the full build-to-monetize workflow. See the MCP Server Spec for the complete tool reference.

Pre-configured rules that teach Cursor IDE how to write Varity code correctly.

Available rules:

  • Varity App Development — SDK patterns, database queries, auth setup
  • Deployment — CLI commands, environment variables, hosting options
  • Authentication — Auth setup, protected routes, user data
  • Database — Collections, typed queries, CRUD hooks

Install:

Terminal window
mkdir -p .cursor/rules
# Copy rule content from the AI Prompts page into .cursor/rules/varity.mdc

See the AI Prompts page for the full rules file content.

AI assistants can reference our documentation in machine-readable formats:

FileURLUse Case
llms.txthttps://docs.varity.so/llms.txtSummary — what Varity is, key APIs, common patterns
llms-full.txthttps://docs.varity.so/llms-full.txtDetailed — full documentation content for deep context

Point any AI tool at one of these URLs for instant Varity context. Use llms.txt for quick context in chat-based LLMs (ChatGPT, Gemini), and llms-full.txt when the AI needs comprehensive documentation.

Our tools teach AI assistants these patterns:

AI assistants are trained to use developer-friendly language:

  • “Deploy your application” (not “upload to IPFS”)
  • “Sign in” (not “connect wallet”)
  • “Usage fees are covered” (not “gasless transactions”)
import { db } from '@varity-labs/sdk';
import type { Project } from '../types';
// Always use typed collections
const projects = db.collection<Project>('projects');
try {
await projects.add(data);
} catch (error) {
if (error.code === 'VALIDATION_ERROR') {
// Handle specific error
}
}
ToolIntegrationLevel
CursorMCP Server + .mdc rules + promptsFull
Claude CodeMCP Server + CLAUDE.md + promptsFull
VS Code (Copilot)MCP Server + llms.txt contextFull
WindsurfMCP Server + Cursor-compatible rulesFull
ChatGPTllms.txt contextBasic
Geminillms.txt contextBasic
GitHub Copilotllms.txt contextBasic