The Traditional Handoff Hell
You've lived through this workflow:
- Designer creates beautiful mockups in Figma
- Designer annotates spacing, colors, fonts in a handoff doc
- Developer inspects Figma, manually translates designs to code
- Developer asks: "Is this 16px or 20px?" "Which blue is this?"
- Designer clarifies (via Slack, usually after waiting hours)
- Developer codes, designer reviews
- Designer: "The spacing is off, and that's not the right blue"
- Repeat steps 4-7 until both sides are exhausted
Time wasted: Days or weeks per feature.
Quality: Inconsistent, drifts from design over time.
Morale: Both designers and developers frustrated.
Now imagine this:
- Designer defines design tokens in Figma (or FramingUI generates them from Figma styles)
- Developer asks AI: "Build a user profile card matching our design system"
- AI reads design tokens, generates pixel-perfect code
- Developer reviews, ships
- Done.
Time wasted: Minutes.
Quality: Matches design system 100%.
Morale: Everyone focuses on creative work, not pixel-pushing.
This isn't future speculation. This is possible today with AI + design tokens.
TL;DR
- Traditional Figma-to-code workflow is slow, error-prone, and relies on manual translation
- AI code generation (Claude, Cursor) can automate this—if it has access to your design system
- Design tokens are the missing link: structured data that both Figma and AI understand
- FramingUI bridges Figma Variables → Design Tokens → AI-readable context
- Result: AI generates production-ready code using your actual design system, no manual cleanup
The Old Workflow: Figma Inspect + Manual Translation
How It Used to Work
Designer side:
- Design components in Figma
- Use Figma styles for colors, text styles, effects
- Create handoff doc or rely on Figma Inspect
- Wait for developer questions
Developer side:
- Open Figma, click each element
- Copy CSS values from Inspect panel
- Translate to code manually:
#3B82F6→bg-blue-500(guessing which Tailwind class)24px→p-6orgap-6? (ambiguous)Inter SemiBold 16px→font-semibold text-base
- Hope you got it right
Why This Workflow Breaks
- Information loss: Figma styles have semantic names ("Primary Brand"), but Inspect shows raw values (
#3B82F6) - No single source of truth: Design lives in Figma, code lives in repo—they drift over time
- Manual translation errors: Developer guesses which token matches which Figma value
- Painful iterations: Every design change requires re-inspecting and re-coding
- No validation: Nothing stops developers from using
#3B82F6instead of--color-primary
The AI-Powered Workflow: Tokens as the Bridge
The New Flow
┌──────────┐ ┌────────────────┐ ┌─────────┐
│ Figma │ ────────► │ Design Tokens │ ────────► │ AI │
│ Variables│ export │ (JSON) │ reads │ Tool │
└──────────┘ └────────────────┘ └─────────┘
│ │
│ ▼
│ ┌─────────────┐
└──────────────────►│ Production │
generates │ Code │
└─────────────┘
Key insight: Design tokens are the common language that Figma, AI, and code all understand.
How It Works
Step 1: Designer defines variables in Figma
Figma Variables (Color, Number, String, Boolean) map directly to design tokens:
- Color variable:
color/primary/500→{ "value": "#3b82f6", "type": "color" } - Number variable:
spacing/md→{ "value": "16px", "type": "dimension" } - String variable:
font/family/base→{ "value": "Inter", "type": "fontFamily" }
Step 2: Export Figma Variables to design tokens
Use Figma's REST API or plugins like Tokens Studio:
# Figma Variables → tokens.json
npx figma-variables-export --file-id YOUR_FILE_ID --output tokens.json
Or use FramingUI's Figma plugin (coming soon):
FramingUI Figma Plugin → Click "Export Tokens" → tokens.json
Step 3: Commit tokens to repo
git add tokens.json
git commit -m "design: update tokens from Figma"
git push
Step 4: AI reads tokens via MCP
Configure FramingUI's MCP server (one-time setup):
// claude_desktop_config.json
{
"mcpServers": {
"framingui": {
"command": "npx",
"args": ["@framingui/mcp-server"],
"env": {
"TOKENS_PATH": "./tokens.json"
}
}
}
}
Step 5: Ask AI to generate code
You: "Build a user profile card with avatar, name, bio, and follow button"
Claude: [reads design tokens via MCP]
[sees color.primary, spacing.md, border.radius.lg, etc.]
[generates component using actual tokens]
Output:
export function UserProfileCard({ user }) {
return (
<div className="bg-white dark:bg-gray-900 rounded-lg p-6 shadow-md">
<div className="flex items-center gap-4">
<img
src={user.avatar}
alt={user.name}
className="w-16 h-16 rounded-full"
/>
<div className="flex-1">
<h3 className="text-lg font-semibold text-gray-900 dark:text-white">
{user.name}
</h3>
<p className="text-sm text-gray-600 dark:text-gray-400">
{user.bio}
</p>
</div>
</div>
<button className="mt-4 w-full bg-primary-500 hover:bg-primary-600 text-white font-medium py-2 px-4 rounded-md">
Follow
</button>
</div>
);
}
Notice: AI used bg-primary-500, text-gray-900, rounded-lg—all from your actual design system. No hardcoded #3B82F6 or arbitrary px-4.
Why Tokens Make AI Better
Without Tokens
Prompt: "Build a button component"
AI output:
<button className="bg-blue-500 text-white px-4 py-2 rounded">
Click me
</button>
Problems:
bg-blue-500might not be your brand colorpx-4 py-2might not match your spacing scaleroundedmight not match your border radius system
You have to manually fix every value.
With Tokens
Prompt: "Build a button component using our design tokens"
AI reads tokens via MCP:
{
"component.button.primary.background": "color.primary.500",
"component.button.padding.x": "spacing.lg",
"component.button.padding.y": "spacing.md",
"component.button.border.radius": "border.radius.md"
}
AI output:
<button className="bg-primary-500 text-white px-6 py-3 rounded-md">
Click me
</button>
Result: Perfect alignment with your design system on the first try.
Token-Based Workflow Benefits
1. Single Source of Truth
Figma Variables ──► Design Tokens ──┬──► Web (CSS/Tailwind)
├──► iOS (Swift)
├──► Android (Kotlin)
└──► AI Context (MCP)
Everyone reads from the same source. No more "which version is correct?"
2. Automatic Consistency
AI can't use the wrong color—it only has access to your approved tokens:
AI: "I need a blue color"
MCP: "Here are your blues: primary-400, primary-500, primary-600"
AI: "I'll use primary-500 for the button"
No #3B82F6 leaks into the codebase.
3. Design Changes Propagate Automatically
Designer updates color.primary.500 in Figma → exports tokens → commits to Git → AI sees new value → next generated component uses updated color.
Zero manual propagation.
4. Fewer Clarification Rounds
Traditional:
Dev: "Is this spacing 16px or 20px?"
Designer: [checks Figma] "20px"
Dev: [updates code]
Token-based:
AI: [reads tokens] spacing.lg = 20px
AI: [generates code with spacing.lg]
Done.
5. Design System Enforcement
AI can only use tokens that exist. If a token isn't defined, AI will ask:
AI: "I need a color for destructive actions. I see color.error.500—should I use that?"
This forces intentional design decisions instead of ad-hoc values.
Real Workflow: Dashboard UI Generation
Let's walk through a full workflow:
Traditional Approach (Old)
Time: 3-5 hours
Steps:
- Designer creates dashboard mockup in Figma (1 hour)
- Developer inspects Figma, takes notes (30 min)
- Developer codes sidebar, header, cards (2 hours)
- Designer reviews: "Colors and spacing don't match" (30 min)
- Developer fixes (1 hour)
- Designer approves (30 min)
Total: 5.5 hours, multiple review cycles
Token-Based AI Approach (New)
Time: 30 minutes
Steps:
- Designer creates mockup + ensures Figma Variables are used (1 hour) [one-time setup, reused for all features]
- Export tokens:
npx figma-variables-export(1 min) - Developer commits tokens to repo (1 min)
- Developer prompts AI: "Generate a dashboard with sidebar, header, stats cards, and data table using our design system" (2 min)
- AI reads tokens, generates code (1 min)
- Developer reviews generated code (10 min)
- Designer reviews: matches Figma perfectly (5 min)
- Ship
Total: 20 minutes of developer time (after one-time Figma setup)
Key difference: AI does the translation work, using exact token values from Figma.
FramingUI's Approach: End-to-End Automation
FramingUI handles the entire pipeline:
1. Token Definition
{
"color": {
"primary": { "500": { "value": "#3b82f6", "type": "color" } }
},
"spacing": {
"md": { "value": "16px", "type": "dimension" }
}
}
2. Build-Time Generation
npx framingui build
Generates:
- ✅ CSS variables (
:root { --color-primary-500: #3b82f6; }) - ✅ Tailwind config (
colors: { primary: { 500: '#3b82f6' } }) - ✅ TypeScript types (
type ColorToken = 'primary-500' | ...) - ✅ Documentation (auto-generated token reference)
3. Runtime AI Access (MCP Server)
npx @framingui/mcp-server
AI tools (Claude Code, Cursor) query tokens in real-time:
AI → MCP Server → Tokens → AI generates code using tokens
4. Always Up-to-Date
Update tokens.json → run framingui build → AI automatically sees new values.
No manual propagation. No stale docs. No "which version is correct?"
Practical Tips for Token-Based Figma Workflow
1. Use Figma Variables for Everything
Don't mix raw values and variables in the same file. If you use variables for colors, use them for spacing too.
Bad:
Button padding: 16px (hardcoded)
Button background: {color/primary/500} (variable)
Good:
Button padding: {spacing/md} (variable)
Button background: {color/primary/500} (variable)
2. Name Variables Semantically
Bad naming:
color/blue/light → Used for primary buttons? Info badges? Links? Unclear.
Good naming:
color/primary/500 → Primary brand color
color/info/bg → Info badge background
color/link/default → Default link color
Semantic names help AI understand intent, not just values.
3. Document Token Purpose
Add descriptions to your tokens:
{
"color": {
"primary": {
"500": {
"value": "#3b82f6",
"type": "color",
"description": "Primary brand color. Use for CTAs, links, focus states."
}
}
}
}
When AI reads this, it knows when to use each token, not just what the value is.
4. Keep Tokens in Sync
Set up automation:
# In your CI/CD pipeline or git hooks
npx figma-variables-export --file-id $FIGMA_FILE_ID --output tokens.json
git diff tokens.json
If tokens changed, commit and push. This keeps design and code in sync automatically.
Case Study: Building a Dashboard with AI
Let's compare both workflows side-by-side.
Scenario: SaaS Dashboard with Sidebar, Cards, Data Table
Traditional workflow:
| Step | Traditional | Time |
|---|---|---|
| 1. Design in Figma | Designer creates mockup | 2 hours |
| 2. Handoff | Designer annotates, writes specs | 1 hour |
| 3. Developer inspects | Open Figma, take notes | 30 min |
| 4. Code sidebar | Manual HTML/CSS/React | 1 hour |
| 5. Code cards | Manual HTML/CSS/React | 1 hour |
| 6. Code table | Manual HTML/CSS/React | 1.5 hours |
| 7. Designer review | "Spacing is wrong, colors off" | 30 min |
| 8. Fix issues | Developer adjusts code | 1 hour |
| 9. Final review | Designer approves | 30 min |
| Total | 9.5 hours |
Token-based AI workflow:
| Step | Token-Based + AI | Time |
|---|---|---|
| 1. Design in Figma | Designer creates mockup with Variables | 2 hours |
| 2. Export tokens | npx figma-variables-export | 2 min |
| 3. Commit tokens | git add tokens.json && git commit | 1 min |
| 4. AI generation | "Build dashboard with sidebar, cards, table" | 2 min |
| 5. Developer review | Check generated code, minor tweaks | 20 min |
| 6. Designer review | Matches Figma perfectly | 10 min |
| Total | 2.5 hours |
Time saved: 7 hours (74% reduction)
Quality: Higher consistency, fewer bugs
Iterations: 1 review cycle instead of 2-3
How AI Understands Your Design System
Without Token Context
Prompt: "Create a card component"
AI's knowledge:
- Generic card patterns from training data
- Common Tailwind/CSS conventions
- No idea what YOUR design system looks like
Output:
<div className="bg-white p-4 rounded shadow">
<h3 className="text-xl font-bold">Title</h3>
<p className="text-gray-600 mt-2">Content</p>
</div>
Problems:
bg-whitemight not be your card background tokenp-4might not match your spacing scaleroundedmight not match your border radius systemtext-gray-600might not be your secondary text color
With Token Context (MCP)
Prompt: "Create a card component using our design system"
AI reads via MCP:
{
"component.card.background": "color.surface.primary",
"component.card.padding": "spacing.lg",
"component.card.border.radius": "border.radius.md",
"component.card.shadow": "shadow.sm",
"color.text.primary": "color.gray.900",
"color.text.secondary": "color.gray.600"
}
Output:
<div className="bg-surface-primary p-6 rounded-md shadow-sm">
<h3 className="text-lg font-semibold text-gray-900">Title</h3>
<p className="text-sm text-gray-600 mt-4">Content</p>
</div>
Result: Every class maps to a token. Perfect alignment with your design system.
Setting Up the Token-Based Workflow
Prerequisites
- Figma file using Figma Variables (or styles + Tokens Studio plugin)
- FramingUI or Style Dictionary installed
- Claude Code or Cursor with MCP support
Step 1: Export Figma Variables
Option A: Figma REST API
npm install figma-variables-export
npx figma-variables-export \
--file-id YOUR_FIGMA_FILE_ID \
--token YOUR_FIGMA_TOKEN \
--output tokens.json
Option B: Tokens Studio Plugin
- Install Tokens Studio for Figma
- Define tokens in the plugin
- Export as JSON
Option C: FramingUI Figma Plugin (coming soon)
One-click export from Figma to FramingUI-compatible tokens.
Step 2: Initialize FramingUI
cd your-project
npx framingui init --tokens tokens.json
This generates:
- CSS variables
- Tailwind config
- TypeScript types
- MCP server config
Step 3: Connect AI Tool
For Claude Code:
// ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"framingui": {
"command": "npx",
"args": ["@framingui/mcp-server"],
"env": {
"TOKENS_PATH": "/path/to/your-project/tokens.json"
}
}
}
}
For Cursor:
// .cursor/mcp.json
{
"mcpServers": {
"framingui": {
"command": "npx",
"args": ["@framingui/mcp-server"],
"env": {
"TOKENS_PATH": "./tokens.json"
}
}
}
}
Step 4: Generate Code with AI
Now when you prompt AI:
"Build a pricing card component with a title, price, features list, and CTA button"
AI will:
- Query design tokens via MCP
- See available colors, spacing, typography, border radius
- Generate component using your exact tokens
- Follow naming conventions from your system
No manual cleanup needed.
What This Changes
For Designers
- Design in Figma as usual, but using Variables
- Export tokens (one command or plugin click)
- Commit to repo
- Done. Code automatically matches design.
For Developers
- Pull latest tokens
- Describe component to AI
- AI generates code using tokens
- Review and ship
- No more pixel-perfect translation work.
For Teams
- Design and code stay in sync automatically
- Fewer Slack messages asking "what color is this?"
- Design changes propagate to code in minutes, not days
- Faster iteration cycles
Limitations & Gotchas
AI Isn't Perfect (Yet)
AI will generate structurally correct code using your tokens, but:
- Layout might need tweaking: AI guesses flex/grid structure from descriptions
- Accessibility requires review: AI adds ARIA labels, but double-check
- Complex interactions: State management, animations, edge cases need human review
Think of AI as a junior developer who knows your design system perfectly but needs guidance on architecture.
Tokens Don't Capture Everything
Figma has:
- Auto Layout rules (flex-grow, alignment)
- Constraints (responsive behavior)
- Component variants and properties
- Interactions and prototyping
Design tokens capture values (colors, spacing), not behavior (layout rules, interactions).
You still need to describe layout and behavior to AI:
"Create a card with flexbox layout, image on left (fixed 120px width),
content on right (flex-grow), and a button anchored to bottom-right"
Not All Figma → Token Mappings Are Automatic
Figma Variables work great for:
- ✅ Colors
- ✅ Numbers (spacing, sizing, opacity)
- ✅ Strings (font families, content)
But Figma also has:
- ⚠️ Text Styles (font family, size, weight, line-height)
- ⚠️ Effect Styles (shadows, blurs)
- ⚠️ Grid Styles
These require extra tooling (Tokens Studio plugin or custom scripts) to export as structured tokens.
The Future: Figma → Production in One Step
Imagine this workflow (not far off):
- Designer creates component in Figma
- AI watches Figma file for changes (via Figma Webhooks)
- AI detects new component → exports tokens → generates React code → opens PR
- Developer reviews PR, merges
- Component goes live
Human involvement: Review and approve.
Manual work: Zero.
This is where the industry is heading. FramingUI is building the infrastructure to make it happen.
Get Started
Try It Now
# Install FramingUI
npm install @framingui/core @framingui/mcp-server
# Initialize with sample tokens
npx framingui init
# Start MCP server
npx @framingui/mcp-server
Configure Claude Code or Cursor to connect, then prompt:
"Show me the available design tokens"
AI will list all your tokens with descriptions. Then try:
"Build a profile card component using these tokens"
Watch AI generate on-brand code automatically.
Resources
Conclusion
The traditional Figma-to-code workflow is slow, manual, and error-prone. AI can automate it—but only if AI has access to your design system.
Design tokens are the bridge. They give AI the context it needs to generate production-ready, on-brand code that matches Figma designs perfectly.
The workflow shift:
- Old: Designer → Figma → Handoff doc → Developer manually translates → Code
- New: Designer → Figma Variables → Tokens → AI generates code → Done
FramingUI makes this workflow effortless. Export tokens from Figma, connect AI via MCP, and let AI handle the translation work.
Your role shifts from pixel-pushing to architecture and review. That's where human creativity belongs.
Ready to eliminate design handoff hell? Try FramingUI.