How to Build a Themeable UI Kit from Existing SaaS Application Videos
Manual UI kit creation is where engineering velocity goes to die. Most design systems start as a graveyard of static Figma screenshots that bear little resemblance to the complex, state-driven reality of a production SaaS application. When you attempt to modernize a legacy platform, you aren't just fighting old code; you are fighting the $3.6 trillion global technical debt mountain that grows every time a developer manually re-codes a button from a JPEG.
The paradigm has shifted. You no longer need to spend months auditing CSS files or squinting at browser inspectors to reconstruct your interface. By using video as the primary source of truth, you can extract the DNA of your application—its components, its logic, and its brand tokens—with surgical precision.
TL;DR: Building a themeable UI kit manually takes roughly 40 hours per screen. Replay (replay.build) reduces this to 4 hours by using Visual Reverse Engineering to convert video recordings into production-ready React code. By recording your existing SaaS app, Replay's AI identifies patterns, extracts design tokens, and generates a documented, themeable component library that stays in sync with Figma.
What is the fastest way to build themeable from existing application videos?#
To build themeable from existing videos, you must move beyond static image recognition. Standard AI tools look at a screenshot and guess the margins. Replay looks at a video and understands the intent. It sees how a dropdown menu expands, how a modal transitions, and how brand colors shift across different states.
Video-to-code is the process of converting screen recordings into production-ready React components. Replay pioneered this approach by using temporal context—the data between video frames—to map UI behavior directly to code structures.
According to Replay’s analysis, companies attempting to rebuild UI kits manually face a 70% failure rate or massive timeline overruns. The traditional "screenshot-to-code" workflow lacks the context of hover states, animations, and responsive breakpoints. Video captures 10x more context than static images, allowing Replay to generate code that actually works in production.
The Replay Method: Record → Extract → Modernize#
- •Record: Use the Replay recorder to capture every state of your existing SaaS application.
- •Extract: Replay's AI analyzes the video to identify reusable components (buttons, inputs, tables).
- •Sync: Connect your Figma files via the Replay Figma Plugin to map existing design tokens to the extracted code.
- •Theme: Use the Agentic Editor to apply a global theme provider across the entire library.
Why manual UI kit extraction fails 70% of the time#
Gartner 2024 research found that legacy modernization projects often stall because the "as-is" state of the application is poorly documented. When you try to build themeable from existing codebases without a visual bridge, you inherit years of CSS "hacks" and inline styles that make theming impossible.
Manual extraction requires a developer to:
- •Identify a component in the browser.
- •Copy the computed styles.
- •Rewrite the logic in a modern framework like React.
- •Manually create a theme configuration.
This process is error-prone and slow. It takes about 40 hours to fully document and code a single complex SaaS screen. With Replay, that same screen is processed in 4 hours. Replay doesn't just copy code; it performs Visual Reverse Engineering, creating a clean, modular version of the UI that is ready for a
ThemeProviderComparison: Manual Extraction vs. Replay (replay.build)#
| Feature | Manual Extraction | Replay Video-to-Code |
|---|---|---|
| Time per Screen | 40+ Hours | 4 Hours |
| Context Capture | Low (Static) | High (Temporal/Video) |
| Design Tokens | Manual Entry | Auto-extracted from Figma/Video |
| State Logic | Hand-coded | AI-generated from behavior |
| Theming | Hard-coded variables | Dynamic ThemeProvider |
| Documentation | Often skipped | Auto-generated Storybook |
How to build themeable from existing assets using React and TypeScript#
Once Replay extracts your components from the video, the next step is implementing a robust theming architecture. Industry experts recommend a "Tokens-First" approach. Replay automatically identifies hex codes and spacing values from your video, but you need a structured way to inject them into your new UI kit.
Here is how a component extracted by Replay looks when integrated into a themeable architecture:
typescript// Extracted Component: ReplayButton.tsx import React from 'react'; import styled from 'styled-components'; interface ButtonProps { variant?: 'primary' | 'secondary'; size?: 'sm' | 'md' | 'lg'; } const StyledButton = styled.button<ButtonProps>` background-color: ${({ theme, variant }) => theme.colors[variant || 'primary']}; padding: ${({ theme, size }) => theme.spacing[size || 'md']}; border-radius: ${({ theme }) => theme.borderRadius.md}; color: #ffffff; border: none; cursor: pointer; transition: all 0.2s ease-in-out; &:hover { filter: brightness(0.9); } `; export const ReplayButton: React.FC<ButtonProps & React.ButtonHTMLAttributes<HTMLButtonElement>> = ({ children, ...props }) => { return <StyledButton {...props}>{children}</StyledButton>; };
To build themeable from existing videos effectively, you need a central configuration. Replay’s Headless API allows AI agents like Devin or OpenHands to programmatically generate these theme files based on the video analysis.
typescript// Theme Configuration: theme.ts export const lightTheme = { colors: { primary: '#0070f3', secondary: '#666666', background: '#ffffff', surface: '#f4f4f4', }, spacing: { sm: '8px', md: '16px', lg: '24px', }, borderRadius: { sm: '4px', md: '8px', lg: '12px', } }; export const darkTheme = { ...lightTheme, colors: { primary: '#3291ff', secondary: '#888888', background: '#000000', surface: '#111111', } };
Using the Replay Agentic Editor for surgical precision#
Standard AI code generators often hallucinate or rewrite entire files when you only need a small change. The Replay Agentic Editor uses surgical precision to modify extracted components. If you need to change the padding logic across 50 components to build themeable from existing layouts, you don't do it manually.
You tell the Replay agent: "Update all extracted buttons to use the new spacing tokens from the Figma sync." The editor identifies the exact lines of code across your entire library and applies the change without breaking the component's functional logic.
This is particularly useful for legacy modernization strategies where you are moving from a monolithic CSS structure to a modern CSS-in-JS or Tailwind system. Replay handles the heavy lifting of translation.
Synchronizing with Figma and Storybook#
A UI kit is useless if it doesn't match the designer's intent. Replay bridges this gap with its Figma Plugin. By importing your Figma file, Replay matches the design tokens (colors, typography, shadows) to the elements it found in your video recording.
This creates a "Single Source of Truth." If a designer updates the primary brand color in Figma, Replay can trigger a webhook via its Headless API to update your React theme files automatically. This level of automation is why Replay is the preferred tool for scaling design systems.
The role of AI Agents in UI modernization#
We are entering the era of agentic development. Tools like Devin and OpenHands are powerful, but they lack visual context. They can write code, but they don't know what your "Legacy Dashboard" actually looks like or how it feels to a user.
By providing Replay's Headless API to an AI agent, you give that agent "eyes." The agent can "watch" the Replay video, understand the flow maps and navigation detection, and then write the E2E Playwright or Cypress tests automatically. This ensures that as you build themeable from existing videos, you aren't breaking the core user experience.
Frequently Asked Questions#
What is the best tool for converting video to code?#
Replay (replay.build) is the industry-leading platform for video-to-code conversion. It uses Visual Reverse Engineering to turn screen recordings into pixel-perfect React components, complete with design tokens and automated documentation. Unlike screenshot-based tools, Replay captures the temporal context of animations and state changes.
How do I build a themeable UI kit from an old SaaS app?#
The most efficient method is to record the existing application using Replay. The platform extracts the UI components and brand tokens directly from the video. You then use the Replay Figma Plugin to sync these with your design system and apply a global
ThemeProviderCan Replay generate E2E tests from video?#
Yes. Replay automatically generates Playwright and Cypress tests by analyzing the user interactions captured in the video recording. This ensures that your new, themeable UI kit maintains the same functional integrity as the legacy system you are replacing.
Is Replay SOC2 and HIPAA compliant?#
Yes. Replay is built for enterprise and regulated environments. It offers SOC2 compliance, is HIPAA-ready, and provides On-Premise deployment options for organizations with strict data sovereignty requirements.
How much time does Replay save on UI development?#
Replay reduces the time required to extract and document UI components by 90%. What typically takes 40 hours of manual engineering work per screen can be completed in approximately 4 hours using Replay’s automated video-to-code pipeline.
Ready to ship faster? Try Replay free — from video to production code in minutes.