
With the rise of AI, it’s natural to ask: what if you could generate a complete Dark Mode design system—tokens, documentation, and a high-fidelity prototype—from scratch?The answer is: it’s possible. But not quite as seamless as you might imagine.
To understand why, let’s break the challenge down into three key steps.
Here’s how AI helped compress what normally takes days (or weeks) into minutes.
Instead of inventing colors from scratch, the AI acted like a design engineer.
It began by analysing the light mode variable ( if you had JSON file or screenshots would have worked)
Identified core brand colours (Primary Purple: #5F4DBC)
Classified semantic roles (backgrounds, surfaces, text, accents)
Then it applied logic-based transformations to generate Dark Mode:
Surfaces: Inverted lightness (White → #121214)
Brand: Tuned saturation and brightness for dark environments (#5F4DBC → #816EDE)
Accessibility: Automatically validated contrast against WCAG AA standards

Moving design tokens between engineering and Figma usually means exporting/importing JSON or relying on plugins.
To see how much AI could streamline this workflow, I put it to the test.
In practice, standard token formats didn’t import cleanly into the existing Figma setup. The most reliable solution still came down to manual work—creating variables by hand. That’s the current reality.
However, to bridge the gap between raw token data and visual understanding, you’d ask AI to generate a Design Docs that mirrors the visual utility of Figma documentation.

With the tokens in place, the next step was to see how they actually translated into the existing UI—starting from the light theme and moving toward Dark Mode.
This meant building out detailed previews, including:
Trade inputs
“Best Price” toggle
Chart lines
Layout spacing
Some custom assets—like avatars created in SVG—aren’t always easy for AI to reproduce perfectly on the first pass. That required a few iterations to resolve visual inconsistencies and fine-tune the details.
Still, one especially useful outcome was generating a practical cheat sheet that maps every UI element to its underlying token. This “variable map” can become a clear guide for engineering:
Trade Card → background surface
Text styles → semantic typography variables
Components → design tokens
Having this mapping in place makes implementation far more straightforward—and removes much of the guesswork for developers.
The final UI result covers both light & dark themes.

The AI didn’t just write code; it became a design partner. It participated in the entire product design lifecycle:
Extracting requirements
Designing accessible systems
Solving tooling compatibility ( you’d still need quite a workaround to fix issues)
Producing documentation
Building production-grade prototypes
<100 subscribers

With the rise of AI, it’s natural to ask: what if you could generate a complete Dark Mode design system—tokens, documentation, and a high-fidelity prototype—from scratch?The answer is: it’s possible. But not quite as seamless as you might imagine.
To understand why, let’s break the challenge down into three key steps.
Here’s how AI helped compress what normally takes days (or weeks) into minutes.
Instead of inventing colors from scratch, the AI acted like a design engineer.
It began by analysing the light mode variable ( if you had JSON file or screenshots would have worked)
Identified core brand colours (Primary Purple: #5F4DBC)
Classified semantic roles (backgrounds, surfaces, text, accents)
Then it applied logic-based transformations to generate Dark Mode:
Surfaces: Inverted lightness (White → #121214)
Brand: Tuned saturation and brightness for dark environments (#5F4DBC → #816EDE)
Accessibility: Automatically validated contrast against WCAG AA standards

Moving design tokens between engineering and Figma usually means exporting/importing JSON or relying on plugins.
To see how much AI could streamline this workflow, I put it to the test.
In practice, standard token formats didn’t import cleanly into the existing Figma setup. The most reliable solution still came down to manual work—creating variables by hand. That’s the current reality.
However, to bridge the gap between raw token data and visual understanding, you’d ask AI to generate a Design Docs that mirrors the visual utility of Figma documentation.

With the tokens in place, the next step was to see how they actually translated into the existing UI—starting from the light theme and moving toward Dark Mode.
This meant building out detailed previews, including:
Trade inputs
“Best Price” toggle
Chart lines
Layout spacing
Some custom assets—like avatars created in SVG—aren’t always easy for AI to reproduce perfectly on the first pass. That required a few iterations to resolve visual inconsistencies and fine-tune the details.
Still, one especially useful outcome was generating a practical cheat sheet that maps every UI element to its underlying token. This “variable map” can become a clear guide for engineering:
Trade Card → background surface
Text styles → semantic typography variables
Components → design tokens
Having this mapping in place makes implementation far more straightforward—and removes much of the guesswork for developers.
The final UI result covers both light & dark themes.

The AI didn’t just write code; it became a design partner. It participated in the entire product design lifecycle:
Extracting requirements
Designing accessible systems
Solving tooling compatibility ( you’d still need quite a workaround to fix issues)
Producing documentation
Building production-grade prototypes
Share Dialog
Share Dialog
No comments yet