How hue works
From a brand, to a skill, to every UI your AI builds.
The three steps
1. Drop a URL or screenshot
Point hue at the source. It can be a live URL, a Figma screenshot, a moodboard, or even a written brand brief. hue handles all four as inputs to the same pipeline.
2. hue captures it as a system
hue extracts the visible identity (colors, type, radii, motion, voice) and resolves it into a structured design language. The output is not a snapshot — it is a system, with semantic tokens that the model can reason about.
3. Every UI your AI builds matches
From that point on, your assistant uses the hue-generated skill as its design context. Ask it to build a settings page, a marketing landing, or a dashboard component — the visual language is preserved.
What hue generates
For every brand, hue writes six files into the skill folder:
SKILL.md— the entry point the assistant readsdesign-model.yaml— structured semantic tokens (colors, type, radii, spacing, motion)tokens.md— human-readable token referencecomponent-library.html— a generated component library you can browselanding-page.html— a marketing landing in the brand languagepreview.html— a visual cheat sheet
All six are static files, all six live on disk, and all six are version-controllable.
An example output
The Halcyon brand page is one of the 17 example skills shipped with hue. Same generator, same output structure, fictional brand.
Try it
Install hue and run it against any brand of your choice. It takes about one minute.
hue