I wanted a simple demo to show what a single prompt can produce with GPT 5.2.
The result is static/planet-demo.html: a self-contained Solar System Explorer
with a canvas renderer, a small control panel, and planet info cards.
No build step. One file.
This is not a product. It is a proof of speed: idea to interactive artifact in minutes.
The prompt, in spirit
The ask was straightforward:
- One HTML file, no dependencies
- Interactive solar system with orbits and labels
- A small HUD with speed and zoom controls
- Click a planet to see a facts panel
- Simple keyboard shortcuts for pause and reset
That is the level of specificity that keeps the output coherent and useful.
The original prompt
Create a single-page app in a single HTML file with the following requirements:
- Name: Solar System Explorer
- Goal: Visualize planets orbiting the sun.
- Features: Click planets for info, orbit speed control, and, drag to rotate, zoom in/out.
- The UI should be dark-themed and interactive.
The demo
Open the demo in a new tabWhy this matters
LLMs compress the time between concept and usable demo. That changes how you explore ideas:
- You can test an interaction before you commit to a stack
- You can validate a visual direction without a full design cycle
- You can show stakeholders something tangible on day one
The real work still matters, though. A prompt can draft a demo, but it does not decide the product boundaries, the UX tradeoffs, or how it fits into a real system. That is where engineering judgment still earns its keep.