Reinventing the Physical Model
Physical scale models have long been the gold standard for urban planning communication—they offer an intuitive, tactile view of a city that screens can’t match. But they have a fatal flaw: they are static. Once built, they cannot change to show new data, future scenarios, or live traffic.
PARM-X (Projection Augmented Relief Models - Extended) solves this by turning the physical model into a dynamic digital canvas. By precisely projecting data onto the model’s topography, it combines the intuitive understanding of a physical object with the infinite flexibility of a digital map.
PARM-X enables researchers and practitioners to prototype ideas, communicate complex data, and explore spatial scenarios without the hours of preparation traditionally needed.
The “Three-Screen” Architecture
The core innovation of PARM-X is its unified, browser-based environment that orchestrates three distinct views in real-time. Instead of requiring complex desktop software, it runs entirely in the browser, making it deployable anywhere.
1. The Operator Cockpit (Controls)
This is the command center. An operator uses a full dashboard to:
- Ingest Data: Drag-and-drop CSVs, Shapefiles, or GeoJSONs instantly.
- Connect Live Feeds: Stream data directly from ArcGIS REST servers or OpenStreetMap queries.
- Orchestrate: Control what the audience sees with sub-millisecond precision.
2. The Projection Surface (Map)
This view is the system’s “visual engine.” It provides a UI-free, perfectly calibrated map designed to be projected onto the physical model. It handles the complex geometry of warping a flat digital map into a 3D perspective to match the physical ridges and valleys of the city.
3. The Audience Context (Presentation)
While the model looks beautiful, it can be hard to read specific data values on a bumpy surface. The third screen solves this by acting as a dynamic legend, showing the audience exactly what they are looking at, along with live statistics and interaction prompts.
Turning Observation into Interaction
Most city models are “assess only”—you look, but you don’t touch. PARM-X transforms public consultation by making the model interactive.
Through the “Research Projects” module, the system generates dynamic QR codes on the presentation screen. Visitors scan these to open a mobile web interface where they can:
- Draw Desire Lines: Sketch new cycle routes or paths directly onto the model.
- Vote Spatially: Drop pins to mark unsafe areas or community assets.
- Real-Time Feedback: See their contributions appear instantly on the physical model, creating a powerful feedback loop between the citizen and the city planner.
Technical Innovation
PARM-X is built on a modern, high-performance stack designed for low latency:
- Real-Time Sync: Uses the BroadcastChannel API to synchronize state across windows instantly, with Supabase Realtime handling cross-device updates from the audience.
- Advanced Calibration: Features a custom 4-point Homography transformation engine that digitally warps the map projection to perfectly match the physical model’s perspective from any angle.
- Spatial Backend: Powered by PostgreSQL/PostGIS, enabling complex spatial queries (like “find all schools within this drawn polygon”) to happen on-the-fly.
- Audio-Reactive: Leverages the Web Audio API to make the map “dance” to audio, visualizing frequency bands as opacity or color shifts for immersive storytelling.
Impact
Demonstrated at the University of Nottingham’s City as Lab, PARM-X has been used to explore city-scale challenges in a collaborative medium. It represents a shift from “presenting to” an audience to “exploring with” them, making complex urban data accessible, transparent, and engaging.