From data streams to semantic space:
teaching AI to understand place
An introduction to Spaxiom and INTENT
by Joe Scanlin
AI can't see the world yet
Today's AI systems (GPT, Claude, Gemini) are linguistic machines. They've read billions of words and can generate plausible text, code, and plans.
But they don't see, touch, or measure. They reconstruct the world from written traces—like describing the ocean from shipping logs.
The shift from language-based AI to experience-based AI
The Gap: We need AI that can understand physical reality directly—not through words about the world, but through the world itself.
A universal grammar for physical context
Spaxiom is a platform that translates raw sensor data into semantic events that AI agents can understand and act on.
Key Insight: Just like language models gave us semantic interoperability for human knowledge, Spaxiom gives us spatial interoperability for machine experience.
From noise to meaning
Raw radar sends back millions of radio wave reflections per second: meaningless blips.
But air traffic control screens don't show blips.
They show: Flight AA123, altitude 35,000 ft, speed 480 kts, heading toward collision zone.
Spaxiom does for buildings what ATC does for airspace
Not "sensor 47 triggered" — but "Queue formed, lobby entrance, 5 people, 3 minutes, growing."
The compression pipeline
From 12 MB/sec of raw sensor data to 8 KB/sec of semantic events
The Magic: 1500× compression. What was 12 MB/sec becomes 8 KB/sec—while gaining semantic meaning.
Reusable semantic templates
INTENT is a domain-specific language for defining spatial-temporal patterns.
Pharmaceutical shipment monitoring
The Challenge: $2.3M vaccine shipment, 4 sensor streams (temperature, GPS, accelerometer, door access), 24 hours of travel.
Four disconnected data streams become one actionable integrity score
Output: Shipment Integrity Index = 87.5
"Product viable • Minor excursion at Denver hub (23 min) • Route on schedule"
Predictive maintenance & compliance
The Challenge: Hospital sterilization rooms must meet CDC/FDA standards. Motor degradation → airflow loss → sterility breach.
Six sensors fused into one validated sterility event
Output: SterilizationCycleComplete(status=VALID, confidence=0.97)
Or: PressureDifferentialViolation(cause=motor_degradation, severity=CRITICAL)
Federated learning without sharing data
Single buildings are interesting. Networks of sites are transformative.
Imagine 80 retail stores sharing insights without sharing camera footage—or a hospital network learning from manufacturing shift patterns.
142 sites across 6 industries, 23K events/day, zero raw data sharing
The Superpower: Diversity makes the system smarter. Cross-domain learning finds solutions that single-industry networks would never discover.
A handful of semantic events replace thousands of raw samples. Smaller prompts, faster decisions, lower carbon per action.
Patterns are site-independent. Move an agent from a retail lobby to a clinic corridor—it still understands crowding, queuing, wayfinding, anomalies.
Structured events create explainable, replayable histories of "what the agent knew, when." No black boxes.
Bottom Line: Spaxiom makes the physical world as legible to AI as text has become—and does it with 1500× compression.
Why now matters
The Risk: Without Spaxiom, every vendor builds their own dialect. We get vendor lock-in, data silos, and fragmented intelligence.
The Vision: A universal semantic layer for physical context that agents can trust and developers can extend. That's what Spaxiom provides.
Read the Full Post
Includes 4 detailed use cases with implementation guides
Technical Paper
Full Technical Specification →
Complete DSL syntax, architecture details, and formal definitions