1 / 12

The World, Made Legible

From data streams to semantic space:
teaching AI to understand place

An introduction to Spaxiom and INTENT

by Joe Scanlin

The Problem

AI can't see the world yet

Today's AI systems (GPT, Claude, Gemini) are linguistic machines. They've read billions of words and can generate plausible text, code, and plans.

But they don't see, touch, or measure. They reconstruct the world from written traces—like describing the ocean from shipping logs.

Language Era vs Experience Era

The shift from language-based AI to experience-based AI

The Gap: We need AI that can understand physical reality directly—not through words about the world, but through the world itself.

The Solution: Spaxiom

A universal grammar for physical context

Spaxiom is a platform that translates raw sensor data into semantic events that AI agents can understand and act on.

Without Spaxiom
  • Raw sensor streams
  • Brittle custom scripts
  • No cross-site learning
  • Data silos everywhere
With Spaxiom
  • Semantic events
  • Reusable patterns
  • Federated learning
  • Privacy-preserving

Key Insight: Just like language models gave us semantic interoperability for human knowledge, Spaxiom gives us spatial interoperability for machine experience.

Think: Air Traffic Control

From noise to meaning

Raw radar sends back millions of radio wave reflections per second: meaningless blips.

But air traffic control screens don't show blips.

They show: Flight AA123, altitude 35,000 ft, speed 480 kts, heading toward collision zone.

Air traffic control analogy

Spaxiom does for buildings what ATC does for airspace

Not "sensor 47 triggered" — but "Queue formed, lobby entrance, 5 people, 3 minutes, growing."

How Spaxiom Works

The compression pipeline

Sensor fusion pipeline

From 12 MB/sec of raw sensor data to 8 KB/sec of semantic events

Three Layers

  • Sensors: Cameras, occupancy sensors, temperature, access control, etc.
  • Spaxiom DSL: Defines zones, fuses streams, compresses to events
  • INTENT Patterns: Reusable templates like "QueueFormed" or "DwellTimeAnomaly"

The Magic: 1500× compression. What was 12 MB/sec becomes 8 KB/sec—while gaining semantic meaning.

INTENT: The Pattern Language

Reusable semantic templates

INTENT is a domain-specific language for defining spatial-temporal patterns.

pattern QueueFormed: triggers: - occupancy.count > 3 in zone.entrance - avg(person.dwell_time) > 2 minutes - spatial_cluster(people) within 3 meters emit: event: "QueueFormed" location: zone.id count: occupancy.count avg_wait: avg(dwell_time)

Why This Matters

  • Patterns are site-independent—works in retail, hospitals, offices
  • Patterns are composable—build complex behaviors from simple parts
  • Patterns are portable—trained once, deployed everywhere

Real Example: Cold Chain Logistics

Pharmaceutical shipment monitoring

The Challenge: $2.3M vaccine shipment, 4 sensor streams (temperature, GPS, accelerometer, door access), 24 hours of travel.

Cold chain sensor fusion

Four disconnected data streams become one actionable integrity score

Output: Shipment Integrity Index = 87.5
"Product viable • Minor excursion at Denver hub (23 min) • Route on schedule"

Real Example: Medical Sterilization

Predictive maintenance & compliance

The Challenge: Hospital sterilization rooms must meet CDC/FDA standards. Motor degradation → airflow loss → sterility breach.

Sterilization sensor fusion

Six sensors fused into one validated sterility event

Output: SterilizationCycleComplete(status=VALID, confidence=0.97)
Or: PressureDifferentialViolation(cause=motor_degradation, severity=CRITICAL)

The Experience Fabric

Federated learning without sharing data

Single buildings are interesting. Networks of sites are transformative.

Imagine 80 retail stores sharing insights without sharing camera footage—or a hospital network learning from manufacturing shift patterns.

Experience fabric network

142 sites across 6 industries, 23K events/day, zero raw data sharing

The Superpower: Diversity makes the system smarter. Cross-domain learning finds solutions that single-industry networks would never discover.

Why This Unlocks the Next AI Platform

1. Token & Energy Efficiency

A handful of semantic events replace thousands of raw samples. Smaller prompts, faster decisions, lower carbon per action.

2. Generalizable Agency

Patterns are site-independent. Move an agent from a retail lobby to a clinic corridor—it still understands crowding, queuing, wayfinding, anomalies.

3. Safety & Forensics Built In

Structured events create explainable, replayable histories of "what the agent knew, when." No black boxes.

Bottom Line: Spaxiom makes the physical world as legible to AI as text has become—and does it with 1500× compression.

The Opportunity

Why now matters

  • Sensors are exploding: From buildings to factories to cities, everything is instrumented
  • Agents are moving from chat to control: AI is shifting from answering questions to operating systems
  • Context-aware computing is becoming infrastructure: The missing layer is a common grammar for space and time

The Risk: Without Spaxiom, every vendor builds their own dialect. We get vendor lock-in, data silos, and fragmented intelligence.

The Vision: A universal semantic layer for physical context that agents can trust and developers can extend. That's what Spaxiom provides.

Learn More

Read the Full Post

The World, Made Legible →

Includes 4 detailed use cases with implementation guides

Technical Paper

Full Technical Specification →

Complete DSL syntax, architecture details, and formal definitions

← back to writings