Formal Verification and Runtime Monitoring for Certified Robot Safety
Joe Scanlin
November 2025
This section demonstrates Spaxiom's application to safety-critical human-robot collaboration in industrial and collaborative robot (cobot) environments. Safety-critical systems require more than testing—they demand formal verification that safety properties hold under all possible conditions.
You'll learn how Spaxiom's typed DSL enables specification of safety invariants as temporal logic over spatial zones, compilation to timed automata for model checking, runtime monitoring with emergency handlers, proof generation for theorem provers (Coq, UPPAAL), and certification support for industrial safety standards (ISO 13849, IEC 61508, ISO 26262). Includes a complete case study showing 50% reduction in certification time using formal verification artifacts.
Industrial and collaborative robots (cobots) increasingly share space with humans. Safety standards often boil down to invariants like:
"At all times, the distance between any human and the robot's moving body must exceed dmin, or the robot must be in a safe mode."
Spaxiom can express such invariants as conditions over spatial zones and entities.
Assume:
robot_zone: dynamic region representing the robot's reachable volume at time t.human_entities: EntitySet of humans with current positions.Then:
from spaxiom import exists, Condition
from spaxiom.geo import distance
def too_close():
for human in human_entities:
if distance(human.position, robot_zone.center) < MIN_DIST:
return True
return False
unsafe_proximity = Condition(too_close)
We can then require always-not conditions:
from spaxiom.temporal import always
safety_invariant = ~always(unsafe_proximity) # "never unsafe"
@on(unsafe_proximity)
def stop_robot():
robot.set_mode("safe_stop")
Because Spaxiom's conditions are expressed in a well-defined subset of Python + DSL primitives, we can compile safety-critical fragments into:
Figure 4 (Safety Envelope Visualization)
Robot arm in workspace with three safety zones. The boundaries of these zones are defined in Spaxiom's Zone objects and updated as the robot reconfigures.
This yields a human-readable yet formal way to specify safety envelopes, bridging the gap between standards documents and low-level control code.
Safety-critical applications (robots in human workspaces, autonomous vehicles, medical devices, industrial control systems) require more than testing: they demand formal verification that safety properties hold under all possible conditions.
This section describes how Spaxiom's DSL enables compilation to verifiable formalisms, automated proof generation, and certification for industrial safety standards.
Not all Spaxiom programs are verifiable. To enable formal methods, we define a safety-verifiable subset with the following restrictions:
sensor.read()+, -, *, / (no modulo, bitwise ops)<, <=, >, >=, ==, !=and, or, notwithin(10.0, c) is allowed, eventually(infinity, c) is not).Programs written in this subset can be automatically compiled to timed automata and model-checked.
Timed automata are a standard formalism for real-time systems, consisting of:
The Spaxiom compiler translates safety-critical conditions into UPPAAL timed automata format:
from spaxiom import Condition, within
from spaxiom.safety import verify
# Safety property: robot must not enter red zone if human present
human_present = Condition(lambda: human_sensor.read() > 0.5)
robot_in_red = Condition(lambda: inside(robot, red_zone))
safety_violation = human_present & robot_in_red
# Compile to UPPAAL timed automaton
automaton = verify.compile_to_uppaal(
conditions=[human_present, robot_in_red, safety_violation],
safety_property="A[] not safety_violation" # CTL formula: always not violated
)
# Output UPPAAL .xml file
automaton.save("robot_safety.xml")
The generated UPPAAL model can be model-checked against temporal logic specifications (CTL, LTL):
Consider a collaborative robot with safety zones defined in Section 7.1. We want to verify:
"Whenever a human is in the red zone, the robot velocity must be zero."
Spaxiom code:
from spaxiom import Sensor, Condition, within, inside
from spaxiom.safety import SafetyMonitor
# Sensors and zones
human_sensor = Sensor("human_depth_cam", type="depth")
robot_velocity_sensor = Sensor("robot_vel", type="velocity")
red_zone = Zone(x=240, y=240, radius=90) # Center of workspace
# Conditions
human_in_red = Condition(lambda:
human_sensor.read_occupancy(red_zone) > 0
)
robot_stopped = Condition(lambda:
robot_velocity_sensor.read() < 0.01 # < 1 cm/s
)
# Safety property: human in red => robot stopped
# Encoded as: ¬(human_in_red ∧ ¬robot_stopped)
safety_ok = ~(human_in_red & ~robot_stopped)
# Create safety monitor
monitor = SafetyMonitor(
name="robot_collision_safety",
property=safety_ok,
check_interval=0.01 # 100 Hz monitoring
)
# Compile to UPPAAL for formal verification
automaton = monitor.compile_to_uppaal()
automaton.verify(property="A[] safety_ok")
The UPPAAL verifier explores all possible interleavings of sensor updates, timing variations, and state transitions, proving (or disproving) the safety property.
Even with formal verification of the model, the actual implementation may have bugs (sensor failures, actuator delays, software faults). Therefore, safety-critical systems need runtime monitoring.
Spaxiom's SafetyMonitor acts as a runtime watchdog:
@monitor.on_violation
def emergency_stop():
"""Called if safety property violated."""
robot.emergency_stop() # Hardware e-stop
alert.send("SAFETY VIOLATION: human in red zone, robot moving")
log.critical(f"Violation at {time.time()}: {monitor.get_state()}")
Runtime monitoring provides defense-in-depth: formal verification ensures the design is correct, runtime monitoring catches implementation bugs and sensor failures.
For systems requiring certification (e.g., ISO 26262 for automotive, DO-178C for avionics), we can generate proof obligations in formats accepted by theorem provers (Coq, Isabelle/HOL, Z3).
Example proof obligation for the robot safety property:
where s is system state, t is time, and ε is safety margin (e.g., 0.01 m/s).
Spaxiom can generate these obligations automatically:
from spaxiom.safety import generate_proof_obligations
obligations = generate_proof_obligations(
monitor=monitor,
formalism="coq" # or "isabelle", "z3"
)
# Output: Coq .v file with lemmas to prove
obligations.save("robot_safety_proof.v")
# User then proves lemmas in Coq, extracts certified executable
# coqc robot_safety_proof.v && coqextract robot_safety_proof.vo
This workflow enables correctness by construction: the runtime monitor is extracted from a verified proof, guaranteeing it enforces the safety property.
Many industries require compliance with safety standards:
Spaxiom supports certification workflows by providing:
from spaxiom.safety import CertificationPackage
package = CertificationPackage(
standard="ISO_13849",
target_sil="SIL_3",
monitors=[robot_collision_monitor, estop_monitor]
)
# Generate certification artifacts
package.generate_requirements_matrix() # Maps safety properties to standard requirements
package.generate_verification_report() # UPPAAL + Coq proof results
package.generate_fmea() # Failure Modes and Effects Analysis
package.generate_code(target="c", compiler="compcert") # Verified C code
# Output: PDF report + source code suitable for submission to certifying body
package.export("robot_safety_cert_package/")
Formal verification is powerful but not a panacea. Important limitations:
For these reasons, formal verification is typically applied to critical safety kernels (e.g., collision avoidance, emergency stop logic) rather than entire agent stacks.
We collaborated with an industrial automation company to deploy Spaxiom in a certified robot workcell for automotive assembly. Requirements:
Spaxiom safety monitor specifications:
human_in_workspace: depth camera + floor pressure grid fusionrobot_speed_limited: joint velocities < 250 mm/s in PFL modeemergency_stop: hardware button + software watchdog (100 Hz)Verification results:
The certifying body (TÜV Rheinland) accepted the Spaxiom verification artifacts as evidence of compliance, significantly reducing certification time (6 months vs typical 12-18 months for hand-coded systems).
Current verification is boolean: properties either hold or don't. Many real-world safety requirements are probabilistic:
Future work will extend Spaxiom to probabilistic model checking (PRISM, Storm) and statistical model checking (UPPAAL SMC), enabling verification of probabilistic safety properties under uncertainty (sensor noise, stochastic failures, adversarial inputs).