Seeing Everything: How We Enhance Real-Time Analysis with Foxglove
- seaxaidevloop
- Aug 12
- 3 min read
Updated: Aug 14
At SeaX AI, every second of testing counts. Our work is about understanding how the autonomous vehicle behaves in all kinds of situations — and to do that, we need access to all relevant data at a single glance. With that in mind, we developed a visualization tool that combines multiple data sources, allowing us to analyze in real time what’s happening on track or in the simulator.
This tool integrates real images, 3D models of the vehicle and its environment, and dynamic graphs of key variables such as steering wheel angle, speeds, and accelerations. Instead of interpreting logs in isolation, we can see the full context of a maneuver as it happens — or replay it later for a detailed analysis.
To build it, we used Foxglove as the base platform, customizing its configuration to fit our exact development and validation needs. Thanks to this tool, we can unify all the information we need into a single environment: from real images and 3D models of the vehicle and surroundings to real-time graphs of the most critical variables. vehículo y el entorno, hasta gráficos en tiempo real de las variables más críticas.
What do we see in the interface?
In the video, you can see how the Foxglove interface combines different panels to literally show the full story of a test in real time or replay it later for analysis. The main capabilities we use include:
Lateral error at all times – Lateral error — the side-to-side distance between the vehicle’s current position and the desired trajectory — is one of the key indicators for evaluating control performance. Seeing it in real time lets us identify deviations, correlate them with maneuvers, and adjust control parameters before problems build up.
Synced real camera view and 3D environment – We combine the image captured by the vehicle’s internal camera with a 3D recreation showing the exact position of the car in the environment. This gives us two complementary perspectives: what is “seen” from inside the vehicle, and where it really is in space.
Graphs of critical variables – We monitor and plot in real time parameters like steering wheel angle, longitudinal and lateral speeds, accelerations, yaw rate, and any other signal of interest. These graphs are synchronized with the real image and the 3D model, making it easier to detect the cause of abnormal behavior.
Scalability and customization – We configured Foxglove so we can easily add or remove variables and views depending on the type of test. For example, in steering control validations we focus on steering angle, its derivative, and lateral error; in acceleration and braking tests we prioritize speed, G-force, and jerk graphs.
Why is this so valuable for us?
Before having a tool like this, our workflow was much more fragmented:
Download the test logs.
Open them with a data analysis tool, usually MATLAB.
Replay the camera footage in another program.
Manually synchronize the information.
That process could take hours and was prone to interpretation errors. With Foxglove integrated with ROS 2, everything happens in a single interface — and here’s the key detail: ROS’s ability to generate a ROS bag is essential. When we record a test in ROS 2, the bag stores absolutely everything — every topic, every timestamped variable, every video frame or 3D object — all perfectly synchronized, as if we were replaying a simulation.
This means we can rewind and replay the session with total fidelity, analyzing not only what happened, but why it happened. Foxglove reads that ROS bag and lets us choose exactly which variables to visualize and how to correlate them: from acceleration and steering angle graphs to camera footage and 3D models of the vehicle and environment..
The result: faster, deeper, more flexible analysis
This approach allows us to:
Drastically reduce the time between testing and analysis.
Detect patterns and correlations that previously went unnoticed.
Share findings quickly with the entire team, even remotely.
An accelerator for development and validation
Autonomous driving and advanced driver assistance systems aren’t developed only in code; they evolve through fast iterations of testing, analysis, and adjustment. The shorter that cycle, the faster the technology advances.
Thanks to these visualization tools, we can make data-driven decisions in minutes, validate software changes the same day, and precisely document system behavior in every scenario. This not only speeds up our work but also gives us complete traceability for each test.
In short, this tool has become a cornerstone of our development process, enabling us to visualize the present to build the future of our technology.
Interested in following our progress?
Subscribe, follow us on social media, or contact us if you represent a municipality or operator and want to explore pilot projects in a rural environment.
Comments