Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for the design, simulation, and analysis of systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. Reference examples provide a starting point for implementing components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.
The toolbox includes multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that let you evaluate fusion architectures using real and synthetic data. With Sensor Fusion and Tracking Toolbox you can import and define scenarios and trajectories, stream signals, and generate synthetic data for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors. You can also evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots.
For simulation acceleration or desktop prototyping, the toolbox supports C code generation.
Learn about toolbox conventions for spatial representation and coordinate systems
Model combinations of inertial sensors and GPS
Fuse inertial measurement unit (IMU) readings to determine orientation.
Use Kalman filters to fuse IMU and GPS readings to determine pose.
You can build a complete tracking simulation using the functions and objects supplied in this toolbox.
You can define a tracking simulation by using the
Simulate target detections by radar sensors.