Software Architecture

This year we have reworked the software architecture to make maintainance easier. At the highest level, the user interacts with the Mission Planner Interface. The mission planner is responsible for orchestrating the task nodes which in turn interface with the vision pipeline, the navigation and control systems. 

Control Panel Interface

The Bumblebee 3.5 Control panel displays telemetry information, camera and sonar images, which aids in monitoring of sensors and actuator data for system analysis during operational deployments when connected over the tether. The control panel is also used for control of the vehicle during teleoperation mode.

The software stack has a logging system which is used to capture telemetry and video information, and log messages during both tethered and autonomous operations.  The software is capable of replaying logged data and streaming into the user interfaces allowing for post mortem analysis of the mission and evaluation of the collected data.

Mapping Interface

Apart from the control panel interface, the Bumblebee 3.5 vehicle has a mapping interface which enables the operator to simply point and click on a map and the vehicle will move to the location as indicated when used in tele-operation mode.

The mapping interface maps out key sonar objects in the area and projects it onto the mapping interface so the operator can directly correlate in real world, the actual coordinate of a certain sonar object.

Control Systems

BumbleBee 3.5 has six degrees of freedom: surge, sway, heave, roll, pitch and yaw. Six PID (Proportional Integral Derivative) control loops are tuned to control the vehicle’s underwater dynamics and allow for highly precise station-keeping. For ease of tuning, a Control System User Interface is available for the user to quickly tune the controls should there be any changes to buoyancy of the vehicle.

Underwater Perception and Tracking

Deep Learning

Deep Learning was also implemented for detection of more complex objects. With this newly introduced feature, our AUV obtained a result of 0.933 for the mean average precision for detection of various underwater obstacles for Robosub.

Sensor Fusion

The AUV is equipped with two machine vision cameras and an imaging sonar for forward and downward perception. Leveraging upon the benefits of both sonar and camera, the AUV is capable for highly robust perception and tracking of objects underwater.

Navigation Suite

The navigation sensor suite consists of a 9 axis Sparton IMU, 6 axis STIM300 IMU, a DVL and a barometric pressure depth sensor. An Error state Kalman Filter is used to obtain much higher accuracy than each sensor can provide independently. The AUV navigation system is capable of performing accurate local navigation and global navigation.

The current navigation system was evaluated over 176m with a 1.62% of drift measured based on its DVL/IMU sensor fusion. 

Autonomous Manipulation

With a highly accurate navigation suite and robust object perception and tracking, the AUV is capable of fully autonomous manipulation of objects. We have tested on various types of manipulators ranging from grabbing arms, to marker droppers, to mini projectiles, the software on the Bumblebee AUV is capable of different types of manipulation.

Mission Planner

Vehicle dynamics during an autonomous mission is fully controlled by the mission planner, which directs task nodes, controls trajectory between tasks and manages time.

The highly modular software architecture complements the functionality of the mission planner. The mission planner’s multi-threaded structure allows for simultaneous execution of mission tasks and watch states that serve to keep track of the mission and task statuses. The mission planner also manages contingency states to allow for recovery via the saved waypoints during the missions.

Mission runs can be dynamically built from user input, providing an option to test task nodes independently in addition to a full mission test.