Software Architecture

Bumblebee’s software system consists of three parts: At the high level application layer it has Human Machine Interface and the various software interfaces. At the middleware level, it comprises of the mission planner, the task modules, control systems and the computer vision suite. At the lowest level, it has the various hardware drivers. The software system runs on Linux utilizing Robot Operating System (ROS) as the messaging passing interface. 

Control Panel Interface

The Bumblebee 3.0 Control panel displays telemetry information, camera and sonar images, which aids in monitoring of sensors and actuator data for system analysis during operational deployments when connected over the tether. The control panel is also used for control of the vehicle during teleoperation mode.

The software stack has a logging system which is used to capture telemetry and video information, and log messages during both tethered and autonomous operations.  The software is capable of replaying logged data and streaming into the user interfaces allowing for post mortem analysis of the mission and evaluation of the collected data.

Mapping Interface

Apart from the control panel interface, the Bumblebee 3.0 vehicle has a mapping interface which enables the operator to simply point and click on a map and the vehicle will move to the location as indicated when used in tele-operation mode.

The mapping interface maps out key sonar objects in the area and projects it onto the mapping interface so the operator can directly correlate in real world, the actual coordinate of a certain sonar object.

Control Systems

BumbleBee 3.0 has six degrees of freedom: surge, sway, heave, roll, pitch and yaw. Six PID (Proportional Integral Derivative) control loops are tuned to control the vehicle’s underwater dynamics and allow for highly precise station-keeping. For ease of tuning, a Control System User Interface is available for the user to quickly tune the controls should there be any changes to buoyancy of the vehicle.

Underwater Perception and Tracking

The AUV is equipped with two machine vision cameras and an imaging sonar for forward and downward perception. Leveraging upon the benefits of both sonar and camera, the AUV is capable for highly robust perception and tracking of objects underwater.

Navigation Suite

The navigation sensor suite consists of a 9 axis Sparton IMU, 6 axis STIM300 IMU, a DVL and a barometric pressure depth sensor. An Error state Kalman Filter is used to obtain much higher accuracy than each sensor can provide independently. The AUV navigation system is capable of performing accurate local navigation and global navigation. 

The current navigation system was evaluated over 176m with a 1.62% of drift measured based on its DVL/IMU sensor fusion.   

Autonomous Manipulation

With a highly accurate navigation suite and robust object perception and tracking, the AUV is capable of fully autonomous manipulation of objects. We have tested on various types of manipulators ranging from grabbing arms, to marker droppers, to mini projectiles, the software on the Bumblebee AUV is capable of different types of manipulation.

 

Mission Planner

Vehicle dynamics during an autonomous mission is fully controlled by the mission planner, which directs task nodes, controls trajectory between tasks and manages time.

The highly modular software architecture complements the functionality of the mission planner. The mission planner’s multi-threaded structure allows for simultaneous execution of mission tasks and watch states that serve to keep track of the mission and task statuses. The mission planner also manages contingency states to allow for recovery via the saved waypoints during the missions.

Mission runs can be dynamically built from user input, providing an option to test task nodes independently in addition to a full mission test.