CUAUV's Software System
This year saw several modifications to CUAUV’s software system to improve functionality and usability, as well as maintenance of existing code base. Improvements were made in vision processing, mission, and communication infrastructure to improve reliability and adaptability of in multiple environments.
Tachyon's software infrastructure distributes vehicle state across a network of computers, has adaptable infrastructure for different environments and missions, and enables rapid and reliable code design and implementation. These three features all contribute to the usability of Tachyon as a platform, as they ensure that all users can interact directly with the vehicle, that the vehicle will work regardless of where it is being deployed, and that new missions can be developed quickly as the need arises. Software functionality is implemented with daemon processes that communicate through shared memory, with sensor data/actuator output being handled by progressively higher levels of abstraction. The highest level of abstraction is mission control (for AUV mode) or user control (for ROV mode).
The mission planner sits on top of all other software subsystems to control mission execution. It is what allows a user to write complicated missions and how Tachyon is able to run the sophisticated, multi-threaded missions required by the AUVSI/ONR AUV competition. The mission planner is built upon two subsystems: a planner and a task subsystem. The planner schedules everything from task blocks. This structure allows for incredibly rich, dynamic missions to be written quickly, with the planner system taking care of many of the details that would have had to be encoded in a more procedural system.
The mission planner is a tree-walking, multi-threaded program written in Python that instantiates each element of the user-given task list, allowing the tasks to add sub-tasks, and executing these subtasks when it is their turn. The planner is always running in the background, ready to cull completed tasks and notify tasks further down the line that it is their turn to run. The planner also makes sure exclusive tasks (such as movement primitives) only run one-at-a-time, and that each task is run at a regular interval so that tasks can use time-based accounting if desired.
Each task is allowed to give the planner a list of “dependencies”, which are simply aliases to shared variables. The planner monitors the requested lists of shared variables in separate threads and notifies the tasks to run when their “dependencies” change. General feedback motion primitive tasks have been developed to abstract approaching and hovering over mission elements out of mission element specific tasks. This greatly reduces code reuse and allows for the rapid development and tuning of new tasks without having to worry about the physical response of the vehicle, which is handled by these motion primitives.
HSV and CIELUV Processing on 2009 recovery object
The mission set forth by the AUVSI/ONR AUV competition includes a number of visual object recognition tasks. Tachyon’s vision system is written in C++ and uses the open source OpenCV and libdc1394 libraries. The vision system architecture allows the mission to enable and disable the various vision algorithms whenever visual data is required for a mission element. An integrated vision daemon uses multithreading to efficiently capture images from cameras, video files, and image directories and provides a modular framework for multithreaded vision processing algorithms. Images are passed to the different processing algorithms in memory using the module framework.
The majority of the machine vision modules rely on color-based segmentation in which the input image is converted from the RGB colorspace into the HSV and CIELUV colorspaces which are then split into their respective component channels. Each of these channels is segmented through predetermined thresholds, and the six segmented channels are recombined to form a binary image. Contours are detected in the binary image, and these contours are then run through a set of probabilistic filters and moment analyses. For some of the mission elements, other processing techniques such as canny edge detection, sobel gradients, polygon approximation, Hough transforms , and invariant Hu moments are used to provide better performance and avoid some of the issues with color-dependent vision code that the team has encountered in the past, including illuminant metameric failure. All of the vision modules provide the location, orientation, size, and probability for a specific mission element.
Vision Tuning Tools
Threshold Tuning Tool: select the area, it generates the correct HSV and CIELUV values for that part of the image.
The time efficiency and ease of the tuning process for color based segmentation (i.e. thresholds, morphology operations, etc.) has been significantly improved through the development of several custom tools. One tuning module allows a user to highlight an object in an image and uses the image and mouse input to generate and analyze intensity histograms for the six HSV and CIELUV channels to generate ideal thresholding parameters. The second module provides a GUI with sliders and frequently updated image processing outputs which allow for the calibration of vision tuning parameters in real time. Additionally, a generalized object recognition module and tuning utility have been developed to simplify the generation of probabilistic image descriptor data sets.
Control Helm, used to control vehicle and adjust PID control parameters
Depth, heading, pitch, gyro and accelerometer data are used for control in Tachyon, and are obtained by a variety of sensors. Depth measurement is obtained using an MSI UltraStable-300 pressure transducer read by a microcontroller using an external ADC. A MicroStrain 3DM-GX1 orientation sensor uses gyros and accelerometers to provide angular rate and acceleration data. This data is passed to the Kalman Filter that is used by the controller for velocity, pitch, and heading control. An OceanServer compass is used for heading and pitch data.
A Kalman Filter fuses this sensor data in real time on the vehicle. This filter combines higher frequency sensor observations of acceleration and angular rate with model-based prediction and then with slower direct observations of velocity and orientation. The output of the Kalman Filter feeds into the vehicle's five PID controllers. Finally, the controller output is reconciled with the limitations of the thrusters to generate a closed-loop controller. The vehicle’s six thrusters allow for closed-loop control of five degrees of freedom: surge, sway, heave, pitch, and yaw. Velocity data from the DVL provides the option of running either open or closed loop velocity control in the surge and sway directions, enabling hovering. Tachyon's more uniform control parameters make it possible to use PID control for all five degrees of freedom.
Shared State System
Shared memory system diagram
The Shared State system provides a centralized interface for communicating the state of the vehicle between all processes running on the vehicle, all of the vehicle's electronics and to and all users controlling the vehicle. It is a proprietary system built upon POSIX shared memory, providing thread/process-safe variable updating and notification. It is responsible for storing the vehicle's state in a litany of type-aware variables. These shared variables can be accessed by all of the components of the software system, which allows for simple communication among the various daemons. Additionally, using the new serial system, this state can also be shared across the vehicle's custom electronics. The firmware in our custom hardware can read and write what appear to be local variables and these state changes are reflected across the entire submarine. This data can be read by the controller or mission software, which can in turn send new commands to the hardware by once again updating the shared state system.
Unified Serial Daemon
CUAUV previously used a partially standardized serial protocol that defined lightweight packets, error handling, and device identification. Since there was no standardized instruction set for interacting with the custom boards, each board required its own individually-written software driver on the main computer. To make this easier, the team designed a new serial protocol with standardized instructions. As a result, a single configurable daemon on the computer is able to communicate with all conformant boards, and changes are handled by simply changing a configuration file instead of recompiling an entire independent piece of software. The protocol and daemon also offer a variety of built-in functionality such as board identification, heartbeat monitoring, and the ability to perform logical bitwise operators on variables. The new system allows us to expand the shared state system off the central computer and onto the custom boards. Changes to shared memory appear as local variables on these boards transparently. This has significantly reduced the development time for our custom firmware.
Another key change on Tachyon is the addition of boot loader capabilities to all programmable electronics boards. This allows firmware to updated in system, directly from the vehicle's computer.
The vehicle abstraction layer (VAL) creates an abstract Vehicle object in Python by interfacing with the shared variable system. This Vehicle object not only has access to physical sensors which write data to shared memory, but can also create virtual hybrid sensors that combine data from multiple real or simulated sources. This feature allows the creation of new virtual sensors such as ''water depth'' by combining data from the real depth and altitude sensors. It also allows mission code to be tested using data from a simulator which writes data to virtual sensors.