Visual Tracking on Pixhawk

From Rsewiki
(Difference between revisions)
Jump to: navigation, search
(User Instructions)
(Original ekf2 Module)
 
(19 intermediate revisions by one user not shown)
Line 113: Line 113:
 
  EKF2_MO_p_NOISE - variance of pitch measurement (set high, otherwise you have to really carefully align pixhawk with marker)
 
  EKF2_MO_p_NOISE - variance of pitch measurement (set high, otherwise you have to really carefully align pixhawk with marker)
 
  EKF2_MO_h_NOISE - variance of headding measurement
 
  EKF2_MO_h_NOISE - variance of headding measurement
 +
 +
==== Ground Station ====
 +
'''QGroundControl''' is the recomended tool for visualizing onbard data in real time and setting parameters.
 +
It is available from http://qgroundcontrol.org/
 +
 +
'''AMP Mission Planner''' is another common ground station. I had some issues with QGroundControl not showing all parameters, but Mission Planner did.
 +
Mission Planner is, however, made with a different flight stack in mind (flight controller and estimators).
 +
 +
==== 3DR Radio Notes ====
 +
The radios used for telemetry are configured with the default settings. While the air-rate in this mode is rather slow, this also gives much better noise immunity.
 +
 +
The radio settings can be set from AMP Mission Planner: http://planner.ardupilot.com/wiki/common-3dr-radio-version-2/
  
 
=== Installing Tracker software ===
 
=== Installing Tracker software ===
Line 133: Line 145:
 
== User Instructions ==
 
== User Instructions ==
 
=== Coordinate systems ===
 
=== Coordinate systems ===
 +
The tracking program can take a setting file giving the position and rotation of the camera in the world NED coordinate system.
 +
If this file is not supplied, the default is that the camera is positioned at the origin of the world coordinate system, rotated so that:
 +
{| class="wikitable"
 +
!colspan="2"|Axis Alignment
 +
|-
 +
!World Frame
 +
!Camera Frame
 +
|-
 +
|North     
 +
| -y
 +
|-
 +
|East     
 +
| -x
 +
|-
 +
|Down     
 +
| -z
 +
|}
 +
 
=== Aruco Markers ===
 
=== Aruco Markers ===
 +
The Aruco marker pacekage is documented at:
 +
http://docs.opencv.org/master/d9/d6d/tutorial_table_of_content_aruco.html
 +
This includes some rather nice examples (these will be in your openCV build folder, if you built them).
 +
 +
The marker used for the current setup is included in the git repo as a .png. It is ID 0 from the dictionary DICT_4x4_50 (#0).
 +
It should be printed with a size of [10cm x 10cm].
 +
 +
=== Starting Application ===
 +
Set the correct settings on the pixhawk according to [[Visual_Tracking_on_Pixhawk#Vision Tracking Specific Settings]].
 +
You can check that the ekf2 application is running with the nuttx command
 +
ekf2 status
 +
 +
or you can start top in nuttx with
 +
top
 +
 +
Remember to ensure that MOCAP input is enabled
 +
param show EKF2_USE_MOCAP
 +
should return 1
 +
 +
Start the tracking application on your PC with:
 +
./tracker -d=0 -c=calib/camera.yml
 +
For a list of all options, start with no arguents
 +
 +
Test that tracking is done by looking at the local position in QGroundControl. This can be done with a USB cable. Alternatively you could add a second telemetry radio.
 +
 +
== Software Overview ==
 +
=== Original ekf2 Module ===
 +
The ekf2 module is an implementation of the filter described here: http://elinux.org/images/1/11/Application_of_Data_Fusion_to_Aerial_Robotics.pdf
 +
The equations are explained in the matlab files of https://github.com/PX4/ecl
 +
 +
The ecl/EKF is a separate library that is used in the Pixhawk codebase through a wrapper, the "ekf2 module". All the wrapper does is pass data in and out of the objects described in the library and start/stop the module thread.
 +
 +
The library has two main objects '''EstimatorBase''' and '''Ekf'''.
 +
 +
'''EstimatorBase''' contains the state, some internal FIFO buffers for measurements and functions to parse data into the buffers. It is aslo responsible for down-sampling the IMU.
 +
 +
'''Ekf''' extends '''EstimatorBase''' with all the functionality for actually estimating. it stores the co-variance ''P'', and has functions for propagating (predicting) the EKF and fusing (updating) based on measurements. It also has an outputcalculator that tries to hide estimation delay and smooth out discontinuities due to updates.
 +
 +
==== Handeling Sensor Delay ====
 +
To handle delay of different sensors the EKF is updated at a delayed horizon, that is IMU measurements used to predict the filter are put in a FIFO and sensor inputs at the time of the oldest instance in the FIFO are used for fusion.
 +
 +
This works quite nicely as long as no sensor input is delayed more than the "horizon", but the whole estimate is delayed in this way.
 +
(in contrast the ekf_att_pos_estimator module, which implements the same equations compare the measurements to a delayed state estimate, which might lead to instability).
 +
To overcome the delay a complementary filter using the current IMU measurements is used (This is done in the '''calculateOutputStates''' function). This has pros and cons. While it does a good job at hiding the delay, big changes in the state estimate are not handled well as they are smoothed out by the filter.
 +
The smoothing seems to be a desired effect though, as the controllers in the Pixhawk are P-I-Lead, so discontinuities in the state estimates would result in erattic control.
 +
 +
==== Step Wise Fusion ====
 +
For most sensors the fusion algorithm assumes no correlation between measurements (eg GPS x,y,z measurement), this enables updates with each measurement individually, so no matrix has to be inverted.
 +
This is very computationally efficient, but does not allow handeling of the case where measurements are correlated.
 +
 +
=== Changes to ekf2 Module ===
 +
=== Motion Capture ===
 +
The tracking software is very simple. It consists of:
 +
An initialization procedure
 +
An image server thread
 +
A main tracking thread
 +
 +
The '''initialization procedure''' parses command line arguments, opens the MAVLink serial device, and starts the image server.
 +
 +
The '''Image server''' is a simple thread, that opens a camera deivce and reads images from it.
 +
The image server ensures that images grabbed by the main thread are all ways the most recent. This has to be done if the main algorithm cannot keep up with the FPS of the camera. The camera has a (usually) 5-deep FIFO, this results in a significant delay if the FIFO ever filled up, as new images are not added to the FIFO before there is space.
 +
 +
'''Note:''' the image server grabs two image at a time, his ensures that even if two images were put in the FIFO while decoding the previous, this does not give any delay. This, however, halfes the effective FPS.
 +
 +
The image server signals the main thread whenever a new image is ready. It also prints the current FPS of the image server.
 +
 +
The '''Main thread''' wait for a signal from the image server then detects all Aruco markers and uses the detected marker to find the board. The bord position is rotated according to the position and orientation of the camera and the result is sent over MAVLink. Lastly the current image is displayed at a downscaled size along with the tracked markers.
  
 
== Important notes ==
 
== Important notes ==
Line 141: Line 238:
 
in the process the <code>ekf2</code> module was added to the build list along with the useful tool <code>listener</code>.
 
in the process the <code>ekf2</code> module was added to the build list along with the useful tool <code>listener</code>.
 
To free up some space, some unused modules were commented out of the build list.
 
To free up some space, some unused modules were commented out of the build list.
=== SD card and parameters ===
+
=== To Do ===
 +
==== Add file input for Aruco board ====
 +
At the moment the layout of the Aruco board is fixed to a single marker in the center of the drone.
 +
The position and size is hardcoded. This should be changed to take input from a yml file.
 +
 
 +
==== Revet to tracking red balls? ====
 +
The choice to use Aruco markers was made because they are a relatively well tested and robust.
 +
Detection is fast and reliably, as long as lighting conditions are good.
 +
In strong back lighting conditions, tracking consistency is pretty bad due to decreased contrast and glare.
 +
 
 +
The original tracking algorithm used red plastic balls at the edges of the drone.
 +
The advantage of this is that the baseline is significantly increased.
 +
 
 +
It might be beneficial to replace the Arcuo tracker with tracking of balls.

Latest revision as of 19:02, 1 February 2016

Contents

[edit] System Overview

The project consists of:

  • A quarotor equipped with a Pixhawk, the Drone
  • A linux computer connected to a
  • HD webcam, and
  • 3DR telemetry radio.
  • Software running on the linux computer for tracking the drone, and sending the position over the telemetry link
  • Modified Pixhawk firmware with sensors added to the ekf2 module

The quadrotor has an Aruco marker http://docs.opencv.org/master/d9/d6d/tutorial_table_of_content_aruco.html#gsc.tab=0 on the underside. The location of the marker is detected by a Linux computer running C++ code with the openCV library and send the position to the Pixhawk over MAVLINK using the ATT_POS_MOCAP topic.

[edit] Installation Instructions

[edit] Default Pixhawk Firmware

For instructions on setting up the required tool chain and building the standard Pixhawk firmware, follow the tutorial at http://dev.px4.io/starting-installing.html

The Pixhawk team uses Git to synchronize their work. If you are not familiar with git, now would be a good time to familiarize yourself with it.

The Pixhawk firmware is a rather large project. The main part of the software is located at https://github.com/PX4/Firmware.git, but some part reside in so called git submodules.

To get a general idea about, how to make custom modules (programs) for the Pixhawk I suggest reading through the examples at http://dev.px4.io/


[edit] Updated Pixhawk Firmware

The important changes in the Pixhawk firmware are located at: <blockqute> src/lib/ecl - this is the submodule for the EKF source code (and some other modules). The reason why it's in a submodule, is because it's ment as a portable estimator and can be used in other projects as well </blockquote> <blockqute> src/modules/ekf2 - this is a wrapper for the EKF mentioned above. It passes data from uORB topics to the filter and back. </blockquote>

To fetch the code I delvelpped and assign the correct submodule for the ekf run the following commands:

mkdir Pixhawk
cd Pixhawk
git clone https://github.com/skrogh/Firmware.git
cd Firmware
git remote add upstream https://github.com/PX4/Firmware.git
git checkout mocap-ekf2
Tools/check_submodules.sh
cd src/lib/ecl
git remote rename origin upstream
git remote add origin https://github.com/skrogh/ecl.git
git fetch origin
git checkout mocap-ekf2
cd ../../..
make px4fmu-v2_default

This will do:

  • Make a new directory for the project
  • Enter it
  • Clone my fork of the repository. This includes updated wrappers for the filter and some changes to enable Mikrokopter drivers
  • Enter the repository
  • Add a remote to the original repository. This can later be used to update the software to include changes made by the PX4 team
  • All work on motion capture input has ben done on a separate branch mocap-ekf2 so we switch to that
  • Tools/check_submodules.sh is a script that ensures, that submodules are up to date. We run it to fetch all submodules
  • Go into the submodule for the estimator
  • Move the original "origin" remote to "upstream" (these are just names, but this is to keep consistency)
  • Add a new remote to the version of this repo I made the changes in
  • Fetch changes
  • Again work was done on the mocap-ekf2 branch, so switch to this one.
  • Go back to the toplevel
  • build the default configuration

[edit] Pixhawk Setup

[edit] General Advice

Not all settings for the Pixhawk are stored in the firmware. A lot of parameters that affects the Pixhawk can be set either from a ground control station, or via the Nuttx shell.

It is recommended to connect to the Nuttx shell through the serial port and not the USB port. This will also give you debut info during boot.

When the Pixhawk boots a script ROMFS/px4fmu-v2_common/init.d/rcS is launched. Based on some parameters and files located on the SD card, the Pixhawk starts the appropriate modules (flight controllers, estimators, sensor drivers etc.)

The boot script does something like this:

  1. If the file etc/rc.txt exists on the SD card, this is run instead of the autostart routine (step 2-6).
  2. if SYS_AUTOSTART parameter is set, a (automatically generated) script is run that sets parameters associated with the chosen robot. (ex. for SYS_AUTOSTART=4001 https://github.com/PX4/Firmware/blob/master/ROMFS/px4fmu_common/init.d/4001_quad_x is run).
  3. If the file etc/config.txt exists on the SD card this script will run, allowing you to overwrite parameters for the robot, without creating your own custom one.
  4. Sensor drivers are startet along with output dirvers (PWM and/or Mikrokopter if chosen).
  5. Most SYS_AUTOSTART configurations (if not all?) will set the VEHICLE_TYPE variable. This will chose which modules are startet for flight controller and estimator(s). (Note the difference between parameters and variables. Parameters are saved in flash and prevail during poweroff - they are also available in C/C++ code).
  6. if the file /etc/extras.txt exists this will run as the last part of the autostart.
  7. A Nuttx shell is opened over USB if no SD card is present. Otherwise MAVLINK wil be started on the USB port (on my fork a Nuttx will always be present on the USB, to start mavlink for ground control connection type mavlink start -r 800000 -d /dev/ttyACM0 -m config -x in a Nuttx shell)

[edit] Vision Tracking Specific Settings

Instead of making custom startup scrips, we reuse the ones present and set the correct parameters.

On the SD card create the file etc/config.txt:

# EKF2 Setup
echo "Disabeling INAV"
param set INAV_ENABLED 0
echo "Disabeling LPE"
param set LPE_ENABLED 0
echo "This makes the quad use ekf2"
# Mikrokopter setup
echo "Setting MikroKopter BL-ctl settings"
set OUTPUT_MODE mkblctrl
set MKBLCTRL_MODE x

This will tell the Pixhawk to disable the split attitude position estimators and use the combined ekf2 estimator instead. Furthermore the Mikrokopter motor driver will be enabled. Set SYS_AUTOSTART=4001 for a standard x-quadrotor configuration.

In the ekf2 module some parameters have been added:

EKF2_USE_MOCAP (default: 0) - use motion capture data? (if 0 will use GPS)
EKF2_USE_PREDICT (default: 1) - use IMU measurements to propagate the delayed state estimate to hide delay (set to 0 for debug) 
EKF2_MO_x_NOISE - variance of x position measurement
EKF2_MO_y_NOISE - variance of y position measurement
EKF2_MO_z_NOISE - variance of z position measurement
EKF2_MO_r_NOISE - variance of roll measurement (set high, otherwise you have to really carefully align pixhawk with marker)
EKF2_MO_p_NOISE - variance of pitch measurement (set high, otherwise you have to really carefully align pixhawk with marker)
EKF2_MO_h_NOISE - variance of headding measurement

[edit] Ground Station

QGroundControl is the recomended tool for visualizing onbard data in real time and setting parameters. It is available from http://qgroundcontrol.org/

AMP Mission Planner is another common ground station. I had some issues with QGroundControl not showing all parameters, but Mission Planner did. Mission Planner is, however, made with a different flight stack in mind (flight controller and estimators).

[edit] 3DR Radio Notes

The radios used for telemetry are configured with the default settings. While the air-rate in this mode is rather slow, this also gives much better noise immunity.

The radio settings can be set from AMP Mission Planner: http://planner.ardupilot.com/wiki/common-3dr-radio-version-2/

[edit] Installing Tracker software

Build and install OpenCV 3.1 according to: http://docs.opencv.org/3.1.0/d7/d9f/tutorial_linux_install.html

Remember to build with contributed modules.

Clone the repository https://github.com/skrogh/droneTracker.git where you want the tracker and build:

mkdir tracker
cd tracker
git clone https://github.com/skrogh/droneTracker.git
cd droneTracker
./getDependencies.sh
cmake .
make

getDependencies.sh might take a while as it has to download and unpack the Eigen Library.

[edit] User Instructions

[edit] Coordinate systems

The tracking program can take a setting file giving the position and rotation of the camera in the world NED coordinate system. If this file is not supplied, the default is that the camera is positioned at the origin of the world coordinate system, rotated so that:

Axis Alignment
World Frame Camera Frame
North -y
East -x
Down -z

[edit] Aruco Markers

The Aruco marker pacekage is documented at: http://docs.opencv.org/master/d9/d6d/tutorial_table_of_content_aruco.html This includes some rather nice examples (these will be in your openCV build folder, if you built them).

The marker used for the current setup is included in the git repo as a .png. It is ID 0 from the dictionary DICT_4x4_50 (#0). It should be printed with a size of [10cm x 10cm].

[edit] Starting Application

Set the correct settings on the pixhawk according to Visual_Tracking_on_Pixhawk#Vision Tracking Specific Settings. You can check that the ekf2 application is running with the nuttx command

ekf2 status

or you can start top in nuttx with

top

Remember to ensure that MOCAP input is enabled

param show EKF2_USE_MOCAP

should return 1

Start the tracking application on your PC with:

./tracker -d=0 -c=calib/camera.yml

For a list of all options, start with no arguents

Test that tracking is done by looking at the local position in QGroundControl. This can be done with a USB cable. Alternatively you could add a second telemetry radio.

[edit] Software Overview

[edit] Original ekf2 Module

The ekf2 module is an implementation of the filter described here: http://elinux.org/images/1/11/Application_of_Data_Fusion_to_Aerial_Robotics.pdf The equations are explained in the matlab files of https://github.com/PX4/ecl

The ecl/EKF is a separate library that is used in the Pixhawk codebase through a wrapper, the "ekf2 module". All the wrapper does is pass data in and out of the objects described in the library and start/stop the module thread.

The library has two main objects EstimatorBase and Ekf.

EstimatorBase contains the state, some internal FIFO buffers for measurements and functions to parse data into the buffers. It is aslo responsible for down-sampling the IMU.

Ekf extends EstimatorBase with all the functionality for actually estimating. it stores the co-variance P, and has functions for propagating (predicting) the EKF and fusing (updating) based on measurements. It also has an outputcalculator that tries to hide estimation delay and smooth out discontinuities due to updates.

[edit] Handeling Sensor Delay

To handle delay of different sensors the EKF is updated at a delayed horizon, that is IMU measurements used to predict the filter are put in a FIFO and sensor inputs at the time of the oldest instance in the FIFO are used for fusion.

This works quite nicely as long as no sensor input is delayed more than the "horizon", but the whole estimate is delayed in this way. (in contrast the ekf_att_pos_estimator module, which implements the same equations compare the measurements to a delayed state estimate, which might lead to instability). To overcome the delay a complementary filter using the current IMU measurements is used (This is done in the calculateOutputStates function). This has pros and cons. While it does a good job at hiding the delay, big changes in the state estimate are not handled well as they are smoothed out by the filter. The smoothing seems to be a desired effect though, as the controllers in the Pixhawk are P-I-Lead, so discontinuities in the state estimates would result in erattic control.

[edit] Step Wise Fusion

For most sensors the fusion algorithm assumes no correlation between measurements (eg GPS x,y,z measurement), this enables updates with each measurement individually, so no matrix has to be inverted. This is very computationally efficient, but does not allow handeling of the case where measurements are correlated.

[edit] Changes to ekf2 Module

[edit] Motion Capture

The tracking software is very simple. It consists of:

An initialization procedure
An image server thread
A main tracking thread

The initialization procedure parses command line arguments, opens the MAVLink serial device, and starts the image server.

The Image server is a simple thread, that opens a camera deivce and reads images from it. The image server ensures that images grabbed by the main thread are all ways the most recent. This has to be done if the main algorithm cannot keep up with the FPS of the camera. The camera has a (usually) 5-deep FIFO, this results in a significant delay if the FIFO ever filled up, as new images are not added to the FIFO before there is space.

Note: the image server grabs two image at a time, his ensures that even if two images were put in the FIFO while decoding the previous, this does not give any delay. This, however, halfes the effective FPS.

The image server signals the main thread whenever a new image is ready. It also prints the current FPS of the image server.

The Main thread wait for a signal from the image server then detects all Aruco markers and uses the detected marker to find the board. The bord position is rotated according to the position and orientation of the camera and the result is sent over MAVLink. Lastly the current image is displayed at a downscaled size along with the tracked markers.

[edit] Important notes

[edit] Changes in files not concerning the EKF

The following files have been changed, to enable Mikrokopter BL-CTRL motor drivers and ultrasonic rangefinder: cmake/configs/nuttx_px4fmu-v2_default.cmake in the process the ekf2 module was added to the build list along with the useful tool listener. To free up some space, some unused modules were commented out of the build list.

[edit] To Do

[edit] Add file input for Aruco board

At the moment the layout of the Aruco board is fixed to a single marker in the center of the drone. The position and size is hardcoded. This should be changed to take input from a yml file.

[edit] Revet to tracking red balls?

The choice to use Aruco markers was made because they are a relatively well tested and robust. Detection is fast and reliably, as long as lighting conditions are good. In strong back lighting conditions, tracking consistency is pretty bad due to decreased contrast and glare.

The original tracking algorithm used red plastic balls at the edges of the drone. The advantage of this is that the baseline is significantly increased.

It might be beneficial to replace the Arcuo tracker with tracking of balls.

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox