Robobot architecture

From Rsewiki
(Difference between revisions)
Jump to: navigation, search
(Configuration options)
(Teensy PCB)
 
(39 intermediate revisions by one user not shown)
Line 1: Line 1:
 
Back to [[Robobot B]]
 
Back to [[Robobot B]]
  
==NASREM==
+
== Software block overview ==
  
The software architecture is based on the old NASREM architecture, and this is the structure for the description on this page.
+
[[File:robobot-in-blocks-2024.png | 600px]]
  
The National Aeronautics and Space Administration (NASA) and the US National Institute of Standards and Technology (NIST) have developed a Standard Reference Model Telerobot Control System Architecture called NASREM. Albus, J. S. (1992), A reference model architecture for intelligent systems design.
+
Figure 1. The main building blocks.
  
[[File:nasrem.png | 700px]]
+
=== Software building blocks ===
  
Figure 1. The NASREM model divides the control software into a two-dimensional structure. The columns are software function: Sensor data processing, modelling and behaviour control.
+
[[File:robobot-function-blocks.png | 600px]]
  
== Level 1 ==
+
Figure 2. The main software building blocks.
  
[[File:robobot_level_1.png | 800px]]
 
  
Figure 2. The lowest level in the control software. The encoder ticks are received form the hardware (from the Teensy microprocessor) in the sensor interface. The encoder values are then modeled into an odometry pose. The pose is used to control the wheel velocity using a PID controller.
+
==== Base control====
The desired wheel velocity for each wheel is generated in the mixer from a desired linear and rotational velocity.
+
The 'base control' block is the 'brain' of the robot.
  
=== Hardware Teensy ===
+
The base control is an expandable skeleton software that is intended as the mission controller.
 +
The skeleton includes basic functionality to interface the Teensy board, to interface the digital IO, and to communicate with a Python app.
  
The Teensy handles all the sensors connected to the microprocessor and the wheel motors as well as the up to 5 servos.
+
The base control skeleton is written in C++.
  
The sensor data is available as subscriptions with a constant sample rate. The sample can be specified in units of 1 ms.
+
==== Python3 block====
The USB connection has a bandwidth of about 4000 sensor messages per second.
+
Vision functions are often implemented using the Python libraries.
  
All subscription requests start with the 'sub' keyword followed by the message key for the data type and the sample time in milliseconds, e.g:
+
The provided skeleton Python app includes communication with the Base control. The interface is a simple socket connection, and the communication protocol is lines of text both ways. The lines from the base control could be e.g. "aruco" or "golf" to trigger some detection of ArUco codes or golf balls.
 +
The reply back to the Base control could be "golfpos 3 1 0.34 0.05" for; found 3 balls, ball 1 is at position x=0.34m (forward), y=0.05m (left).
  
sub svo 50
+
The Python3 block is optional, as the same libraries often are available in C++, and therefore could be implemented in the 'Base control' directly.
  
that subscribes to the servo status (for all 5 servos) every 50 ms.
+
==== IP-disp ====
The Teensy interface will add the CRC and make sure it is received.
+
  
 +
Is a silent app that is started at reboot and has two tasks:
 +
* Detect the IP net address of the Raspberry and send it to the small display on the Teensy board.
 +
* Detect if the "start" button is pressed, and if so, start the 'Base control' app.
  
=== Teensy i/f ===
+
==== Teensy PCB ====
  
This is the Teensy interface controlling the USB interface to the Teensy.
+
The Teensy board is actually a baseboard used in the simpler 'Regbot' robot.
 +
This board has most of the hardware interfaces and offers all sensor data to be streamed in a publish-subscribe protocol.
 +
All communication is based on clear text lines.
  
Commands to the Teensy can be sent in two ways, trusted or best effort.
+
See details in [[Robobot circuits]].
The trusted commands await a confirmation from the Teensy; if this is not received within 30ms, it is resent - up to 3 times. If still not confirmed, the message is dropped (most likely, the connection is dead, or two applications are listening to the same USB device.
+
The 'best effort' tye is sent once only; this is the fastest communication - like motor voltage in a control loop. If one message is lost, a similar value will be sent at the next sample time.
+
  
Both types are further secured by a simple CRC check.
+
==Software architecture==
  
All received messages are sent to the service module for decoding.
+
The software architecture is based on the old NASREM architecture, and this is the structure for the description on this page.
All outgoing messages use the send method in this module.
+
  
All communication on the interface can be logged.
+
(The National Aeronautics and Space Administration (NASA) and the US National Institute of Standards and Technology (NIST) have developed a Standard Reference Model Telerobot Control System Architecture called NASREM. Albus, J. S. (1992), A reference model architecture for intelligent systems design.)
  
The module is coded in the steensy.h/steensy.cpp files.
+
[[File:nasrem.png | 700px]]
  
The messages look like this - timestamped by the logging function.
+
Figure 3. The NASREM model divides the control software into a two-dimensional structure. The columns are software functions: Sensor data processing, modelling, and behaviour control.
  
% teensy communication to/from Teensy
+
The rows describe abstraction levels:
% 1 Time (sec) from system
+
* Level 1 with the primary control of the wheels for forward velocity and turn rate. This level also maintains the robot pose (position, orientation, and velocity, based on wheel odometry.  
% 2 (Tx) Send to Teensy
+
* Level 2 is drive select, where the drive can be controlled by just odometry (forward velocity and turn rate) or follow a line based on the line sensor. This level also includes other sensor detections like crossing lines and distances.
%  (Rx) Received from Teensy
+
* Level 3 is where the overall behaviour is decided and includes camera sensor object detections like navigation codes and other objects.
%  (Qu N) Put in queue to Teensy, now queue size N
+
% 3 Message string queued, send or received
+
1687200276.5853 Tx ;75!setid robobot
+
1687200276.5853 Rx ;04hbt 47.9792 128 1581 4.64 0 7 74.1
+
1687200276.5866 Rx ;57# got new name (get with 'id')
+
1687200276.5867 Rx ;65confirm !setid robobot
+
1687200276.5869 Tx ;80!setidx 2
+
1687200276.5880 Rx ;70confirm !setidx 2
+
1687200276.5881 Tx ;47!idi
+
1687200276.5892 Rx ;37confirm !idi
+
1687200276.5893 Rx ;57dname robobot Sofia
+
1687200276.7838 Qu 1 ;01!sub enc 7
+
1687200276.7839 Qu 2 ;04!sub hbt 500
+
1687200276.7842 Qu 3 ;34!sub gyro0 12
+
1687200276.7842 Qu 4 ;78!sub acc0 12
+
1687200276.7842 Qu 5 ;37!gyrocal 0 0 0
+
1687200276.7842 Tx ;01!sub enc 7
+
1687200276.7853 Qu 6 ;81!sub svo 50
+
1687200276.7853 Rx ;90confirm !sub enc 7
+
  
The CRC is coded as two numeric characters that are the sum of all non-control characters in the message. The sum is reduced to two digits by a modulus 99 and 1 is added to avoid '00'. The two numbers are preceded by a semicolon ';'.
+
=== Level 1; Pose and drive control ===
Messages that need a confirmation are queued, and an '!' are added after the CRC code. The '!' is included in the CRC calculation.
+
  
Messages are coded as a character line in 7-bit ASCII, i.e., no Danish characters.
+
[[File:robobot_level_1.png | 800px]]
  
All messages are terminated by a 'new line', a '\n'.
+
Figure 4. The lowest level in the control software. The encoder ticks are received from the hardware (from the Teensy board) into the sensor interface. The encoder values are then modeled into an odometry pose. The pose is used to control the wheel velocity using a PID controller.
 +
The desired wheel velocity for each wheel is generated in the mixer from a desired linear and rotational velocity.
 +
The heading control translates rotation velocity to the desired heading and uses a PID controller to implement.
  
=====Configuration options=====
+
More [[Robobot level 1]] details of the individual blocks.
  
In robot.ini, there are these options for the module:
+
=== Level 2; drive select ===
  
[teensy]
+
[[File:robobot_level_2.png | 800px]]
device=/dev/ttyACM0
+
confirm_timeout = 0.04
+
log=true
+
print = false
+
  
The device is in most cases /dev/ttyACM0, but can be changed here.
+
Figure 5. At level 2 further sensor data is received, modeled, and used as optional control sources.
  
The timeout unit is seconds; about 40ms is an OK compromise between detecting a failed communication and retransmitting too soon. Mostly a message is confirmed within 2ms.
+
More [[robobot level 2]] details of the individual blocks.
  
Print is a debug feature to see all communication in the terminal window. Log is the same messages, but send to a file; this can be used for timing analysis of the communication or other debug issues.
+
=== Level 3; behaviour ===
  
=== Encoder ===
+
[[File:robobot_level_3.png | 800px]]
  
The encoder messages hold the encoder count for each wheel, e.g.
+
Figure 6. At level 3, the drive types are used to implement more abstract behaviour, e.g. follow the tape line to the axe challenge, detect the situation where the axe is out of the way, and then continue the mission.
 
+
enc 4294967040 259 555 0
+
enc 4294967036 262 562 0
+
enc 4294967030 266 572 0
+
 
+
The first two numbers are the encoder count in a 32-bit integer format starting at zero at power up. The first number (left wheel) is here counting down and is thus close to the maximum value for a 32-bit unsigned integer.
+
 
+
The last two numbers are debug values that will be removed at some point.
+
 
+
The streaming sample interval is set in the ini-file for the code, as
+
 
+
[encoder]
+
; encoder sample rate is also the sample rate for velocity control
+
rate_ms=7
+
log=true
+
 
+
here set to 7 ms. mostly a sample rate of 5 to 15 ms is OK. If much faster than 5ms, then the reading of the incoming messages in the Teensy may get overloaded (late). Slower than 15 ms may be too slow for the control loop.
+
 
+
=== Pose ===
+
 
+
The pose is updated every time new encoder values are available.
+
 
+
@hertil
+
 
+
== Level 2 ==
+
 
+
[[File:robobot_level_2.png | 800px]]
+
  
Figure 3. At level 2 further sensor data is received, modelled and used as optional control sources.
+
[[Robobot level 3 details]]

Latest revision as of 11:24, 1 January 2024

Back to Robobot B

Contents

[edit] Software block overview

Robobot-in-blocks-2024.png

Figure 1. The main building blocks.

[edit] Software building blocks

Robobot-function-blocks.png

Figure 2. The main software building blocks.


[edit] Base control

The 'base control' block is the 'brain' of the robot.

The base control is an expandable skeleton software that is intended as the mission controller. The skeleton includes basic functionality to interface the Teensy board, to interface the digital IO, and to communicate with a Python app.

The base control skeleton is written in C++.

[edit] Python3 block

Vision functions are often implemented using the Python libraries.

The provided skeleton Python app includes communication with the Base control. The interface is a simple socket connection, and the communication protocol is lines of text both ways. The lines from the base control could be e.g. "aruco" or "golf" to trigger some detection of ArUco codes or golf balls. The reply back to the Base control could be "golfpos 3 1 0.34 0.05" for; found 3 balls, ball 1 is at position x=0.34m (forward), y=0.05m (left).

The Python3 block is optional, as the same libraries often are available in C++, and therefore could be implemented in the 'Base control' directly.

[edit] IP-disp

Is a silent app that is started at reboot and has two tasks:

  • Detect the IP net address of the Raspberry and send it to the small display on the Teensy board.
  • Detect if the "start" button is pressed, and if so, start the 'Base control' app.

[edit] Teensy PCB

The Teensy board is actually a baseboard used in the simpler 'Regbot' robot. This board has most of the hardware interfaces and offers all sensor data to be streamed in a publish-subscribe protocol. All communication is based on clear text lines.

See details in Robobot circuits.

[edit] Software architecture

The software architecture is based on the old NASREM architecture, and this is the structure for the description on this page.

(The National Aeronautics and Space Administration (NASA) and the US National Institute of Standards and Technology (NIST) have developed a Standard Reference Model Telerobot Control System Architecture called NASREM. Albus, J. S. (1992), A reference model architecture for intelligent systems design.)

Nasrem.png

Figure 3. The NASREM model divides the control software into a two-dimensional structure. The columns are software functions: Sensor data processing, modelling, and behaviour control.

The rows describe abstraction levels:

  • Level 1 with the primary control of the wheels for forward velocity and turn rate. This level also maintains the robot pose (position, orientation, and velocity, based on wheel odometry.
  • Level 2 is drive select, where the drive can be controlled by just odometry (forward velocity and turn rate) or follow a line based on the line sensor. This level also includes other sensor detections like crossing lines and distances.
  • Level 3 is where the overall behaviour is decided and includes camera sensor object detections like navigation codes and other objects.

[edit] Level 1; Pose and drive control

Robobot level 1.png

Figure 4. The lowest level in the control software. The encoder ticks are received from the hardware (from the Teensy board) into the sensor interface. The encoder values are then modeled into an odometry pose. The pose is used to control the wheel velocity using a PID controller. The desired wheel velocity for each wheel is generated in the mixer from a desired linear and rotational velocity. The heading control translates rotation velocity to the desired heading and uses a PID controller to implement.

More Robobot level 1 details of the individual blocks.

[edit] Level 2; drive select

Robobot level 2.png

Figure 5. At level 2 further sensor data is received, modeled, and used as optional control sources.

More robobot level 2 details of the individual blocks.

[edit] Level 3; behaviour

Robobot level 3.png

Figure 6. At level 3, the drive types are used to implement more abstract behaviour, e.g. follow the tape line to the axe challenge, detect the situation where the axe is out of the way, and then continue the mission.

Robobot level 3 details

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox