Record Sensor Data
WARNING auKsys/5 has been released as stable, but this documentation has not been fully updated with changes since auKsys/4. In case of questions or mistakes, you can use the support project.
This tutorial covers how to record sensor data. kDB is not a generic ROS message storage solution, it support a limited number of messages, but it can be extended to support more. kDB currently support the recording of images, Lidar 2D/3D scans, IMU, GNSS, fluid pressure, and temperature. This tutorial cover the recording of images and Lidar 3D scan.
kDB is a geographic database, and kDB uses the WGS84 coordinate system for all coordinates. It is therefore require to provide the position of the robot on a geoposestamped
topic of type ros_geographic_msgs/msg/GeoPoseStamped
. The orientation should be given according to the local UTM coordinate system, the header should contains a timestamp and the frame_id
for the robot body frame.
Images
Store images in the database
For this part of the tutorial we are going to use data from a flight above water near Gränsö, Sweden. The data is available in a ROS2 bag, rosbag2_images_sea.tar.xz. The following commands will download it and unpack it:
curl -o rosbag2_images_sea.tar.xz "https://gitlab.liu.se/lks/tutorials_data/-/raw/main/rosbag2_images_sea.tar.xz?ref_type=heads&inline=false"
tar -xJf rosbag2_images_sea.tar.xz
rm -f rosbag2_images_sea.tar.xz
The data was acquired by a platform in the /dji6
namespace. It contains the following topics:
/tf
and/tf_static
: the standard ROS transformation frames. Needed to transform from the camera frame to the robot body frame./dji6/camera/images
: the images from the camera./dji6/geoposestamped
: the geo coordinate of the robot body frame, expressed in longitude/latitude and a quaternion for the orientation.
We will need to start the kDB server in the /dji6
namespace:
ros2 run ros_kdb kdb_server --ros-args -r __ns:=/dji6
In a different terminal, we can start the bag file in pause state:
ros2 bag play -p rosbag2_images_sea/
After starting the bag file, it is possible to start recording the images with a service call:
ros2 service call /dji6/kdb_server/start_topic_recording ros_kdb_interfaces/srv/StartTopicRecording '{ dataset_uri: http://askco.re/examples#images_granso, topic: /dji6/camera/images }'
The dataset_uri
is optional, but it allows to benefit from the kDBDatasets framework, for easier access to the data, and enabling transfert of the data between platforms. This URI is expected to be unique.
To proceed with the recording of the data, switch to the terminal with the bag file, and unpause it with space
. Once the bag file finishes playing, you will need to call the stop_topic_recording
service to end the recording:
ros2 service call /dji6/kdb_server/stop_topic_recording ros_kdb_interfaces/srv/StopTopicRecording '{ topic: /dji6/camera/images }'
Create the dataset metadata
Before continuing, you should follow the datasets tutorial. There is no command line control for adding a dataset to kDB, the best option is to use a Python (or Ruby) shell, and to use the following commands:
import rclpy
import knowCore
import knowGIS
import kDBDatasets
import ros_kdb
# Create a connection to the '/dji6' kDB database.
rclpy.init()
connection = ros_kdb.create_connection("/dji6")
# Uri that should match the argument to the record service call.
dataset_uri = knowCore.Uri("http://askco.re/examples#images_granso")
# The geometry corresponding to where the images have been acquired.
geometry = knowGIS.GeometryObject.fromWKT("POLYGON ((16.681876 57.760724, 16.682745 57.760318, 16.682079 57.760034, 16.681205 57.760418, 16.681876 57.760724))")
# Add the dataset to the collection of private datasets.
dss = kDBDatasets.Datasets.get(connection, knowCore.Uri("http://askco.re/graphs#private_datasets"))
ds = dss.createDataset(knowCore.Uri("http://askco.re/sensing#image_frame"), geometry, dataset_uri)
# Set the status
ds.setStatus(kDBDatasets.Dataset.Status.Completed)
# This indicates that the current connection has a copy of the dataset
ds.associate(connection.serverUri())
The following command can be used to list the datasets:
ros2 run ros_kdb datasets_info --ros-args -r __ns:=/dji6
The line about statistics
should show something like - statistics: 65 data points
(actual number might vary depending on when the bag file is started). If it shows dataset is not local
it indicates that there was an issue during recording, as no images was saved in the database.
Lidar Scan
For this part of the tutorial we are going to use data from a flight near the Gränsö Castle, Sweden. The data is available in a ROS2 bag, rosbag2_lidar_buildings.tar.xz. The following commands will download it and unpack it:
curl -o rosbag2_lidar_buildings.tar.xz "https://gitlab.liu.se/lks/tutorials_data/-/raw/main/rosbag2_lidar_buildings.tar.xz?ref_type=heads&inline=false"
tar -xJf rosbag2_lidar_buildings.tar.xz
rm -f rosbag2_lidar_buildings.tar.xz
The data was acquired by a platform in the /dji3
namespace. It contains the following topics:
/tf
and/tf_static
: the standard ROS transformation frames. Needed to transform from the camera frame to the robot body frame./dji3/lidar/points
: the 3D Lidar scan from a velodyne sensor./dji3/geoposestamped
: the geo coordinate of the robot body frame, expressed in longitude/latitude and a quaternion for the orientation.
As an exercise, try recording the data in a database with namespace /dji3
, using the dataset URI http://askco.re/examples#lidar3d_granso
, the type of the dataset is http://askco.re/sensing#lidar3d_scan
and the geometry POLYGON ((16.683034 57.760853, 16.68267 57.760638, 16.683056 57.760435, 16.683501 57.760635, 16.683034 57.760853))
.
Next
- Datasets is an introductions to datasets in kDB, it covers how to create them, change their properties and import/export datasets.
The tutorials regarding pralin/compose
for processing recorded data:
- Introduction to pralin/compose and how to execute composition from the command line or an API.
- Process images datasets this tutorial shows how to process an existing image datasets
- Process Lidar Datasets this tutorial covers how to process lidar data, to generate a point cloud and downsample it.