Introduction
It is also a project requirement, and needs to use visual inertial navigation to do some development, so the first step is to do some algorithm testing-simulation and physical testing. After passing the simulation test results, it is finally decided to use ORB-SLAM3 to complete the task.
- System version: Ubuntu 20.04, ROS Noetic
Contents
- Introduction
- Cuda and CuDNN installation
- ZED SDK installation
- Binocular camera, IMU calibration and joint calibration
- Algorithm operation results display
- Discussion
Cuda and CuDNN installation
- CUDA Toolkit Install
- CuDNN install
ZED SDK
- ZED SDK
- ZED Python API
- ZED ROS Wrapper
Sensor Calibration
Install Calibration tool
- Install dependencies
1
2
3
4
5
6
sudo apt-get install python-setuptools python3-rosinstall ipython3 libeigen3-dev libboost-all-dev doxygen libopencv-dev ros-noetic-vision-opencv ros-noetic-image-transport-plugins ros-noetic-cmake-modules software-properties-common libpoco-dev python3-matplotlib python3-scipy python3-git python3-pip libtbb-dev liblapack-dev python3-catkin-tools libv4l-dev
sudo add-apt-repository ppa:igraph/ppa
sudo apt-get update
sudo apt-get install python3-igraph
- Create a new workspace, remember to add the environment variable source ~/kalibr_workspace/devel/setup.bash in bashrc
1
2
3
mkdir -p ~/kalibr_ws/src
cd ~/kalibr_ws
catkin_make
- Download the source code and compile
1
2
3
4
5
6
7
cd ~/kalibr_ws/src
git clone -b 20 https://github.com/ethz-asl/kalibr.git
catkin build -DCMAKE_BUILD_TYPE=Release -j4
- Download and compile code_utils (here to compile under the workspace created by catkin_make, mine is catkin_ws)
1
2
3
4
5
6
7
8
9
10
11
cd ~/catkin_ws/src
git clone hhttps://github.com/gaowenliang/code_utils.git
sudo apt-get install libw-dev
cd ../
catkin_make
- Download and compile imu_utils
1
2
3
git clone https://github.com/geowenliang/imu_utils
cd ../
catkin_make
Download the calibration board and yaml file
Enter GitHub to download relevant parameter files https://github.com/ethz-asl/kalibr/wiki/downloads#calibration-targets Select Aprilgrid 6x6 0.8x0.8 m (A0 page), download its pdf and yaml files
The following is the yaml file Contents: Among them The codeOffset parameter is useless and can be deleted, tagSize is the side length of the large black square, tagSpacing is the side length of the small black square/the side length of the large black square.
1
2
3
4
5
6
terget_type: 'aprilgrid'
tagCols: 6
tagRows: 6
tagSize: 0.088
tagSpacing: 0.3
codeoffset: 0
ZED 2 recording bag ready for calibration
Found under the catkin_ws/src/zed-ros-wrapper/zed_wrapper/params foldercommon.yaml (parameter configuration file), which can configure the camera output resolution, I set the resolution to 2, and the resolution size is 1280 720, but in fact, the image resolution obtained by subscribing to the ros topic during calibration is half of this, that is, 640 360
Start ZED 2:
1
2
roslaunch zed_wrapper zed2.launch
rqt_image_view # for viewing image topic
The default camera release frequency is 15Hz, and the IMU release frequency is 200Hz. Next, we will start recording the bag package for calibration.
Note
(1) Note that when recording, the full picture of the QR code should be exposed in the field of view of the two cameras of the binocular camera, which can be viewed through rqt_image_view.
(2) Fully move the QR code to every corner of the camera’s field of view.
(3) Move three rounds along the three axes of the camera, forward, backward, left, and right, and swing back and forth around the three axes for three rounds.
The specific operations can be performed first.
1
rosbag record -o Kalib_data_HD720.bag /zed2/zed_node/imu/data_raw /zed2/zed_node/left/image_rect_color /zed2/zed_node/right/image_rect_color
A total of three topics are recorded, camera images and IMU’s. There are many people on the Internet who say that the frequency should be lowered, but it is not necessary, just record directly.
Camera Calibration
Remember Kalib_data_HD720.bag (recorded bag package) and april.yaml (calibration board parameter file)Replace it with the path where your actual location is, It will be marked in a while. Anyway, I did not report an error. If I did report an error, I suggest re-recording. During the calibration process, you can visualize whether the corner detection is good, and find that there are serious errors in corner reprojection; –approx-sync 0.04, where 0.04 can be adjusted to 0.1 according to the situation, and the function is to synchronize the data of each camera.
1
2
roscore
rosrun kalibr kalibr_calibrate_cameras --bag Kalib_data_HD720.bag --topic /zed2/zed_node/left/image_rect_color /zed2/zed_node/right/image_rect_color --models pinhole-radtan pinhole-radtan --target april.yaml
After the calibration is completed, the camera’s internal reference and other files will be obtained, among which the yaml file can be used for joint calibration.
IMU Calibration
(1) Manual Calibration
1
rosbag record -o imu_calibration /zed2/zed_node/imu/data_raw
- To record for more than two hours, create a launch file named ZED2_calibration.launch
1
2
3
4
5
6
7
8
9
<launch>
<node pkg="imu_utils" type="imu_an" name="imu_an" output="screen">
<param name="imu_topic" type="string" value= "/zed2/zed_node/imu/data_raw"/>
<param name="imu_name" type="string" value= "ZED2"/>
<param name="data_save_path" type="string" value= "$(find imu_utils)/data/"/>
<param name="max_time_min" type="int" value= "120"/>
<param name="max_cluster" type="int" value= "200"/>
</node>
</launch>
- Start calibration
1
2
roslaunch imu_utils ZED2_calibration.launch
rosbag play -r 200 imu_calibration.bag
Finally, go to the calibration result file to create the corresponding imu.yaml (take the average value of the calibration results Acc and Gyr and fill in the imu.yaml file)
(2) directly use the official data to create imu.yaml and fill it in
1
2
3
4
5
6
7
8
9
10
#Accelerometers
accelerometer_noise_density: 1.4e-03 #Noise density (continuous-time)
accelerometer_random_walk: 8.0e-05 #Bias random walk
#Gyroscopes
gyroscope_noise_density: 8.6e-05 #Noise density (continuous-time)
gyroscope_random_walk: 2.2e-06 #Bias random walk
rostopic: /zed2/zed_node/imu/data_raw #the IMU ROS topic
update_rate: 200.0 #Hz (for discretization of the values above)
Binocular camera/IMU joint calibration
Note that the files Kalib_data_HD720.bag, camchain-Kalib_data_HD720.yaml, imu.yaml, and april.yaml should use absolute paths as much as possible
1
2
roscore
rosrun kalibr kalibr_calibrate_imu_camera --bag Kalib_data_HD720.bag --cam camchain-Kalib_data_HD720.yaml --imu imu.yaml --target april.yaml
Finally, the joint calibration parameters Kalib_data_HD720-imu.yaml can be obtained.
ZED 2 runs ORB-SLAM3
Compile and install ORB-SLAM3
Let’s directly quote the content of a blogger here. I have configured it on three computer devices. In fact, one of the problems is environment variable , add export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:/home/
1
2
3
4
5
6
7
8
9
10
git clone https://github.com/UZ-SLAMLab/ORB_SLAM3.git
cd ORB_SLAM3-master
chmod +x build.sh
./build.sh
cd ORB_SLAM3-master/Examples_old/ROS/ORB_SLAM3
mkdir build
cd build
cmake ..
make
Add the configuration file below
1
2
3
cd ORB_SLAM3-master/Examples_old/ROS/ORB_SLAM3/src
touch zed2_stereo_inertial.cc
touch zed2_stereo_inertial.yaml
Add the following information to the zed2_stereo_inertial.cc file
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
#include<iostream>
#include<algorithm>
#include<fstream>
#include<chrono>
#include<vector>
#include<queue>
#include<thread>
#include<mutex>
#include<ros/ros.h>
#include<cv_bridge/cv_bridge.h>
#include<sensor_msgs/Imu.h>
#include<opencv2/core/core.hpp>
#include"../../../include/System.h"
#include"include/ImuTypes.h"
using namespace std;
class ImuGrabber
{
public:
ImuGrabber(){};
void GrabImu(const sensor_msgs::ImuConstPtr &imu_msg);
queue<sensor_msgs::ImuConstPtr> imuBuf;
std::mutex mBufMutex;
};
class ImageGrabber
{
public:
ImageGrabber(ORB_SLAM3::System* pSLAM, ImuGrabber *pImuGb, const bool bRect, const bool bClahe): mpSLAM(pSLAM), mpImuGb(pImuGb), do_rectify(bRect), mbClahe(bClahe){}
void GrabImageLeft(const sensor_msgs::ImageConstPtr& msg);
void GrabImageRight(const sensor_msgs::ImageConstPtr& msg);
cv::Mat GetImage(const sensor_msgs::ImageConstPtr &img_msg);
void SyncWithImu();
queue<sensor_msgs::ImageConstPtr> imgLeftBuf, imgRightBuf;
std::mutex mBufMutexLeft,mBufMutexRight;
ORB_SLAM3::System* mpSLAM;
ImuGrabber *mpImuGb;
const bool do_rectify;
cv::Mat M1l,M2l,M1r,M2r;
const bool mbClahe;
cv::Ptr<cv::CLAHE> mClahe = cv::createCLAHE(3.0, cv::Size(8, 8));
};
int main(int argc, char **argv)
{
ros::init(argc, argv, "Stereo_Inertial");
ros::NodeHandle n("~");
ros::console::set_logger_level(ROSCONSOLE_DEFAULT_NAME, ros::console::levels::Info);
bool bEqual = false;
if(argc < 4 || argc > 5)
{
cerr << endl << "Usage: rosrun ORB_SLAM3 Stereo_Inertial path_to_vocabulary path_to_settings do_rectify [do_equalize]" << endl;
ros::shutdown();
return 1;
}
std::string sbRect(argv[3]);
if(argc==5)
{
std::string sbEqual(argv[4]);
if(sbEqual == "true")
bEqual = true;
}
// Create SLAM system. It initializes all system threads and gets ready to process frames.
ORB_SLAM3::System SLAM(argv[1],argv[2],ORB_SLAM3::System::IMU_STEREO,true);
ImuGrabber imugb;
ImageGrabber igb(&SLAM,&imugb,sbRect == "true",bEqual);
if(igb.do_rectify)
{
// Load settings related to stereo calibration
cv::FileStorage fsSettings(argv[2], cv::FileStorage::READ);
if(!fsSettings.isOpened())
{
cerr << "ERROR: Wrong path to settings" << endl;
return -1;
}
cv::Mat K_l, K_r, P_l, P_r, R_l, R_r, D_l, D_r;
fsSettings["LEFT.K"] >> K_l;
fsSettings["RIGHT.K"] >> K_r;
fsSettings["LEFT.P"] >> P_l;
fsSettings["RIGHT.P"] >> P_r;
fsSettings["LEFT.R"] >> R_l;
fsSettings["RIGHT.R"] >> R_r;
fsSettings["LEFT.D"] >> D_l;
fsSettings["RIGHT.D"] >> D_r;
int rows_l = fsSettings["LEFT.height"];
int cols_l = fsSettings["LEFT.width"];
int rows_r = fsSettings["RIGHT.height"];
int cols_r = fsSettings["RIGHT.width"];
if(K_l.empty() || K_r.empty() || P_l.empty() || P_r.empty() || R_l.empty() || R_r.empty() || D_l.empty() || D_r.empty() ||
rows_l==0 || rows_r==0 || cols_l==0 || cols_r==0)
{
cerr << "ERROR: Calibration parameters to rectify stereo are missing!" << endl;
return -1;
}
cv::initUndistortRectifyMap(K_l,D_l,R_l,P_l.rowRange(0,3).colRange(0,3),cv::Size(cols_l,rows_l),CV_32F,igb.M1l,igb.M2l);
cv::initUndistortRectifyMap(K_r,D_r,R_r,P_r.rowRange(0,3).colRange(0,3),cv::Size(cols_r,rows_r),CV_32F,igb.M1r,igb.M2r);
}
// Maximum delay, 5 seconds
//ros::Subscriber sub_imu = n.subscribe("/imu", 1000, &ImuGrabber::GrabImu, &imugb);
//ros::Subscriber sub_img_left = n.subscribe("/camera/left/image_raw", 100, &ImageGrabber::GrabImageLeft,&igb);
//ros::Subscriber sub_img_right = n.subscribe("/camera/right/image_raw", 100, &ImageGrabber::GrabImageRight,&igb);
ros::Subscriber sub_imu = n.subscribe("/zed2/zed_node/imu/data_raw", 1000, &ImuGrabber::GrabImu, &imugb);
ros::Subscriber sub_img_left = n.subscribe("/zed2/zed_node/left/image_rect_color", 100, &ImageGrabber::GrabImageLeft,&igb);
ros::Subscriber sub_img_right = n.subscribe("/zed2/zed_node/right/image_rect_color", 100, &ImageGrabber::GrabImageRight,&igb);
std::thread sync_thread(&ImageGrabber::SyncWithImu,&igb);
ros::spin();
return 0;
}
void ImageGrabber::GrabImageLeft(const sensor_msgs::ImageConstPtr &img_msg)
{
mBufMutexLeft.lock();
if (!imgLeftBuf.empty())
imgLeftBuf.pop();
imgLeftBuf.push(img_msg);
mBufMutexLeft.unlock();
}
void ImageGrabber::GrabImageRight(const sensor_msgs::ImageConstPtr &img_msg)
{
mBufMutexRight.lock();
if (!imgRightBuf.empty())
imgRightBuf.pop();
imgRightBuf.push(img_msg);
mBufMutexRight.unlock();
}
cv::Mat ImageGrabber::GetImage(const sensor_msgs::ImageConstPtr &img_msg)
{
// Copy the ros image message to cv::Mat.
cv_bridge::CvImageConstPtr cv_ptr;
try
{
cv_ptr = cv_bridge::toCvShare(img_msg, sensor_msgs::image_encodings::MONO8);
}
catch (cv_bridge::Exception& e)
{
ROS_ERROR("cv_bridge exception: %s", e.what());
}
if(cv_ptr->image.type()==0)
{
return cv_ptr->image.clone();
}
else
{
std::cout << "Error type" << std::endl;
return cv_ptr->image.clone();
}
}
void ImageGrabber::SyncWithImu()
{
const double maxTimeDiff = 0.01;
while(1)
{
cv::Mat imLeft, imRight;
double tImLeft = 0, tImRight = 0;
if (!imgLeftBuf.empty()&&!imgRightBuf.empty()&&!mpImuGb->imuBuf.empty())
{
tImLeft = imgLeftBuf.front()->header.stamp.toSec();
tImRight = imgRightBuf.front()->header.stamp.toSec();
this->mBufMutexRight.lock();
while((tImLeft-tImRight)>maxTimeDiff && imgRightBuf.size()>1)
{
imgRightBuf.pop();
tImRight = imgRightBuf.front()->header.stamp.toSec();
}
this->mBufMutexRight.unlock();
this->mBufMutexLeft.lock();
while((tImRight-tImLeft)>maxTimeDiff && imgLeftBuf.size()>1)
{
imgLeftBuf.pop();
tImLeft = imgLeftBuf.front()->header.stamp.toSec();
}
this->mBufMutexLeft.unlock();
if((tImLeft-tImRight)>maxTimeDiff || (tImRight-tImLeft)>maxTimeDiff)
{
// std::cout << "big time difference" << std::endl;
continue;
}
if(tImLeft>mpImuGb->imuBuf.back()->header.stamp.toSec())
continue;
this->mBufMutexLeft.lock();
imLeft = GetImage(imgLeftBuf.front());
imgLeftBuf.pop();
this->mBufMutexLeft.unlock();
this->mBufMutexRight.lock();
imRight = GetImage(imgRightBuf.front());
imgRightBuf.pop();
this->mBufMutexRight.unlock();
vector<ORB_SLAM3::IMU::Point> vImuMeas;
mpImuGb->mBufMutex.lock();
if(!mpImuGb->imuBuf.empty())
{
// Load imu measurements from buffer
vImuMeas.clear();
while(!mpImuGb->imuBuf.empty() && mpImuGb->imuBuf.front()->header.stamp.toSec()<=tImLeft)
{
double t = mpImuGb->imuBuf.front()->header.stamp.toSec();
cv::Point3f acc(mpImuGb->imuBuf.front()->linear_acceleration.x, mpImuGb->imuBuf.front()->linear_acceleration.y, mpImuGb->imuBuf.front()->linear_acceleration.z);
cv::Point3f gyr(mpImuGb->imuBuf.front()->angular_velocity.x, mpImuGb->imuBuf.front()->angular_velocity.y, mpImuGb->imuBuf.front()->angular_velocity.z);
vImuMeas.push_back(ORB_SLAM3::IMU::Point(acc,gyr,t));
mpImuGb->imuBuf.pop();
}
}
mpImuGb->mBufMutex.unlock();
if(mbClahe)
{
mClahe->apply(imLeft,imLeft);
mClahe->apply(imRight,imRight);
}
if(do_rectify)
{
cv::remap(imLeft,imLeft,M1l,M2l,cv::INTER_LINEAR);
cv::remap(imRight,imRight,M1r,M2r,cv::INTER_LINEAR);
}
mpSLAM->TrackStereo(imLeft,imRight,tImLeft,vImuMeas);
std::chrono::milliseconds tSleep(1);
std::this_thread::sleep_for(tSleep);
}
}
}
void ImuGrabber::GrabImu(const sensor_msgs::ImuConstPtr &imu_msg)
{
mBufMutex.lock();
imuBuf.push(imu_msg);
mBufMutex.unlock();
return;
}
Add the following information in zed2_stereo_inertial.yaml (here are my camera parameters, you can modify it according to your own situation)
I directly use the DKRP data in the camera_info topic of the original image of the ZED left and right eyes, you can find it in the rostopic list , In addition, the zed2 camera baseline length is 120.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
%YAML:1.0
#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------
Camera.type: "PinHole"
# Camera calibration and distortion parameters (OpenCV) (equal for both cameras after stereo rectification)
Camera.fx: 256.5277384633465
Camera.fy: 258.08249705047217
Camera.cx: 325.50319459226085
Camera.cy: 180.96517806223522
Camera.k1: -0.020457937535071292
Camera.k2: 0.01104746035697357
Camera.p1: 0.00020521550183980535
Camera.p2: -0.0015638016748186173
Camera.width: 640
Camera.height: 360
# Camera frames per second
Camera.fps: 15.0
# stereo baseline times fx
Camera.bf: 30.7824
# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1
# Close/Far threshold. Baseline times.
ThDepth: 40.0 # 35
# Transformation from camera 0 to body-frame (imu)
Tbc: !!opencv-matrix
rows: 4
cols: 4
dt: f
data: [ 0.011189492088057723, -0.005170435177808852, 0.999924028047573, -0.030171769239718378,
-0.9999089154286149, -0.0076051555206256005, 0.011149998021449559, +0.006834916768468505,
0.007546927400109649, -0.9999577132107034, -0.0052550620584632945, -0.018995636408175094,
0.0, 0.0, 0.0, 1.0]
# IMU noise
# get it from Project of **zed-examples/tutorials/tutorial 7 - sensor data/**.
IMU.NoiseGyro: 8.6e-05 # 1.6968e-04
IMU.NoiseAcc: 0.0014 # 2.0000e-3
IMU.GyroWalk: 2.2e-06
IMU.AccWalk: 8.0e-05 # 3.0000e-3
IMU.Frequency: 200
#--------------------------------------------------------------------------------------------
# Stereo Rectification. Only if you need to pre-rectify the images.
# Camera.fx, .fy, etc must be the same as in LEFT.P
#--------------------------------------------------------------------------------------------
LEFT.height: 360
LEFT.width: 640
LEFT.D: !!opencv-matrix
rows: 1
cols: 5
dt: d
data: [-0.040750399231910706, 0.009019049815833569, -0.004655580036342144, -0.0006361529813148081, 0.0003129479882773012]
LEFT.K: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [264.1424865722656, 0.0, 328.0299987792969, 0.0, 263.9525146484375, 180.45175170898438, 0.0, 0.0, 1.0]
LEFT.R: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
LEFT.Rf: !!opencv-matrix
rows: 3
cols: 3
dt: f
data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
LEFT.P: !!opencv-matrix
rows: 3
cols: 4
dt: d
data: [264.1424865722656, 0.0, 328.0299987792969, 0.0, 0.0, 263.9525146484375, 180.45175170898438, 0.0, 0.0, 0.0, 1.0, 0.0]
RIGHT.height: 360
RIGHT.width: 640
RIGHT.D: !!opencv-matrix
rows: 1
cols: 5
dt: d
data: [-0.03843430057168007, 0.005912320222705603, -0.0034095800947397947, 6.041819870006293e-05, -0.00011238799925195053]
RIGHT.K: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [263.0425109863281, 0.0, 326.2799987792969, 0.0, 262.93499755859375, 180.3209991455078, 0.0, 0.0, 1.0]
RIGHT.R: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
RIGHT.P: !!opencv-matrix
rows: 3
cols: 4
dt: d
data: [263.0425109863281, 0.0, 326.2799987792969, -31.668317794799805, 0.0, 262.93499755859375, 180.3209991455078, 0.0, 0.0, 0.0, 1.0, 0.0]
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------
# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 2200
# ORB Extractor: Scale factor between levels in the scale pyramid
ORBextractor.scaleFactor: 1.2
# ORB Extractor: Number of levels in the scale pyramid
ORBextractor.nLevels: 8
# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7
#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500
Edit the CMakeLists.txt file
1
2
cd ORB_SLAM3-master/Examples_old/ROS/ORB_SLAM3
gedit CMakeLists.txt
Add at the bottom
1
2
3
4
5
6
7
rosbuild_add_executable(zed2_stereo_inertial
src/zed2_stereo_inertial.cc
)
target_link_libraries(zed2_stereo_inertial
${LIBS}
)
Just recompile, if the executable file cannot be found, run
1
2
rospack profile