一. 课题背景
利用RGB-D相机同时获取彩色图像和深度图像的能力,对室内环境进行三维重建和建模。这种技术在室内导航、场景理解、智能家居、增强现实等应用中具有广泛的应用前景。在移动机器人中可以获取室内环境的建模并将其转化为栅格地图为机器人自主导航和避障起到重要的作用。
本课题将RGB-D相机在室内环境中获取的深度图和彩色图实时的转化为三维地图。
二. 环境介绍
Ubuntu18.04.6 TLS双系统
ROS1(Melodic1.14.12)
OpenCV 3.4
Eigen3 3.1.0
PCL 1.12
Pangolin 1.0
VTK 8.2
三. 相机信息获取
通过二次开发MV-EB435i在ROS1的开源代码,调整课题需要获取的深度图和彩色图的格式、采样帧数和ros的话题信息。
相机启动
1、相机参数获取
利用MV3D_RGBD_GetDeviceList()获取相机的内外参数,得到畸变参数矩阵和相机内参矩阵相机参数包括相机的畸变参数[k1,k2,p1,p2,k3]和相机的内参矩阵[fx,0.0,cx,0.0,fx,cy,0.0,0.0,1.0];fx为f/dx,其中f单位为mm,dx单位一般为mm/像素,fx的单位为像素。
std::vector<MV3D_RGBD_DEVICE_INFO> devs(nDevNum);
ASSERT_OK(MV3D_RGBD_GetDeviceList(DeviceType_USB, &devs[0], nDevNum, &nDevNum));
for (unsigned int i = 0; i < nDevNum; i++)
{
LOG("Index[%d]. SerialNum[%s] IP[%s] name[%s].\r\n", i, devs[i].chSerialNumber, devs[i].SpecialInfo.stNetInfo.chCurrentIp, devs[i].chModelName);
}
LOG("---------------------------------------------------------------");
相机内外参数在ORB-SLAM3的yaml配置文件中需要用到。
相机外参主要作用是将相机坐标系转化世界坐标系公式中的K就是相机外参矩阵。
相机内参主要作用是去除相机的畸变公式为相机模型的畸变补偿函数。
2、相机彩色图和深度图获取
相机彩色图和深度图通过开源的ROS1里面的nodeletclass获取图像信息,并将图像获取的帧率设置为30。
以上1、2两个步骤便是通过RGB-D相机需要获取的信息数据。
3、相机在ROS中的启动
由于在启动相机过程中运行launch文件时需要刷新环境变量等复杂的操作,因此添加了run_eb435i.sh可执行文件,运用chmod +x ./run.sh添加可执行权限。以下为可执行sh文件(该文件正常启动需要进入管理员模式)。下图左侧终端窗口为运行指令。
source /etc/profile
source ./Examples/ROS/mv_ws/devel/setup.bash
roslaunch base_nodelet nodeletclass1.launch
四. ORB-SLAM3配置
在运行该算法前需要对环境进行配置(二、环境介绍中的配置),配置完成后就仅仅配置算法的配置文件即可。
1、yaml文件配置
首先需要配置的是Examples/RGB-D/MV-EB435i.yaml的yaml文件其中包括阔相机内参、外参、稠密度和滤波系数等。下面的code为yaml配置文件
%YAML:1.0
#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------
File.version: "1.0"
Camera.type: "PinHole"
# Camera calibration and distortion parameters (OpenCV)
#相机外参#Camera1.fx: 320.081238
Camera1.fy: 319.866180
Camera1.cx: 307.801514
Camera1.cy: 177.549438
#相机内参#
Camera1.k1: -0.202915
Camera1.k2: 0.038161
Camera1.p1: 0.000202
Camera1.p2: -0.000017
Camera1.k3: 0.000000
#相机输出图像的大小#
Camera.width: 640
Camera.height: 360
# Camera frames per second
Camera.fps: 30
# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)Camera.RGB: 1
# Close/Far threshold. Baseline times.
Stereo.ThDepth: 40.0
Stereo.b: 0.07732
# Depth map values factor
RGBD.DepthMapFactor: 10.0 # 1.0 for ROS_bag,other 5000.0在ROS中的深度因子为1比1,在纯ORB-SLAM中需要单独计算
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------
# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 1000
# ORB Extractor: Scale factor between levels in the scale pyramid
ORBextractor.scaleFactor: 1.2
# ORB Extractor: Number of levels in the scale pyramid
ORBextractor.nLevels: 8
# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7
#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1.0
Viewer.GraphLineWidth: 0.9
Viewer.PointSize: 2.0
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3.0
Viewer.ViewpointX: 0.0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500.0
PointCloudMapping.Resolution: 0.02 #稠密度
meank: 40 #统计滤波中查询点邻居点的数量
thresh: 1.5 #判断是否为离群点的阈值
2、ROS的launch文件配置
在启动算法过程需要配置相机类型、字典文件、相机配置文件(yaml)、深度图和彩色图的话题个参数。
<launch>
<node pkg="ORB_SLAM3" type="RGBD" name="RGBD" args="/home/cc/ORB_SLAM3_DENSE_LOOP1/Vocabulary/ORBvoc.txt /home/cc/ORB_SLAM3_DENSE_LOOP1/Examples/RGB-D/MV-EB435i.yaml _rgb:=/rgb/image_raw _depth:=/depth/image_raw" output="screen"/>
</launch>
launch文件的参数传递到ros_rgbd.cc的ROS参数服务器中进行处理,以下代码便是整个程序的一个主程序。
#include<iostream>
#include<algorithm>
#include<fstream>
#include<chrono>
#include<ros/ros.h>
#include <cv_bridge/cv_bridge.h>
#include <message_filters/subscriber.h>
#include <message_filters/time_synchronizer.h>
#include <message_filters/sync_policies/approximate_time.h>
#include <sensor_msgs/PointCloud2.h>
#include<pcl/point_cloud.h>
#include <pcl_conversions/pcl_conversions.h>
//#include<pcl/io/pcd_io.h>
#include<opencv2/core/core.hpp>
#include"../../../include/System.h"
using namespace std;
class ImageGrabber
{
public:
ImageGrabber(ORB_SLAM3::System* pSLAM):mpSLAM(pSLAM){}
void GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD);
ORB_SLAM3::System* mpSLAM;
};
int main(int argc, char **argv)
{
ros::init(argc, argv, "RGBD");
ros::start();
if(argc != 3)
{
cerr << endl << "Usage: rosrun ORB_SLAM3 RGBD path_to_vocabulary path_to_settings" << endl;
ros::shutdown();
return 1;
}
// Create SLAM system. It initializes all system threads and gets ready to process frames.
ORB_SLAM3::System SLAM(argv[1],argv[2],ORB_SLAM3::System::RGBD,true);
ImageGrabber igb(&SLAM);
//ros::NodeHandle nh;
//加上"~"号后可以传入参数
ros::NodeHandle nh("~");
std::string rgb_topic = nh.param<std::string>("rgb", "/camera/rgb/image_color");
std::string depth_topic = nh.param<std::string>("depth", "/camera/depth/image");
cout << "rgb: " << rgb_topic << endl;
cout << "depth: " << depth_topic << endl;
message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, rgb_topic, 100);
message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, depth_topic, 100);
// message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_raw", 100);
// message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth_registered/image_raw", 100);
// 以下为tum rgbd 数据集的topic
//message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_color", 100);
//message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth/image", 100);
//message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_raw", 100);
//message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth/image_raw", 100);
typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Image> sync_pol;
message_filters::Synchronizer<sync_pol> sync(sync_pol(10), rgb_sub,depth_sub);
sync.registerCallback(boost::bind(&ImageGrabber::GrabRGBD,&igb,_1,_2));
//ros::spin();
SLAM.save("/home/cc/ORB_SLAM3_DENSE_LOOP1/地图.pcd");
// Stop all threads
SLAM.Shutdown();
// Save camera trajectory
//SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");
//SLAM.SaveTrajectoryTUM("CameraTrajectory.txt");
// Save camera trajectory
SLAM.SaveKeyFrameTrajectoryTUM("/home/cc/ORB_SLAM3_DENSE_LOOP1/关键帧轨迹.txt");
// Save camera trajectory**
SLAM.SaveTrajectoryTUM("/home/cc/ORB_SLAM3_DENSE_LOOP1/相机轨迹.txt");
ros::shutdown();
return 0;
}
void ImageGrabber::GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD)
{
// Copy the ros image message to cv::Mat.
cv_bridge::CvImageConstPtr cv_ptrRGB;
try
{
cv_ptrRGB = cv_bridge::toCvShare(msgRGB);
}
catch (cv_bridge::Exception& e)
{
ROS_ERROR("cv_bridge exception: %s", e.what());
return;
}
cv_bridge::CvImageConstPtr cv_ptrD;
try
{
cv_ptrD = cv_bridge::toCvShare(msgD);
}
catch (cv_bridge::Exception& e)
{
ROS_ERROR("cv_bridge exception: %s", e.what());
return;
}
mpSLAM->TrackRGBD(cv_ptrRGB->image,cv_ptrD->image,cv_ptrRGB->header.stamp.toSec());
}
3、启动文件配置
通过可执行文件来执行启动程序可以大大降低启动过程中大量开终端窗口和测试过程中反复键入大量指令的麻烦,将指令键入可执行文件中启动,以下为可执行文件run_eb435i_mapping.sh的配置
echo "-------------------------------------------"
echo "Building ROS nodes"
echo "-------------------------------------------"
source /home/cc/ORB_SLAM3_DENSE_LOOP1/Examples/ROS/ORB_SLAM3/build/devel/setup.bash
roslaunch ORB_SLAM3 run_rgbd_mv.launch
通过在终端窗口键入启动ROS,在两个窗口中键入run_eb435i_mapping.sh和run_eb435i.sh启动程序和相机。
五. 室内环境建模
代码链接:
https://github.com/iCat-cc/HK_Perception
团队信息:
胖头鱼群
朱旭 李光兰 王海琛
##演示视频在附件中##