# ROS Wrapper for Intel® RealSense™ Devices
These are packages for using Intel RealSense cameras (D400 series SR300 camera and T265 Tracking Module) with ROS.
This version supports Kinetic, Melodic and Noetic distributions.
For running in ROS2 environment please switch to the [ros2 branch](https://github.com/IntelRealSense/realsense-ros/tree/ros2). </br>
LibRealSense2 supported version: v2.50.0 (see [realsense2_camera release notes](https://github.com/IntelRealSense/realsense-ros/releases))
## Installation Instructions
### Ubuntu
#### Step 1: Install the ROS distribution
- #### Install [ROS Kinetic](http://wiki.ros.org/kinetic/Installation/Ubuntu), on Ubuntu 16.04, [ROS Melodic](http://wiki.ros.org/melodic/Installation/Ubuntu) on Ubuntu 18.04 or [ROS Noetic](http://wiki.ros.org/noetic/Installation/Ubuntu) on Ubuntu 20.04.
### Windows
#### Step 1: Install the ROS distribution
- #### Install [ROS Melodic or later on Windows 10](https://wiki.ros.org/Installation/Windows)
### There are 2 sources to install realsense2_camera from:
* ### Method 1: The ROS distribution:
*Ubuntu*
realsense2_camera is available as a debian package of ROS distribution. It can be installed by typing:
```sudo apt-get install ros-$ROS_DISTRO-realsense2-camera```
This will install both realsense2_camera and its dependents, including librealsense2 library and matching udev-rules.
Notice:
* The version of librealsense2 is almost always behind the one availeable in RealSense™ official repository.
* librealsense2 is not built to use native v4l2 driver but the less stable RS-USB protocol. That is because the last is more general and operational on a larger variety of platforms.
* realsense2_description is available as a separate debian package of ROS distribution. It includes the 3D-models of the devices and is necessary for running launch files that include these models (i.e. rs_d435_camera_with_model.launch). It can be installed by typing:
`sudo apt-get install ros-$ROS_DISTRO-realsense2-description`
*Windows*
**Chocolatey distribution Coming soon**
* ### Method 2: The RealSense™ distribution:
> This option is demonstrated in the [.travis.yml](https://github.com/intel-ros/realsense/blob/development/.travis.yml) file. It basically summerize the elaborate instructions in the following 2 steps:
### Step 1: Install the latest Intel® RealSense™ SDK 2.0
*Ubuntu*
Install librealsense2 debian package:
* Jetson users - use the [Jetson Installation Guide](https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md)
* Otherwise, install from [Linux Debian Installation Guide](https://github.com/IntelRealSense/librealsense/blob/master/doc/distribution_linux.md#installing-the-packages)
- In that case treat yourself as a developer. Make sure you follow the instructions to also install librealsense2-dev and librealsense2-dkms packages.
*Windows*
Install using vcpkg
`vcpkg install realsense2:x64-windows`
#### OR
- #### Build from sources by downloading the latest [Intel® RealSense™ SDK 2.0](https://github.com/IntelRealSense/librealsense/releases/tag/v2.50.0) and follow the instructions under [Linux Installation](https://github.com/IntelRealSense/librealsense/blob/master/doc/installation.md)
### Step 2: Install Intel® RealSense™ ROS from Sources
- Create a [catkin](http://wiki.ros.org/catkin#Installing_catkin) workspace
*Ubuntu*
```bash
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src/
```
*Windows*
```batch
mkdir c:\catkin_ws\src
cd c:\catkin_ws\src
```
- Clone the latest Intel® RealSense™ ROS from [here](https://github.com/intel-ros/realsense/releases) into 'catkin_ws/src/'
```bashrc
git clone https://github.com/IntelRealSense/realsense-ros.git
cd realsense-ros/
git checkout `git tag | sort -V | grep -P "^2.\d+\.\d+" | tail -1`
cd ..
```
- Make sure all dependent packages are installed. You can check .travis.yml file for reference.
- Specifically, make sure that the ros package *ddynamic_reconfigure* is installed. If *ddynamic_reconfigure* cannot be installed using APT or if you are using *Windows* you may clone it into your workspace 'catkin_ws/src/' from [here](https://github.com/pal-robotics/ddynamic_reconfigure/tree/kinetic-devel)
```bash
catkin_init_workspace
cd ..
catkin_make clean
catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
catkin_make install
```
*Ubuntu*
```bash
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
```
*Windows*
```batch
devel\setup.bat
```
## Usage Instructions
### Start the camera node
To start the camera node in ROS:
```bash
roslaunch realsense2_camera rs_camera.launch
```
This will stream all camera sensors and publish on the appropriate ROS topics.
Other stream resolutions and frame rates can optionally be provided as parameters to the 'rs_camera.launch' file.
### Published Topics
The published topics differ according to the device and parameters.
After running the above command with D435i attached, the following list of topics will be available (This is a partial list. For full one type `rostopic list`):
- /camera/color/camera_info
- /camera/color/image_raw
- /camera/color/metadata
- /camera/depth/camera_info
- /camera/depth/image_rect_raw
- /camera/depth/metadata
- /camera/extrinsics/depth_to_color
- /camera/extrinsics/depth_to_infra1
- /camera/extrinsics/depth_to_infra2
- /camera/infra1/camera_info
- /camera/infra1/image_rect_raw
- /camera/infra2/camera_info
- /camera/infra2/image_rect_raw
- /camera/gyro/imu_info
- /camera/gyro/metadata
- /camera/gyro/sample
- /camera/accel/imu_info
- /camera/accel/metadata
- /camera/accel/sample
- /diagnostics
>Using an L515 device the list differs a little by adding a 4-bit confidence grade (pulished as a mono8 image):
>- /camera/confidence/camera_info
>- /camera/confidence/image_rect_raw
>
>It also replaces the 2 infrared topics with the single available one:
>- /camera/infra/camera_info
>- /camera/infra/image_raw
The "/camera" prefix is the default and can be changed. Check the rs_multiple_devices.launch file for an example.
If using D435 or D415, the gyro and accel topics wont be available. Likewise, other topics will be available when using T265 (see below).
### Launch parameters
The following parameters are available by the wrapper:
- **serial_no**: will attach to the device with the given serial number (*serial_no*) number. Default, attach to available RealSense device in random.
- **usb_port_id**: will attach to the device with the given USB port (*usb_port_id*). i.e 4-1, 4-2 etc. Default, ignore USB port when choosing a device.
- **device_type**: will attach to a device whose name includes the given *device_type* regular expression pattern. Default, ignore device type. For example, device_type:=d435 will match d435 and d435i. device_type=d435(?!i) will match d435 but not d435i.
- **rosbag_filename**: Will publish topics from rosbag file.
- **initial_reset**: On occasions the device was not closed properly and due to firmware issues needs to reset. If set to true, the device will reset prior to usage.
- **reconnect_timeout**: When the driver cannot connect to the device try to reconnect after this timeout (in seconds).
- **align_depth**: If set to true, will publish additional topics for the "aligned depth to color" image.: ```/camera/aligned_depth_to_color/image_raw```, ```/camera/aligned_depth_to_color/camera_info```.</br>
The pointcloud, if enabled, will be built based on the aligned_depth_to_color image.</br>
- **filters**: any of the following options, separated by commas:</br>
- ```colorizer```: will color the depth image. On the depth topic an RGB image will be published, instead of the 16bit depth values .
- ```pointcloud```: will add a poin


妄北y
- 粉丝: 2w+
- 资源: 1万+
最新资源
- sqlserver jtds 驱动
- 微电网逆变器控制策略与并网仿真研究:下垂控制仿真模型及其参考资料综述,微电网逆变器控制策略与并网仿真研究:下垂控制仿真模型及其应用探讨,微电网逆变器控制、并网仿真、下垂控制等仿真模型以及一些参考资料
- DeepSeek清华大学全套学习资料1-5
- 【毕业设计】Python的Django-html深度学习的安全帽佩戴检测系统源码(完整前后端+mysql+说明文档+LW+PPT).zip
- 【毕业设计】Python的Django-html深度学习的聊天机器人设计源码(完整前后端+mysql+说明文档+LW+PPT).zip
- 【毕业设计】Python的Django-html搜索的目标站点内容监测系统源码(完整前后端+mysql+说明文档+LW+PPT).zip
- 蒙特卡洛思想下的充电汽车负荷曲线生成研究报告:探索1万台汽车的充电负荷曲线模拟,基于蒙特卡洛思想的充电汽车充电负荷曲线生成研究报告-涵盖负荷曲线的精细分析与实际应用解析,基于蒙特卡洛思想生成1000
- DeepSeek 保姆级新手教程
- MATLAB下电转气协同与碳捕集垃圾焚烧虚拟电厂优化调度复现程序及仿真结果展示,基于MATLAB和CPLEX的碳捕集与垃圾焚烧虚拟电厂电转气协同优化调度算法研究,MATLAB代码:计及电转气协同的含碳
- 基于直齿行星齿轮传动系统的ode45求解方法及自由度收敛研究,基于直齿行星齿轮传动系统的ODE45求解及其自由度全收敛分析研究,直齿行星齿轮传动系统ode45求解,自由度全部收敛 本人主攻齿轮动力学
- 洪泽湖流域1951-2008年面雨量变化特征与趋势分析及其防洪减灾应用
- 基于TMS320F2812的PMSM有传感器矢量控制双闭环程序系统,TMS320F2812 DSP矢量控制实现PMSM传感器转速与电流双闭环控制程序,矢量控制dsp2812 主控为TMS320F281
- blender插件,mmd-tools-v2.10.3
- 黑龙江省近45年积温变化
- SVPWM过调制控制技术:深入解析与实际应用,基于SVPWM技术的过调制控制方法的研究与应用,SVPWM过调制控制方法 ,SVPWM; 过调制; 控制方法; 数字信号处理,SVPWM过调制控制策略
- 基于PI控制器与滑膜控制器的三相永磁同步电机无速度传感器控制策略及其SVPWM-MRAS-PI SMC算法研究与应用,基于PI控制器与滑膜控制器的三相永磁同步电机无速度传感器控制策略研究:SVPWM
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈


