280.02K
Category: programmingprogramming

ROS Capstone Design: Turtlebot3 autorace

1.

ROS
Capstone
Design:
Turtlebot3
autorace
Prof.: Jangwoo Kwon
TAs: Baydadaev Shokhrukh,
Usmankhujaev Saidrasul
1

2.

Content
Prerequisites
Install auto race packages
Camera calibration
Check calibration results
Lane detection
2

3.

Prerequisites
What you need for Autonomous Driving?
TurtleBot3 Burger
It is the basic model to use AutoRace packages for the autonomous driving on ROS. Provided source codes, AutoRace
Packages, are made based on TurtleBot3 Burger.
Remote PC
It communicates with a board computer (SBC) on Turtlebot3. Laptop, desktop, or other devices with ROS 1.
Raspberry Pi camera module with a camera mount
You can use a different module if ROS supports it.
Source codes provided to calibrate the camera are created based on (Fisheye Lens) module.
AutoRace tracks and objects
Download 3D CAD files for AutoRace tracks, Traffic signs, traffic lights and other objects at ROBOTIS_GIT/autorace.
Download a refree system at ROBOTIS-GIT/autorace_referee
3

4.

Install Turtlebot3 packages
Install the AutoRace 2020 meta package on Remote PC
$ cd ~/catkin_ws/src/
$ git clone -b noetic-devel https://github.com/ROBOTIS-GIT/turtlebot3.git
$ git clone –b noetic-devel https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
$ git clone -b noetic-devel https://github.com/ROBOTIS-GIT/turtlebot3_autorace_2020.git
$ cd ~/catkin_ws && catkin_make
4

5.

Camera calibration
Extrinsic camera calibration
1.
Open a new terminal on Remote PC and run Gazebo
$ roslaunch turtlebot3_gazebo turtlebot3_autorace_2020.launch
2.
Open a new terminal and launch the intrinsic camera calibration node.
$ roslaunch turtlebot3_autorace_camera intrinsic_camera_calibration.launch
3.
Open a new terminal and launch the extrinsic camera calibration node.
$ roslaunch turtlebot3_autorace_camera extrinsic_camera_calibration.launch mode:=calibration
4.
Execute rqt on new terminal on Remote PC
$ rqt
5.
Select plugins > visualization > Image view. Create two image view windows.
6.
Select /camera/image_extrinsic_calib/compressed topic on one window and /camera/image_projected
7.
Execute rqt_reconfigure on Remote PC
$ rosrun rqt_reconfigure rqt_reconfigure
5

6.

Camera calibration (cont.)
8.
Adjust parameters /camera/image_projection and /camera/image_compensation_projection
Сhange /camera/image_projection parameter value. It affects /camera/image_extrinsic_calib/compressed topic.
Intrinsic camera calibration modifies the perspective of the image in the red trapezoid.
After that, overwrite each values on to the yaml files
in turtlebot3_autorace_camera/calibration/extrinsic_calibration/. This will save the current calibration parameters
so that they can be loaded later.
6

7.

Check calibration results
After completing calibrations, run the step-by-step instructions below on Remote PC
1. Close all terminals
2. Open new teminal and run Autorace Gazebo
$ roslaunch turtlebot3_gazebo turtlebot3_autorace_2020.launch
3. Open a new terminal and run a calibration node
$ roslaunch turtlebot3_autorace_camera intrinsic_camera_calibration.launch
4. Open a new terminal and launch the extrinsic calibration node
$ roslaunch turtlebot3_autorace_camera extrinsic_camera_calibration.launch
5. Launch the rqt viewer $rqt_image_view
7

8.

Lane follower (lane_follow.py)
#!/usr/bin/env python
import rospy
import cv2
import numpy as np
from sensor_msgs.msg import Image
from cv_bridge import CvBridge
from geometry_msgs.msg import Twist
8

9.

Lane follower (lane_follow.py)
# Initialize ROS node and publisher
rospy.init_node('lane_follower')
pub = rospy.Publisher('/cmd_vel', Twist, queue_size=1)
bridge = CvBridge()
# Define color thresholds for lane detection
low_yellow = np.array([20, 100, 100])
high_yellow = np.array([30, 255, 255])
# Define command speed and turning angles
linear_speed = 0.2
angular_speed = 0.5
# Define image size and region of interest
IMAGE_WIDTH = 640
IMAGE_HEIGHT = 480
ROI_TOP = 350
ROI_BOTTOM = IMAGE_HEIGHT
9

10.

Lane follower (lane_follow.py)
def image_callback(img_msg):
# Convert ROS image message to OpenCV image
img = bridge.imgmsg_to_cv2(img_msg, "bgr8")
# Apply color thresholding to detect lane
hsv = cv2.cvtColor(img, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(hsv, low_yellow, high_yellow)
masked_image = cv2.bitwise_and(img, img, mask=mask)
# Crop the image to region of interest
roi = masked_image[ROI_TOP:ROI_BOTTOM, 0:IMAGE_WIDTH]
# Convert cropped image to grayscale and blur it
gray = cv2.cvtColor(roi, cv2.COLOR_BGR2GRAY)
blur = cv2.GaussianBlur(gray, (5, 5), 0)
# Apply Canny edge detection
edges = cv2.Canny(blur, 50, 150)
# Find lane lines using Hough transform
lines = cv2.HoughLinesP(edges, 1, np.pi / 180, 50, minLineLength=30, maxLineGap=50)
10

11.

Lane follower (lane_follow.py)
# Calculate center of lane and steer TurtleBot3 towards it
if lines is not None:
x_sum = 0
count = 0
for line in lines:
x1, y1, x2, y2 = line[0]
x_sum += x1 + x2
count += 2
cv2.line(roi, (x1, y1), (x2, y2), (0, 255, 0), 2)
center_x = int(x_sum / count)
error = center_x - IMAGE_WIDTH / 2
angular_z = -error / 100.0
cmd_vel = Twist()
cmd_vel.linear.x = linear_speed
cmd_vel.angular.z = angular_z * angular_speed
pub.publish(cmd_vel)
11

12.

Lane follower (lane_follow.py)
# Display the image
cv2.imshow('Image', roi)
cv2.waitKey(1)
# Subscribe to the camera image
rospy.Subscriber('/camera/rgb/image_raw', Image, image_callback)
# Run the node
rospy.spin()
12

13.

References
1. ROS official website https://www.ros.org/, Tutorials http://wiki.ros.org/ROS/Tutorials .
2. Turtlebot3 official website
https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/
3. ROS tutorial - https://www.youtube.com/playlist?list=PLLSegLrePWgIbIrA4iehUQimpvIXdd9Q
4. ‘The construct’ – popular ROS community www.theconstructsim.com
5. Exploring ROS with 2 wheeled robot https://www.theconstructsim.com/ros-projectsexploring-ros-using-2-wheeled-robot-part-1/#part13
6. ‘The construct’ - TurtleBot 3 https://www.theconstructsim.com/turtlebot3/
13
English     Русский Rules