Field exploration robots have gained significant attention in recent years due to their potential applications in various domains such as agriculture, environmental monitoring, and autonomous driving research. These robots are designed to operate autonomously in diverse environments, collecting data and performing tasks that would otherwise be labor-intensive or hazardous for humans. This project focuses on developing a ROS-powered robotic
vehicle designed for field exploration
. The vehicle is capable of traversing varied ground surfaces autonomously and can be used for tasks such as autonomous field mapping, agricultural plant monitoring, and testing autonomous driving algorithms.
The primary goal of this project is to develop a ROS2-enabled four-wheel-drive vehicle with Ackermann steering
capable of waypoint navigation
and obstacle avoidance
. This vehicle will have various sensors and an onboard computer to performSimultaneous Localization and Mapping (SLAM
). The project encompasses hardware development, sensor integration, firmware and algorithm development, and extensive testing and validation to ensure the vehicle’s performance and reliability in real-world scenarios.
Ubuntu 24 LTS
noble or laterJazzy Jelisco
Raspberry Pi5
,BMX 160 IMU
, Ydlidar Tmini-Pro
, Esp32
, 7V Lipo Battery
Arduino IDE
, Python 3.12
Install ROS2 Jazzy:
After installing run this command once
source /opt/ros/humble/setup.bash
echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc
source ~/.bashrc
Create a workspace and clone the repository
mkdir -p ~/ros2_ws/src # Create a workspace , builld and source it
cd ~/ros2_ws
colcon build --symlink install
source install/local_setup.bash
cd /pi_ws/src # To clone the repo inside src folder
git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
To avoid sourcing everytime you open a new terminal instance run this once ,
echo "source ~/ros2_ws/install/setup.bash" >> ~/.bashrc
source ~/.bashrc # source bashrc as we have made changes
Ubuntu 24
image for Raspberry Pi from the official Ubuntu website.Balena Etcher
/Rpi Imager
to flash the Ubuntu image to a 32GB
or 64GB
SD card.echo “source ~/pi_ws/install/setup.bash” » ~/.bashrc source ~/.bashrc # source bashrc as we have made changes ```
cd /pi_ws/src
git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
dialout
to groups
sudo usermod -aG dialout $USER
sudo apt update
sudo apt install python3-smbus
ros2 launch imu_pkg imu_rviz_robot.launch.py
ros2 topic echo /imu_raw
Ensure the YDLidar is connected to a USB port and powered on
ros2 launch ydlidar_ros2_driver ydlidar_launch.py
ros2 launch rcar_viz display.launch.py
This will launch Rviz2
with a robot model and fixed frame
as "odom"
ros2 run imu_pose_estimation estimator
Alternatively, set the fixed frame to "base_link"
in RViz to see laser data on the /scan
topic.
five-pin servo
control using PID
for precision and DC motor control for throttle.PCB
board was developed to ensure optimum power supply to the ESP32
, motor drivers
, and aspberry Pi
.two-layered system
with mounts for all hardware components including a LIDAR
.BMX-160
IMU was interfaced and calibrated to publish accurate orientation and acceleration data on a ROS-2 network.Ydlidar Tmini-Pro
interfaced with Raspberry Pi5
on ROS2 Jazzy
to publish laser data on /scan
topicFigure 1: Prototype Vehicle
Figure 2: Control System Design
angle
and PWM
values via UART Serial
from the Raspberry Pi 5.steering angle
of the vehicle.throttle
of the vehicle.angle
for steering based on the current position
and the goal
.GPIO
pins on Rasbperry Pi 5./imu_data
topic.LiDAR
data for SLAM.USB
./scan
topic.IMU
and other Vicon
car velocity (corresponding to PWM) data to provide an accurate estimate of the vehicle’s position
and velocity
./odom
topic.EKF (Extended Kalman Filter)
and AMCL (Adaptive Monte Carlo Localization)
.map
of the environment./scan
and /odom
topics.path planning
and navigation
./plan
topic.goals
for the vehicle.waypoints
to the /goal_topic
.Achieving complete low-level control of the car through keyboard commands using ROS2 for communication over a shared network is a significant milestone. This setup allows for real-time teleoperation, enabling users to control the car remotely using simple keyboard inputs. By implementing teleoperation nodes in ROS 2, commands can be sent to the robot, providing a responsive and reliable control system.
Implementation Steps:
How to Run the Teleop Node:
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly
ros2 run teleop_bot sub
ros2 run teleop_bot pub
If u have less computation on Pi run the publisher on the remote PC
using the same command stated above
Now u can control the car using WASD
keys for movement :
W
- Move ForwardA
- Turn LeftS
- Move BackwardD
- Turn RightH
- Halt
Note
: The throttle
value has been kept constant for the time being, you can change the value in the subscriber script in the teleop_bot
package. Also, try running the car at 25-50 PWM
range for safe teleoperationTo run the odometry Node
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly
ros2 run localization imu_odometry
This will give you the (x,y,yaw)
data of the vehicle , and also publish data on /odom
topic
Goal_Node
, which is fed into the pure-pursuit algorithm. The algorithm used Odometry data from the /Odom
topic, calculates the linear and angular error and proceeds towards the waypoints effectively ( Note: The tuning parameter is lookahead_distance
)
Waypoint navigation
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly
ros2 run navigation GoalNode
ros2 run navigation pure_pursuit
Hit enter on the goal node terminal after the pure-pursuit controller is started
Upgrading Hardware for Outdoor Use: Enhance the hardware to function robustly in outdoor environments and incorporate GPS for improved outdoor localization.
Developing a Mission Planner Stack: Create a mission planner stack capable of generating and updating paths for the vehicle based on environmental variables, enabling more dynamic and adaptive route planning.
Implementing State-of-art Algorithms: Integrate deep learning and reinforcement learning-based algorithms to achieve more adaptive and robust navigation, enabling the vehicle to better handle diverse and unpredictable conditions. Upgrade the algorithms to better handle dynamic environments, both indoor and outdoor, as they are currently designed for static settings.
Enhanced Sensor Integration: Incorporate additional sensors such as RGB-depth cameras, and ultrasonic sensors to improve environmental perception, and implement advanced algorithms such as ORB-SLAM.