•  


GitHub - Krishnateja244/UAV-3D-terrain-mapping-system: 3D Terrain mapping system to generate a 3D map of the forest floor
Skip to content

3D Terrain mapping system to generate a 3D map of the forest floor

Notifications You must be signed in to change notification settings

Krishnateja244/UAV-3D-terrain-mapping-system

Repository files navigation

3D terrain mapping system

In this 3D terrain mapping system, data collection from sensors during the UAV survey is done initially and further offline processing of the data is done to build a 3D map. The implementation of this research uses mainly Robotic Operating System(ROS). Therefore, the software libraries used for the implementation are divided into two tasks: Data collection and Offline data processing.

Data collection

The data collection involves collecting data from Velodyne LIDAR, Aceinna IMU, Emlid Reach M+ and Pixhawk 4. The sensors data is collected using the ROS bag tool. The system needs to be setup with appropriate ROS drivers to receive data from sensors into ROS architecture. The below figure illustrates architectural diagram of sensor setup used for data collection

My Image

To record the data using the ROS bag tool:

rosbag record -a

Velodyne LIDAR

Download Velodyne ROS driver using:

git clone git@github.com:ros-drivers/velodyne.git

To run the Velodyne sensor ROS driver:

roslaunch velodyne\_pointcloud VLP16_points.launch

Aceinna IMU300ZI

Get the Aceinna ROS package updated based on the requirement of this research from this repository. This ros_openimu is forked from

https://github.com/ROS-Aceinna/ros_openimu

To run the IMU ROS driver:

roslaunch ros_openimu openimu_driver.launch 

Emlid M+

The NMEA sentences from the Emlid M+ and GNSS UTC can be logged using the nmea_navsat driver. This driver is forked from

 https://github.com/ros-drivers/nmea_navsat_driver

To run the driver:

roslaunch nmea_navsat_driver nmea_serial_driver.launch

MAVROS

The Pixhawk flight controller data is logged in ROS architecture to log the camera trigger GPS time and corresponding GNSS coordinates.

Download MAVROS library using :

git clone git@github.com:mavlink/mavros.git

To launch the MAVROS library:

roslaunch mavros px4.launch

Offline Data processing

In this step, the data collected from all the sensors are processed to generate a 3D point cloud. The process involves generating the optimal pose of the UAV by sensor fusion and using it for Georeferencing LIDAR point cloud. Two online datasets are used in this research for performing offline data processing, due to technical difficulties in collecting sensor data from UAV.

Two datasets are:

HongKong Dataset UrbanNav-HK-Data20190428: https://github.com/weisongwen/UrbanNavDataset

Winterwheat Path A: https://vision.eng.au.dk/future-cropping/uav_lidar/

The ROS bags collected in the data collection stage/datasets are played by:

rosbag play xxx.bag

Sensor Fusion

The Sensor fusion is implemented using Ethzasal_msf multisensor fusion ROS package based on Extended Kalman Filter (EKF). The sensors fused for current implementation are GNSS and IMU. Therefore, Position_sensor.msf ROS node is utilised in this implementation.

Get the modified Ethzasl_msf package with topics configuration based on the Hongkong dataset from this repository. This is a forked repository from https://github.com/ethz-asl/ethzasl_msf .

The instructions to run this package are:

  • Pause the bag file once the position_sensor node receives initial readings.
  • Select the "core_init_filter" in the parameter server to initialize the EKF filter with initial readings
  • Unpause the bag file to continue working of the filter

To run the package:

roslaunch msf_updates position_sensor.launch
rosrun rqt_reconfigure rqt_reconfigure

To convert the position data from meters in ENU frame published in "msf_core/pose" topic to geocoordinate representation (Latitude,Longitude, Altitude) :

python3 enu_to_geodetic.py 

Georeferencing LIDAR point cloud

The pointcloud fromm LIDAR frame are transformed to UAV baselink/IMU link and then transformed to maping frame using the below equation

img

My Image

The Livox high precision mapping ROS package is used to map the LIDAR point cloud. The updated package can be downloaded from this repository. This is a forked repository from https://github.com/Livox-SDK/livox_high_precision_mapping

The transformation matrix between imu_link and base_link in the Winterwheat dataset is used to transform UAV data from imu_link to base_link using the ROS TF library.

rosrun tf2_ros static_transform_publisher 0 0 0 0 0 0 1 map base_link
rosrun tf2_ros static_transform_publisher -0.039 -0.008 -0.294 -0.7071 4.32978e-7 0.707106 4.32978e-17 base_link imu_link
python transform.py

Then run the mapping application

roslaunch livox_mapping livox_mapping.launch

Results

The below figure illustrates the predicted trajectory by fusing IMU and GPS and plotted against the groundtruth trajectory

My Image

The below figure illustrates the groundtruth lidar pointcloud from Winter wheat dataset

My Image

The below figure illustrates the LIDAR pointcloud generated from the mapping algorithm My Image

- "漢字路" 한글한자자동변환 서비스는 교육부 고전문헌국역지원사업의 지원으로 구축되었습니다.
- "漢字路" 한글한자자동변환 서비스는 전통문화연구회 "울산대학교한국어처리연구실 옥철영(IT융합전공)교수팀"에서 개발한 한글한자자동변환기를 바탕하여 지속적으로 공동 연구 개발하고 있는 서비스입니다.
- 현재 고유명사(인명, 지명등)을 비롯한 여러 변환오류가 있으며 이를 해결하고자 많은 연구 개발을 진행하고자 하고 있습니다. 이를 인지하시고 다른 곳에서 인용시 한자 변환 결과를 한번 더 검토하시고 사용해 주시기 바랍니다.
- 변환오류 및 건의,문의사항은 juntong@juntong.or.kr로 메일로 보내주시면 감사하겠습니다. .
Copyright ⓒ 2020 By '전통문화연구회(傳統文化硏究會)' All Rights reserved.
 한국   대만   중국   일본