WildOcc:Traversability analysis based on occupancy prediction in unstructured environment
DOI:
CSTR:
Author:
Affiliation:

1.Shanghai Jiao Tong University;2.Key Laboratory of Digital Earth of Science

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To achieve autonomous navigation in unstructured environments, unmanned vehicles need to analyze the traversability of the terrain. Currently, methods based on LiDAR and vision are used for this analysis, but LiDAR systems are limited by sparse point clouds and high costs, and traditional vision approaches fail to effectively capture and express the three-dimensional spatial conditions of the scene. Addressing these challenges, this paper introduces for the first time a method for analyzing traversability in unstructured environments based on Occupancy Prediction, named WildOcc. WildOcc extracts multi-scale features from monocular RGB images, projects 3D occupancy labels onto the images, and introduces a road attention mechanism to query points and fuse information to obtain 3D features, which are then output as traversable areas through a decoder and semantic segmentation head. To accurately estimate the three-dimensional traversability of the environment, WildOcc uses 3D occupancy labels for supervision; due to the sparsity of point cloud data in unstructured environments, this paper designs a data enhancement module called Dense Label Generate (DLG) to produce dense occupancy labels, improving the accuracy of the supervision results. Based on the DLG module, this paper produces the first dataset usable for occupancy prediction in unstructured environments. Comprehensive experiments conducted on this dataset show that, relative to occupancy prediction methods designed for structured environments, the DLG module improves the mIoU by 0.7%, and jointly with WildOcc, enhances the mIoU by 1%, effectively increasing the prediction accuracy and robustness.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:July 09,2024
  • Revised:October 17,2024
  • Adopted:October 17,2024
  • Online:
  • Published:
Article QR Code

Address:No. 219, Ningliu Road, Nanjing, Jiangsu Province

Postcode:210044

Phone:025-58731025