Remote Sensing Image Scene Classification Application Based on Deep Learning Feature Fusion
DOI:
Author:
Affiliation:

中北大学

Clc Number:

Fund Project:

The National Natural Science Foundation of China (General Program, Key Program, Major Research Plan)

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In view of the traditional manual feature method, which can not effectively extract the overall image deep information, a new method of scene classification based on deep learning feature fusion is proposed. Firstly, the grayscale symbiotic matrix (GLCM) and local two-value pattern (LBP) are used to extract the shallow information of texture features and local texture features with relevant spatial characteristics, and secondly, the deep information of images is extracted by the AlexNet migration learning network, and a 256-dimensional full-connect layer is added as feature output while the last layer of full connection layer is removed, and the two features are adaptively integrated The remote sensing images are classified and identified in the support vector machine (GS-SVM) optimized by the grid search algorithm. The experimental results of the 21 types of target data of the public dataset UC Merced and the 7 types of target data of RSSCN7 show that the average accuracy rate of the five experiments was 94.77% and93.79%, respectively. The experimental results are compared with other methods in the references and the classification accuracy under the same data conditions, which shows that the proposed method can effectively improve the classification accuracy of remote sensing image scenes.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:March 22,2022
  • Revised:May 05,2022
  • Adopted:May 09,2022
  • Online:
  • Published:

Address:No. 219, Ningliu Road, Nanjing, Jiangsu Province

Postcode:210044

Phone:025-58731025