Thông tin chung

  English

  Đề tài NC khoa học
  Bài báo, báo cáo khoa học
  Hướng dẫn Sau đại học
  Sách và giáo trình
  Các học phần và môn giảng dạy
  Giải thưởng khoa học, Phát minh, sáng chế
  Khen thưởng
  Thông tin khác

  Tài liệu tham khảo

  Hiệu chỉnh

 
Số người truy cập: 107,464,316

 Dense Multi-Scale Convolutional Network for Plant Segmentation
Tác giả hoặc Nhóm tác giả: Thi Hoang Yen Tran, Tran Dang Khoa Phan
Nơi đăng: Journal IEEE Access; Số: Volume 11;Từ->đến trang: 82640 - 82651;Năm: 2023
Lĩnh vực: Kỹ thuật; Loại: Bài báo khoa học; Thể loại: Quốc tế
TÓM TẮT
ABSTRACT
Plant segmentation is a critical task in precision agriculture as related to crop management and weed treatment. Plants can exhibit very large scale changes, which presents great challenge for accurate crop/weed segmentation. Recent works have shown that multi-scale features are useful to segment objects with different scales. In this work, we propose a Dense Multi-scale Convolutional Network (DMSCN) for pixel-wise crop/weed segmentation. Our network has an encoder-decoder structure. The encoder comprises of a Dense Convolutional Network (DCN) and a Dense Multi-Scale Atrous Pooling (DMSAP) module. DCN is composed of standard and atrous convolutions with dense connections. The architecture of DCN allows the encoder to increase the density of feature maps while avoiding signal decimation due to the dimension reduction. The proposed DMSAP connects a set of standard and atrous convolutional layers with different dilation rates in a densely cascaded manner. DMSAP is able to capture features with dense scale sampling and large receptive field. A simple yet effective decoder is used to refine the segmentation results by combining high and low-level features of the encoder. Extensive experiments are performed on four crop/weed datasets. One of these datasets was collected and annotated by us. We conduct an ablation study to show the advantages of different modules of DMSCN. The comparative study demonstrates the advantages of our model compared with the previous methods in terms of accuracy and complexity.
© Đại học Đà Nẵng
 
 
Địa chỉ: 41 Lê Duẩn Thành phố Đà Nẵng
Điện thoại: (84) 0236 3822 041 ; Email: dhdn@ac.udn.vn