THE EFFECT OF CONTRAST ENHANCEMENT ON EPIPHYTE SEGMENTATION USING GENERATIVE NETWORK

Q2 Social Sciences
V. S. Sajith Variyar, V. Sowmya, R. Sivanpillai, G. Brown
{"title":"THE EFFECT OF CONTRAST ENHANCEMENT ON EPIPHYTE SEGMENTATION USING GENERATIVE NETWORK","authors":"V. S. Sajith Variyar, V. Sowmya, R. Sivanpillai, G. Brown","doi":"10.5194/isprs-archives-xlviii-m-3-2023-219-2023","DOIUrl":null,"url":null,"abstract":"Abstract. The performance of the deep learning-based image segmentation is highly dependent on two major factors as follows: 1) The organization and structure of the architecture used to train the model and 2) The quality of input data used to train the model. The input image quality and the variety of training samples are highly influencing the features derived by the deep learning filters for segmentation. This study focus on the effect of image quality of a natural dataset of epiphytes captured using Unmanned Aerial Vehicles (UAV), while segmenting the epiphytes from other background vegetation. The dataset used in this work is highly challenging in terms of pixel overlap between target and background to be segmented, the occupancy of target in the image and shadows from nearby vegetation. The proposed study used four different contrast enhancement techniques to improve the image quality of low contrast images from the epiphyte dataset. The enhanced dataset with four different methods were used to train five different segmentation models. The segmentation performances of four different models are reported using structural similarity index (SSIM) and intersection over union (IoU) score. The study shows that the epiphyte segmentation performance is highly influenced by the input image quality and recommendations are given based on four different techniques for experts to work with segmentation with natural datasets like epiphytes. The study also reported that the occupancy of the target epiphyte and vegetation highly influence the performance of the segmentation model.\n","PeriodicalId":30634,"journal":{"name":"The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5194/isprs-archives-xlviii-m-3-2023-219-2023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract. The performance of the deep learning-based image segmentation is highly dependent on two major factors as follows: 1) The organization and structure of the architecture used to train the model and 2) The quality of input data used to train the model. The input image quality and the variety of training samples are highly influencing the features derived by the deep learning filters for segmentation. This study focus on the effect of image quality of a natural dataset of epiphytes captured using Unmanned Aerial Vehicles (UAV), while segmenting the epiphytes from other background vegetation. The dataset used in this work is highly challenging in terms of pixel overlap between target and background to be segmented, the occupancy of target in the image and shadows from nearby vegetation. The proposed study used four different contrast enhancement techniques to improve the image quality of low contrast images from the epiphyte dataset. The enhanced dataset with four different methods were used to train five different segmentation models. The segmentation performances of four different models are reported using structural similarity index (SSIM) and intersection over union (IoU) score. The study shows that the epiphyte segmentation performance is highly influenced by the input image quality and recommendations are given based on four different techniques for experts to work with segmentation with natural datasets like epiphytes. The study also reported that the occupancy of the target epiphyte and vegetation highly influence the performance of the segmentation model.
对比度增强对基于生成网络的附生植物分割的影响
摘要基于深度学习的图像分割的性能高度依赖于以下两个主要因素:1)用于训练模型的架构的组织和结构;2)用于训练模型的输入数据的质量。输入图像质量和训练样本的多样性对深度学习滤波器分割得到的特征有很大影响。本文研究了利用无人机(UAV)捕获的附生植物自然数据集的图像质量的影响,同时将附生植物从其他背景植被中分割出来。本工作使用的数据集在待分割的目标和背景之间的像素重叠、图像中目标的占用以及附近植被的阴影方面具有很高的挑战性。该研究使用了四种不同的对比度增强技术来提高来自epiphyte数据集的低对比度图像的图像质量。使用四种不同方法的增强数据集训练五种不同的分割模型。利用结构相似指数(SSIM)和交联(IoU)评分对四种不同模型的分割效果进行了分析。研究表明,附生植物的分割性能受到输入图像质量的高度影响,并基于四种不同的技术给出了专家对附生植物等自然数据集进行分割的建议。该研究还报道了目标附生植物和植被的占比对分割模型的性能有很大影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
1.70
自引率
0.00%
发文量
949
审稿时长
16 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信