2022 14th International Conference on Machine Learning and Computing (ICMLC)最新文献

筛选
英文 中文
A Transmittance Optimization-based Framework for Image Dehazing on Multi-rotor Drones Imaging 基于透光率优化的多旋翼无人机图像去雾框架
2022 14th International Conference on Machine Learning and Computing (ICMLC) Pub Date : 2022-02-18 DOI: 10.1145/3529836.3529921
Zonglin Li
{"title":"A Transmittance Optimization-based Framework for Image Dehazing on Multi-rotor Drones Imaging","authors":"Zonglin Li","doi":"10.1145/3529836.3529921","DOIUrl":"https://doi.org/10.1145/3529836.3529921","url":null,"abstract":"The imaging quality of images collected by multi-rotor drones determines its practical application effects. However, current image enhancement dehazing methods have problems such as being affected by depth information, high computational complexity, and artefacts in the restoration results. In this paper, based on the dark channel prior model, a tolerance mechanism is introduced in the transmittance estimation part. The total variation (TV) model constrained by the ℓ1-norm is used to refine the transmittance estimation. In addition, to reduce our algorithm’s calculation, we use down-sampling technology to reduce the original image to obtain the transmittance part. Then we calculate the transmittance of the small resolution image. In the end, we can generate the transmittance of the original image by interpolation. The experimental data processing results verify the effectiveness of our algorithm.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127809763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
AI for Closed-Loop Control Systems: New Opportunities for Modeling, Designing, and Tuning Control Systems 闭环控制系统的人工智能:建模、设计和调整控制系统的新机遇
2022 14th International Conference on Machine Learning and Computing (ICMLC) Pub Date : 2022-01-18 DOI: 10.1145/3529836.3529952
Julius Schöning, A. Riechmann, H. Pfisterer
{"title":"AI for Closed-Loop Control Systems: New Opportunities for Modeling, Designing, and Tuning Control Systems","authors":"Julius Schöning, A. Riechmann, H. Pfisterer","doi":"10.1145/3529836.3529952","DOIUrl":"https://doi.org/10.1145/3529836.3529952","url":null,"abstract":"Control Systems, particularly closed-loop control systems (CLCS), are frequently used in production machines, vehicles, and robots nowadays. CLCS are needed to actively align actual values of a process to a given reference or set values in real-time with a very high precession. Yet, artificial intelligence (AI) is not used to model, design, optimize, and tune CLCS. This paper will highlight potential AI-empowered and -based control system designs and designing procedures, gathering new opportunities and research direction in the field of control system engineering. Therefore, this paper illustrates which building blocks within the standard block diagram of CLCS can be replaced by AI, i.e., artificial neuronal networks (ANN). Having processes with real-time contains and functional safety in mind, it is discussed if AI-based controller blocks can cope with these demands. By concluding the paper, the pros and cons of AI-empowered as well as -based CLCS designs are discussed, and possible research directions for introducing AI in the domain of control system engineering are given.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121283378","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Hierarchical Neural Network Approaches for Long Document Classification 长文档分类的层次神经网络方法
2022 14th International Conference on Machine Learning and Computing (ICMLC) Pub Date : 2022-01-18 DOI: 10.1145/3529836.3529935
Snehal Khandve, Vedangi Wagh, Apurva Wani, Isha Joshi, Raviraj Joshi
{"title":"Hierarchical Neural Network Approaches for Long Document Classification","authors":"Snehal Khandve, Vedangi Wagh, Apurva Wani, Isha Joshi, Raviraj Joshi","doi":"10.1145/3529836.3529935","DOIUrl":"https://doi.org/10.1145/3529836.3529935","url":null,"abstract":"Text classification algorithms investigate the intricate relationships between words or phrases and attempt to deduce the document’s interpretation. In the last few years, these algorithms have progressed tremendously. Transformer architecture and sentence encoders have proven to give superior results on natural language processing tasks. But a major limitation of these architectures is their applicability for text no longer than a few hundred words. In this paper, we explore hierarchical transfer learning approaches for long document classification. We employ pre-trained Universal Sentence Encoder (USE) and Bidirectional Encoder Representations from Transformers (BERT) in a hierarchical setup to capture better representations efficiently. Our proposed models are conceptually simple where we divide the input data into chunks and then pass this through base models of BERT and USE. Then output representation for each chunk is then propagated through a shallow neural network comprising of LSTMs or CNNs for classifying the text data. These extensions are evaluated on 6 benchmark datasets. We show that USE + CNN/LSTM performs better than its stand-alone baseline. Whereas the BERT + CNN/LSTM performs on par with its stand-alone counterpart. However, the hierarchical BERT models are still desirable as it avoids the quadratic complexity of the attention mechanism in BERT. Along with the hierarchical approaches, this work also provides a comparison of different deep learning algorithms like USE, BERT, HAN, Longformer, and BigBird for long document classification. The Longformer approach consistently performs well on most of the datasets.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130985179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Contrastive Pre-training for Imbalanced Corporate Credit Ratings 不平衡企业信用评级的对比预训练
2022 14th International Conference on Machine Learning and Computing (ICMLC) Pub Date : 2021-02-18 DOI: 10.1145/3529836.3529911
Bojing Feng, Wenfang Xue
{"title":"Contrastive Pre-training for Imbalanced Corporate Credit Ratings","authors":"Bojing Feng, Wenfang Xue","doi":"10.1145/3529836.3529911","DOIUrl":"https://doi.org/10.1145/3529836.3529911","url":null,"abstract":"The corporate credit rating reflects the level of corporate credit and plays a crucial role in modern financial risk control. But real-world credit rating data usually shows long-tail distributions, which means a heavy class imbalanced problem challenging the corporate credit rating system greatly. To tackle that, inspired by the recent advances of pre-train techniques in self-supervised representation learning, we propose a novel framework named Contrastive Pre-training for Corporate Credit Rating (CP4CCR), which utilizes the self-supervision for getting over the class imbalance. Specifically, we propose to, in the first phase, exert contrastive self-supervised pre-training without label information, which aims to learn a better class-agnostic initialization. Furthermore, two self-supervised tasks are developed within CP4CCR: (i) Feature Masking (FM) and (ii) Feature Swapping(FS). In the second phase, we can train any standard corporate credit rating model initialized by the pre-trained network. Extensive experiments conducted on the real public-listed corporate rating dataset, prove that CP4CCR can improve the performance of standard corporate credit rating models, especially for the class with few samples.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131417330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信