K. Varshitha, Chinni Guna Kumari, Muppala Hasvitha, Shaik Fiza, Amarendra K, V. Rachapudi
{"title":"Natural Language Processing using Convolutional Neural Network","authors":"K. Varshitha, Chinni Guna Kumari, Muppala Hasvitha, Shaik Fiza, Amarendra K, V. Rachapudi","doi":"10.1109/ICCMC56507.2023.10083608","DOIUrl":null,"url":null,"abstract":"Convolutional neural networks (CNN) are multi-layer neural networks that are used to learn hierarchical data properties. In recent times, CNN has achieved remarkable advances in the architecture and computation of Natural Language Processing (NLP). The Word2vec technique is considered to introduce Word embeddings, which are used to improve the performance of a variety of Natural Language Processing (NLP) applications. It is a well-known technique for learning word embeddings, which are dense representations of words in a lower-dimensional vector space. Two prominent approaches are used for learning word embeddings, which are dense representations of words in a lower-dimensional vector space, are Continuous Bag-of-Words (CBOW) and Skip-Gram.","PeriodicalId":197059,"journal":{"name":"2023 7th International Conference on Computing Methodologies and Communication (ICCMC)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 7th International Conference on Computing Methodologies and Communication (ICCMC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCMC56507.2023.10083608","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Convolutional neural networks (CNN) are multi-layer neural networks that are used to learn hierarchical data properties. In recent times, CNN has achieved remarkable advances in the architecture and computation of Natural Language Processing (NLP). The Word2vec technique is considered to introduce Word embeddings, which are used to improve the performance of a variety of Natural Language Processing (NLP) applications. It is a well-known technique for learning word embeddings, which are dense representations of words in a lower-dimensional vector space. Two prominent approaches are used for learning word embeddings, which are dense representations of words in a lower-dimensional vector space, are Continuous Bag-of-Words (CBOW) and Skip-Gram.