{"title":"l-LSTM:一种基于随机梯度下降优化的大数据序列标注的双向LSTM","authors":"Nancy Victor, Daphne Lopez","doi":"10.4018/ijghpc.2020070101","DOIUrl":null,"url":null,"abstract":"The volume of data in diverse data formats from various data sources has led the way for a new drift in the digital world, Big Data. This article proposes sl-LSTM (sequence labelling LSTM), a neural network architecture that combines the effectiveness of typical LSTM models to perform sequence labeling tasks. This is a bi-directional LSTM which uses stochastic gradient descent optimization and combines two features of the existing LSTM variants: coupled input-forget gates for reducing the computational complexity and peephole connections that allow all gates to inspect the current cell state. The model is tested on different datasets and the results show that the integration of various neural network models can further improve the efficiency of approach for identifying sensitive information in Big data.","PeriodicalId":43565,"journal":{"name":"International Journal of Grid and High Performance Computing","volume":"19 1","pages":"1-16"},"PeriodicalIF":0.6000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"sl-LSTM: A Bi-Directional LSTM With Stochastic Gradient Descent Optimization for Sequence Labeling Tasks in Big Data\",\"authors\":\"Nancy Victor, Daphne Lopez\",\"doi\":\"10.4018/ijghpc.2020070101\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The volume of data in diverse data formats from various data sources has led the way for a new drift in the digital world, Big Data. This article proposes sl-LSTM (sequence labelling LSTM), a neural network architecture that combines the effectiveness of typical LSTM models to perform sequence labeling tasks. This is a bi-directional LSTM which uses stochastic gradient descent optimization and combines two features of the existing LSTM variants: coupled input-forget gates for reducing the computational complexity and peephole connections that allow all gates to inspect the current cell state. The model is tested on different datasets and the results show that the integration of various neural network models can further improve the efficiency of approach for identifying sensitive information in Big data.\",\"PeriodicalId\":43565,\"journal\":{\"name\":\"International Journal of Grid and High Performance Computing\",\"volume\":\"19 1\",\"pages\":\"1-16\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2020-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Grid and High Performance Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4018/ijghpc.2020070101\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Grid and High Performance Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijghpc.2020070101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
sl-LSTM: A Bi-Directional LSTM With Stochastic Gradient Descent Optimization for Sequence Labeling Tasks in Big Data
The volume of data in diverse data formats from various data sources has led the way for a new drift in the digital world, Big Data. This article proposes sl-LSTM (sequence labelling LSTM), a neural network architecture that combines the effectiveness of typical LSTM models to perform sequence labeling tasks. This is a bi-directional LSTM which uses stochastic gradient descent optimization and combines two features of the existing LSTM variants: coupled input-forget gates for reducing the computational complexity and peephole connections that allow all gates to inspect the current cell state. The model is tested on different datasets and the results show that the integration of various neural network models can further improve the efficiency of approach for identifying sensitive information in Big data.