{"title":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","authors":"","doi":"10.1145/3409073","DOIUrl":"https://doi.org/10.1145/3409073","url":null,"abstract":"","PeriodicalId":229746,"journal":{"name":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","volume":"os-4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127759851","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shuaichuang Yang, Minsheng Tan, Shiying Xia, Fangju Liu
{"title":"A method of intrusion detection based on Attention-LSTM neural network","authors":"Shuaichuang Yang, Minsheng Tan, Shiying Xia, Fangju Liu","doi":"10.1145/3409073.3409096","DOIUrl":"https://doi.org/10.1145/3409073.3409096","url":null,"abstract":"Recently, network attacks with complex types have occurred more frequently than before, and traditional detection algorithms cannot meet current needs. For this reason, an intrusion detection method based on Attention- Long Short Term Memory (LSTM) neural network is proposed. This method combines the advantage of the attention mechanism theory to solve the problem of the inability to pay attention to key attributes in intrusion detection. At the same time, it uses the memory function of the Long Short Term Memory network and powerful series data learning ability to learn. Finally, the KDD-CUP99 data sets are used to test the performance of attention-LSTM. The experiment results show that the proposed algorithm is efficient. Compare with the classical Convolutional Neural Networks (CNN) algorithm, Recurrent Neural Network (RNN) algorithm, and LSTM algorithm, the method not only improves the accuracy rate and precision rate of network intrusion detection but also decreases the false alarm rate. It provides a design basis and technical support for future intrusion detection technology.","PeriodicalId":229746,"journal":{"name":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","volume":"114 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123357468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Speeding Up Deep Convolutional Neural Networks Based on Tucker-CP Decomposition","authors":"Dechun Song, Peiyong Zhang, Feiteng Li","doi":"10.1145/3409073.3409094","DOIUrl":"https://doi.org/10.1145/3409073.3409094","url":null,"abstract":"Convolutional neural networks (CNNs) have made great success in computer vision tasks. But the computational complexity of CNNs is huge, which makes CNNs run slowly especially when computational resources are limited. In this paper, we propose a scheme based on tensor decomposition to accelerate CNNs. Firstly, Tucker method is used to decompose the convolution kernel into a small core tensor with key information and two factor matrices reflecting the linear relationship in the third dimension and fourth dimension of the convolution kernel respectively. Then CP (CANDECOMP/PARAFAC) method is used to decompose the core tensor into several rank-1 tensors. This scheme can remove the linear redundancy in convolution kernels and greatly speed up CNNs while maintaining the high classification accuracy. The scheme is used to decompose all the convolutional layers in AlexNet, and the accelerated model is trained and tested on ImageNet. The results show that our scheme achieves a whole-model speedup of 4 x with merely a 1.9% increase in top-5 error for AlexNet.","PeriodicalId":229746,"journal":{"name":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128921071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An intelligent teaching assistant system using deep learning technologies","authors":"Zheyu Zhou","doi":"10.1145/3409073.3409079","DOIUrl":"https://doi.org/10.1145/3409073.3409079","url":null,"abstract":"In this paper, we describe an intelligent teaching assistant system using deep learning technologies. A few works had been done on building intelligent assistant for teachers before and our job is novel. The main challenge in this area is the distraction detection under various surroundings of the students. We invent a creative way to embed human prior knowledge of distraction judgment. Our system takes multiple sequential images as input, runs through a prior feature image extraction component using image segmentation and face detection technologies base on deep learning, and then a distraction detection component which uses AlexNet to do the classification, and finally outputs the evaluation of the online lesson. Our system has achieved 85.8% precision on student distraction detection and the evaluation generated by the system can serve as an indication to notify the teacher when specific teaching methods should be taken so as to enhance the lesson effectiveness.","PeriodicalId":229746,"journal":{"name":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123119445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An LSTM-Based Method for Detection and Classification of Sensor Anomalies","authors":"A. Verner, Sumitra Mukherjee","doi":"10.1145/3409073.3409089","DOIUrl":"https://doi.org/10.1145/3409073.3409089","url":null,"abstract":"Most existing machine learning (ML) based solutions for anomaly detection in sensory data rely on carefully hand-crafted features. This approach has a fundamental limitation since it is often application-specific and requires considerable human effort from domain experts. Deep learning models have been demonstrated to have the ability to abstract relevant high-level features from raw data. Long short-term memory (LSTM) recurrent neural networks have proven effective in complex time-series prediction problems. In this paper, we propose an LSTM-based method for anomaly detection in sensory data. We systematically investigate its effectiveness on raw time-series of real medical sensors measurements and show that it achieves the same level of performance as traditional ML models operating on carefully designed feature vectors. The proposed method achieved micro, macro, and weighted precision, recall, and F1-score of over 0.99.","PeriodicalId":229746,"journal":{"name":"Proceedings of the 2020 5th International Conference on Machine Learning Technologies","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132651595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}