{"title":"Analysis and comparison of various deep learning models to implement suspicious activity recognition in CCTV surveillance","authors":"Dhruv Saluja, Harsh Kukreja, Akash Saini, Devanshi Tegwal, Preeti Nagrath, Jude Hemanth","doi":"10.3233/idt-230469","DOIUrl":"https://doi.org/10.3233/idt-230469","url":null,"abstract":"The paper aims to analyze and compare various deep learning (DL) algorithms in order to develop a Suspicious Activity Recognition (SAR) system for closed-circuit television (CCTV) surveillance. Automated systems for detecting and classifying suspicious activities are crucial as technology’s role in safety and security expands. This paper addresses these challenges by creating a robust SAR system using machine learning techniques. It analyzes and compares evaluation metrics such as Precision, Recall, F1 Score, and Accuracy using various deep learning methods (convolutional neural network (CNN), Long short-term memory (LSTM) – Visual Geometry Group 16 (VGG16), LSTM – ResNet50, LSTM – EfficientNetB0, LSTM – InceptionNetV3, LSTM – DenseNet121, and Long-term Recurrent Convolutional Network (LRCN)). The proposed system improves threat identification, vandalism deterrence, fight prevention, and video surveillance. It aids emergency response by accurately classifying suspicious activities from CCTV footage, reducing reliance on human security personnel and addressing limitations in manual monitoring. The objectives of the paper include analyzing existing works, extracting features from CCTV videos, training robust deep learning models, evaluating algorithms, and improving accuracy. The conclusion highlights the superior performance of the LSTM-DenseNet121 algorithm, achieving an overall accuracy of 91.17% in detecting suspicious activities. This enhances security monitoring capabilities and reduces response time. Limitations of the system include subjectivity, contextual understanding, occlusion, false alarms, and privacy concerns. Future improvements involve real-time object tracking, collaboration with law enforcement agencies, and performance optimization. Ongoing research is necessary to overcome limitations and enhance the effectiveness of CCTV surveillance.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135514254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Severity prediction in COVID-19 patients using clinical markers and explainable artificial intelligence: A stacked ensemble machine learning approach","authors":"Krishnaraj Chadaga, Srikanth Prabhu, Niranjana Sampathila, Rajagopala Chadaga","doi":"10.3233/idt-230320","DOIUrl":"https://doi.org/10.3233/idt-230320","url":null,"abstract":"The recent COVID-19 pandemic had wreaked havoc worldwide, causing a massive strain on already-struggling healthcare infrastructure. Vaccines have been rolled out and seem effective in preventing a bad prognosis. However, a small part of the population (elderly and people with comorbidities) continues to succumb to this deadly virus. Due to a lack of available resources, appropriate triaging and treatment planning are vital to improving outcomes for patients with COVID-19. Assessing whether a patient requires the hospital’s Intensive Care Unit (ICU) is very important since these units are not available for every patient. In this research, we automate this assessment with stacked ensemble machine learning models that predict ICU admission based on general patient laboratory data. We have built an explainable decision support model which automatically scores the COVID-19 severity for individual patients. Data from 1925 COVID-19 positive patients, sourced from three top-tier Brazilian hospitals, were used to design the model. Pearson’s correlation and mutual information were utilized for feature selection, and the top 24 features were chosen as input for the model. The final stacked model could provide decision support on whether an admitted COVID-19 patient would require the ICU or not, with an accuracy of 88%. Explainable Artificial Intelligence (EAI) was used to undertake system-level insight discovery and investigate various clinical variables’ impact on decision-making. It was found that the most critical factors were respiratory rate, temperature, blood pressure, lactate dehydrogenase, hemoglobin, and age. Healthcare facilities can use the proposed approach to categorize COVID-19 patients and prevent COVID-19 fatalities.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135514255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Data analytics methods to measure service quality: A systematic review","authors":"Georgia Gkioka, Thimios Bothos, Babis Magoutas, Gregoris Mentzas","doi":"10.3233/idt-230363","DOIUrl":"https://doi.org/10.3233/idt-230363","url":null,"abstract":"The volume of user generated content (UGC) regarding the quality of provided services has increased exponentially. Meanwhile, research on how to leverage this data using data-driven methods to systematically measure service quality is rather limited. Several works have employed Data Analytics (DA) techniques on UGC and shown that using such data to measure service quality is promising and efficient. The purpose of this study is to provide insights into the studies which use Data Analytics techniques to measure service quality in different sectors, identify gaps in the literature and propose future directions. This study performs a systematic literature review (SLR) of Data Analytics (DA) techniques to measure service quality in various sectors. This paper focuses on the type of data, the approaches used, and the evaluation techniques found in these studies. The study derives a new categorization of the Data Analytics methods used in measuring service quality, distinguishes the most used data sources and provides insights regarding methods and data sources used per industry. Finally, the paper concludes by identifying gaps in the literature and proposes future research directions aiming to provide practitioners and academia with guidance on implementing DA for service quality assessment, complementary to traditional survey-based methods.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136214023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Data storage query and traceability method of electronic certificate based on cloud computing and blockchain","authors":"Huanying Chen, Bo Wei, Zhaoji Huang","doi":"10.3233/idt-230152","DOIUrl":"https://doi.org/10.3233/idt-230152","url":null,"abstract":"In the age of big data, electronic data has developed rapidly and gradually replaced traditional paper documents. In daily life, all kinds of data are saved in the form of electronic documents. In this regard, people have strengthened the development of electronic depository system. Electronic storage refers to the storage of actual events in the form of electronic data through information technology to prove the time and content of events. Its application scenarios are very extensive such as electronic contracts, online transactions and intellectual property rights. However, due to the vulnerability of electronic data, the existing electronic data depository system has certain security risks, and its content is very easy to be tampered with and destroyed, resulting in the loss of depository information. Due to the complexity of the operation of the existing electronic data depository system, some users are likely to reduce the authenticity of the depository information due to the non-standard operation. In order to solve the problems existing in the current electronic data storage system, this paper designed an electronic data storage system based on cloud computing and blockchain technology. The data storage of cloud computing and blockchain was decentralized, and its content cannot be tampered with. It can effectively ensure the integrity and security of electronic information, which is more suitable for the needs of electronic storage scenarios. This paper first introduced the development of electronic data depository system and cloud computing, and optimized the electronic data depository system through the task scheduling model of cloud computing. Finally, the feasibility of the system was verified through experiments. The data showed that the functional efficiency of the system in the electronic data sampling point storage function, the upload of documents to be stored, the download of stored documents, the view of stored information function and the file storage and certificate comparison verification function has reached 0.843, 0.821, 0.798, 0.862 and 0.812 respectively. The final function indexes of each function of the traditional electronic data depository system were 0.619, 0.594, 0.618, 0.597 and 0.622 respectively. This data shows that the electronic data storage system based on cloud computing and blockchain modeling can effectively manage electronic data and facilitate relevant personnel to verify electronic data.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135789676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Stock market prediction based on sentiment analysis using deep long short-term memory optimized with namib beetle henry optimization","authors":"Nital Adikane, V. Nirmalrani","doi":"10.3233/idt-230191","DOIUrl":"https://doi.org/10.3233/idt-230191","url":null,"abstract":"Stock price prediction is a recent hot subject with enormous promise and difficulties. Stock prices are volatile and exceedingly challenging to predict accurately due to factors like investment sentiment and market rumors etc. The development of effective models for accurate prediction is extremely tricky due to the complexity of stockdata. Long Short-Term Memory (LSTM) discovers patterns and insights that weren’t previously visible, and they can be leveraged to make incredibly accurate predictions. Therefore, to perform an accurate prediction of the next-day trend, in this research manuscript, a novel method called Updated Deep LSTM (UDLSTM) with namib Beetle Henry optimization (BH-UDLSTM) is proposed on historical stock market data and sentiment analysis data. The UDLSTMmodel has improved prediction performance, which is more stable during training, and increases data accuracy. Hybridization of namib beetle and henry gas algorithm with the UDLSTM further enhances the prediction accuracy with minimum error by excellent balance of exploration and exploitation. BH-UDLSTM is then evaluated with several existing methods and it is proved that the introduced approach predicts the stock price accurately (92.45%) than the state-of-the-art.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135886767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design of laser image recognition system based on high performance computing of spatiotemporal data","authors":"Zongfu Wu, Fazhong Hou","doi":"10.3233/idt-230161","DOIUrl":"https://doi.org/10.3233/idt-230161","url":null,"abstract":"Due to the large scale and spatiotemporal dispersion of 3D (three-dimensional) point cloud data, current object recognition and semantic annotation methods still face issues of high computational complexity and slow data processing speed, resulting in data processing requiring much longer time than collection. This article studied the FPFH (Fast Point Feature Histograms) description method for local spatial features of point cloud data, achieving efficient extraction of local spatial features of point cloud data; This article investigated the robustness of point cloud data under different sample densities and noise environments. This article utilized the time delay of laser emission and reception signals to achieve distance measurement. Based on this, the measured object is continuously scanned to obtain the distance between the measured object and the measurement point. This article referred to the existing three-dimensional coordinate conversion method to obtain a two-dimensional lattice after three-dimensional position conversion. Based on the basic requirements of point cloud data processing, this article adopted a modular approach, with core functional modules such as input and output of point cloud data, visualization of point clouds, filtering of point clouds, extraction of key points of point clouds, feature extraction of point clouds, registration of point clouds, and data acquisition of point clouds. This can achieve efficient and convenient human-computer interaction for point clouds. This article used a laser image recognition system to screen potential objects, with a success rate of 85% and an accuracy rate of 82%. The laser image recognition system based on spatiotemporal data used in this article has high accuracy.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136025567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bhagya Lakshmi A, Sasirekha K, Nagendiran S, Ani Minisha R, Mary Shiba C, Varun C.M, Sajitha L.P, Vimala Josphine C
{"title":"HT-WSO: A hybrid meta-heuristic approach-aided multi-objective constraints for energy efficient routing in WBANs","authors":"Bhagya Lakshmi A, Sasirekha K, Nagendiran S, Ani Minisha R, Mary Shiba C, Varun C.M, Sajitha L.P, Vimala Josphine C","doi":"10.3233/idt-220295","DOIUrl":"https://doi.org/10.3233/idt-220295","url":null,"abstract":"Generally, Wireless Body Area Networks (WBANs) are regarded as the collection of small sensor devices that are effectively implanted or embedded into the human body. Moreover, the nodes included in the WBAN have large resource constraints. Hence, reliable and energy-efficient data transmission plays a significant role in the implementation and in constructing of most of the merging applications. Regarded to complicated channel environment, limited power supply, as well as varying link connectivity has made the construction of WBANs routing protocol become difficult. In order to provide the routing protocol in a high energy-efficient manner, a new approach is suggested using hybrid meta-heuristic development. Initially, all the sensor nodes in WBAN are considered for experimentation. In general, the WBAN is comprised of mobile nodes as well as fixed sensor nodes. Since the existing models are ineffective to achieve high energy efficiency, the new routing protocol is developed by proposing the Hybrid Tunicate-Whale Swarm Optimization (HT-WSO) algorithm. Subsequently, the proposed work considers the multiple constraints for deriving the objective function. The network efficiency is analyzed using the objective function that is formulated by distance, hop count, energy, path loss, and load and packet loss ratio. To attain the optimum value, the HT-WSO derived from Tunicate Swarm Algorithm (TSA) and Whale Optimization Algorithm (WOA) is employed. In the end, the ability of the working model is estimated by diverse parameters and compared with existing traditional approaches. The simulation outcome of the designed method achieves 13.3%, 23.5%, 25.7%, and 27.7% improved performance than DHOA, Jaya, TSA, and WOA. Thus, the results illustrate that the recommended protocol attains better energy efficiency over WBANs.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136025566","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Taobao transaction data mining based on time series evaluation under the background of big data","authors":"Yanmin Zhang","doi":"10.3233/idt-230111","DOIUrl":"https://doi.org/10.3233/idt-230111","url":null,"abstract":"With the emergence of e-commerce, more and more people conduct transactions through the Internet, thus resulting in a large number of transaction data. Data mining is to decompose a large amount of data according to data rules, and analyze network transaction data, so as to provide necessary digital links for companies to analyze the market and develop business. Although time series data mining is smaller than other types of data mining, it is also an important issue of data mining. In the real world, the correlation between data and time is very common. The study of time series model plays a very important role in data mining. Due to different purposes, Taobao data analysis is also different. In addition to statistics, at present, the in-depth research and analysis of Taobao data are relatively insufficient, and the analysis of Taobao transaction data based on time series is rare. In order to improve the accuracy of Taobao transaction data mining and better formulate Taobao marketing strategy, this paper used time series data mining technology to mine Taobao transaction data. This paper first introduced the role of Taobao transaction data mining, and then described the calculation method of time series data mining, including the re-description of time series and the similarity measurement of time series. Finally, through a series of processes such as data collection, data processing and data feature extraction, the data mining model for Taobao transaction was established, and two data prediction evaluation indicators, namely prediction accuracy and entropy, were proposed. The experimental part verified the effect of Taobao transaction data mining. The experimental results showed that the data mining model moment had good data prediction accuracy and entropy. The average data prediction accuracy was 94.26%, and the data mining ability was strong.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136025568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Machine learning and financial big data control using IoT","authors":"Jian Xiao","doi":"10.3233/idt-230156","DOIUrl":"https://doi.org/10.3233/idt-230156","url":null,"abstract":"Machine learning algorithms have been widely used in risk prediction management systems for financial data. Early warning and control of financial risks are important areas of corporate investment decision-making, which can effectively reduce investment risks and ensure companies’ stable development. With the development of the Internet of Things, enterprises’ financial information is obtained through various intelligent devices in the enterprise financial system. Big data provides high-quality services for the economy and society in the high-tech era of information. However, the amount of financial data is large, complex and variable, so the analysis of financial data has huge difficulties, and with the in-depth application of machine learning algorithms, its shortcomings are gradually exposed. To this end, this paper collects the financial data of a listed group from 2005 to 2020, and conducts data preprocessing and Feature selection, including removing missing values, Outlier and unrelated items. Next, these data are divided into a training set and a testing set, where the training set data is used for model training and the testing set data is used to evaluate the performance of the model. Three methods are used to build and compare data control models, which are based on machine learning algorithm, based on deep learning network and the model based on artificial intelligence and Big data technology proposed in this paper. In terms of risk event prediction comparison, this paper selects two indicators to measure the performance of the model: accuracy and Mean squared error (MSE). Accuracy reflects the predictive ability of the model, which is the proportion of all correctly predicted samples to the total sample size. Mean squared error is used to evaluate the accuracy and error of the model, that is, the square of the Average absolute deviation between the predicted value and the true value. In this paper, the prediction results of the three methods are compared with the actual values, and their accuracy and Mean squared error are obtained and compared. The experimental results show that the model based on artificial intelligence and Big data technology proposed in this paper has higher accuracy and smaller Mean squared error than the other two models, and can achieve 90% accuracy in risk event prediction, which proves that it has higher ability in controlling financial data risk.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85905821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling dynamic social networks using concept of neighborhood theory","authors":"Subrata Paul, C. Koner, Anirban Mitra","doi":"10.3233/idt-220138","DOIUrl":"https://doi.org/10.3233/idt-220138","url":null,"abstract":"Dynamic social network analysis basically deals with the study of how the nodes and edges and associations among them within the network alter with time, thereby forming a special category of social network. Geometrical analysis has been done on various occasions, but there is a difference in the approximate distances of nodes. Snapshots for social networks are taken at each time slot and then are bound for these studies. The paper will discuss an efficient way of modeling dynamic social networks with the concept of neighborhood theory of cellular automata. So far, no model that uses the concept of neighborhood has been proposed to the best of our knowledge and the literature survey. Besides cellular automata that has been important tool in various applications has remained unexplored in the area of modelling. To this extent the paper, is the 1st attempt in modelling the social network that is evolving in nature. A link prediction algorithm based on some basic graph theory concepts has also been additionally proposed for the emergence of new nodes within the network. Theoretical and programming simulations have been explained in support to the model. Finally, the paper will discuss the model with a real-life scenario.","PeriodicalId":43932,"journal":{"name":"Intelligent Decision Technologies-Netherlands","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83696696","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}