IEEE Sensors Letters最新文献

筛选
英文 中文
A High-Fidelity, Low-Cost Visuotactile Sensor for Rolling Tactile Perception 用于滚动触觉感知的高保真、低成本视觉触觉传感器
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-09 DOI: 10.1109/LSENS.2024.3477913
Lintao Xie;Guitao Yu;Tianhong Tong;Yang He;Dongtai Liang
{"title":"A High-Fidelity, Low-Cost Visuotactile Sensor for Rolling Tactile Perception","authors":"Lintao Xie;Guitao Yu;Tianhong Tong;Yang He;Dongtai Liang","doi":"10.1109/LSENS.2024.3477913","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3477913","url":null,"abstract":"In this letter, a low-cost but high-fidelity rolling tactile system is proposed for distinguishing patterns on curved surfaces, including an improved vision-based tactile sensor (VBTS) and a novel lightweight processing framework. The proposed VBTS contains a modular ring-shaped illumination configuration and an improved sensing elastomer, which is easy to fabricate without complex processing and costs only 16.95 USD in total. To achieve real-time data processing of rolling tactile images, inspired by event-based cameras, an efficient processing framework is introduced based on computer graphics, which can integrate sparse rolling tactile images into complete high-fidelity images for the final classification. To evaluate the effectiveness of the proposed system, a classification model is trained using a dataset generated by 13 cylinders with similar textures, where the identification accuracy of validation is up to 98.3%. Then, we test each cylinder sample for three rolling tactile perceptions and achieve 100% identification accuracy within 1.2 s on average, indicating a promising prospect of the proposed perception system for real-time application.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142579206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Efficient and Scalable Internet of Things Framework for Smart Farming 高效、可扩展的智能农业物联网框架
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-09 DOI: 10.1109/LSENS.2024.3476940
Imad Jawhar;Samar Sindian;Sara Shreif;Mahmoud Ezzdine;Bilal Hammoud
{"title":"An Efficient and Scalable Internet of Things Framework for Smart Farming","authors":"Imad Jawhar;Samar Sindian;Sara Shreif;Mahmoud Ezzdine;Bilal Hammoud","doi":"10.1109/LSENS.2024.3476940","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3476940","url":null,"abstract":"Internet of Things (IoT) advancements have provided significant benefits to the agriculture sector in rationing water usage and monitoring the growth of vegetation. This article presents an efficient and scalable IoT framework for smart farming. It is based on a wireless sensor actuator network (WSAN) that logs the farm's environmental parameters into a network control center for processing and monitoring. Furthermore, a new addressing scheme for the WSAN nodes is proposed, which features the scalability of the proposed solution. To test and evaluate the architecture's performance, simulations are conducted to measure water consumption and time to network failure. Results confirm the efficiency and the reliability of the proposed scalable network as a proof of concept of the proposed work.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142450990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Separable Spatial–Temporal Graph Learning Approach for Skeleton-Based Action Recognition 基于骨架的动作识别的可分离时空图学习方法
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-07 DOI: 10.1109/LSENS.2024.3475515
Hui Zheng;Ye-Sheng Zhao;Bo Zhang;Guo-Qiang Shang
{"title":"A Separable Spatial–Temporal Graph Learning Approach for Skeleton-Based Action Recognition","authors":"Hui Zheng;Ye-Sheng Zhao;Bo Zhang;Guo-Qiang Shang","doi":"10.1109/LSENS.2024.3475515","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3475515","url":null,"abstract":"With the popularization of sensors and the development of pose estimation algorithms, a skeleton-based action recognition task has gradually become mainstream in human action recognition tasks. The key to solving skeleton-based action recognition task is to extract feature representations that can accurately outline the characteristics of human actions from sensor data. In this letter, we propose a separable spatial-temporal graph learning approach, which is composed of independent spatial and temporal graph networks. In the spatial graph network, spectral-based graph convolutional network is selected to mine spatial features of each moment. In the temporal graph network, a global-local attention mechanism is embedded to excavate interdependence at different times. Extensive experiments are carried out on the NTU-RGB+D and NTU-RGB+D 120 datasets, and the results show that our proposed method outperforms several other baselines.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142517898","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Slosh Measuring Sensor System for Liquid-Carrying Robots 用于液体输送机器人的湍流测量传感器系统
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-02 DOI: 10.1109/LSENS.2024.3473688
Luke J. Weaver;S. M. Bhagya P. Samarakoon;M. A. Viraj J. Muthugala;Mohan Rajesh Elara;Zaki S. Saldi
{"title":"Slosh Measuring Sensor System for Liquid-Carrying Robots","authors":"Luke J. Weaver;S. M. Bhagya P. Samarakoon;M. A. Viraj J. Muthugala;Mohan Rajesh Elara;Zaki S. Saldi","doi":"10.1109/LSENS.2024.3473688","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3473688","url":null,"abstract":"Liquid-carrying robots require slosh suppression methods to improve their performance. To design these systems requires effective slosh measurement. State-of-the-Art slosh estimation methods have limitations, which include solely handling unidirectional motion or relying on theoretical models. This letter proposes a novel sensor array for measuring sloshing in liquid-carrying mobile robots. The proposed system offers two key contributions: first, it enables comprehensive measurement and visualization of sloshing during omnidirectional movements, and second, it provides a compact and seamless integration into mobile robots, enabling them to mitigate the adverse effects of sloshing. The sensor system has been developed using 14 time-of-flight range sensors. The range sensors are connected to an Arduino Mega through I\u0000<inline-formula><tex-math>$^{2}$</tex-math></inline-formula>\u0000 C communication. A 3-D visualization method has also been developed to visualize the sloshing. The sensor array was integrated into a mobile robot for validation. Utilizing the 3-D visualization method, the sensor system can reconstruct the liquid surface with a sampling rate of 17.8 Hz. The experimental results confirm that the proposed sensor system effectively measures sloshing during omnidirectional movement of the robot.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142518063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Portable and Flexible On-Road Sensing System for Traffic Monitoring 用于交通监控的便携式灵活路面传感系统
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-02 DOI: 10.1109/LSENS.2024.3473306
Naveen Kumar Gajingam;Sourav Karmakar;Aftab M. Hussain
{"title":"A Portable and Flexible On-Road Sensing System for Traffic Monitoring","authors":"Naveen Kumar Gajingam;Sourav Karmakar;Aftab M. Hussain","doi":"10.1109/LSENS.2024.3473306","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3473306","url":null,"abstract":"With an increasing number of vehicles on the road every day, intelligent traffic monitoring and control is essential. This entails development of cost-effective, scalable, and easy-to-install monitoring systems. In this letter, a versatile piezoresistance-based cost-effective on-road sensor system is presented to estimate vehicle speed and vehicle wheelbase length. The system consists of a velostat thin film sensing element placed on the road, with read out circuits and control electronics located at the sidewalk. The system measures the speed of a vehicle with 90.4% accuracy, and the length of the wheelbase with 94.3% accuracy. The wheelbase length can be used to classify the vehicle type. Our experiments show that the system is reliable, as the sensor output returns to the initial values after each vehicle passes. The utilization of flexible piezoresistive sensors makes this system convenient to deploy in different applications where basic traffic activity monitoring is required with speed, count, and classification estimation of vehicles.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142452725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Machine Learning-Based Low-Cost Colorimetric Sensor for pH and Free-Chlorine Measurement 基于机器学习的低成本比色传感器用于 pH 值和游离氯测量
IF 2.2
IEEE Sensors Letters Pub Date : 2024-10-02 DOI: 10.1109/LSENS.2024.3473530
Chetanya Goyal;Shreya Malkurthi;Kirthi Vignan Reddy Yellakonda;Aftab M. Hussain
{"title":"Machine Learning-Based Low-Cost Colorimetric Sensor for pH and Free-Chlorine Measurement","authors":"Chetanya Goyal;Shreya Malkurthi;Kirthi Vignan Reddy Yellakonda;Aftab M. Hussain","doi":"10.1109/LSENS.2024.3473530","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3473530","url":null,"abstract":"Free-chlorine concentration monitoring is of importance in public and industrial water supplies. Current colorimetric methods, which include test strips, spectrophotometric kits, etc. either lack precision or are expensive and labor intensive. In this study, we present a fully automated, cost-effective method of measurement of free chlorine concentration in real -time. The setup includes an automatic powder dispenser, an automatic liquid dispenser, a sample chamber, and an LED-light-dependent resistor sensor pair. The liquid sample is mixed with a coloring reagent and its color is measured using the sensor pair. Different regression algorithms were trained on the sensor data and tuned to predict the corresponding free-chlorine concentration with maximum accuracy. The system eliminates the need for color matching, reduces the time taken per test, and can be used to predict concentrations of multiple analytes, including ammonia-nitrogen, dissolved oxygen, etc., by adding corresponding colorimetry agents. This allows for a fully automated, real-time water testing system.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142452694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advancing General Sensor Data Synthesis by Integrating LLMs and Domain-Specific Generative Models 通过整合 LLM 和特定领域生成模型推进通用传感器数据合成
IF 2.2
IEEE Sensors Letters Pub Date : 2024-09-30 DOI: 10.1109/LSENS.2024.3470748
Xiaomao Zhou;Qingmin Jia;Yujiao Hu
{"title":"Advancing General Sensor Data Synthesis by Integrating LLMs and Domain-Specific Generative Models","authors":"Xiaomao Zhou;Qingmin Jia;Yujiao Hu","doi":"10.1109/LSENS.2024.3470748","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3470748","url":null,"abstract":"Synthetic data has become essential in machine learning and data science, addressing real-world data limitations such as scarcity, privacy, and cost. While existing generative models are effective in synthesizing various sensor data, they struggle with performance and generalization. This letter introduces a large language model (LLM)-driven framework that leverages LLMs and domain-specific generative models (DGMs) for general sensor data synthesis. Specifically, our method employs LLMs as the core to analyze data generation tasks, decompose complex tasks into manageable subtasks, and delegate each to the most suitable DGM, thereby automatically constructing customized data generation pipelines. Meanwhile, the integration of reinforcement learning (RL) is promising to enhance the framework's ability to optimally utilize DGMs, resulting in data generation with superior quality and control flexibility. Experimental results demonstrate the effectiveness of LLMs in understanding diverse tasks and in facilitating general sensor data synthesis through collaborative interactions with diverse DGMs.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142443004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
$mu$WSense: A Self-Sustainable Microwave-Powered Battery-Less Wireless Sensor Node for Temperature and Humidity Monitoring $mu$WSense:用于温湿度监测的可自我维持的微波供电无电池无线传感器节点
IF 2.2
IEEE Sensors Letters Pub Date : 2024-09-25 DOI: 10.1109/LSENS.2024.3468808
Vikas Kumar Malav;Ashwani Sharma
{"title":"$mu$WSense: A Self-Sustainable Microwave-Powered Battery-Less Wireless Sensor Node for Temperature and Humidity Monitoring","authors":"Vikas Kumar Malav;Ashwani Sharma","doi":"10.1109/LSENS.2024.3468808","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3468808","url":null,"abstract":"To realize a green Internet of Things (IoT) sensor network, batteryless wireless sensor nodes (WSNs) are required. This self-sustainability is achieved via energy harvesting from conventional renewable sources, such as solar and wind, which rely on the weather and are highly expensive. Alternatively, the microwave-based wireless power transfer technique is demonstrated previously, however, only for sensing operations without including the IoT. In this letter, a \u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000Wave-powered WSN (\u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000Wsense) hardware is demonstrated to realize true batteryless IoT sensing applications. The \u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000Wsense consists of a rectenna array (\u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000wave receiver), power management unit, and BLE module. The \u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000 wave receiver is designed at 5.2 GHz to power \u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000Wsense at a maximum measured transfer range of 2 m with a maximum real-time sensing interval of 75 s. The minimum harvested power \u0000<inline-formula><tex-math>$-16.59$</tex-math></inline-formula>\u0000 dBm is sufficient to operate the \u0000<inline-formula><tex-math>$mu$</tex-math></inline-formula>\u0000Wsense.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 11","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142438516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multiclass Object Classification Using Ultra-Low Resolution Time-of-Flight Sensors 利用超低分辨率飞行时间传感器进行多类物体分类
IF 2.2
IEEE Sensors Letters Pub Date : 2024-09-24 DOI: 10.1109/LSENS.2024.3467165
Andrea Fasolino;Paola Vitolo;Rosalba Liguori;Luigi Di Benedetto;Alfredo Rubino;Danilo Pau;Gian Domenico Licciardo
{"title":"Multiclass Object Classification Using Ultra-Low Resolution Time-of-Flight Sensors","authors":"Andrea Fasolino;Paola Vitolo;Rosalba Liguori;Luigi Di Benedetto;Alfredo Rubino;Danilo Pau;Gian Domenico Licciardo","doi":"10.1109/LSENS.2024.3467165","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3467165","url":null,"abstract":"Time-of-Flight (ToF) sensors are generally used in combination with red–blue–green sensors in image processing for adding the 3-D to 2-D scenes. Because of their low lateral resolution and contrast, they are scarcely used in object detection or classification. In this work, we demonstrate that ultra-low resolution (URL) ToF sensors with 8×8 pixels can be successfully used as stand-alone sensors for multiclass object detection even if combined with machine learning (ML) models, which can be implemented in a very compact and low-power custom circuit. Specifically, addressing an STMicroelectronics VL53L8CX 8×8 pixel ToF sensor, the designed ToF+ML system is capable to classify up to 10 classes with an overall mean accuracy of 90.21%. The resulting hardware architecture, prototyped on an AMD Xilinx Artix-7 field programmable gate array (FPGA), achieves an energy per inference consumption of 65.6 nJ and a power consumption of 1.095 \u0000<inline-formula><tex-math>$mu text{W}$</tex-math></inline-formula>\u0000 at the maximum output data rate of the sensor. These values are lower than the typical energy and power consumption of the sensor, enabling real-time postprocessing of depth images with significantly better performance than the state-of-the-art in the literature.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 10","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10689573","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142431892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Physical Sensor Driven Approach for Optimizing Positive Airway Pressure Therapy 优化气道正压疗法的物理传感器驱动方法
IF 2.2
IEEE Sensors Letters Pub Date : 2024-09-20 DOI: 10.1109/LSENS.2024.3464673
Delpha J;Priyanka Kokil;Subramaniyan S;Jayanthi T
{"title":"Physical Sensor Driven Approach for Optimizing Positive Airway Pressure Therapy","authors":"Delpha J;Priyanka Kokil;Subramaniyan S;Jayanthi T","doi":"10.1109/LSENS.2024.3464673","DOIUrl":"https://doi.org/10.1109/LSENS.2024.3464673","url":null,"abstract":"Obstructive sleep apnea (OSA) is a sleep disorder for which continuous positive airway pressure (CPAP) therapy is an effective treatment. In this study, a novel method to control the pressure in the positive airway pressure (PAP) device is proposed, which, in return, reduces the need to keep the subject at high pressure throughout extended period of their sleep without hindering the efficacy of the therapy. A standard CPAP/Bi-PAP (bilevel positive airway pressure) titration study is compared and verified with the predicted pressure values. Also, the relationship and correlation between weight, age, \u0000<inline-formula><tex-math>$SpO_{2}$</tex-math></inline-formula>\u0000, oxygen desaturation index, and the maximum pressure required for PAP therapy are also analyzed. Thus, it is affirmed that the PAP therapy compliance can be improved by sustaining the essential pressure and avoiding extended high-pressure intervals during therapy, unless they are absolutely required.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 10","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142376949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信