Drones最新文献

筛选
英文 中文
A Live Detecting System for Strain Clamps of Transmission Lines Based on Dual UAVs’ Cooperation 基于双无人机合作的输电线路应变夹钳实时检测系统
Drones Pub Date : 2024-07-19 DOI: 10.3390/drones8070333
Zhiwei Jia, Yongkang Ouyang, Chao Feng, Shaosheng Fan, Zheng Liu, Chenhao Sun
{"title":"A Live Detecting System for Strain Clamps of Transmission Lines Based on Dual UAVs’ Cooperation","authors":"Zhiwei Jia, Yongkang Ouyang, Chao Feng, Shaosheng Fan, Zheng Liu, Chenhao Sun","doi":"10.3390/drones8070333","DOIUrl":"https://doi.org/10.3390/drones8070333","url":null,"abstract":"Strain clamps are critical components in high-voltage overhead transmission lines, and detection of their defects becomes an important part of regular inspection of transmission lines. A dual UAV (unmanned aerial vehicle) system was proposed to detect strain clamps in multiple split-phase conductors. The main UAV was equipped with a digital radiography (DR) imaging device, a mechanical arm, and an edge intelligence module with visual sensors. The slave UAV was equipped with a digital imaging board and visual sensors. A workflow was proposed for this dual UAV system. Target detection and distance detection of the strain clamps, as well as detection of the defects of strain clamps in DR images, are the main procedures of this workflow. To satisfy the demands of UAV-borne and real-time deployment, the improved YOLOv8-TR algorithm was proposed for the detection of strain clamps (the mAP@50 was 60.9%), and the KD-ResRPA algorithm is used for detecting defects in DR images (the average AUCROC of the three datasets was 82.7%). Field experiments validated the suitability of our dual UAV-based system for charged detection of strain clamps in double split-phase conductors, demonstrating its potential for practical application in live detecting systems.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141822744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Vision-Based Anti-UAV Detection Based on YOLOv7-GS in Complex Backgrounds 复杂背景下基于 YOLOv7-GS 的反无人机视觉检测
Drones Pub Date : 2024-07-18 DOI: 10.3390/drones8070331
Chunjuan Bo, Yuntao Wei, Xiujia Wang, Zhan Shi, Ying Xiao
{"title":"Vision-Based Anti-UAV Detection Based on YOLOv7-GS in Complex Backgrounds","authors":"Chunjuan Bo, Yuntao Wei, Xiujia Wang, Zhan Shi, Ying Xiao","doi":"10.3390/drones8070331","DOIUrl":"https://doi.org/10.3390/drones8070331","url":null,"abstract":"Unauthorized unmanned aerial vehicles (UAVs) pose threats to public safety and individual privacy. Traditional object-detection approaches often fall short during their application in anti-UAV technologies. To address this issue, we propose the YOLOv7-GS model, which is designed specifically for the identification of small UAVs in complex and low-altitude environments. This research primarily aims to improve the model’s detection capabilities for small UAVs in complex backgrounds. Enhancements were applied to the YOLOv7-tiny model, including adjustments to the sizes of prior boxes, incorporation of the InceptionNeXt module at the end of the neck section, and introduction of the SPPFCSPC-SR and Get-and-Send modules. These modifications aid in the preservation of details about small UAVs and heighten the model’s focus on them. The YOLOv7-GS model achieves commendable results on the DUT Anti-UAV and the Amateur Unmanned Air Vehicle Detection datasets and performs to be competitive against other mainstream algorithms.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141824291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
YOMO-Runwaynet: A Lightweight Fixed-Wing Aircraft Runway Detection Algorithm Combining YOLO and MobileRunwaynet YOMO-Runwaynet:结合 YOLO 和 MobileRunwaynet 的轻量级固定翼飞机跑道检测算法
Drones Pub Date : 2024-07-18 DOI: 10.3390/drones8070330
Wei Dai, Zhengjun Zhai, Dezhong Wang, Zhaozi Zu, Siyuan Shen, Xinlei Lv, Sheng Lu, Lei Wang
{"title":"YOMO-Runwaynet: A Lightweight Fixed-Wing Aircraft Runway Detection Algorithm Combining YOLO and MobileRunwaynet","authors":"Wei Dai, Zhengjun Zhai, Dezhong Wang, Zhaozi Zu, Siyuan Shen, Xinlei Lv, Sheng Lu, Lei Wang","doi":"10.3390/drones8070330","DOIUrl":"https://doi.org/10.3390/drones8070330","url":null,"abstract":"The runway detection algorithm for fixed-wing aircraft is a hot topic in the field of aircraft visual navigation. High accuracy, high fault tolerance, and lightweight design are the core requirements in the domain of runway feature detection. This paper aims to address these needs by proposing a lightweight runway feature detection algorithm named YOMO-Runwaynet, designed for edge devices. The algorithm features a lightweight network architecture that follows the YOMO inference framework, combining the advantages of YOLO and MobileNetV3 in feature extraction and operational speed. Firstly, a lightweight attention module is introduced into MnasNet, and the improved MobileNetV3 is employed as the backbone network to enhance the feature extraction efficiency. Then, PANet and SPPnet are incorporated to aggregate the features from multiple effective feature layers. Subsequently, to reduce latency and improve efficiency, YOMO-Runwaynet generates a single optimal prediction for each object, eliminating the need for non-maximum suppression (NMS). Finally, experimental results on embedded devices demonstrate that YOMO-Runwaynet achieves a detection accuracy of over 89.5% on the ATD (Aerovista Runway Dataset), with a pixel error rate of less than 0.003 for runway keypoint detection, and an inference speed exceeding 90.9 FPS. These results indicate that the YOMO-Runwaynet algorithm offers high accuracy and real-time performance, providing effective support for the visual navigation of fixed-wing aircraft.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141825957","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An All-Time Detection Algorithm for UAV Images in Urban Low Altitude 城市低空无人机图像的全时检测算法
Drones Pub Date : 2024-07-18 DOI: 10.3390/drones8070332
Yuzhuo Huang, Jingyi Qu, Haoyu Wang, Jun Yang
{"title":"An All-Time Detection Algorithm for UAV Images in Urban Low Altitude","authors":"Yuzhuo Huang, Jingyi Qu, Haoyu Wang, Jun Yang","doi":"10.3390/drones8070332","DOIUrl":"https://doi.org/10.3390/drones8070332","url":null,"abstract":"With the rapid development of urban air traffic, Unmanned Aerial Vehicles (UAVs) are gradually being widely used in cities. Since UAVs are prohibited over important places in Urban Air Mobility (UAM), such as government and airports, it is important to develop air–ground non-cooperative UAV surveillance for air security all day and night. In the paper, an all-time UAV detection algorithm based on visible images during the day and infrared images at night is proposed by our team. We construct a UAV dataset used in urban visible backgrounds (UAV–visible) and a UAV dataset used in urban infrared backgrounds (UAV–infrared). In the daytime, the visible images are less accurate for UAV detection in foggy environments; therefore, we incorporate a defogging algorithm with the detection network that can ensure the undistorted output of images for UAV detection based on the realization of defogging. At night, infrared images have the characteristics of a low-resolution, unclear object contour, and complex image background. We integrate the attention and the transformation of space feature maps into depth feature maps to detect small UAVs in images. The all-time detection algorithm is trained separately on these two datasets, which can achieve 96.3% and 94.7% mAP50 on the UAV–visible and UAV–infrared datasets and perform real-time object detection with an inference speed of 40.16 FPS and 28.57 FPS, respectively.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141826667","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Control and Application of Tree Obstacle-Clearing Coaxial Octocopter with Flexible Suspension Saw 带柔性悬挂锯的树木障碍物清除同轴八旋翼飞行器的控制与应用
Drones Pub Date : 2024-07-17 DOI: 10.3390/drones8070328
Luwei Liao, Zhong Yang, Haoze Zhuo, Nuo Xu, Wei Wang, Kun Tao, Jiabing Liang, Qiuyan Zhang
{"title":"Control and Application of Tree Obstacle-Clearing Coaxial Octocopter with Flexible Suspension Saw","authors":"Luwei Liao, Zhong Yang, Haoze Zhuo, Nuo Xu, Wei Wang, Kun Tao, Jiabing Liang, Qiuyan Zhang","doi":"10.3390/drones8070328","DOIUrl":"https://doi.org/10.3390/drones8070328","url":null,"abstract":"Aiming at the challenges of clearing tree obstacles along power transmission lines, the control and application of a novel Tree-Obstacle Clearing Coaxial Octocopter with Flexible Suspension Saw (TOCCO-FSS) have been investigated. Firstly, an overall scheme design and modeling of the TOCCO-FSS were conducted, and dynamic modeling of the TOCCO-FSS was performed using the Lagrange equation. Secondly, to address the interference encountered during the operation, a contact operation model was established to estimate the uncertainties and external disturbances during the contact operation process. Further, the Non-Singular Terminal Sliding-Mode Active Disturbance Rejection Control (NTSM-ADRC) method was researched based on the mathematical model of the TOCCO-FSS. Finally, the performance of the controller was verified through simulations and physical experiments. The results demonstrate that the design, control, and application of the entire TOCCO-FSS system are effective.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141830405","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Real-Time Registration of Unmanned Aerial Vehicle Hyperspectral Remote Sensing Images Using an Acousto-Optic Tunable Filter Spectrometer 使用声光可调谐滤波光谱仪实时注册无人机高光谱遥感图像
Drones Pub Date : 2024-07-17 DOI: 10.3390/drones8070329
Hong Liu, Bingliang Hu, Xingsong Hou, Tao Yu, Zhoufeng Zhang, Xiao Liu, Jiacheng Liu, Xueji Wang
{"title":"Real-Time Registration of Unmanned Aerial Vehicle Hyperspectral Remote Sensing Images Using an Acousto-Optic Tunable Filter Spectrometer","authors":"Hong Liu, Bingliang Hu, Xingsong Hou, Tao Yu, Zhoufeng Zhang, Xiao Liu, Jiacheng Liu, Xueji Wang","doi":"10.3390/drones8070329","DOIUrl":"https://doi.org/10.3390/drones8070329","url":null,"abstract":"Differences in field of view may occur during unmanned aerial remote sensing imaging applications with acousto-optic tunable filter (AOTF) spectral imagers using zoom lenses. These differences may stem from image size deformation caused by the zoom lens, image drift caused by AOTF wavelength switching, and drone platform jitter. However, they can be addressed using hyperspectral image registration. This article proposes a new coarse-to-fine remote sensing image registration framework based on feature and optical flow theory, comparing its performance with that of existing registration algorithms using the same dataset. The proposed method increases the structure similarity index by 5.2 times, reduces the root mean square error by 3.1 times, and increases the mutual information by 1.9 times. To meet the real-time processing requirements of the AOTF spectrometer in remote sensing, a development environment using VS2023+CUDA+OPENCV was established to improve the demons registration algorithm. The registration algorithm for the central processing unit+graphics processing unit (CPU+GPU) achieved an acceleration ratio of ~30 times compared to that of a CPU alone. Finally, the real-time registration effect of spectral data during flight was verified. The proposed method demonstrates that AOTF hyperspectral imagers can be used in real-time remote sensing applications on unmanned aerial vehicles.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141828711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Control of Helicopter Using Virtual Swashplate 利用虚拟斜盘控制直升机
Drones Pub Date : 2024-07-16 DOI: 10.3390/drones8070327
J. Flores, Sergio Salazar, I. González-Hernández, Yukio Rosales, Rogelio Lozano, Eduardo Salazar, Benjamin Nicolas
{"title":"Control of Helicopter Using Virtual Swashplate","authors":"J. Flores, Sergio Salazar, I. González-Hernández, Yukio Rosales, Rogelio Lozano, Eduardo Salazar, Benjamin Nicolas","doi":"10.3390/drones8070327","DOIUrl":"https://doi.org/10.3390/drones8070327","url":null,"abstract":"This article presents a virtual swashplate mechanism for a mini helicopter in classic configuration. The propeller bases are part of a passive mechanism driven by main rotor torque modulaton, this mechanism generates a synchronous and opposite change in the propellers angle of attack, then the thrust vector tilts. This approach proposes to control the 6 degrees of freedom of the aircraft using two rotors. The main rotor controls vertical displacement and uses torque modulation and swing-hinged propellers to generate pitch and roll moments and the horizontal displacement while the yaw moment is controlled by the tail rotor. The dynamic model is obtained using the Newton-Euler approach and robust control algorithms are proposed. Experimental results are presented to show the performance of the proposed virtual swashplate in real-time outdoor hover flights.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141643206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multi-UAVs Tracking Non-Cooperative Target Using Constrained Iterative Linear Quadratic Gaussian 使用受限迭代线性二次高斯跟踪非合作目标的多无人机
Drones Pub Date : 2024-07-15 DOI: 10.3390/drones8070326
Can Zhang, Yidi Wang, Wei Zheng
{"title":"Multi-UAVs Tracking Non-Cooperative Target Using Constrained Iterative Linear Quadratic Gaussian","authors":"Can Zhang, Yidi Wang, Wei Zheng","doi":"10.3390/drones8070326","DOIUrl":"https://doi.org/10.3390/drones8070326","url":null,"abstract":"This study considers the problem of controlling multi-unmanned aerial vehicles (UAVs) to consistently track a non-cooperative ground target with uncertain motion in a hostile environment with obstacles. An active information acquisition (AIA) problem is formulated to minimize the uncertainty of the target tracking task. The uncertain motion of the target is represented as a Wiener process. First, we optimize the configuration of the UAV swarm considering the collision avoidance, horizontal field of view (HFOV), and communication radius to calculate the reference trajectories of the UAVs. Next, a novel algorithm called Constrained Iterative Linear Quadratic Gaussian (CILQG) is introduced to track the reference trajectory. The target’s state with uncertainty and the UAV state are described as beliefs. The CILQG algorithm utilizes the Unscented Transform to propagate the belief regarding the UAVs’ motions, while also accounting for the impact of navigation errors on the target tracking process. The estimation error of the target position of the proposed method is under 4 m, and the error of tracking the reference trajectories is under 3 m. The estimation error remains unchanged even in the presence of obstacles. Therefore, this approach effectively deals with the uncertainties involved and ensures accurate tracking of the target.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141644904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Message Passing Detectors for UAV-Based Uplink Grant-Free NOMA Systems 基于无人机的无上行链路赠送 NOMA 系统的消息传递检测器
Drones Pub Date : 2024-07-14 DOI: 10.3390/drones8070325
Yi Song, Yiwen Zhu, Kun Chen-Hu, Xinhua Lu, Peng Sun, Zhongyong Wang
{"title":"Message Passing Detectors for UAV-Based Uplink Grant-Free NOMA Systems","authors":"Yi Song, Yiwen Zhu, Kun Chen-Hu, Xinhua Lu, Peng Sun, Zhongyong Wang","doi":"10.3390/drones8070325","DOIUrl":"https://doi.org/10.3390/drones8070325","url":null,"abstract":"Utilizing unmanned aerial vehicles (UAVs) as mobile access points or base stations has emerged as a promising solution to address the excessive traffic demands in wireless networks. This paper investigates improving the detector performance at the unmanned aerial vehicle base stations (UAV-BSs) in an uplink grant-free non-orthogonal multiple access (GF-NOMA) system by considering the activity state (AS) temporal correlation of the different user equipments (UEs) in the time domain. The Bernoulli Gaussian-Markov chain (BG-MC) probability model is used for exploiting both the sparsity and slow change characteristic of the AS of the UE. The GAMP Bernoulli Gaussian-Markov chain (GAMP-BG-MC) algorithm is proposed to improve the detector performance, which can utilize the bidirectional message passing between the neighboring time slots to fully exploit the temporally correlated AS of the UE. Furthermore, the parameters of the BG-MC model can be updated adaptively during the estimation procedure with unknown system statistics. Simulation results show that the proposed algorithm can improve the detection accuracy compared to existing methods while keeping the same order complexity.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141649980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Event-Triggered Collaborative Fault Diagnosis for UAV–UGV Systems 无人机-无人潜航器系统的事件触发协同故障诊断
Drones Pub Date : 2024-07-13 DOI: 10.3390/drones8070324
Runze Li, Bin Jiang, Yan Zong, N. Lu, Li Guo
{"title":"Event-Triggered Collaborative Fault Diagnosis for UAV–UGV Systems","authors":"Runze Li, Bin Jiang, Yan Zong, N. Lu, Li Guo","doi":"10.3390/drones8070324","DOIUrl":"https://doi.org/10.3390/drones8070324","url":null,"abstract":"The heterogeneous unmanned system, which is composed of unmanned aerial vehicles (UAV) and unmanned ground vehicles (UGV), has been broadly applied in many domains. Collaborative fault diagnosis (CFD) among UAVs and UGVs has become a key technology in these unmanned systems. However, collaborative fault diagnosis in unmanned systems faces the challenges of the dynamic environment and limited communication bandwidth. This paper proposes an event-triggered collaborative fault diagnosis framework for the UAV–UGV system. The framework aims to achieve autonomous fault monitoring and cooperative diagnosis among unmanned systems, thus enhancing system security and reliability. Firstly, we propose a fault trigger mechanism based on broad learning systems (BLS), which utilizes sensor data to accurately detect and identify faults. Then, under the dynamic event triggering mechanism, the network communication topology between the UAV–UGV system and BLS is used to achieve cooperative fault diagnosis. To validate the effectiveness of our proposed scheme, we conduct experiments on a software-in-the-loop (SIL) simulation platform. The experimental results demonstrate that our method achieves high diagnosis accuracy for the UAV–UGV system.","PeriodicalId":507567,"journal":{"name":"Drones","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141651950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信