Computer Networks最新文献

筛选
英文 中文
GWPF: Communication-efficient federated learning with Gradient-Wise Parameter Freezing GWPF:利用渐进参数冻结技术进行通信效率高的联合学习
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-11-04 DOI: 10.1016/j.comnet.2024.110886
{"title":"GWPF: Communication-efficient federated learning with Gradient-Wise Parameter Freezing","authors":"","doi":"10.1016/j.comnet.2024.110886","DOIUrl":"10.1016/j.comnet.2024.110886","url":null,"abstract":"<div><div>Communication bottleneck is a critical challenge in federated learning. While parameter freezing has emerged as a popular approach, utilizing fine-grained parameters as aggregation objects, existing methods suffer from issues such as a lack of thawing strategy, lag and inflexibility in the thawing process, and underutilization of frozen parameters’ updates. To address these challenges, we propose Gradient-Wise Parameter Freezing (GWPF), a mechanism that wisely controls frozen periods for different parameters through parameter freezing and thawing strategies. GWPF globally freezes parameters with insignificant gradients and excludes frozen parameters from global updates during the frozen period, reducing communication overhead and accelerating training. The thawing strategy, based on global decisions by the server and collaboration with clients, leverages real-time feedback on the locally accumulated gradients of frozen parameters in each round, achieving a balanced approach between mitigating communication and enhancing model accuracy. We provide theoretical analysis and a convergence guarantee for non-convex objectives. Extensive experiments confirm that our mechanism achieves a speedup of up to 4.52 times in time-to-accuracy performance and reduces communication overhead by up to 48.73%. It also improves final model accuracy by up to 2.01% compared to the existing fastest method APF. The code for GWPF is available at <span><span>https://github.com/Dora233/GWPF</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142594076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Slice admission control in 5G wireless communication with multi-dimensional state space and distributed action space: A sequential twin actor-critic approach 具有多维状态空间和分布式行动空间的 5G 无线通信中的分片准入控制:顺序孪生行为批评方法
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-29 DOI: 10.1016/j.comnet.2024.110878
{"title":"Slice admission control in 5G wireless communication with multi-dimensional state space and distributed action space: A sequential twin actor-critic approach","authors":"","doi":"10.1016/j.comnet.2024.110878","DOIUrl":"10.1016/j.comnet.2024.110878","url":null,"abstract":"<div><div>Network slicing represents a paradigm shift in the way resources are allocated for different 5G network functions through network function virtualization. This innovation aims to facilitate logical resource allocation, accommodating the anticipated surge in network resource requirements. This will harness automatic processing, scheduling, and orchestration for efficient management. To overcome the challenge of managing network resources under heavy demand, slice providers need to leverage both artificial intelligence and slice admission control strategies. While 5G network resources can be allocated to maintain a slice, the logical allocation and real-time network evaluation must be continuously examined and adjusted if network resilience is to be maintained. The complex task of leveraging slice admission control to maintain 5G network resilience has not been fully investigated. To tackle this problem, we propose a machine learning approach for slice admission control and resource allocation optimization so as to maintain network resilience. Machine learning algorithms offer a powerful tool for making robust and autonomous decisions, which are crucial for effective slice admission control. By intelligently allocating resources based on real-time demand and network conditions, these algorithms can help ensure long-term network resilience and achieve key objectives. While various machine learning algorithms hold promise for 5G resource management and admission control, reinforcement learning (RL) has emerged as a particularly exciting solution. Its ability to mimic human learning processes makes it a versatile solution, well-suited to tackle the complex challenges of network control. To fill this gap, we propose a new technique known as sequential twin actor critic (STAC). Simulations show that the STAC improves network resilience through enhanced admission probability and overall utility.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142587237","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantitative analysis of segmented satellite network architectures: A maritime surveillance case study 分段卫星网络架构的定量分析:海上监视案例研究
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-28 DOI: 10.1016/j.comnet.2024.110874
{"title":"Quantitative analysis of segmented satellite network architectures: A maritime surveillance case study","authors":"","doi":"10.1016/j.comnet.2024.110874","DOIUrl":"10.1016/j.comnet.2024.110874","url":null,"abstract":"<div><div>This paper presents an in-depth trade-off analysis of a Swarm Satellite Constellation (SSC) Mission for Earth observation that leverages Segmented Architecture (SA), a concept designed by the Argentinian Space Agency (CONAE) within the New Space philosophy. This architecture consists of a scenario featuring a networked constellation of small, cooperative satellites to enhance mission flexibility, reliability, coverage, and cost-effectiveness. Despite its promising prospects, SA features challenges in its mission design and definition phases due to the complex interplay between distributed space systems, technological innovation, and geographical landscapes. Our study analyzes an innovative quantitative analysis framework integrated with Ansys’ Systems Toolkit (STK). The resulting software tool models critical components, including ground and space segments, orbital dynamics, coverage, onboard processing, and communication links. We focus on a hypothetical SARE mission to detect illicit maritime activity near Argentina’s Exclusive Economic Zone (EEZ). This case study constitutes an archetypal mission elucidating the architecture’s benefits and complexities, addressing swarm coverage, contact dynamics, and data handling strategies. Results contribute to discussions on the practical trade-off in current and future Segmented Satellite Architectures with multiple mission objectives.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142552345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Machine learning-driven integration of terrestrial and non-terrestrial networks for enhanced 6G connectivity 以机器学习为驱动整合地面和非地面网络,增强 6G 连接性
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-28 DOI: 10.1016/j.comnet.2024.110875
{"title":"Machine learning-driven integration of terrestrial and non-terrestrial networks for enhanced 6G connectivity","authors":"","doi":"10.1016/j.comnet.2024.110875","DOIUrl":"10.1016/j.comnet.2024.110875","url":null,"abstract":"<div><div>Non-terrestrial networks (NTN)s are essential for achieving the persistent connectivity goal of sixth-generation networks, especially in areas lacking terrestrial infrastructure. However, integrating NTNs with terrestrial networks presents several challenges. The dynamic and complex nature of NTN communication scenarios makes traditional model-based approaches for resource allocation and parameter optimization computationally intensive and often impractical. Machine learning (ML)-based solutions are critical here because they can efficiently identify patterns in dynamic, multi-dimensional data, offering enhanced performance with reduced complexity. ML algorithms are categorized based on learning style—supervised, unsupervised, and reinforcement learning—and architecture, including centralized, decentralized, and distributed ML. Each approach has advantages and limitations in different contexts, making it crucial to select the most suitable ML strategy for each specific scenario in the integration of terrestrial and non-terrestrial networks (TNTN)s. This paper reviews the integration architectures of TNTNs as outlined in the 3rd Generation Partnership Project, examines ML-based existing work, and discusses suitable ML learning styles and architectures for various TNTN scenarios. Subsequently, it delves into the capabilities and challenges of different ML approaches through a case study in a specific scenario.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142560806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluating integration methods of a quantum random number generator in OpenSSL for TLS 评估用于 TLS 的 OpenSSL 中量子随机数发生器的集成方法
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-25 DOI: 10.1016/j.comnet.2024.110877
{"title":"Evaluating integration methods of a quantum random number generator in OpenSSL for TLS","authors":"","doi":"10.1016/j.comnet.2024.110877","DOIUrl":"10.1016/j.comnet.2024.110877","url":null,"abstract":"<div><div>The rapid advancement of quantum computing poses a significant threat to conventional cryptography. Whilst post-quantum cryptography (PQC) stands as the prevailing trend for fortifying the security of cryptographic systems, the coexistence of quantum and classical computing paradigms presents an opportunity to leverage the strengths of both technologies, for instance, nowadays the use of Quantum Random Number Generators (QRNGs) – considered as True Random Number Generators (TRNGs) – opens up the possibility of discussing hybrid systems. In this paper, we evaluate both aspects, on the one hand, we use hybrid TLS (Transport Layer Security) protocol that leverages the widely used secure protocol on the Internet and integrates PQC algorithms, and, on the other hand, we evaluate two approaches to integrate a QRNG, i.e., Quantis PCIe-240M, in OpenSSL 3.0 to be used by TLS. Both approaches are compared through a Nginx Web server, that uses OpenSSL’s implementation of TLS 1.3 for secure web communication. Our findings highlight the importance of optimizing such integration method, because while direct integration can lead to performance penalties specific to the method and hardware used, alternative methods demonstrate the potential for efficient QRNG deployment in cryptographic systems.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142571412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Traffic evolution in Software Defined Networks 软件定义网络的流量演进
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-24 DOI: 10.1016/j.comnet.2024.110852
{"title":"Traffic evolution in Software Defined Networks","authors":"","doi":"10.1016/j.comnet.2024.110852","DOIUrl":"10.1016/j.comnet.2024.110852","url":null,"abstract":"<div><div>Software Defined Networking (SDN) offers unprecedented traffic engineering possibilities due to optimal centralized decision making. However, network traffic evolves over time and changes the underlying optimization problem. Frequent application of the model to reflect traffic evolution causes flooding of control messages, traffic re-routing and synchronization problems. This paper addresses the problem of graceful traffic evolution in SDNs (Software Defined Networks) minimizing rule installations and modifications, optimizing the global objectives of minimization of Maximum Link Utilization (MLU) and minimization of the Maximum Switch Table Space Utilization (MSTU). The problem is formulated as multi-objective optimization using Mixed Integer Linear Programming (MILP). Proof of NP-Hardness is provided. Then, we re-formulate the problem as a single-objective problem and propose two greedy algorithms to solve the single-objective problem, namely MIRA-Im and MIRA-Im with Conflict Detection, and experiments are performed to show the effectiveness of the algorithms in comparison to previous state of the art proposals. Simulation results show significant improvements of MIRA-Im with Conflict Detection, especially in terms of number of installed rules (with a gain till 80% with the highest number of flows) and flow table space utilization (with a gain till 55% with the highest number of flows), compared to MIRA-Im and other algorithms available in the literature, while the other metrics are essentially stable.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142533939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Concordit: A credit-based incentive mechanism for permissioned redactable blockchain Concordit:基于信用的许可可删节区块链激励机制
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-23 DOI: 10.1016/j.comnet.2024.110848
{"title":"Concordit: A credit-based incentive mechanism for permissioned redactable blockchain","authors":"","doi":"10.1016/j.comnet.2024.110848","DOIUrl":"10.1016/j.comnet.2024.110848","url":null,"abstract":"<div><div>Malicious attacks and the introduction of illegal data put blockchains at risk, and blockchain governance is gaining increasing attention. The redactable blockchain technology has become a mainstream solution for blockchain governance. However, a low completion rate for redaction tasks limits current redactable blockchain technologies, primarily due to the absence of an effective incentive mechanism for participants. This gap underscores the urgent need for designing and implementing robust incentive mechanisms in redactable blockchains. Incentive mechanisms can motivate and guide entities to participate and perform desired behaviors through awards and punishments. This paper proposes Concordit, the first deployable credit-based incentive mechanism for redactable blockchains. Its purpose is to encourage submitters to submit legal redaction requests, modifiers to perform legal redaction operations, and verifiers to maintain the behavior consistent with the consensus algorithm. In the context of permissioned blockchains, Concordit utilizes a credit value system for awards and punishments. Additionally, we use a game theory-based mechanism to analyze and model participants’ behavior utilities in the redactable blockchain. Meanwhile, we evaluate the credibility of nodes by combining their static initial credit values and dynamic behavior-related credit values. This system prioritizes high-credibility nodes as participants, thereby enhancing the completion rate for redaction tasks. Finally, the implementation and performance evaluation of our Concordit incentive mechanism demonstrate its effectiveness and practicality.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142571411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint intelligent optimizing economic dispatch and electric vehicles charging in 5G vehicular networks 5G 车辆网络中的联合智能优化经济调度和电动汽车充电
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-23 DOI: 10.1016/j.comnet.2024.110872
{"title":"Joint intelligent optimizing economic dispatch and electric vehicles charging in 5G vehicular networks","authors":"","doi":"10.1016/j.comnet.2024.110872","DOIUrl":"10.1016/j.comnet.2024.110872","url":null,"abstract":"<div><div>In recent years, with the rapid development of 5G networks, the road traffic network composed of vehicles with different energy sources has become more and more complex, and the problems of environmental pollution and road congestion have also become increasingly serious. Electric vehicles are favored by people due to their environmental protection and energy-saving characteristics. However, improper charging dispatching will cause excess energy in charging stations, affecting the power grid and road traffic, such as energy shortages and lower traffic throughput. Therefore, how to design a reasonable charging strategy that can maximize the user’s charging satisfaction and consume the energy of the charging station as much as possible becomes a challenge. Meanwhile, this strategy should consider power economic dispatch to reduce power generation costs and polluting gas emissions. With the support of 5G’s high-bandwidth and low-latency characteristics, this paper designs an intelligent charging model which indirectly reflects the charging satisfaction through the time cost, energy consumption cost, charging cost, and the user’s range anxiety, while consuming the remaining energy of the charging station as much as possible. Due to the uncertainty of wind and photovoltaic power generation, this paper proposes a two-stage economic dispatch model to improve the accuracy of power dispatch and reduce power generation costs and carbon emissions. Due to the highly variable traffic environment and energy demand, we employ proximal policy optimization-based deep reinforcement learning algorithms to realize electric vehicle charging dispatching and charging station power dispatching. Numerical results show the efficiency of our proposed strategy for electric vehicle charging in terms of the convergence speed.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142571413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Topology sensing of FANET under missing data 缺失数据下的 FANET 拓扑感知
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-23 DOI: 10.1016/j.comnet.2024.110856
{"title":"Topology sensing of FANET under missing data","authors":"","doi":"10.1016/j.comnet.2024.110856","DOIUrl":"10.1016/j.comnet.2024.110856","url":null,"abstract":"<div><div>The topological structure of a flying ad hoc network (FANET) is crucial to understand, explain, and predict the behavior of unmanned aerial vehicle (UAV) swarms. Most studies focusing on topology sensing use perfect observations and complete datasets. However, the received signal dataset, being non-cooperative, commonly encounters instances of missing data, causing the performance of the existing algorithms to degrade. We investigate the issue of topology sensing of FANET based on external observations and propose a topology sensing method for FANET with missing data while introducing link-prediction methods to correct the topology inference results. First, we employ multi-dimensional Hawkes processes to model the communication event sequence in the network. Subsequently, to solve the problem in which the binary decision threshold is difficult to determine and cannot be adapted to the application scenario, we propose an extended multi-dimensional Hawkes model suitable for FANET and use the maximum likelihood estimation method for topology inference. Finally, to solve the problem of the low accuracy of inference results owing to missing data, we perform community detection on the observation network and combine the community detection and inference results to construct a mixed connection probability matrix, based on which we perform topology correction. The results of the analysis show that the topology sensing method proposed in this study is robust against missing data, indicating that it is an effective solution for solving this problem.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142552270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mathematical analysis of busy tone in full-duplex optical MAC for hidden node mitigation 缓解隐藏节点的全双工光 MAC 中忙音的数学分析
IF 4.4 2区 计算机科学
Computer Networks Pub Date : 2024-10-23 DOI: 10.1016/j.comnet.2024.110870
{"title":"Mathematical analysis of busy tone in full-duplex optical MAC for hidden node mitigation","authors":"","doi":"10.1016/j.comnet.2024.110870","DOIUrl":"10.1016/j.comnet.2024.110870","url":null,"abstract":"<div><div>This article provides the mathematical analysis of carrier sense multiple access with collision avoidance (CSMA/CA) media access control (MAC) protocol of IEEE 802.15.7 optical wireless communication (OWC). While the prior works have performed the OWC CSMA/CA mathematical analysis using the Markov models, deviation from the simulation results has been observed. We address this by improving the Markov model calculations, which display a mere 0.2% throughput deviation, nearly matching the simulation results. Furthermore, we work on the hidden node problem of the OWC networks. This problem is solved in literature by using various full-duplex communication methods, such as bi-directional data transmission and the busy tone signal; the latter is employed in our previous work on full-duplex optical MAC (FD-OMAC). These techniques increase the coverage area of the nodes by utilizing an access point (AP) as a relay node. However, the AP response is delayed by the processing time, causing an unexpected network behavior. The quantitative effect of this delay remains unexplored, which is critical for optimizing the OWC network. We bridge this gap by extending the proposed Markov analysis to model CSMA/CA and the aforementioned full-duplex techniques. This work equips readers with mathematical insights for future OWC MAC layer enhancements.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142587236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信