Proceedings of the International Conference on Neuromorphic Systems 2022最新文献

筛选
英文 中文
Neuro-symbolic computing with spiking neural networks 脉冲神经网络的神经符号计算
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546824
D. Dold, J. Garrido, Victor Caceres Chian, Marcel Hildebrandt, T. Runkler
{"title":"Neuro-symbolic computing with spiking neural networks","authors":"D. Dold, J. Garrido, Victor Caceres Chian, Marcel Hildebrandt, T. Runkler","doi":"10.1145/3546790.3546824","DOIUrl":"https://doi.org/10.1145/3546790.3546824","url":null,"abstract":"Knowledge graphs are an expressive and widely used data structure due to their ability to integrate data from different domains in a sensible and machine-readable way. Thus, they can be used to model a variety of systems such as molecules and social networks. However, it still remains an open question how symbolic reasoning could be realized in spiking systems and, therefore, how spiking neural networks could be applied to such graph data. Here, we extend previous work on spike-based graph algorithms by demonstrating how symbolic and multi-relational information can be encoded using spiking neurons, allowing reasoning over symbolic structures like knowledge graphs with spiking neural networks. The introduced framework is enabled by combining the graph embedding paradigm and the recent progress in training spiking neural networks using error backpropagation. The presented methods are applicable to a variety of spiking neuron models and can be trained end-to-end in combination with other differentiable network architectures, which we demonstrate by implementing a spiking relational graph neural network.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127398976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Learning to Parameterize a Stochastic Process Using Neuromorphic Data Generation 使用神经形态数据生成学习参数化随机过程
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546797
William M. Severa, J. D. Smith, J. Aimone, R. Lehoucq
{"title":"Learning to Parameterize a Stochastic Process Using Neuromorphic Data Generation","authors":"William M. Severa, J. D. Smith, J. Aimone, R. Lehoucq","doi":"10.1145/3546790.3546797","DOIUrl":"https://doi.org/10.1145/3546790.3546797","url":null,"abstract":"Deep learning is consistently becoming more integrated into scientific computing workflows. These high-performance methods allow for data-driven discoveries enabling, among other tasks, classification, feature extraction, and regression. In this paper, we present a unique approach to solving an inverse problem—determining the initial parameters of a system from observed data—using not only deep learning-powered AI but also simulation data generated using neuromorphic, brain-inspired hardware. We find this approach to be both scalable and energy efficient, capable of leveraging future advancements both in AI algorithms and neuromorphic hardware. Many high performing deep learning approaches require large amounts of training data. And, while great progress is being made in new techniques, current methods suggest that data-heavy approaches are still best-suited for maintaining critical generalization required for an inverse problem. However, that data comes at a cost, often in the form of expensive high-fidelity numerical simulations. Instead, we make use of recent advances in spiking neural networks and neural-inspired computing wherein we can use Intel’s Loihi to compute hundreds of thousands of random walk trajectories. Statistics from these random walkers effectively simulate certain classes of physical processes. Moreover, the use of neuromorphic architectures allows these trajectories to be generated quickly and at drastically lower energy cost. This generated data can then be fed into a deep learning regression network, modified to incorporate certain known physical properties. We find the resulting networks can then determine the initial parameters and their uncertainties, and we explore various factors that impact their performance.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"630 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133846858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Event-driven Recurrent Spiking Neural Network Architecture for Efficient Inference on FPGA 一种基于FPGA的事件驱动循环尖峰神经网络结构
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546802
Anand Sankaran, Paul Detterer, Kalpana Kannan, Nikolaos S. Alachiotis, Federico Corradi
{"title":"An Event-driven Recurrent Spiking Neural Network Architecture for Efficient Inference on FPGA","authors":"Anand Sankaran, Paul Detterer, Kalpana Kannan, Nikolaos S. Alachiotis, Federico Corradi","doi":"10.1145/3546790.3546802","DOIUrl":"https://doi.org/10.1145/3546790.3546802","url":null,"abstract":"Spiking Neural Network (SNN) architectures are promising candidates for executing machine intelligence at the edge while meeting strict energy and cost reduction constraints in several application areas. To this end, we propose a new digital architecture compatible with Recurrent Spiking Neural Networks (RSNNs) trained using the PyTorch framework and Back-Propagation-Through-Time (BPTT) for optimizing the weights and the neuron’s parameters. Our architecture offers high software-to-hardware fidelity, providing high accuracy and a low number of spikes, and it targets efficient and low-cost implementations in Field Programmable Gate Arrays (FPGAs). We introduce a new time-discretization technique that uses request-acknowledge cycles between layers to allow the layer’s time execution to depend only upon the number of spikes. As a result, we achieve between 1.7x and 30x lower resource utilization and between 11x and 61x fewer spikes per inference than previous SNN implementations in FPGAs that rely on on-chip memory to store spike-time information and weight values. We demonstrate our approach using two benchmarks: MNIST digit recognition and a realistic radar and image sensory fusion for cropland classifications. Our results demonstrate that we can exploit the trade-off between accuracy, latency, and resource utilization at design time. Moreover, the use of low-cost FPGA platforms enables the deployment of several applications by satisfying the strict constraints of edge machine learning devices.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"247 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124719927","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Reducing the Spike Rate in Deep Spiking Neural Networks 降低深度尖峰神经网络的尖峰率
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546798
R. Fontanini, D. Esseni, M. Loghi
{"title":"Reducing the Spike Rate in Deep Spiking Neural Networks","authors":"R. Fontanini, D. Esseni, M. Loghi","doi":"10.1145/3546790.3546798","DOIUrl":"https://doi.org/10.1145/3546790.3546798","url":null,"abstract":"One objective of Spiking Neural Networks is a very efficient computation in terms of energy consumption. To achieve this target, a small spike rate is of course very beneficial since the event-driven nature of such a computation. However, as the network becomes deeper, the spike rate tends to increase without any improvements in the final results. On the other hand, the introduction of a penalty on the excess of spikes can often lead the network to a configuration where many neurons are silent, resulting in a drop of the computational efficacy. In this paper, we propose a learning strategy that keeps the spike rate under control, by (i) changing the loss function to penalize the spikes generated by neurons after the first ones, and by (ii) proposing a two-phase training that avoids silent neurons during the training.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121773268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fine-tuning Deep Reinforcement Learning Policies with r-STDP for Domain Adaptation 基于r-STDP的深度强化学习策略微调
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546804
Mahmoud Akl, Yulia Sandamirskaya, Deniz Ergene, Florian Walter, Alois Knoll
{"title":"Fine-tuning Deep Reinforcement Learning Policies with r-STDP for Domain Adaptation","authors":"Mahmoud Akl, Yulia Sandamirskaya, Deniz Ergene, Florian Walter, Alois Knoll","doi":"10.1145/3546790.3546804","DOIUrl":"https://doi.org/10.1145/3546790.3546804","url":null,"abstract":"Using deep reinforcement learning policies that are trained in simulation on real robotic platforms requires fine-tuning due to discrepancies between simulated and real environments. Multiple methods like domain randomization and system identification have been suggested to overcome this problem. However, sim-to-real transfer remains an open problem in robotics and deep reinforcement learning. In this paper, we present a spiking neural network (SNN) alternative for dealing with the sim-to-real problem. In particular, we train SNNs with backpropagation using surrogate gradients and the (Deep Q-Network) DQN algorithm to solve two classical control reinforcement learning tasks. The performance of the trained DQNs degrades when evaluated on randomized versions of the environments used during training. To compensate for the drop in performance, we apply the biologically plausible reward-modulated spike timing dependent plasticity (r-STDP) learning rule. Our results show that r-STDP can be successfully utilized to restore the network’s ability to solve the task. Furthermore, since r-STDP can be directly implemented on neuromorphic hardware, we believe it provides a promising neuromorphic solution to the sim-to-real problem.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124171042","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Neuromorphic Algorithm for Radiation Anomaly Detection 一种辐射异常检测的神经形态算法
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546815
James M. Ghawaly, Aaron R. Young, Daniel E. Archer, Nick Prins, Brett Witherspoon, Catherine D. Schuman
{"title":"A Neuromorphic Algorithm for Radiation Anomaly Detection","authors":"James M. Ghawaly, Aaron R. Young, Daniel E. Archer, Nick Prins, Brett Witherspoon, Catherine D. Schuman","doi":"10.1145/3546790.3546815","DOIUrl":"https://doi.org/10.1145/3546790.3546815","url":null,"abstract":"In this work, we present initial results on the development of a neuromorphic spiking neural network for performing gamma-ray radiation anomaly detection, the first known application of neuromorphic computing to be applied to the radiation detection domain. Neuromorphic computing seeks to enable future autonomous systems to obtain machine learning-level performance without the typical high power consumption needs. The detection of anomalous radioactive sources in an urban environment is challenging, largely due to the highly dynamic nature of background radiation. For this evaluation, the spiking neural network is trained and evaluated on the Urban Source Search challenge dataset, a synthetic dataset whose development was funded through the United States Department of Energy. The network’s weights and architecture are trained using an evolutionary optimization approach. A preliminary performance evaluation of the spiking neural network indicates significant improvements in source detection sensitivity when compared to an established gross count rate-based algorithm, while meeting ANSI standards for false alarm rate. The SNN achieved half the sensitivity of a different, more complex spectral analysis algorithm from literature, leaving room for future research and development.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114475429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Evaluating Encoding and Decoding Approaches for Spiking Neuromorphic Systems 评价尖峰神经形态系统的编码和解码方法
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546792
Catherine D. Schuman, Charles Rizzo, John McDonald-Carmack, Nicholas D. Skuda, J. Plank
{"title":"Evaluating Encoding and Decoding Approaches for Spiking Neuromorphic Systems","authors":"Catherine D. Schuman, Charles Rizzo, John McDonald-Carmack, Nicholas D. Skuda, J. Plank","doi":"10.1145/3546790.3546792","DOIUrl":"https://doi.org/10.1145/3546790.3546792","url":null,"abstract":"A challenge associated with effectively using spiking neuromorphic systems is how to communicate data to and from the neuromorphic implementation. Unless a neuromorphic or event-based sensing system is used, data has to be converted into spikes to be processed as input by the neuromorphic system. The output spikes produced by the neuromorphic system have to be turned back into a value or decision. There are a variety of commonly used input encoding approaches, such as rate coding, temporal coding, and population coding, as well as several commonly used output approaches, such as voting or first-to-spike. However, it is not clear which is the most appropriate approach to use or whether the choice of encoding or decoding approach has a significant impact on performance. In this work, we evaluate the performance of several encoding and decoding approaches on classification, regression, and control tasks. We show that the choice of encoding and decoding approaches significantly impact performance on these tasks, and we make recommendations on how to select the appropriate encoding and decoding approaches for real-world applications.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131818186","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Think Fast: Time Control in Varying Paradigms of Spiking Neural Networks 快速思考:脉冲神经网络变化范式中的时间控制
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546814
Steven C. Nesbit, Andrew O'Brien, Jocelyn Rego, Gavin Parpart, Carlos Gonzalez, Garrett T. Kenyon, Edward Kim, T. Stewart, Y. Watkins
{"title":"Think Fast: Time Control in Varying Paradigms of Spiking Neural Networks","authors":"Steven C. Nesbit, Andrew O'Brien, Jocelyn Rego, Gavin Parpart, Carlos Gonzalez, Garrett T. Kenyon, Edward Kim, T. Stewart, Y. Watkins","doi":"10.1145/3546790.3546814","DOIUrl":"https://doi.org/10.1145/3546790.3546814","url":null,"abstract":"The state-of-the-art in machine learning has been achieved primarily by deep learning artificial neural networks. These networks are powerful but biologically implausible and energy intensive. In parallel, a new paradigm of neural network is being researched that can alleviate some of the computational and energy issues. These networks, spiking neural networks (SNNs), have transformative potential if the community is able to bridge the gap between deep learning and SNNs. However, SNNs are notoriously difficult to train and lack precision in their communication. In an effort to overcome these limitations and retain the benefits of the learning process in deep learning, we investigate novel ways to translate between them. We construct several network designs with varying degrees of biological plausibility. We then test our designs on an image classification task and demonstrate our designs allow for a customized tradeoff between biological plausibility, power efficiency, inference time, and accuracy.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130689061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sparse Vector Binding on Spiking Neuromorphic Hardware Using Synaptic Delays 基于突触延迟的脉冲神经形态硬件稀疏向量绑定
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546820
Alpha Renner, Yulia Sandamirskaya, F. Sommer, E. P. Frady
{"title":"Sparse Vector Binding on Spiking Neuromorphic Hardware Using Synaptic Delays","authors":"Alpha Renner, Yulia Sandamirskaya, F. Sommer, E. P. Frady","doi":"10.1145/3546790.3546820","DOIUrl":"https://doi.org/10.1145/3546790.3546820","url":null,"abstract":"Vector Symbolic Architectures (VSA) were first proposed as connectionist models for symbolic reasoning, leveraging parallel and in-memory computing in brains and neuromorphic hardware that enable low-power, low-latency applications. Symbols are defined in VSAs as points/vectors in a high-dimensional neural state-space. For spiking neuromorphic hardware (and brains), particularly sparse representations are of interest, as they minimize the number of costly spikes. Furthermore, sparse representations can be efficiently stored in simple Hebbian auto-associative memories, which provide error correction in VSAs. However, the binding of spatially sparse representations is computationally expensive because it is not local to corresponding pairs of neurons as in VSAs with dense vectors. Here, we present the first implementation of a sparse VSA on spiking neuromorphic hardware, specifically Intel’s neuromorphic research chip Loihi. To reduce the cost of binding, a delay line and coincidence detection are used, trading off space with time. We show as proof of principle that our network on Loihi can perform the binding operation of a classical analogical reasoning task and discuss the cost of different sparse binding operations. The proposed binding mechanism can be used as a building block for VSA-based architectures on neuromorphic hardware.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"163 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131577666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Optimizing Recurrent Spiking Neural Networks with Small Time Constants for Temporal Tasks 面向时间任务的小时间常数循环脉冲神经网络优化
Proceedings of the International Conference on Neuromorphic Systems 2022 Pub Date : 2022-07-27 DOI: 10.1145/3546790.3546796
Yuan Zeng, Edward Jeffs, T. Stewart, Y. Berdichevsky, Xiaochen Guo
{"title":"Optimizing Recurrent Spiking Neural Networks with Small Time Constants for Temporal Tasks","authors":"Yuan Zeng, Edward Jeffs, T. Stewart, Y. Berdichevsky, Xiaochen Guo","doi":"10.1145/3546790.3546796","DOIUrl":"https://doi.org/10.1145/3546790.3546796","url":null,"abstract":"Recurrent spiking neural network (RSNN) is a frequently studied model to understand biological neural networks, as well as to develop energy efficient neuromorphic systems. Deep learning optimization approach, such as backpropogation through time (BPTT), equipped with surrogate gradient, can be used as an efficient optimization method for RSNN. Including dynamic properties of biological neurons into the neuron model may improve the network’s temporal learning capability. Earlier work only considers the spike frequency adaptation behavior with a large adaptation time constant that may be unsuitable for neuromorphic implementation. Besides adaptation, synapse is also an important structure for information transfer between neurons and its dynamics may influence network performance. In this work, a Leaky Integrate and Fire neuron model with dynamic synapses and spike frequency adaptation is used for temporal tasks. A step-by-step experiment is designed to understand the impact of recurrent connections, synapse model, and adaptation model on the network accuracy. For each step, a hyper-parameters tuning tool is used to find the best set of neuron parameters. In addition, the influence of the synapse and adaptation time constants is studied. Results suggest that, dynamic synapse is more efficient than adaptation in improving the network’s learning capability. When incorporating adaptation and synapse model together, the network can achieve a similar accuracy as the sate-of-the-art RSNN works while requiring fewer neurons and smaller time constants.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114987391","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信