Evolutionary Computation最新文献

筛选
英文 中文
BUSTLE: A Versatile Tool for the Evolutionary Learning of STL Specifications from Data BUSTLE:从数据中进化学习 STL 规格的多功能工具。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-15 DOI: 10.1162/evco_a_00347
Federico Pigozzi;Laura Nenzi;Eric Medvet
{"title":"BUSTLE: A Versatile Tool for the Evolutionary Learning of STL Specifications from Data","authors":"Federico Pigozzi;Laura Nenzi;Eric Medvet","doi":"10.1162/evco_a_00347","DOIUrl":"10.1162/evco_a_00347","url":null,"abstract":"Describing the properties of complex systems that evolve over time is a crucial requirement for monitoring and understanding them. Signal Temporal Logic (STL) is a framework that proved to be effective for this aim because it is expressive and allows state properties as human-readable formulae. Crafting STL formulae that fit a particular system is, however, a difficult task. For this reason, a few approaches have been proposed recently for the automatic learning of STL formulae starting from observations of the system. In this paper, we propose BUSTLE (Bi-level Universal STL Evolver), an approach based on evolutionary computation for learning STL formulae from data. BUSTLE advances the state of the art because it (i) applies to a broader class of problems, in terms of what is known about the state of the system during its observation, and (ii) generates both the structure and the values of the parameters of the formulae employing a bi-level search mechanism (global for the structure, local for the parameters). We consider two cases where (a) observations of the system in both anomalous and regular state are available, or (b) only observations of regular state are available. We experimentally evaluate BUSTLE on problem instances corresponding to the two cases and compare it against previous approaches. We show that the evolved STL formulae are effective and human-readable: the versatility of BUSTLE does not come at the cost of lower effectiveness.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"33 1","pages":"91-114"},"PeriodicalIF":4.6,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139913984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
OneMax Is Not the Easiest Function for Fitness Improvements OneMax 并非改善体能的最简单功能。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-15 DOI: 10.1162/evco_a_00348
Marc Kaufmann;Maxime Larcher;Johannes Lengler;Xun Zou
{"title":"OneMax Is Not the Easiest Function for Fitness Improvements","authors":"Marc Kaufmann;Maxime Larcher;Johannes Lengler;Xun Zou","doi":"10.1162/evco_a_00348","DOIUrl":"10.1162/evco_a_00348","url":null,"abstract":"We study the (1:s+1) success rule for controlling the population size of the (1,λ)-EA. It was shown by Hevia Fajardo and Sudholt that this parameter control mechanism can run into problems for large s if the fitness landscape is too easy. They conjectured that this problem is worst for the OneMax benchmark, since in some well-established sense OneMax is known to be the easiest fitness landscape. In this paper, we disprove this conjecture. We show that there exist s and ɛ such that the self-adjusting (1,λ)-EA with the (1:s+1)-rule optimizes OneMax efficiently when started with ɛn zero-bits, but does not find the optimum in polynomial time on Dynamic BinVal. Hence, we show that there are landscapes where the problem of the (1:s+1)-rule for controlling the population size of the (1,λ)-EA is more severe than for OneMax. The key insight is that, while OneMax is the easiest function for decreasing the distance to the optimum, it is not the easiest fitness landscape with respect to finding fitness-improving steps.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"33 1","pages":"27-54"},"PeriodicalIF":4.6,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140295208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Synthesising Diverse and Discriminatory Sets of Instances Using Novelty Search in Combinatorial Domains 在组合领域利用新颖性搜索合成多样化和辨别性实例集。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-15 DOI: 10.1162/evco_a_00350
Alejandro Marrero;Eduardo Segredo;Coromoto León;Emma Hart
{"title":"Synthesising Diverse and Discriminatory Sets of Instances Using Novelty Search in Combinatorial Domains","authors":"Alejandro Marrero;Eduardo Segredo;Coromoto León;Emma Hart","doi":"10.1162/evco_a_00350","DOIUrl":"10.1162/evco_a_00350","url":null,"abstract":"Gathering sufficient instance data to either train algorithm-selection models or understand algorithm footprints within an instance space can be challenging. We propose an approach to generating synthetic instances that are tailored to perform well with respect to a target algorithm belonging to a predefined portfolio but are also diverse with respect to their features. Our approach uses a novelty search algorithm with a linearly weighted fitness function that balances novelty and performance to generate a large set of diverse and discriminatory instances in a single run of the algorithm. We consider two definitions of novelty: (1) with respect to discriminatory performance within a portfolio of solvers; (2) with respect to the features of the evolved instances. We evaluate the proposed method with respect to its ability to generate diverse and discriminatory instances in two domains (knapsack and bin-packing), comparing to another well-known quality diversity method, Multi-dimensional Archive of Phenotypic Elites (MAP-Elites) and an evolutionary algorithm that only evolves for discriminatory behaviour. The results demonstrate that the novelty search method outperforms its competitors in terms of coverage of the space and its ability to generate instances that are diverse regarding the relative size of the “performance gap” between the target solver and the remaining solvers in the portfolio. Moreover, for the Knapsack domain, we also show that we are able to generate novel instances in regions of an instance space not covered by existing benchmarks using a portfolio of state-of-the-art solvers. Finally, we demonstrate that the method is robust to different portfolios of solvers (stochastic approaches, deterministic heuristics, and state-of-the-art methods), thereby providing further evidence of its generality.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"33 1","pages":"55-90"},"PeriodicalIF":4.6,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140877841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Drift Analysis with Fitness Levels for Elitist Evolutionary Algorithms 精英进化算法的漂移分析与适合度分析
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-15 DOI: 10.1162/evco_a_00349
Jun He;Yuren Zhou
{"title":"Drift Analysis with Fitness Levels for Elitist Evolutionary Algorithms","authors":"Jun He;Yuren Zhou","doi":"10.1162/evco_a_00349","DOIUrl":"10.1162/evco_a_00349","url":null,"abstract":"The fitness level method is a popular tool for analyzing the hitting time of elitist evolutionary algorithms. Its idea is to divide the search space into multiple fitness levels and estimate lower and upper bounds on the hitting time using transition probabilities between fitness levels. However, the lower bound generated by this method is often loose. An open question regarding the fitness level method is what are the tightest lower and upper time bounds that can be constructed based on transition probabilities between fitness levels. To answer this question, we combine drift analysis with fitness levels and define the tightest bound problem as a constrained multiobjective optimization problem subject to fitness levels. The tightest metric bounds by fitness levels are constructed and proven for the first time. Then linear bounds are derived from metric bounds and a framework is established that can be used to develop different fitness level methods for different types of linear bounds. The framework is generic and promising, as it can be used to draw tight time bounds on both fitness landscapes with and without shortcuts. This is demonstrated in the example of the (1+1) EA maximizing the TwoMax1 function.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"33 1","pages":"1-25"},"PeriodicalIF":4.6,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140295207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Layered Learning Approach to Scaling in Learning Classifier Systems for Boolean Problems 布尔问题分类器学习系统中的分层学习扩展方法。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-15 DOI: 10.1162/evco_a_00351
Isidro M. Alvarez;Trung B. Nguyen;Will N. Browne;Mengjie Zhang
{"title":"A Layered Learning Approach to Scaling in Learning Classifier Systems for Boolean Problems","authors":"Isidro M. Alvarez;Trung B. Nguyen;Will N. Browne;Mengjie Zhang","doi":"10.1162/evco_a_00351","DOIUrl":"10.1162/evco_a_00351","url":null,"abstract":"Evolutionary Computation (EC) often throws away learned knowledge as it is reset for each new problem addressed. Conversely, humans can learn from small-scale problems, retain this knowledge (plus functionality), and then successfully reuse them in larger-scale and/or related problems. Linking solutions to problems has been achieved through layered learning, where an experimenter sets a series of simpler related problems to solve a more complex task. Recent works on Learning Classifier Systems (LCSs) has shown that knowledge reuse through the adoption of Code Fragments, GP-like tree-based programs, is plausible. However, random reuse is inefficient. Thus, the research question is how LCS can adopt a layered-learning framework, such that increasingly complex problems can be solved efficiently. An LCS (named XCSCF*) has been developed to include the required base axioms necessary for learning, refined methods for transfer learning and learning recast as a decomposition into a series of subordinate problems. These subordinate problems can be set as a curriculum by a teacher, but this does not mean that an agent can learn from it; especially if it only extracts over-fitted knowledge of each problem rather than the underlying scalable patterns and functions. Results show that from a conventional tabula rasa, with only a vague notion of which subordinate problems might be relevant, XCSCF* captures the general logic behind the tested domains and therefore can solve any n-bit Multiplexer, n-bit Carry-one, n-bit Majority-on, and n-bit Even-parity problems. This work demonstrates a step towards continual learning as learned knowledge is effectively reused in subsequent problems.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"33 1","pages":"115-140"},"PeriodicalIF":4.6,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140877840","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Solving Many-objective Optimization Problems based on PF Shape Classification and Vector Angle Selection. 基于PF形状分类和矢量角度选择的多目标优化问题求解。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-10 DOI: 10.1162/evco_a_00373
Y T Wu, F Z Ge, D B Chen, L Shi
{"title":"Solving Many-objective Optimization Problems based on PF Shape Classification and Vector Angle Selection.","authors":"Y T Wu, F Z Ge, D B Chen, L Shi","doi":"10.1162/evco_a_00373","DOIUrl":"https://doi.org/10.1162/evco_a_00373","url":null,"abstract":"<p><p>Most many-objective optimization algorithms (MaOEAs) adopt a pre-assumed Pareto front (PF) shape, instead of the true PF shape, to balance convergence and diversity in high-dimensional objective space, resulting in insufficient selection pressure and poor performance. To address these shortcomings, we propose MaOEA-PV based on PF shape classification and vector angle selection. The three innovation points of this paper are as follows: (I) a new method for PF classification; (II) a new fitness function that combines convergence and diversity indicators, thereby enhancing the quality of parents during mating selection; and (III) the selection of individuals exhibiting the best convergence to add to the population, overcoming the lack of selection pressure during environmental selection. Subsequently, the max-min vector angle strategy is employed. The solutions with the highest diversity and the least convergence are selected based on the max and min vector angles, respectively, which balances convergence and diversity. The performance of algorithm is compared with those of five state-of-the-art MaOEAs on 41 test problems and 5 real-world problems comprising as many 15 objectives. The experimental results demonstrate the competitive and effective nature of the proposed algorithm.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-42"},"PeriodicalIF":4.6,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143651776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
MO-SMAC: Multi-objective Sequential Model-based Algorithm Configuration. MO-SMAC:多目标序列模型算法配置。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-10 DOI: 10.1162/evco_a_00371
Jeroen G Rook, Carolin Benjamins, Jakob Bossek, Heike Trautmann, Holger H Hoos, Marius Lindauer
{"title":"MO-SMAC: Multi-objective Sequential Model-based Algorithm Configuration.","authors":"Jeroen G Rook, Carolin Benjamins, Jakob Bossek, Heike Trautmann, Holger H Hoos, Marius Lindauer","doi":"10.1162/evco_a_00371","DOIUrl":"https://doi.org/10.1162/evco_a_00371","url":null,"abstract":"<p><p>Automated algorithm configuration aims at finding well-performing parameter configurations for a given problem, and it has proven to be effective within many AI domains, including evolutionary computation. Initially, the focus was on excelling in one performance objective, but, in reality, most tasks have a variety of (conflicting) objectives. The surging demand for trustworthy and resource-efficient AI systems makes this multi-objective perspective even more prevalent. We propose a new general-purpose multi-objective automated algorithm configurator by extending the widely-used SMAC framework. Instead of finding a single configuration, we search for a non-dominated set that approximates the actual Pareto set. We propose a pure multi-objective Bayesian Optimisation approach for obtaining promising configurations by using the predicted hypervolume improvement as acquisition function. We also present a novel intensification procedure to efficiently handle the selection of configurations in a multi-objective context. Our approach is empirically validated and compared across various configuration scenarios in four AI domains, demonstrating superiority over baseline methods, competitiveness with MO-ParamILS on individual scenarios and an overall best performance.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-25"},"PeriodicalIF":4.6,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143651775","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Beyond Landscape Analysis: DynamoRep Features For Capturing Algorithm-Problem Interaction In Single-Objective Continuous Optimization. 超越景观分析:单目标连续优化中捕捉算法问题交互的DynamoRep特征。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-03-07 DOI: 10.1162/evco_a_00370
Gjorgjina Cenikj, Gašper Petelin, Carola Doerr, Peter Korošec, Tome Eftimov
{"title":"Beyond Landscape Analysis: DynamoRep Features For Capturing Algorithm-Problem Interaction In Single-Objective Continuous Optimization.","authors":"Gjorgjina Cenikj, Gašper Petelin, Carola Doerr, Peter Korošec, Tome Eftimov","doi":"10.1162/evco_a_00370","DOIUrl":"https://doi.org/10.1162/evco_a_00370","url":null,"abstract":"<p><p>The representation of optimization problems and algorithms in terms of numerical features is a well-established tool for comparing optimization problem instances, for analyzing the behavior of optimization algorithms, and the quality of existing problem benchmarks, as well as for automated per-instance algorithm selection and configuration approaches. Extending purely problem-centered feature collections, our recently proposed DynamoRep features provide a simple and inexpensive representation of the algorithmproblem interaction during the optimization process. In this paper, we conduct a comprehensive analysis of the predictive power of the DynamoRep features for the problem classification, algorithm selection, and algorithm classification tasks. In particular, the features are evaluated for the classification of problem instances into problem classes from the BBOB (Black Box Optimization Benchmarking) suite, selecting the best algorithm to solve a given problem from a portfolio of three algorithms (Differential Evolution, Evolutionary Strategy, and Particle Swarm Optimization), as well as distinguishing these algorithms based on their trajectories. We show that, despite being much cheaper to compute, they can yield results comparable to those using state-ofthe-art Exploratory Landscape Analysis features.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-28"},"PeriodicalIF":4.6,"publicationDate":"2025-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143575999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
P-NP instance decomposition based on the Fourier transform for solving the Linear Ordering Problem. 基于傅里叶变换的P-NP实例分解求解线性排序问题。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-02-20 DOI: 10.1162/evco_a_00368
Xabier Benavides, Leticia Hernando, Josu Ceberio, Jose A Lozano
{"title":"P-NP instance decomposition based on the Fourier transform for solving the Linear Ordering Problem.","authors":"Xabier Benavides, Leticia Hernando, Josu Ceberio, Jose A Lozano","doi":"10.1162/evco_a_00368","DOIUrl":"https://doi.org/10.1162/evco_a_00368","url":null,"abstract":"<p><p>The Fourier transform over finite groups has proved to be a useful tool for analyzing combinatorial optimization problems. However, few heuristic and meta-heuristic algorithms have been proposed in the literature that utilize the information provided by this technique to guide the search process. In this work, we attempt to address this research gap by considering the case study of the Linear Ordering Problem (LOP). Based on the Fourier transform, we propose an instance decomposition strategy that divides any LOP instance into the sum of two LOP instances associated with a P and an NP-Hard optimization problem. By linearly aggregating the instances obtained from the decomposition, it is possible to create artificial instances with modified proportions of the P and NP-Hard components. Conducted experiments show that increasing the weight of the P component leads to a less rugged fitness landscape suitable for local search-based optimization. We take advantage of this phenomenon by presenting a new meta-heuristic algorithm called P-Descent Search (PDS). The proposed method, first, optimizes a surrogate instance with a high proportion of the P component, and then, gradually increases the weight of the NP-Hard component until the original instance is reached. The multi-start version of PDS shows a promising and predictable performance that appears to be correlated to specific characteristics of the problem, which could open the door to an automatic tuning of its hyper-parameters.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-28"},"PeriodicalIF":4.6,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143469897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On the use of the Doubly Stochastic Matrix models for the Quadratic Assignment Problem. 双随机矩阵模型在二次分配问题中的应用。
IF 4.6 2区 计算机科学
Evolutionary Computation Pub Date : 2025-02-20 DOI: 10.1162/evco_a_00369
Valentino Santucci, Josu Ceberio
{"title":"On the use of the Doubly Stochastic Matrix models for the Quadratic Assignment Problem.","authors":"Valentino Santucci, Josu Ceberio","doi":"10.1162/evco_a_00369","DOIUrl":"https://doi.org/10.1162/evco_a_00369","url":null,"abstract":"<p><p>Permutation problems have captured the attention of the combinatorial optimization community for decades due to the challenge they pose. Although their solutions are naturally encoded as permutations, in each problem, the information to be used to optimize them can vary substantially. In this article, we consider the Quadratic Assignment Problem (QAP) as a case study, and propose using Doubly Stochastic Matrices (DSMs) under the framework of Estimation of Distribution Algorithms. To that end, we design efficient learning and sampling schemes that enable an effective iterative update of the probability model. Conducted experiments on commonly adopted benchmarks for the QAP prove doubly stochastic matrices to be preferred to other four models for permutations, both in terms of effectiveness and computational efficiency. Moreover, additional analyses performed on the structure of the QAP and the Linear Ordering Problem (LOP) show that DSMs are good to deal with assignment problems, but they have interesting capabilities to deal also with ordering problems such as the LOP. The article concludes with a description of the potential uses of DSMs for other optimization paradigms, such as genetic algorithms or model-based gradient search.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-30"},"PeriodicalIF":4.6,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143469891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信