{"title":"Principles and Strategies of Tabu Search","authors":"F. Glover, M. Laguna, R. Martí","doi":"10.1201/9781351236423-21","DOIUrl":"https://doi.org/10.1201/9781351236423-21","url":null,"abstract":"Tabu Search is a meta-heuristic that guides a local heuristic search procedure to explore the solution space beyond local optimality. One of the main components of Tabu Search is its use of adaptive memory, which creates a more flexible search behavior. Memory-based strategies are therefore the hallmark of tabu search approaches, founded on a quest for “integrating principles,” by which alternative forms of memory are appropriately combined with effective strategies for exploiting them. A novel finding is that such principles are sometimes sufficiently potent to yield effective problem solving behavior in their own right, with negligible reliance on memory. Over a wide range of problem settings, however, strategic use of memory can make dramatic differences in the ability to solve problems. Pure and hybrid Tabu Search approaches have set new records in finding better solutions to problems in production planning and scheduling, resource allocation, network design, routing, financial analysis, telecommunications, portfolio planning, supply chain management, agent-based modeling, business process design, forecasting, machine learning, data mining, biocomputation, molecular design, forest management and resource planning, among many other areas.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"313 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122244292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Practical Algorithms for Two-Dimensional Packing of General Shapes","authors":"Yannan Hu, H. Hashimoto, S. Imahori, M. Yagiura","doi":"10.1201/9781351236423-33","DOIUrl":"https://doi.org/10.1201/9781351236423-33","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133829731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reactive Search: Machine Learning for Memory-Based Heuristics","authors":"R. Battiti, M. Brunato","doi":"10.1201/9781351236423-19","DOIUrl":"https://doi.org/10.1201/9781351236423-19","url":null,"abstract":"Most state-of-the-art heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology. In some cases, these parameters are tuned through a feedback loop that includes the user as a crucial learning component: depending on preliminary algorithm tests some parameter values are changed by the user, and different options are tested until acceptable results are obtained. Therefore, the quality of results is not automatically transferred to different instances and the feedback loop can require a lengthy \"trial and error\" process every time the algorithm has to be tuned for a new application. Parameter tuning is therefore a crucial issue both in the scientific development and in the practical use of heuristics. In some cases the role of the user as an intelligent (learning) part makes the reproducibility of heuristic results difficult and, as a consequence, the competitiveness of alternative techniques depends in a crucial way on the user's capabilities. Reactive Search advocates the use of simple sub-symbolic machine learning to automate the parameter tuning process and make it an integral (and fully documented) part of the algorithm. If learning is performed on line, task-dependent and local properties of the configuration space can be used by the algorithm to determine the appropriate balance between diversification (looking for better solutions in other zones of the configuration space) and intensification (exploring more intensively a small but promising part of the configuration space). In this way a single algorithm maintains the flexibility to deal with related problems through an internal feedback loop that considers the previous history of the search. In the following, we shall call reaction the act of modifying some algorithm parameters in response to the search algorithm's behavior during its execution, rather than between runs. Therefore, a reactive heuristic is a technique with the ability of tuning some important parameters during execution by means of a machine learning mechanism. It is important to notice that such heuristics are intrinsically history-dependent; thus, the practical success of this approach in some cases raises the need of a sounder theoretical foundation of non-Markovian search techniques.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114496020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Approximation Algorithms for the Selection of Robust Tag SNPs","authors":"Kui Zhang, K. Chao, Yao-Ting Huang, Ting Chen","doi":"10.1201/9781420010749.ch77","DOIUrl":"https://doi.org/10.1201/9781420010749.ch77","url":null,"abstract":"Recent studies have shown that the chromosomal recombination only takes places at some narrow hotspots. Within the chromosomal region between these hotspots (called haplotype block), little or even no recombination occurs, and a small subset of SNPs (called tag SNPs) is sufficient to capture the haplotype pattern of the block. In reality, the tag SNPs may be genotyped as missing data, and we may fail to distinguish two distinct haplotypes due to the ambiguity caused by missing data. In this paper, we formulate this problem as finding a set of SNPs (called robust tag SNPs) which is able to tolerate missing data. To find robust tag SNPs, we propose two greedy and one LP-relaxation algorithms which give solutions of ((m+1)lnfrac{K(K-1)}{2}), (ln((m+1)frac{K(K-1)}{2})), and O(mln K) approximation respectively, where m is the number of SNPs allowed for missing data and K is the number of patterns in the block.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131046742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Özlem Ergun, Abraham P. Punnen, J. Orlin, R. Ahuja
{"title":"Very Large-Scale Neighborhood Search","authors":"Özlem Ergun, Abraham P. Punnen, J. Orlin, R. Ahuja","doi":"10.1201/9781420010749.ch20","DOIUrl":"https://doi.org/10.1201/9781420010749.ch20","url":null,"abstract":"Many optimization problems that model the essential issues of important real-world decision making are computationally intractable. Therefore, a practical approach for solving such problems is to employ heuristic techniques that find nearly optimal solutions within a reasonable amount of computation time. Improvement algorithms generally start with a feasible solution and iteratively try to obtain a better solution. Neighborhood search algorithms, which are alternatively called local search algorithms, are a wide class of improvement algorithms where at each iteration an improving solution is found by searching a “neighborhood” of the current solution. A critical issue in the design of a neighborhood search algorithm is defining what solutions constitute the neighborhood of a solution. As a rule of thumb, the larger the neighborhood, the better is the quality of the locally optimal solutions, including the final solution selected upon termination. Similarly, the larger the neighborhood, the longer it takes to search the neighborhood. Thus, a larger neighborhood does not necessarily produce a more effective heuristic unless one can search the larger neighborhood efficiently. This article concentrates on neighborhood search algorithms where the size of the neighborhood is “very large” with respect to the size of the input data and the neighborhood can be searched efficiently. We survey three broad classes of very large-scale neighborhood (VLSN) search algorithms: variable-depth methods in which large neighborhoods are searched heuristically, large neighborhoods that are searched by solving a constrained minimum–cost flow problem, and other situations that give rise to efficiently searchable large neighborhoods. \u0000 \u0000 \u0000Keywords: \u0000 \u0000very large-scale neighborhood search; \u0000cyclic-exchange neighborhood; \u0000variable-depth neighborhood; \u0000multiexchange neighborhood; \u0000heuristics","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128419628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Stochastic Local Search","authors":"H. Hoos, T. Stützle","doi":"10.1201/9781420010749.ch19","DOIUrl":"https://doi.org/10.1201/9781420010749.ch19","url":null,"abstract":"Interestingly, stochastic local search that you really wait for now is coming. It's significant to wait for the representative and beneficial books to read. Every book that is provided in better way and utterance will be expected by many peoples. Even you are a good reader or not, feeling to read this book will always appear when you find it. But, when you feel hard to find it as yours, what to do? Borrow to your friends and don't know when to give back it to her or him.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116952065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Restriction Methods","authors":"T. Gonzalez","doi":"10.1201/9781420010749.ch3","DOIUrl":"https://doi.org/10.1201/9781420010749.ch3","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116005923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"LP Rounding and Extensions","authors":"Ramesh Krishnamurti, D. Gaur","doi":"10.1201/9781420010749.ch7","DOIUrl":"https://doi.org/10.1201/9781420010749.ch7","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122699262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Greedy Methods","authors":"S. Khuller, B. Raghavachari, N. Young","doi":"10.1201/9781420010749.ch4","DOIUrl":"https://doi.org/10.1201/9781420010749.ch4","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"309 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133484066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural Networks","authors":"H. Siegelmann, B. Dasgupta, Derong Liu","doi":"10.1201/9781420010749.ch22","DOIUrl":"https://doi.org/10.1201/9781420010749.ch22","url":null,"abstract":"Artificial neural networks have been proposed as a tool for machine learning (e.g., see [23, 41, 47, 52]) and many results have been obtained regarding their application to practical problems in robotics control, vision, pattern recognition, grammatical inferences and other areas (e.g., see [8, 19, 29, 61]). In these roles, a neural network is trained to recognize complex associations between inputs and outputs that were presented during a supervised training cycle. These associations are incorporated into the weights of the network, which encode ∗Supported in part by NSF grants CCR-0206795, CCR-0208749 and IIS-0346973.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131689119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}