A comparison of heuristics for list schedules using the Box-method and P-method for random digraph generation

S. Al-Sharaeh, B. Wells
{"title":"A comparison of heuristics for list schedules using the Box-method and P-method for random digraph generation","authors":"S. Al-Sharaeh, B. Wells","doi":"10.1109/SSST.1996.493549","DOIUrl":null,"url":null,"abstract":"It is not uncommon to evaluate the effectiveness of competing parallel processing scheduling, mapping, and allocation heuristics by applying a common set of randomly-generated task systems and comparing the performance of the resulting allocations in a statistical manner with one another. Although much research has been performed using this paradigm the authors believe that often the results of such experiments have been extrapolated beyond their range of applicability and provide little insight into determining the best heuristic for a given type of real-world problem. This paper presents evidence to support this assertion by analyzing the results of from the mathematical literature (i.e. the P-method and the Box method) to create a large set of directed graphs which are then used (along with a set of digraphs which were derived from real-world problems) to evaluate four classical list-based scheduling methodologies (the HLFET, HLFNET, SCFET, and SCFNET). The difference of the effective ranking of these methodologies from those presented by other researchers illustrate how the built-in biases associated with random techniques can affect how one views the relative effectiveness of each of these heuristics.","PeriodicalId":135973,"journal":{"name":"Proceedings of 28th Southeastern Symposium on System Theory","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1996-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 28th Southeastern Symposium on System Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSST.1996.493549","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27

Abstract

It is not uncommon to evaluate the effectiveness of competing parallel processing scheduling, mapping, and allocation heuristics by applying a common set of randomly-generated task systems and comparing the performance of the resulting allocations in a statistical manner with one another. Although much research has been performed using this paradigm the authors believe that often the results of such experiments have been extrapolated beyond their range of applicability and provide little insight into determining the best heuristic for a given type of real-world problem. This paper presents evidence to support this assertion by analyzing the results of from the mathematical literature (i.e. the P-method and the Box method) to create a large set of directed graphs which are then used (along with a set of digraphs which were derived from real-world problems) to evaluate four classical list-based scheduling methodologies (the HLFET, HLFNET, SCFET, and SCFNET). The difference of the effective ranking of these methodologies from those presented by other researchers illustrate how the built-in biases associated with random techniques can affect how one views the relative effectiveness of each of these heuristics.
随机有向图生成中使用盒法和p法的列表调度的启发式比较
通过应用一组常见的随机生成的任务系统,并以统计方式相互比较结果分配的性能,评估竞争并行处理调度、映射和分配启发式的有效性并不罕见。尽管使用这种范式进行了许多研究,但作者认为,这些实验的结果通常已经超出了它们的适用范围,并且对确定给定类型的现实世界问题的最佳启发式提供了很少的见解。本文通过分析数学文献(即p -方法和Box方法)的结果来提供支持这一断言的证据,以创建一组大的有向图,然后使用(以及一组来自现实世界问题的有向图)来评估四种经典的基于列表的调度方法(HLFET, HLFNET, SCFET和SCFNET)。这些方法的有效排名与其他研究人员提出的方法的差异说明了与随机技术相关的内置偏见如何影响人们如何看待这些启发式的相对有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信