{"title":"Exact PPS sampling with bounded sample size","authors":"Brian Hentschel , Peter J. Haas , Yuanyuan Tian","doi":"10.1016/j.ipl.2023.106382","DOIUrl":null,"url":null,"abstract":"<div><p>Probability proportional to size (PPS) sampling schemes with a target sample size aim to produce a sample comprising a specified number <em>n</em> of items while ensuring that each item in the population appears in the sample with a probability proportional to its specified “weight” (also called its “size”). These two objectives, however, cannot always be achieved simultaneously. Existing PPS schemes prioritize control of the sample size, violating the PPS property if necessary. We provide a new PPS scheme, called EB-PPS, that allows a different trade-off: EB-PPS enforces the PPS property at all times while ensuring that the sample size never exceeds the target value <em>n</em>. The sample size is exactly equal to <em>n</em> if possible, and otherwise has maximal expected value and minimal variance. Thus we bound the sample size, thereby avoiding storage overflows and helping to control the time required for analytics over the sample, while allowing the user complete control over the sample contents. In the context of training classifiers at scale under imbalanced loss functions, we show that such control yields superior classifiers. The method is both simple to implement and efficient, being a one-pass streaming algorithm with an amortized processing time of <span><math><mi>O</mi><mo>(</mo><mn>1</mn><mo>)</mo></math></span> per item, which makes it computationally preferable even in cases where both EB-PPS and prior algorithms can ensure the PPS property and a target sample size simultaneously.</p></div>","PeriodicalId":56290,"journal":{"name":"Information Processing Letters","volume":"182 ","pages":"Article 106382"},"PeriodicalIF":0.7000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S002001902300025X","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 1
Abstract
Probability proportional to size (PPS) sampling schemes with a target sample size aim to produce a sample comprising a specified number n of items while ensuring that each item in the population appears in the sample with a probability proportional to its specified “weight” (also called its “size”). These two objectives, however, cannot always be achieved simultaneously. Existing PPS schemes prioritize control of the sample size, violating the PPS property if necessary. We provide a new PPS scheme, called EB-PPS, that allows a different trade-off: EB-PPS enforces the PPS property at all times while ensuring that the sample size never exceeds the target value n. The sample size is exactly equal to n if possible, and otherwise has maximal expected value and minimal variance. Thus we bound the sample size, thereby avoiding storage overflows and helping to control the time required for analytics over the sample, while allowing the user complete control over the sample contents. In the context of training classifiers at scale under imbalanced loss functions, we show that such control yields superior classifiers. The method is both simple to implement and efficient, being a one-pass streaming algorithm with an amortized processing time of per item, which makes it computationally preferable even in cases where both EB-PPS and prior algorithms can ensure the PPS property and a target sample size simultaneously.
期刊介绍:
Information Processing Letters invites submission of original research articles that focus on fundamental aspects of information processing and computing. This naturally includes work in the broadly understood field of theoretical computer science; although papers in all areas of scientific inquiry will be given consideration, provided that they describe research contributions credibly motivated by applications to computing and involve rigorous methodology. High quality experimental papers that address topics of sufficiently broad interest may also be considered.
Since its inception in 1971, Information Processing Letters has served as a forum for timely dissemination of short, concise and focused research contributions. Continuing with this tradition, and to expedite the reviewing process, manuscripts are generally limited in length to nine pages when they appear in print.