{"title":"Convex Combinations in Judgment Aggregation","authors":"Johannes G. Jaspersen","doi":"10.2139/ssrn.3732274","DOIUrl":null,"url":null,"abstract":"Judgments are the basis for almost all decisions. They often come from different models and multiple experts. This information is typically aggregated using simple averages, which leads to the well-known shared information problem. A weighted average of the individual judgments based on empirically estimated sophisticated weights is commonly discarded in practice, because the sophisticated weights have large estimation errors. In this paper, we explore mixture weights, which are convex combinations of sophisticated and naive weights. We show analytically that if the data generation process is stable, there always exists a mixture weight which aggregates judgments better than the naive weights. We thus offer a path to alleviate the shared information problem. In contrast to other proposed solutions, we do not require any control over the judgment process. We demonstrate the utility of mixture weights in numerical analyses and in two empirical applications. We also offer heuristic selection algorithms for the correct mixture weight and analyze them in our numerical and empirical settings.","PeriodicalId":11495,"journal":{"name":"Econometric Modeling: Capital Markets - Forecasting eJournal","volume":"72 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Econometric Modeling: Capital Markets - Forecasting eJournal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.3732274","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Judgments are the basis for almost all decisions. They often come from different models and multiple experts. This information is typically aggregated using simple averages, which leads to the well-known shared information problem. A weighted average of the individual judgments based on empirically estimated sophisticated weights is commonly discarded in practice, because the sophisticated weights have large estimation errors. In this paper, we explore mixture weights, which are convex combinations of sophisticated and naive weights. We show analytically that if the data generation process is stable, there always exists a mixture weight which aggregates judgments better than the naive weights. We thus offer a path to alleviate the shared information problem. In contrast to other proposed solutions, we do not require any control over the judgment process. We demonstrate the utility of mixture weights in numerical analyses and in two empirical applications. We also offer heuristic selection algorithms for the correct mixture weight and analyze them in our numerical and empirical settings.