{"title":"Transfer Ranking in Finance: Applications to Cross-Sectional Momentum with Data Scarcity","authors":"Daniel Poh, Stephen Roberts, Stefan Zohren","doi":"10.52354/jsi.3.1.ii","DOIUrl":null,"url":null,"abstract":"Modern cross-sectional strategies incorporating sophisticated neural architectures outperform their traditional counterparts when applied to mature assets with long histories. However, deploying them on instruments with limited samples generally produces over-fitted models with degraded performance. In this paper, we introduce Fused Encoder Networks -- a hybrid parameter-sharing transfer ranking model which fuses information extracted using an encoder-attention module from a source dataset with a similar but separate module operating on a smaller target dataset of interest. This mitigates the issue of models with poor generalisability. Additionally, the self-attention mechanism enables interactions among instruments to be accounted for, both at the loss-level during model training and at inference time. Focusing on momentum applied to the top ten cryptocurrencies by market capitalisation as a demonstrative use-case, our model outperforms state-of-the-art benchmarks on most measures and significantly boosts the Sharpe ratio. It continues outperforming baselines even after accounting for the high transaction costs associated with trading cryptocurrencies.","PeriodicalId":400870,"journal":{"name":"Journal of Systematic Investing","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Systematic Investing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.52354/jsi.3.1.ii","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Modern cross-sectional strategies incorporating sophisticated neural architectures outperform their traditional counterparts when applied to mature assets with long histories. However, deploying them on instruments with limited samples generally produces over-fitted models with degraded performance. In this paper, we introduce Fused Encoder Networks -- a hybrid parameter-sharing transfer ranking model which fuses information extracted using an encoder-attention module from a source dataset with a similar but separate module operating on a smaller target dataset of interest. This mitigates the issue of models with poor generalisability. Additionally, the self-attention mechanism enables interactions among instruments to be accounted for, both at the loss-level during model training and at inference time. Focusing on momentum applied to the top ten cryptocurrencies by market capitalisation as a demonstrative use-case, our model outperforms state-of-the-art benchmarks on most measures and significantly boosts the Sharpe ratio. It continues outperforming baselines even after accounting for the high transaction costs associated with trading cryptocurrencies.