Transfer Ranking in Finance: Applications to Cross-Sectional Momentum with Data Scarcity

Daniel Poh, Stephen Roberts, Stefan Zohren
{"title":"Transfer Ranking in Finance: Applications to Cross-Sectional Momentum with Data Scarcity","authors":"Daniel Poh, Stephen Roberts, Stefan Zohren","doi":"10.52354/jsi.3.1.ii","DOIUrl":null,"url":null,"abstract":"Modern cross-sectional strategies incorporating sophisticated neural architectures outperform their traditional counterparts when applied to mature assets with long histories. However, deploying them on instruments with limited samples generally produces over-fitted models with degraded performance. In this paper, we introduce Fused Encoder Networks -- a hybrid parameter-sharing transfer ranking model which fuses information extracted using an encoder-attention module from a source dataset with a similar but separate module operating on a smaller target dataset of interest. This mitigates the issue of models with poor generalisability. Additionally, the self-attention mechanism enables interactions among instruments to be accounted for, both at the loss-level during model training and at inference time. Focusing on momentum applied to the top ten cryptocurrencies by market capitalisation as a demonstrative use-case, our model outperforms state-of-the-art benchmarks on most measures and significantly boosts the Sharpe ratio. It continues outperforming baselines even after accounting for the high transaction costs associated with trading cryptocurrencies.","PeriodicalId":400870,"journal":{"name":"Journal of Systematic Investing","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Systematic Investing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.52354/jsi.3.1.ii","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Modern cross-sectional strategies incorporating sophisticated neural architectures outperform their traditional counterparts when applied to mature assets with long histories. However, deploying them on instruments with limited samples generally produces over-fitted models with degraded performance. In this paper, we introduce Fused Encoder Networks -- a hybrid parameter-sharing transfer ranking model which fuses information extracted using an encoder-attention module from a source dataset with a similar but separate module operating on a smaller target dataset of interest. This mitigates the issue of models with poor generalisability. Additionally, the self-attention mechanism enables interactions among instruments to be accounted for, both at the loss-level during model training and at inference time. Focusing on momentum applied to the top ten cryptocurrencies by market capitalisation as a demonstrative use-case, our model outperforms state-of-the-art benchmarks on most measures and significantly boosts the Sharpe ratio. It continues outperforming baselines even after accounting for the high transaction costs associated with trading cryptocurrencies.
金融中的转移排序:在数据稀缺的横截面动量中的应用
当应用于具有悠久历史的成熟资产时,包含复杂神经架构的现代横截面策略优于传统策略。然而,将它们部署在样本有限的仪器上通常会产生性能下降的过拟合模型。在本文中,我们引入了融合编码器网络——一种混合参数共享传输排序模型,该模型将使用编码器注意模块从源数据集提取的信息与在较小的感兴趣目标数据集上操作的类似但独立的模块融合在一起。这减轻了模型泛化性差的问题。此外,自注意机制使工具之间的相互作用能够得到解释,无论是在模型训练期间的损失水平还是在推理时间。作为一个示范用例,我们的模型专注于市值排名前十的加密货币的势头,在大多数指标上都优于最先进的基准,并显著提高了夏普比率。即使考虑到与交易加密货币相关的高交易成本,它的表现仍然优于基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信