高斯导数算子混合离散化相对于连续尺度空间的逼近特性

Tony Lindeberg
{"title":"高斯导数算子混合离散化相对于连续尺度空间的逼近特性","authors":"Tony Lindeberg","doi":"arxiv-2405.05095","DOIUrl":null,"url":null,"abstract":"This paper presents an analysis of properties of two hybrid discretization\nmethods for Gaussian derivatives, based on convolutions with either the\nnormalized sampled Gaussian kernel or the integrated Gaussian kernel followed\nby central differences. The motivation for studying these discretization\nmethods is that in situations when multiple spatial derivatives of different\norder are needed at the same scale level, they can be computed significantly\nmore efficiently compared to more direct derivative approximations based on\nexplicit convolutions with either sampled Gaussian kernels or integrated\nGaussian kernels. While these computational benefits do also hold for the genuinely discrete\napproach for computing discrete analogues of Gaussian derivatives, based on\nconvolution with the discrete analogue of the Gaussian kernel followed by\ncentral differences, the underlying mathematical primitives for the discrete\nanalogue of the Gaussian kernel, in terms of modified Bessel functions of\ninteger order, may not be available in certain frameworks for image processing,\nsuch as when performing deep learning based on scale-parameterized filters in\nterms of Gaussian derivatives, with learning of the scale levels. In this paper, we present a characterization of the properties of these\nhybrid discretization methods, in terms of quantitative performance measures\nconcerning the amount of spatial smoothing that they imply, as well as the\nrelative consistency of scale estimates obtained from scale-invariant feature\ndetectors with automatic scale selection, with an emphasis on the behaviour for\nvery small values of the scale parameter, which may differ significantly from\ncorresponding results obtained from the fully continuous scale-space theory, as\nwell as between different types of discretization methods.","PeriodicalId":501061,"journal":{"name":"arXiv - CS - Numerical Analysis","volume":"6 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Approximation properties relative to continuous scale space for hybrid discretizations of Gaussian derivative operators\",\"authors\":\"Tony Lindeberg\",\"doi\":\"arxiv-2405.05095\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an analysis of properties of two hybrid discretization\\nmethods for Gaussian derivatives, based on convolutions with either the\\nnormalized sampled Gaussian kernel or the integrated Gaussian kernel followed\\nby central differences. The motivation for studying these discretization\\nmethods is that in situations when multiple spatial derivatives of different\\norder are needed at the same scale level, they can be computed significantly\\nmore efficiently compared to more direct derivative approximations based on\\nexplicit convolutions with either sampled Gaussian kernels or integrated\\nGaussian kernels. While these computational benefits do also hold for the genuinely discrete\\napproach for computing discrete analogues of Gaussian derivatives, based on\\nconvolution with the discrete analogue of the Gaussian kernel followed by\\ncentral differences, the underlying mathematical primitives for the discrete\\nanalogue of the Gaussian kernel, in terms of modified Bessel functions of\\ninteger order, may not be available in certain frameworks for image processing,\\nsuch as when performing deep learning based on scale-parameterized filters in\\nterms of Gaussian derivatives, with learning of the scale levels. In this paper, we present a characterization of the properties of these\\nhybrid discretization methods, in terms of quantitative performance measures\\nconcerning the amount of spatial smoothing that they imply, as well as the\\nrelative consistency of scale estimates obtained from scale-invariant feature\\ndetectors with automatic scale selection, with an emphasis on the behaviour for\\nvery small values of the scale parameter, which may differ significantly from\\ncorresponding results obtained from the fully continuous scale-space theory, as\\nwell as between different types of discretization methods.\",\"PeriodicalId\":501061,\"journal\":{\"name\":\"arXiv - CS - Numerical Analysis\",\"volume\":\"6 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Numerical Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2405.05095\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2405.05095","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文分析了两种高斯导数混合离散化方法的特性,这两种方法都是基于归一化采样高斯核或积分高斯核的卷积,然后进行中心差分。研究这些离散化方法的动机是,在同一尺度水平上需要多个不同阶的空间导数时,与基于采样高斯核或积分高斯核的显式卷积的更直接的导数近似计算相比,这些方法的计算效率要高得多。虽然这些计算优势也适用于计算高斯导数离散近似值的真正离散方法(基于高斯核离散近似值的卷积,然后进行中心差分),但高斯核离散近似值的基本数学原理(以修正的整阶贝塞尔函数表示)可能无法用于某些图像处理框架,例如在进行基于尺度参数化滤波器的高斯导数深度学习和尺度级别学习时。在本文中,我们介绍了这些混合离散化方法的特性,包括与空间平滑化程度相关的定量性能指标,以及从具有自动尺度选择功能的尺度不变特征检测器中获得的尺度估计值的相对一致性,重点是尺度参数极小值的行为,这可能与从完全连续的尺度空间理论以及不同类型的离散化方法中获得的相应结果存在显著差异。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Approximation properties relative to continuous scale space for hybrid discretizations of Gaussian derivative operators
This paper presents an analysis of properties of two hybrid discretization methods for Gaussian derivatives, based on convolutions with either the normalized sampled Gaussian kernel or the integrated Gaussian kernel followed by central differences. The motivation for studying these discretization methods is that in situations when multiple spatial derivatives of different order are needed at the same scale level, they can be computed significantly more efficiently compared to more direct derivative approximations based on explicit convolutions with either sampled Gaussian kernels or integrated Gaussian kernels. While these computational benefits do also hold for the genuinely discrete approach for computing discrete analogues of Gaussian derivatives, based on convolution with the discrete analogue of the Gaussian kernel followed by central differences, the underlying mathematical primitives for the discrete analogue of the Gaussian kernel, in terms of modified Bessel functions of integer order, may not be available in certain frameworks for image processing, such as when performing deep learning based on scale-parameterized filters in terms of Gaussian derivatives, with learning of the scale levels. In this paper, we present a characterization of the properties of these hybrid discretization methods, in terms of quantitative performance measures concerning the amount of spatial smoothing that they imply, as well as the relative consistency of scale estimates obtained from scale-invariant feature detectors with automatic scale selection, with an emphasis on the behaviour for very small values of the scale parameter, which may differ significantly from corresponding results obtained from the fully continuous scale-space theory, as well as between different types of discretization methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信