{"title":"时间序列的碎片化不是异常,而是常态","authors":"M. M. Eliseykin, V. F. Ochkov","doi":"10.3103/S0747923925700240","DOIUrl":null,"url":null,"abstract":"<p>Contemporary technologies have simplified and made accessible the collection and processing of data from scientific observations. Current solutions enable rapid preliminary processing of collected data, cleansing it of outliers related to measurement errors and filling gaps in fragmented time series. However, the ease with which this is achieved presents risks of indiscriminate use of such solutions. Consequently, data indicating real physical anomalies may be discarded, and the amalgamation of time series fragments might result in data that does not correspond to the observed processes and phenomena. Such a situation was justified in the past when there was a scarcity of computational resources. Discarding inherently unreliable values and filling gaps simplified and accelerated data analysis. Now, with sufficient computational power available, it is possible to begin searching for patterns in what was previously considered observational error and discarded. Moreover, the volume of accumulated data may allow for the consideration of fragments of time series as parts of a regular process, without filling the gaps with artificial data created based on our assumptions about the nature of the observed processes and phenomena. This raises the question of the necessity to adapt the approaches used in collecting and analyzing observational results to the possibilities afforded by new computational tools.</p>","PeriodicalId":45174,"journal":{"name":"Seismic Instruments","volume":"61 2","pages":"153 - 156"},"PeriodicalIF":0.3000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fragmentation of Time Series Is Not an Anomaly, but the Norm\",\"authors\":\"M. M. Eliseykin, V. F. Ochkov\",\"doi\":\"10.3103/S0747923925700240\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Contemporary technologies have simplified and made accessible the collection and processing of data from scientific observations. Current solutions enable rapid preliminary processing of collected data, cleansing it of outliers related to measurement errors and filling gaps in fragmented time series. However, the ease with which this is achieved presents risks of indiscriminate use of such solutions. Consequently, data indicating real physical anomalies may be discarded, and the amalgamation of time series fragments might result in data that does not correspond to the observed processes and phenomena. Such a situation was justified in the past when there was a scarcity of computational resources. Discarding inherently unreliable values and filling gaps simplified and accelerated data analysis. Now, with sufficient computational power available, it is possible to begin searching for patterns in what was previously considered observational error and discarded. Moreover, the volume of accumulated data may allow for the consideration of fragments of time series as parts of a regular process, without filling the gaps with artificial data created based on our assumptions about the nature of the observed processes and phenomena. This raises the question of the necessity to adapt the approaches used in collecting and analyzing observational results to the possibilities afforded by new computational tools.</p>\",\"PeriodicalId\":45174,\"journal\":{\"name\":\"Seismic Instruments\",\"volume\":\"61 2\",\"pages\":\"153 - 156\"},\"PeriodicalIF\":0.3000,\"publicationDate\":\"2025-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Seismic Instruments\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.3103/S0747923925700240\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"GEOCHEMISTRY & GEOPHYSICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Seismic Instruments","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.3103/S0747923925700240","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"GEOCHEMISTRY & GEOPHYSICS","Score":null,"Total":0}
Fragmentation of Time Series Is Not an Anomaly, but the Norm
Contemporary technologies have simplified and made accessible the collection and processing of data from scientific observations. Current solutions enable rapid preliminary processing of collected data, cleansing it of outliers related to measurement errors and filling gaps in fragmented time series. However, the ease with which this is achieved presents risks of indiscriminate use of such solutions. Consequently, data indicating real physical anomalies may be discarded, and the amalgamation of time series fragments might result in data that does not correspond to the observed processes and phenomena. Such a situation was justified in the past when there was a scarcity of computational resources. Discarding inherently unreliable values and filling gaps simplified and accelerated data analysis. Now, with sufficient computational power available, it is possible to begin searching for patterns in what was previously considered observational error and discarded. Moreover, the volume of accumulated data may allow for the consideration of fragments of time series as parts of a regular process, without filling the gaps with artificial data created based on our assumptions about the nature of the observed processes and phenomena. This raises the question of the necessity to adapt the approaches used in collecting and analyzing observational results to the possibilities afforded by new computational tools.
期刊介绍:
Seismic Instruments is a journal devoted to the description of geophysical instruments used in seismic research. In addition to covering the actual instruments for registering seismic waves, substantial room is devoted to solving instrumental-methodological problems of geophysical monitoring, applying various methods that are used to search for earthquake precursors, to studying earthquake nucleation processes and to monitoring natural and technogenous processes. The description of the construction, working elements, and technical characteristics of the instruments, as well as some results of implementation of the instruments and interpretation of the results are given. Attention is paid to seismic monitoring data and earthquake catalog quality Analysis.