{"title":"Hypothesis Testing in Non-Linear Models Exemplified by the Planar Coordinate Transformations","authors":"R. Lehmann, M. Lösler","doi":"10.1515/jogs-2018-0009","DOIUrl":"https://doi.org/10.1515/jogs-2018-0009","url":null,"abstract":"Abstract In geodesy, hypothesis testing is applied to a wide area of applications e.g. outlier detection, deformation analysis or, more generally, model optimisation. Due to the possible far-reaching consequences of a decision, high statistical test power of such a hypothesis test is needed. The Neyman-Pearson lemma states that under strict assumptions the often-applied likelihood ratio test has highest statistical test power and may thus fulfill the requirement. The application, however, is made more difficult as most of the decision problems are non-linear and, thus, the probability density function of the parameters does not belong to the well-known set of statistical test distributions. Moreover, the statistical test power may change, if linear approximations of the likelihood ratio test are applied. The influence of the non-linearity on hypothesis testing is investigated and exemplified by the planar coordinate transformations. Whereas several mathematical equivalent expressions are conceivable to evaluate the rotation parameter of the transformation, the decisions and, thus, the probabilities of type 1 and 2 decision errors of the related hypothesis testing are unequal to each other. Based on Monte Carlo integration, the effective decision errors are estimated and used as a basis of valuation for linear and non-linear equivalents.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75060911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluation of recent combined global geopotential models in Brazil","authors":"E. Nicacio, R. Dalazoana, S. Freitas","doi":"10.1515/jogs-2018-0008","DOIUrl":"https://doi.org/10.1515/jogs-2018-0008","url":null,"abstract":"Abstract The aim of this paper is to present a quantitative analysis of the adequacy of the main currently existing combined Global Geopotential Models (GGMs) for modeling normal-geoid heights throughout Brazil. As major advances have been reached since mid-2016 in the combined GGMs elaboration and development, the main objective of this analysis is to verify if, in fact, the most recent models present superior or equivalent performance to the most performant previous models. The analysis was based on comparisons between normal-geoid height values obtained fromGNSS/leveling solutions and values calculated from GGMs XGM2016, GOCO05C, EIGEN-6C4 and EGM2008, according to different geopotential functionals - geoid height and height anomaly - and in different degrees of development, always through the relative method. This procedure was applied to 997 stations which carry information of both ellipsoidal and normal-orthometric heights, located all over Brazil. As a main result, it was observed the superior performance of the recent combined GGMs, GOCO05C and XGM2016, when compared to the older models, EIGEN-6C4 and EGM2008, when all of them are developed up to degree 720, the maximum degree of the recent models; and a approximate equality of results when all of the models are used in their individual maximum degrees.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73195857","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The effect of regional sea level atmospheric pressure on sea level variations at globally distributed tide gauge stations with long records","authors":"H. Iz","doi":"10.1515/jogs-2018-0007","DOIUrl":"https://doi.org/10.1515/jogs-2018-0007","url":null,"abstract":"Abstract This study provides additional information about the impact of atmospheric pressure on sea level variations. The observed regularity in sea level atmospheric pressure depends mainly on the latitude and verified to be dominantly random closer to the equator. It was demonstrated that almost all the annual and semiannual sea level variations at 27 globally distributed tide gauge stations can be attributed to the regional/local atmospheric forcing as an inverted barometric effect. Statistically significant non-linearities were detected in the regional atmospheric pressure series, which in turn impacted other sea level variations as compounders in tandem with the lunar nodal forcing, generating lunar sub-harmonics with multidecadal periods. It was shown that random component of regional atmospheric pressure tends to cluster at monthly intervals. The clusters are likely to be caused by the intraannual seasonal atmospheric temperature changes,which may also act as random beats in generating sub-harmonics observed in sea level changes as another mechanism. This study also affirmed that there are no statistically significant secular trends in the progression of regional atmospheric pressures, hence there was no contribution to the sea level trends during the 20th century by the atmospheric pressure.Meanwhile, the estimated nonuniform scale factors of the inverted barometer effects suggest that the sea level atmospheric pressure will bias the sea level trends inferred from satellite altimetry measurements if their impact is accounted for as corrections without proper scaling.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82487930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Eshagh, Filippa Johansson, L. Karlsson, M. Horemuz
{"title":"A case study on displacement analysis of Vasa warship","authors":"M. Eshagh, Filippa Johansson, L. Karlsson, M. Horemuz","doi":"10.1515/jogs-2018-0006","DOIUrl":"https://doi.org/10.1515/jogs-2018-0006","url":null,"abstract":"Abstract Monitoring deformation of man-made structures is very important to prevent them from a risk of collapse and save lives. Such a process is also used for monitoring change in historical objects, which are deforming continuously with time. An example of this is the Vasa warship, which was under water for about 300 years. The ship was raised from the bottom of the sea and is kept in the Vasa museum in Stockholm. A geodetic network with points on the museum building and the ship’s body has been established and measured for 12 years for monitoring the ship’s deformation. The coordinate time series of each point on the ship and their uncertainties have been estimated epoch-wisely. In this paper, our goal is to statistically analyse the ship’s hull movements. By fitting a quadratic polynomial to the coordinate time series of each point of the hull, its acceleration and velocity are estimated. In addition, their significance is tested by comparing them with their respective estimated errors after the fitting. Our numerical investigations show that the backside of the ship, having highest elevation and slope, has moved vertically faster than the other places by a velocity and an acceleration of about 2 mm/year and 0.1 mm/year2, respectively and this part of the ship is the weakest with a higher risk of collapse. The central parts of the ship are more stable as the ship hull is almost vertical and closer to the floor. Generally, the hull is moving towards its port and downwards","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82234619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Lahtinen, Häkli Pasi, L. Jivall, C. Kempe, K. Kollo, Ksenija Kosenko, P. Pihlak, Dalia Prizginiene, O. Tangen, M. Weber, E. Paršeliūnas, Rimvydas Baniulis, Karolis Galinauskas
{"title":"First results of the Nordic and Baltic GNSS Analysis Centre","authors":"S. Lahtinen, Häkli Pasi, L. Jivall, C. Kempe, K. Kollo, Ksenija Kosenko, P. Pihlak, Dalia Prizginiene, O. Tangen, M. Weber, E. Paršeliūnas, Rimvydas Baniulis, Karolis Galinauskas","doi":"10.1515/jogs-2018-0005","DOIUrl":"https://doi.org/10.1515/jogs-2018-0005","url":null,"abstract":"Abstract The Nordic Geodetic Commission (NKG) has launched a joint NKG GNSS Analysis Centre that aims to routinely produce high qualityGNSS solutions for the common needs of the NKG and the Nordic and Baltic countries. A consistent and densified velocity field is needed for the constraining of the gla-cial isostatic adjustment (GIA) modelling that is a key component of maintaining the national reference frame realisations in the area. We described the methods of the NKG GNSS Analysis Centre including the defined processing setup for the local analysis centres (LAC) and for the combination centres.We analysed the results of the first 2.5 years (2014.5-2016). The results showed that different subnets were consistent with the combined solution within 1-2 mm level. We observed the so called network effect affecting our reference frame alignment. However, the accuracy of the reference frame alignment was on a few millimetre level in the area of the main interest (Nordic and Baltic Countries). TheNKGGNSS AC was declared fully operational in April 2017.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84471729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the topographic bias and density distribution in modelling the geoid and orthometric heights","authors":"L. Sjöberg","doi":"10.1515/jogs-2018-0004","DOIUrl":"https://doi.org/10.1515/jogs-2018-0004","url":null,"abstract":"Abstract It is well known that the success in precise determinations of the gravimetric geoid height (N) and the orthometric height (H) rely on the knowledge of the topographic mass distribution. We show that the residual topographic bias due to an imprecise information on the topographic density is practically the same for N and H, but with opposite signs. This result is demonstrated both for the Helmert orthometric height and for a more precise orthometric height derived by analytical continuation of the external geopotential to the geoid. This result leads to the conclusion that precise gravimetric geoid heights cannot be validated by GNSS-levelling geoid heights in mountainous regions for the errors caused by the incorrect modelling of the topographic mass distribution, because this uncertainty is hidden in the difference between the two geoid estimators.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73304799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A numerical test of the topographic bias","authors":"L. Sjöberg, M. Joud","doi":"10.1515/jogs-2018-0002","DOIUrl":"https://doi.org/10.1515/jogs-2018-0002","url":null,"abstract":"Abstract In 1962 A. Bjerhammar introduced the method of analytical continuation in physical geodesy, implying that surface gravity anomalies are downward continued into the topographic masses down to an internal sphere (the Bjerhammar sphere). The method also includes analytical upward continuation of the potential to the surface of the Earth to obtain the quasigeoid. One can show that also the common remove-compute-restore technique for geoid determination includes an analytical continuation as long as the complete density distribution of the topography is not known. The analytical continuation implies that the downward continued gravity anomaly and/or potential are/is in error by the so-called topographic bias, which was postulated by a simple formula of L E Sjöberg in 2007. Here we will numerically test the postulated formula by comparing it with the bias obtained by analytical downward continuation of the external potential of a homogeneous ellipsoid to an inner sphere. The result shows that the postulated formula holds: At the equator of the ellipsoid, where the external potential is downward continued 21 km, the computed and postulated topographic biases agree to less than a millimetre (when the potential is scaled to the unit of metre).","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86841923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayesian statistics and Monte Carlo methods","authors":"K. Koch","doi":"10.1515/jogs-2018-0003","DOIUrl":"https://doi.org/10.1515/jogs-2018-0003","url":null,"abstract":"Abstract The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes’ theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84446788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling the North American vertical datum of 1988 errors in the conterminous United States","authors":"X. Li","doi":"10.1515/jogs-2018-0001","DOIUrl":"https://doi.org/10.1515/jogs-2018-0001","url":null,"abstract":"Abstract A large systematic difference (ranging from −20 cm to +130 cm) was found between NAVD 88 (North AmericanVertical Datum of 1988) and the pure gravimetric geoid models. This difference not only makes it very difficult to augment the local geoid model by directly using the vast NAVD 88 network with state-of-the-art technologies recently developed in geodesy, but also limits the ability of researchers to effectively demonstrate the geoid model improvements on the NAVD 88 network. Here, both conventional regression analyses based on various predefined basis functions such as polynomials, B-splines, and Legendre functions and the Latent Variable Analysis (LVA) such as the Factor Analysis (FA) are used to analyze the systematic difference. Besides giving a mathematical model, the regression results do not reveal a great deal about the physical reasons that caused the large differences in NAVD 88, which may be of interest to various researchers. Furthermore, there is still a significant amount of no-Gaussian signals left in the residuals of the conventional regression models. On the other side, the FA method not only provides a better not of the data, but also offers possible explanations of the error sources. Without requiring extra hypothesis tests on the model coefficients, the results from FA are more efficient in terms of capturing the systematic difference. Furthermore, without using a covariance model, a novel interpolating method based on the relationship between the loading matrix and the factor scores is developed for predictive purposes. The prediction error analysis shows that about 3-7 cm precision is expected in NAVD 88 after removing the systematic difference.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79218502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Performance of GPS sidereal filters during a satellite outage","authors":"C. Atkins, M. Ziebart","doi":"10.1515/jogs-2017-0015","DOIUrl":"https://doi.org/10.1515/jogs-2017-0015","url":null,"abstract":"Abstract Sidereal filtering is the name of a technique used to reduce the effect of multipath interference on a GPS position time series associated with a static or quasi-static antenna. This article assesses the impact of a GPS satellite outage on the performance of a sidereal filter. Two different types of sidereal filter are tested: a position-domain sidereal filter (PDSF) and an observation-domain sidereal filter (ODSF). A satellite outage is simulated at two static receivers with contrasting antenna types and multipath environments. At both stations, the ODSF is more effective than a PDSF at removing multipath error over averaging intervals under around 200 seconds in length whether there is an outage or not. However, difference in the performance of the two types of sidereal filter was much more significant at the station more prone to multipath interference. The results are particularly relevant for applications where high-rate precise point positioning (PPP) is used for monitoring: If a PDSF is applied, then errors due to highfrequency multipath interference may still alias into the resulting position time series if a satellite outage occurs and possibly increasing the false alarm rate. In contrast, an ODSF is likely to perform better in such circumstances.","PeriodicalId":44569,"journal":{"name":"Journal of Geodetic Science","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89831863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}