{"title":"锥形传感器的波前重建","authors":"R. Clare, B. Engler, S. Weddell","doi":"10.1109/IVCNZ51579.2020.9290735","DOIUrl":null,"url":null,"abstract":"Wavefronts of light from celestial objects are aberrated by Earth’s evolving atmosphere, causing images captured by ground-based telescopes to be distorted. The slope of the phase of the wavefront can be estimated by a pyramid wavefront sensor, which subdivides the complex field at the focal plane of the telescope, producing four images of the aperture. The cone wavefront sensor is the extension of the pyramid sensor to having an infinite number of sides, and produces an annulus of intensity rather than four images. We propose and compare the following methods for reconstructing the wavefront from the intensity measurements from the cone sensor: (1) use the entire aperture image, (2) use the pixels inside the intensity annulus only, (3) create a map of slopes by subtracting the slice of annulus 180 degrees opposite, (4) create x and y slopes by cutting out pseudo-apertures around the annulus, and (5) use the inverse Radon transform of the intensity annulus converted to polar co-ordinates. We find via numerical simulation with atmospheric phase screens that methods (1) and (2) provide the best wavefront estimate, methods (3) and (4) the smallest interaction matrices, while method (5) allows direct reconstruction without an interaction matrix.","PeriodicalId":164317,"journal":{"name":"2020 35th International Conference on Image and Vision Computing New Zealand (IVCNZ)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Wavefront reconstruction with the cone sensor\",\"authors\":\"R. Clare, B. Engler, S. Weddell\",\"doi\":\"10.1109/IVCNZ51579.2020.9290735\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Wavefronts of light from celestial objects are aberrated by Earth’s evolving atmosphere, causing images captured by ground-based telescopes to be distorted. The slope of the phase of the wavefront can be estimated by a pyramid wavefront sensor, which subdivides the complex field at the focal plane of the telescope, producing four images of the aperture. The cone wavefront sensor is the extension of the pyramid sensor to having an infinite number of sides, and produces an annulus of intensity rather than four images. We propose and compare the following methods for reconstructing the wavefront from the intensity measurements from the cone sensor: (1) use the entire aperture image, (2) use the pixels inside the intensity annulus only, (3) create a map of slopes by subtracting the slice of annulus 180 degrees opposite, (4) create x and y slopes by cutting out pseudo-apertures around the annulus, and (5) use the inverse Radon transform of the intensity annulus converted to polar co-ordinates. We find via numerical simulation with atmospheric phase screens that methods (1) and (2) provide the best wavefront estimate, methods (3) and (4) the smallest interaction matrices, while method (5) allows direct reconstruction without an interaction matrix.\",\"PeriodicalId\":164317,\"journal\":{\"name\":\"2020 35th International Conference on Image and Vision Computing New Zealand (IVCNZ)\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 35th International Conference on Image and Vision Computing New Zealand (IVCNZ)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IVCNZ51579.2020.9290735\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 35th International Conference on Image and Vision Computing New Zealand (IVCNZ)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IVCNZ51579.2020.9290735","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Wavefronts of light from celestial objects are aberrated by Earth’s evolving atmosphere, causing images captured by ground-based telescopes to be distorted. The slope of the phase of the wavefront can be estimated by a pyramid wavefront sensor, which subdivides the complex field at the focal plane of the telescope, producing four images of the aperture. The cone wavefront sensor is the extension of the pyramid sensor to having an infinite number of sides, and produces an annulus of intensity rather than four images. We propose and compare the following methods for reconstructing the wavefront from the intensity measurements from the cone sensor: (1) use the entire aperture image, (2) use the pixels inside the intensity annulus only, (3) create a map of slopes by subtracting the slice of annulus 180 degrees opposite, (4) create x and y slopes by cutting out pseudo-apertures around the annulus, and (5) use the inverse Radon transform of the intensity annulus converted to polar co-ordinates. We find via numerical simulation with atmospheric phase screens that methods (1) and (2) provide the best wavefront estimate, methods (3) and (4) the smallest interaction matrices, while method (5) allows direct reconstruction without an interaction matrix.