{"title":"From the radiance reflected by a scene to a digital picture : a compact model based simulator for image sensor design","authors":"Alix de Gouvello, L. Soulier, A. Dupret","doi":"10.2352/issn.2470-1173.2020.7.iss-329","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-329","url":null,"abstract":"\u0000 In order to explore the design space of a new, potentially unconventional, sensor or to optimize sensor characteristics for a given computer vision application, an image acquisition process simulator has been designed. Its aim is to be simple and modular, yet complete and accurate\u0000 enough to match the physical phenomena involved. The approach has been described in this paper to highlight the different steps of the acquisition process and to explain the implementation choices and the hypotheses that were made. The simulator has been tested on images of point sources,\u0000 on simulated test patterns and on real high definition pictures and has proven realistic.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"359 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115958861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Farrell, Zheng Lyu, Zhenyi Liu, H. Blasinski, Zhihao Xu, J. Rong, Feng Xiao, B. Wandell
{"title":"Soft-prototyping imaging systems for oral cancer screening","authors":"J. Farrell, Zheng Lyu, Zhenyi Liu, H. Blasinski, Zhihao Xu, J. Rong, Feng Xiao, B. Wandell","doi":"10.2352/issn.2470-1173.2020.7.iss-212","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-212","url":null,"abstract":"We are using image systems simulation technology to design a digital camera for measuring fluorescent signals; a first application is oral cancer screening. We validate the simulations by creating a camera model that accurately predicts measured RGB values for any spectral radiance.\u0000 Then we use the excitationemission spectra for different biological fluorophores to predict measurements of fluorescence of oral mucosal tissue under several different illuminations. The simulations and measurements are useful for (a) designing cameras that measure tissue fluorescence and\u0000 (b) clarifying which fluorophores may be diagnostic in identifying precancerous tissue.","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129353881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Event threshold modulation in dynamic vision spiking imagers for data throughput reduction","authors":"L. Cubero, A. Peizerat, D. Morche, G. Sicard","doi":"10.2352/issn.2470-1173.2020.7.iss-145","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-145","url":null,"abstract":"\u0000 Dynamic vision sensors are growing in popularity for Computer Vision and moving scenes: its output is a stream of events reflecting temporal lighting changes, instead of absolute values. One of its advantages is fast detection of events, which are asynchronously read as spikes. However,\u0000 high event data throughput implies an increasing workload for the read-out. That can lead to data loss or to prohibitively large power consumption for constrained devices. This work presents a scheme to reduce data throughput by using near pixel pre-processing: less events codifying temporal\u0000 change and intensity slope magnitude are generated. Our simulated example depicts a data throughput reduction down to 14 %, in the case of the most aggressive version of our approach.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130277375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"CAOS Smart Camera-based Robust Low Contrast Image Recovery over 90 dB Scene Linear Dynamic Range","authors":"N. Riza, M. Mazhar","doi":"10.2352/issn.2470-1173.2020.7.iss-226","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-226","url":null,"abstract":"\u0000 Experimentally demonstrated for the first time is Coded Access Optical Sensor (CAOS) camera empowered robust and true white light High Dynamic Range (HDR) scene low contrast target image recovery over the full linear dynamic range. The 90 dB linear HDR scene uses a 16 element custom\u0000 designed test target with low contrast 6 dB step scaled irradiances. Such camera performance is highly sought after in catastrophic failure avoidance mission critical HDR scenarios with embedded low contrast targets.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121144665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Anisotropic Subsurface Scattering Acquisition Through a Light Field Based Apparatus","authors":"Yurii Piadyk, Y. Lockerman, Cláudio T. Silva","doi":"10.2352/issn.2470-1173.2020.7.iss-225","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-225","url":null,"abstract":"\u0000 Subsurface scattering gives a distinct look to many everyday objects. However, until now, systems to acquire subsurface scattering have assumed that the subsurface displacement and angle of scattering are completely independent of the angle of incident. While this independence substantially\u0000 simplifies the acquisition and rendering of materials where it holds true, it makes the acquisition of other materials impossible. In this paper, we demonstrate a system that can quickly acquire the full anisotropic subsurface scattering at a given point. Unlike many existing commercial acquisition\u0000 systems, our system can be assembled from off-the-shelf optical component and 3D printed/cut parts, making it accessible at a low price. We validate our device by measuring and fitting a dipole model for material exhibiting isotropic subsurface scattering as well as comparing real-world appearance\u0000 with rendering of anisotropic material under incident laser beam illumination.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131107440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Characterization of camera shake","authors":"H. Dietz, W. V. Davis, P. Eberhart","doi":"10.2352/issn.2470-1173.2020.7.iss-228","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-228","url":null,"abstract":"\u0000 In the early days of photography, emulsions were not very sensitive to light and lenses had relatively small apertures, so long exposures were needed and cameras were generally mounted on solid, stationary, supports. However, in modern use, cameras are nearly always hand-held –\u0000 and this introduces shake. Vibrations also are introduced by the complex moving systems within a camera and lens.\u0000 \u0000 Although many cameras now incorporate mechanisms for minimizing the detrimental impact of shake, and there is a standard test procedure to measure effectiveness of such\u0000 measures, there is surprisingly little published on the characterization of camera shake itself. The current work describes how inexpensive shake measurement hardware can be built, proposes a testing methodology for characterizing shake, and summarizes preliminary results obtained by measuring\u0000 shake under a variety of conditions.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120846001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Leon Eisemann, J. Froehlich, Axel Hartz, J. Maucher
{"title":"Expanding dynamic range in a single-shot image through a sparse grid of low exposure pixels","authors":"Leon Eisemann, J. Froehlich, Axel Hartz, J. Maucher","doi":"10.2352/issn.2470-1173.2020.7.iss-229","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-229","url":null,"abstract":"\u0000 Camera sensors are physically restricted in the amount of luminance which can be captured at once. To achieve a higher dynamic range, multiple exposures are typically combined. This method comes with several disadvantages, like temporal or alignment aliasing. Hence, we propose a\u0000 method to preserve high luminance information in a single-shot image. By introducing a grid of highlight preserving pixels, which equals 1% of the total amount of pixels, we are able to sustain information directly incamera for later processing. To provide evidence, that this number of pixels\u0000 is enough for gaining additional dynamic range, we use a U-Net for reconstruction. For training, we make use of the HDR+ dataset, which we augment to simulate our proposed grid. We demonstrate that our approach can preserve high luminance information, which can be used for a visually convincing\u0000 reconstruction, close to the ground truth.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125081062","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Non-Uniform Integration of TDCI Captures","authors":"P. Eberhart, H. Dietz","doi":"10.2352/issn.2470-1173.2020.7.iss-330","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-330","url":null,"abstract":"\u0000 TDCI (Time Domain Continuous Imaging) is a system for image capture and representation in which scene appearance is modeled as a set of continuous waveforms recording the changes in incident light at each pixel over time. Several of the advantages of TDCI are related to the ability\u0000 to set exposure parameters after-the-fact, rather than at the time of capture. These exposure parameters can be far more complicated than are physically realizable in a conventional camera, or reasonable to design without the ability to repeatedly expose the same scene. Previous TDCI experiments\u0000 have performed relatively traditional integration; this work explores a pair of related exposure behavior enabled by TDCI - the non-uniform integration of incident light into an image along the axes of both the time and space.\u0000 \u0000 This paper details a proof-of-concept implementation\u0000 which ingests video frames and re-exposes images from the resulting sampled light with user-specified spatially and temporally nonuniform gain.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"39 Suppl 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132556179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Calibration Empowered Minimalistic Multi-Exposure Image Processing Technique for Camera Linear Dynamic Range Extension","authors":"N. Riza, Nazim Ashraf","doi":"10.2352/issn.2470-1173.2020.7.iss-213","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-213","url":null,"abstract":"\u0000 Proposed for the first time is a novel calibration empowered minimalistic multi-exposure image processing technique using measured sensor pixel voltage output and exposure time factor limits for robust camera linear dynamic range extension. The technique exploits the best linear\u0000 response region of an overall nonlinear response image sensor to robustly recover via minimal count multi-exposure image fusion, the true and precise scaled High Dynamic Range (HDR) irradiance map. CMOS sensor-based experiments using a measured Low Dynamic Range (LDR) 44 dB linear region for\u0000 the technique with a minimum of 2 multi-exposure images provides robust recovery of 78 dB HDR low contrast highly calibrated test targets.\u0000","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125686539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Antonio de la Calle-Martos, Rubén Gómez-Merchán, J. A. Leñero-Bardallo, Á. Rodríguez-Vázquez
{"title":"Sun tracker sensor for attitude control of space navigation systems","authors":"Antonio de la Calle-Martos, Rubén Gómez-Merchán, J. A. Leñero-Bardallo, Á. Rodríguez-Vázquez","doi":"10.2352/issn.2470-1173.2020.7.iss-231","DOIUrl":"https://doi.org/10.2352/issn.2470-1173.2020.7.iss-231","url":null,"abstract":"We report a sun tracker sensor for attitude control of space navigation systems. The sensor exploits the concept of asynchronous operation previously devised by the authors for those devices. Asynchronous luminance sensors optimize sun trackers operation because only illuminated\u0000 pixels are readout and can transmit data. This approach outperforms classic frame-based sun trackers in terms of bandwidth consumption, latency, and power consumption. The new sensor under study has been optimized for operation and interaction with other attitude control systems when it is\u0000 embarked. The sensor power consumption is quite reduced. To save power, its pixels enter automatically in standby mode after gauging illumination levels. The device operates with only 0.45V. The pixel matrix has been devised to optionally be directly powered by energy harvesting systems based\u0000 on photovoltaic diodes connected to a storage capacitor without a DC-DC converter.","PeriodicalId":121190,"journal":{"name":"Imaging Sensors and Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129131227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}