{"title":"Journal of Display Technology publication information","authors":"","doi":"10.1109/JDT.2016.2626167","DOIUrl":"https://doi.org/10.1109/JDT.2016.2626167","url":null,"abstract":"Provides a listing of the editorial board, current staff, committee members and society officers.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 11","pages":"C2-C2"},"PeriodicalIF":0.0,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2626167","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50281351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Journal of Display Technology information for authors","authors":"","doi":"10.1109/JDT.2016.2626171","DOIUrl":"https://doi.org/10.1109/JDT.2016.2626171","url":null,"abstract":"These instructions give guidelines for preparing papers for this publication. Presents information for authors publishing in this journal.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 11","pages":"C3-C3"},"PeriodicalIF":0.0,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2626171","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50281345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"IEEE Copyright Form","authors":"","doi":"10.1109/JDT.2017.2653263","DOIUrl":"https://doi.org/10.1109/JDT.2017.2653263","url":null,"abstract":"Presents the copyright information for this issue of the publication.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 11","pages":"1490-1491"},"PeriodicalIF":0.0,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2017.2653263","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50399953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mono-Type TFT Logic Architectures for Low Power Systems on Panel Applications","authors":"Yifan Yang;Sungsik Lee;David Holburn;Arokia Nathan","doi":"10.1109/JDT.2016.2623651","DOIUrl":"https://doi.org/10.1109/JDT.2016.2623651","url":null,"abstract":"This paper introduces novel 7-T pseudo-CMOS for enhancement mode and 6-T pseudo-CMOS for depletion mode inverter circuit architectures. The designs are built around mono-type of TFTs and consume less power consumption than existing 4-T pseudo-CMOS circuits. In addition, they provide steep transfer curves, along with embedded control for compensation of device parameter variations. Analysis of the transient behavior for the various circuit architectures is presented, providing quantitative insight into capacitive loading taking into account the effects of overlap capacitances.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1528-1538"},"PeriodicalIF":0.0,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2623651","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50306564","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimized Backlight Power Saving Algorithm Using Joint Power-PSNR Characteristics Among Multiple Frames","authors":"Ting-Lan Lin;Kun-Hsien Tung;Guan-Jie Fang;Po-Yi Wu","doi":"10.1109/JDT.2016.2622063","DOIUrl":"https://doi.org/10.1109/JDT.2016.2622063","url":null,"abstract":"In this paper, an optimized backlight power saving algorithm using joint power-peak signal-to-noise ratio (PSNR) characteristics among multiple frames is proposed. In existing state-of-the-art methods, the optimal decision for clipping point is made only with consideration of current frame, and the target frame PSNR are set the same for all frames. Therefore they have the same power-PSNR performance. In the proposed work, we show that better power-PSNR performance can be achieved by assigning different target frame PSNR for different frames. In the proposed algorithm, the decisions for clipping points associated with all frames are made at once. The inefficiency ratio of each clipping point is evaluated and put into the algorithm design. An optimization problem is formulated, and a solution-search procedure is proposed. Compared with state-of-the-art methods, the proposed work performs better in both aspects of power and PSNR; the BD-PSNR and the BD-Power of the proposed work against all other methods are better for every tested video; the BD-PSNR and the BD-Power can be as high as 19.80 dB on average, and the BD-Power can be as low as - 26.25% on average.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1506-1513"},"PeriodicalIF":0.0,"publicationDate":"2016-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2622063","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50306580","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zhanxu Chen;Wenjie Liu;Wei Wan;Gengyan Chen;Baijun Zhang;Chongjun Jin
{"title":"Improving the Extraction Efficiency of Planar GaN-Based Blue Light-Emitting Diodes via Optimizing Indium Tin Oxide Nanodisc Arrays","authors":"Zhanxu Chen;Wenjie Liu;Wei Wan;Gengyan Chen;Baijun Zhang;Chongjun Jin","doi":"10.1109/JDT.2016.2621138","DOIUrl":"https://doi.org/10.1109/JDT.2016.2621138","url":null,"abstract":"We report an enhancement in light emission of a GaN-based blue light-emitting diode (LED) with periodic indium tin oxide (ITO) nanodisc arrays. The periodic nanodisc arrays were elaborately designed by the Bragg diffraction and the maximum transmittance of a model of bilayer film on a GaN epitaxial layer. In the experiment, the periodic ITO nanodisc arrays have been fabricated by polystyrene nanosphere lithography. The electroluminescence intensity of the LEDs with periodic ITO nanodisc arrays is better than that of conventional planar GaN-based LED, and the greatest enhancement could increase the intensity by a factor of 2.08 at an operating current of 20 mA compared with the conventional planar GaN-based LED. The simulation results based on three-dimensional finite difference time-domain method showed that the LED with periodic ITO nanodisc arrays can improve the light-extraction efficiency of LED by enhancing the critical angle of light output. This method would be valuable for the fabrication of high-efficiency planar GaN-based LEDs.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1588-1593"},"PeriodicalIF":0.0,"publicationDate":"2016-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2621138","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50409447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Configurable Controller for High-Resolution LED Display Systems","authors":"Joonho Hyun;Suk-Ju Kang;Young Hwan Kim","doi":"10.1109/JDT.2016.2621009","DOIUrl":"https://doi.org/10.1109/JDT.2016.2621009","url":null,"abstract":"Light-emitting diode (LED) display systems are implemented in various resolutions and aspect ratios to satisfy various display requirements. The requisite resolution and aspect ratio can be obtained by combining the appropriate LED panels. However, using different types of LED panels require using different types of LED controllers to drive them, which constitutes a rather burdensome design constraint. In addition, a driving circuit inside an LED panel may be modified to improve the display quality by adopting a new driving scheme or new LED elements. Furthermore, it may sometimes be necessary to create a completely new LED panel. All of these situations require the design of new LED controllers. In this paper, we, therefore, propose a configurable LED controller that can be used with various types of LED panels without requiring hardware modification or redesign. The proposed LED controller consists of an application-specific instruction-set processor (ASIP) and custom hardware. The ASIP is software programmable, and the custom hardware is employed to obtain a high operating speed. Moreover, the function of the custom hardware can be configured through the ASIP software program for generating the necessary address sequence to read the image bit plane memory and, consequently, to accommodate the target LED panel. Therefore, the proposed LED controller provides the advantages of flexibility and high-speed operation. The proposed LED controller was implemented using Xilinx Virtex-5, and it successfully drove LED panels having 64 × 96, 96 × 80, and 128 × 64 pixels at refresh rates of 3125, 2232, and 1736 Hz, respectively. An analysis of the operational capability indicates that, at the same refresh rate, the field-programmable gate array prototype of the proposed controller drives 1.44 times more pixels per unit time than a state-of-the-art commercial controller.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1594-1601"},"PeriodicalIF":0.0,"publicationDate":"2016-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2621009","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50409448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Shaping the Spatial and Temporal Noise Characteristics of Driving Signals for Driving AMOLED Display","authors":"Yik-Hing Fung;Yuk-Hee Chan","doi":"10.1109/JDT.2016.2618607","DOIUrl":"https://doi.org/10.1109/JDT.2016.2618607","url":null,"abstract":"In this paper, a digital driving technique for generating binary driving signals to display a digital video on an active-matrix organic light-emitting diode (AMOLED) panel is proposed. The ideal spatial and temporal noise characteristics of the driving signal for an AMOLED panel to display high-quality video are first suggested. Based on the suggested characteristics, a digital driving technique is proposed to generate the driving signals for an AMOLED panel. Simulation results show that the proposed driving technique can generate driving signals having the desired spatial and temporal noise characteristics and, accordingly, allows an AMOLED panel to display video of better quality than other conventional driving techniques in terms of various measures.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1652-1663"},"PeriodicalIF":0.0,"publicationDate":"2016-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2618607","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50409444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Overlapped-Area Removal-Based Image Interpolation for Head-Mounted Displays","authors":"Yong-Deok Ahn;Suk-Ju Kang","doi":"10.1109/JDT.2016.2617889","DOIUrl":"https://doi.org/10.1109/JDT.2016.2617889","url":null,"abstract":"In this paper, an overlapped-area removal-based image interpolation algorithm is proposed to significantly decrease the computation complexity when generating the barrel-distorted image for head-mounted display systems. Existing systems require high computation complexity for generating an output sequence with a high frame rate. To reduce the time complexity, the proposed algorithm uses the low-resolution image as an input, and increases the overall resolution based on an overlapped-area removal technique. Specifically, in the analysis process, it generates the coordinate mapping table after analyzing the overlapped area. In the synthesis process, the proposed algorithm performs the overlapped-area removal to reduce the computational complexity, and then it interpolates an input image. The results of the simulation show that the proposed algorithm maximally maintains the image quality while reducing the computation time greatly. Specifically, it reduces the average computation time up to 36.32% while improving peak signal-to-noise ratio by up to 3.72 dB, when compared to the benchmark algorithm.","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1770-1776"},"PeriodicalIF":0.0,"publicationDate":"2016-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2617889","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50305760","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seong-il Lee;Seung Ho Lee;Konstantinos N. Plataniotis;Yong Man Ro
{"title":"Experimental investigation of facial expressions associated with visual discomfort: Feasibility study toward an objective measurement of visual discomfort based on facial expression","authors":"Seong-il Lee;Seung Ho Lee;Konstantinos N. Plataniotis;Yong Man Ro","doi":"10.1109/JDT.2016.2616419","DOIUrl":"https://doi.org/10.1109/JDT.2016.2616419","url":null,"abstract":"This paper aims to investigate facial expressions associated with visual discomfort induced by excessive screen disparities of stereoscopic three-dimensional (S3D) contents. For this purpose, we constructed a novel facial expression database regarding the visual discomfort. While viewing the realistic stereoscopic stimuli with screen disparities varying from 0° to 4.66°, each viewer's face was captured. The database consisted of face videos and associated comfort scores obtained by self-reporting, which might be only a publicly available database regarding the facial expressions associated with visual discomfort. Using the database, for the quantitative investigation, the facial expressions associated with visual discomfort were compared with basic emotional expressions that were well defined and universal. As a result, we observed that the emotional expression of “stressed” (i.e., anger or disgust) was highly correlated with the perceived visual discomfort (Pearson correlation coefficient: 0.91). Furthermore, the feasibility of the discomfort measurement using facial expressions obtained while viewing S3D contents was verified. Experimental results showed that the discomfort measurement using facial expression recognition could achieve a feasible performance (classification accuracy of 81.42%).","PeriodicalId":15588,"journal":{"name":"Journal of Display Technology","volume":"12 12","pages":"1785-1797"},"PeriodicalIF":0.0,"publicationDate":"2016-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/JDT.2016.2616419","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50283227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}