{"title":"X̅ Charts with Variable Control and Warning Limits","authors":"Shashibhushan B. Mahadik","doi":"10.1515/eqc-2013-0019","DOIUrl":"https://doi.org/10.1515/eqc-2013-0019","url":null,"abstract":"Abstract Adaptive control charts deliver significantly better performances than that of static control charts. However, the frequent switches between sampling interval lengths and/or between sample sizes can be a complicating factor in the administration of these charts. This factor is totally absent for the adaptive chart proposed in this paper. The proposed chart is an X̅ chart with variable control and warning limits (VCWL). Expressions for the performance measures for this chart are developed. The methods presented are general and can be applied to other Shewhart control charts. The performances of VCWL X̅ charts are compared numerically with that of static X̅ charts. It is observed that the idea of VCWL significantly improves the performances of X̅ charts in detecting small to moderate shifts in the process mean without affecting that in detecting large shifts.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122906487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Negative Binomial Marshall–Olkin Rayleigh Distribution and Its Applications","authors":"K. K. Jose, Remya Sivadas","doi":"10.1515/eqc-2015-0009","DOIUrl":"https://doi.org/10.1515/eqc-2015-0009","url":null,"abstract":"Abstract A generalization of the Marshall–Olkin family of distributions is developed using negative binomial compounding instead of geometric compounding where addition is replaced by minimum of a random number of observations X1,X2,...,XN. Here, we consider the Rayleigh distribution and extend it to obtain a Negative Binomial Marshall–Olkin Rayleigh Distribution. Various properties of the new family are investigated. Maximum likelihood estimates are obtained. The use of the model in lifetime modeling is established by fitting it to a real data set on remission times of bladder cancer patients. Also we try to develop a reliability test plan for acceptance or rejection of a lot of products submitted for inspection with lifetimes governed by this distribution.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121180069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayesian Analysis of the Brown–Proschan Model","authors":"Dinh Tuan Nguyen, Y. Dijoux, M. Fouladirad","doi":"10.1515/eqc-2015-6002","DOIUrl":"https://doi.org/10.1515/eqc-2015-6002","url":null,"abstract":"Abstract The paper presents a Bayesian approach of the Brown–Proschan imperfect maintenance model. The initial failure rate is assumed to follow a Weibull distribution. A discussion of the choice of informative and non-informative prior distributions is provided. The implementation of the posterior distributions requires the Metropolis-within-Gibbs algorithm. A study on the quality of the estimators of the model obtained from Bayesian and frequentist inference is proposed. An application to real data is finally developed.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"22 15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116553958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Application of EM Test for the Bayesian Change Point Problem","authors":"A. Variyath, C. Vasudevan","doi":"10.1515/eqc-2013-0013","DOIUrl":"https://doi.org/10.1515/eqc-2013-0013","url":null,"abstract":"Abstract In any manufacturing process, identification of changes in the process conditions is of great interest. Recently, a Bayesian approach for the identification of the change in process mean was proposed assuming that the response of interest follow an exponential family distribution. In this approach, the expectation – maximization (EM) algorithm was used for estimating the process parameters. In general, the EM algorithm is computationally intensive and the optimality depends on the initial values of the parameters chosen. We extend the idea of the EM test for homogeneity to extend this Bayesian approach to the change point problem. Our simulations studies show that the developed EM test procedure converges at a faster rate than the original EM approach. Our studies also show that the EM test with binomial prior distribution leads to solutions very close to the true values. We have applied our approach to two case examples.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128381758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"More Equal and Poorer, or Richer but More Unequal?","authors":"F. Greselin","doi":"10.1515/eqc-2014-0011","DOIUrl":"https://doi.org/10.1515/eqc-2014-0011","url":null,"abstract":"Abstract After a hundred years of contributions, the debate about how to measure inequality is still open. We provide a brief review of the literature, showing that inequality has been assessed through a relative approach, from Gini's pioneering article [Atti del Reale Istituto Veneto di Scienze, Lettere ed Arti 73 (1914), no. 2, 1203–1248]. Analyzing historical census data for Flint and other American cities, we observe how mean values of income in population subgroups capture the shape of the distributions of income and their comparisons state the overall situation of inequality. Namely, we adopted the approach introduced in [Statistica & Applicazioni 5 (2007), no. 1, 3–27] to assess inequality. Our first findings show that prosperity is distributed unevenly across America's metropolitan areas. More interestingly, unbalanced wealth can be related to other concomitant facts [The New Geography of Jobs, Houghton Mifflin Harcourt, New York, 2012], such as population growth, income growth, unemployment rates and women participation to the labor force. Gaps between more and less educated areas were modest 40 years ago, but they have become quite large nowadays [Cities and skills, technical report, National Bureau of Economic Research, 1994].","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122993398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Double Acceptance Sampling Plan for Time-Truncated Life Tests Based on Half Normal Distribution","authors":"A. Al-Omari, Amjad D. Al-Nasser, Fatima S. Gogah","doi":"10.1515/eqc-2016-0004","DOIUrl":"https://doi.org/10.1515/eqc-2016-0004","url":null,"abstract":"Abstract In this work, we investigate a double acceptance sampling plan (DASP) based on truncated life tests when the lifetime of a product follows the half normal distribution. By fixing the consumer’s confidence level, the minimum sample sizes of the first and second samples needful to assert the specified mean life are calculated. The operating characteristic values and the minimum ratios of the mean life to the specified life are also analyzed. Several important tables are provided and a numerical example is given to illustrate the results.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114829597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Control Chart for Autocorrelated Processes with Heavy Tailed Distributions","authors":"Thaga Keoagile","doi":"10.1515/EQC.2008.197","DOIUrl":"https://doi.org/10.1515/EQC.2008.197","url":null,"abstract":"Standard control charts are constructed under the assumption that the observations taken from the process of interest are independent over time; however, in practice the observations in many cases are actually correlated. This paper considers the problem of monitoring a process in which the observations can be represented as a first-order autoregressive model following a heavy tailed distribution. We propose a chart based on computing the control limits using the process mean and the standard error of the least absolute deviation for the case when the process quality characteristics follows a heavy tailed t-distribution. This chart has narrow control limits since the standard error of the least absolute deviation is smaller than that of the ordinary least square estimator in the case of heavy tailed distributions.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124857534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Likelihood Inference for the Lifetime Performance Index under Progressive Type-II Censoring","authors":"B. Laumen, E. Cramer","doi":"10.1515/eqc-2015-0008","DOIUrl":"https://doi.org/10.1515/eqc-2015-0008","url":null,"abstract":"Abstract Likelihood inference for the lifetime performance index CL is discussed in the presence of progressive censoring. We illustrate that many results available in the literature can be traced back to the exponential case using appropriate transformations of the data. Further, we present procedures to compute the MLE of CL under the assumption of gamma distributed lifetimes. The results are illustrated by a simulation study and applied to a data set.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125039196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Transmuted Erlang-Truncated Exponential Distribution","authors":"I. Okorie, Anthony C. Akpanta, Johnson Ohakwe","doi":"10.1515/eqc-2016-0008","DOIUrl":"https://doi.org/10.1515/eqc-2016-0008","url":null,"abstract":"Abstract This article introduces a new lifetime distribution called the transmuted Erlang-truncated exponential (TETE) distribution. This new distribution generalizes the two parameter Erlang-truncated exponential (ETE) distribution. Closed form expressions for some of its distributional and reliability properties are provided. The method of maximum likelihood estimation was proposed for estimating the parameters of the TETE distribution. The hazard rate function of the TETE distribution can be constant, increasing or decreasing depending on the value of the transmutation parameter Φ [ - 1 , 1 ] ${Phi[-1,1]}$ ; this property makes it more reasonable for modelling complex lifetime data sets than the ETE distribution that exhibits only a constant hazard rate function. The goodness of fit of the TETE distribution in analyzing real life time data was investigated by comparing its fit with that provided by the ETE distribution and the results show that the TETE distribution is a better candidate for the data. The stability of the TETE distribution parameters was established through a simulation study.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130022255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Bayesian View on Detecting Drifts by Nonparametric Methods","authors":"Steland Ansgar","doi":"10.1515/EQC.2002.177","DOIUrl":"https://doi.org/10.1515/EQC.2002.177","url":null,"abstract":"We study a nonparametric sequential detection procedure, which aims at detecting the first time point where a drift term appears in a stationary process, from a Bayesian perspective. The approach is based on a nonparametric model for the drift, a nonparametric kernel smoother which is used to define the stopping rule, and a performance measure which determines for each smoothing kernel and each given drift the asymptotic accuracy of the method. We look at this approach by parameterizing the drift and putting a prior distribution on the parameter vector. We are able to identify the optimal prior distribution which minimizes the expected performance measure. Consequently, we can judge whether a certain prior distribution yields good or even optimal asymptotic detection. We consider several important special cases where the optimal prior can be calculated explicitly.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114945193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}