{"title":"Failure rate estimation in a dynamic environment","authors":"Gouno Evans, Guérineau Lise","doi":"10.1515/EQC-2015-6001","DOIUrl":"https://doi.org/10.1515/EQC-2015-6001","url":null,"abstract":"We present a method to assess the reliability of a material operating in a dynamic environment. The dynamic environment is represented as a sequence of shocks governed by a self-exciting point process. The time-to-failure of the material is assumed to have a piecewise exponential distribution. A Cox model is integrated to take into account the effect of the stress. Maximum likelihood estimates of the model parameters are obtained and their properties are studied through simulated data. An application on field data is displayed. Hypothesis testing procedures for environment effect are suggested.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130615694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Forecasting Stock Market Trends","authors":"G. N. Nedeltcheva","doi":"10.1515/eqc-2015-6003","DOIUrl":"https://doi.org/10.1515/eqc-2015-6003","url":null,"abstract":"Abstract Forecasting is a difficult area of management. In this article, we deal with macroforecasting. The development and assessment of econometric methods for use in empirical finance and macroeconomics, with special emphasis on problems of prediction, is very important. Stock market analysis, also known as technical analysis, is the process of deriving patterns from price movement. In the literature, different methods have been applied in order to predict stock market returns. These methods can be grouped in four major categories: technical analysis methods, fundamental analysis methods, traditional time series forecasting, and machine learning methods. Technical analysts, known as chartists, attempt to predict the market by tracing patterns that come from the study of charts that describe historic data of the market. This study examines the effectiveness of technical analysis on US stocks for long range and shorter term.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131650555","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Proceedings of Flint International Statistics Conference Kettering University, June 24–28, 2014","authors":"B. Dimitrov, L. Gawarecki","doi":"10.1515/eqc-2014-0009","DOIUrl":"https://doi.org/10.1515/eqc-2014-0009","url":null,"abstract":"This special issue of Economic Quality Control (EQC) contains selected talks presented at the Flint International Statistics Conference (FISC) held at Kettering University in Flint, Michigan. This was a distinct international event for Kettering and for the institutions in Flint. More than 40 researchers gathered at FISC. Therewere participants fromSweden, France, Germany, Bulgaria, Italy,UnitedKingdom, Spain, SouthAfrica, Canada, Cyprus, Barbados, Georgia, Moldova, and USA. The title of the conference “Flint: One City, One Hundred Years under Variability” reveals its central subject: Statistical Methods and Studies of Historical Data. More speci cally, the participants discussed challenging problems in studies of multiple parallel series of historical data on various objects in the areas of public life, industrial and service development. Historical data are often comprised by huge worldwide arrays of BIG DATA and require not just speci c statistical methods for excerption of useful information and learning, but also development of speci c tools for data mining and reducing the data dimensions. Hence, to be properly addressed, many of the existing questions require collaboration between Computer Science and Statistics in data manipulations. The participants at FISC enjoyed discussions on a broad variety of topics including algorithmic and numerical data techniques, modeling using traditional analytical methodology, and applied data analysis. For the proceedings of FISC we selected a total of 23 articles classi ed in two categories: (i) Statistical Methodology, to be published in Economic Quality Control (De Gruyter). (ii) Computational Extensive Analysis, to be published in Serdica Journal of Computing (Bulgarian Academy of Sciences). This Special Issue of EQC contains six of the selected eleven articles in the category Statistical Methodology. The remaining paper will appear in the next issue. All manuscripts passed the scrutiny of a peer review process. We are proud to present to the attention of the EQC audience the following papers. ∙ Urban Planning for Change: Data and Projections in City of Flint Master Plans (1920, 1960 & 2013), by D. Walling, City Mayor of Flint, Michigan. Dayne Walling, the City Mayor of Flint, was the opening keynote speaker of the conference. We could not expect better support for our conference. In his article Mayor Walling analyzes three comprehensive historical master plans in order to assess available data, make projections and propose recommendations, focusing on historical statistical methods and urban planning, particularly as it relates to spatial data. The three plans are considered in three main sections: planning for population growth (1920), planning for regional rationalization (1960), and planning for a exible future (2013). Population projections, residential density patterns, and economic and employment data are reviewed and compared against the planning recommendations and realities. We value his visi","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114678668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Leverage Effect for Volatility with Generalized Laplace Error","authors":"F. Javed, K. Podgórski","doi":"10.1515/eqc-2014-0015","DOIUrl":"https://doi.org/10.1515/eqc-2014-0015","url":null,"abstract":"Abstract We propose a new model that accounts for the asymmetric response of volatility to positive (`good news') and negative (`bad news') shocks in economic time series – the so-called leverage effect. In the past, asymmetric powers of errors in the conditionally heteroskedastic models have been used to capture this effect. Our model is using the gamma difference representation of the generalized Laplace distributions that efficiently models the asymmetry. It has one additional natural parameter, the shape, that is used instead of power in the asymmetric power models to capture the strength of a long-lasting effect of shocks. Some fundamental properties of the model are provided including the formula for covariances and an explicit form for the conditional distribution of `bad' and `good' news processes given the past – the property that is important for statistical fitting of the model. Relevant features of volatility models are illustrated using S&P 500 historical data.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115070188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Discrete Pareto Distributions","authors":"Amrutha Buddana, T. Kozubowski","doi":"10.1515/eqc-2014-0014","DOIUrl":"https://doi.org/10.1515/eqc-2014-0014","url":null,"abstract":"Abstract We review several common discretization schemes and study a particular class of power-tail probability distributions on integers, obtained by discretizing continuous Pareto II (Lomax) distribution through one of them. Our results include expressions for the density and cumulative distribution functions, probability generating function, moments and related parameters, stability and divisibility properties, stochastic representations, and limiting distributions of random sums with discrete-Pareto number of terms. We also briefly discuss issues of simulation and estimation and extensions to multivariate setting.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121137617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. A. Mahmoud, R. M. El-Sagheer, A. Soliman, A. Abd Ellah
{"title":"Inferences of the Lifetime Performance Index with Lomax Distribution Based on Progressive Type-II Censored Data","authors":"M. A. Mahmoud, R. M. El-Sagheer, A. Soliman, A. Abd Ellah","doi":"10.1515/eqc-2014-0005","DOIUrl":"https://doi.org/10.1515/eqc-2014-0005","url":null,"abstract":"Abstract Effective management and the assessment of quality performance of products is important in modern enterprises. Often, the business performance is measured using the lifetime performance index CL to evaluate the potential of a process, where L is a lower specification limit. In this paper the maximum likelihood estimator (MLE) of CL is derived based on progressive Type II sampling and assuming the Lomax distribution. The MLE of CL is then utilized to develop a new hypothesis testing procedure for given value of L. Moreover, we develop the Bayes estimator of CL assuming the conjugate prior distribution and applying the squared-error loss function. The Bayes estimator of CL is then utilized to develop a credible interval again for given L. Finally, we propose a Bayesian test to assess the lifetime performance of products and give two examples and a Monte Carlo simulation to assess and compare the two ML-approach with the Bayes-approach with respect to the lifetime performance index CL.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128034291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Posterior Control Chart for Process Average under Conjugate Prior Distribution","authors":"Sharada V. Bhat, Kailas D. Gokhale","doi":"10.1515/eqc-2014-0003","DOIUrl":"https://doi.org/10.1515/eqc-2014-0003","url":null,"abstract":"Abstract Control charts for the process average play an important role in process control. In the presence of prior information, using Bayesian approach, one can construct posterior control chart for process average (X̅ posterior control chart). The control limits of the proposed chart are derived under the assumption that the process average has a conjugate prior distribution. Both cases – variance is known and variance is unknown – are discussed. When the variance is unknown, the control limits are constructed using Unbiased Estimator (UE) and Maximum Likelihood Estimator (MLE) of the variance. The power and Average Run Length (ARL) of the proposed chart are obtained. The newly constructed X̅ posterior control chart is compared with a few other X̅ charts described in the literature.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133246360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayesian Reliability Sampling Plans under the Conditions of Rayleigh-Inverse-Rayleigh Distribution","authors":"S. Kalaiselvi, A. Loganathan, R. Vijayaraghavan","doi":"10.1515/eqc-2014-0004","DOIUrl":"https://doi.org/10.1515/eqc-2014-0004","url":null,"abstract":"Abstract Reliability sampling plans are used to take decisions on the disposition of lots based on life testing of products. Such plans are developed taking into the consideration of relevant probability distributions of the lifetimes of the products under testing. When the quality of products varies over lots, then a predictive distribution of the lifetime should be used to design sampling plans. In this paper, designing of reliability single sampling plan based on the predictive distribution of the lifetime is considered. It is assumed that sampling inspection is carried out through life testing of products with hybrid censoring. The predictive distribution is obtained assuming that the probability distribution of the lifetime of the product is Rayleigh and the process parameter has an inverse-Rayleigh prior. Plan parameters are determined using hypergeometric, binomial and Poisson probabilities, providing protection to both producer as well as consumer.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"245 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114580751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Assessment of Traditional Demerits and a New Ordinal Alternative","authors":"J. Chimka, Qilu Wang","doi":"10.1515/eqc-2013-0014","DOIUrl":"https://doi.org/10.1515/eqc-2013-0014","url":null,"abstract":"Abstract Demerits control is the traditional tool for monitoring defects of different severity with a single chart. It requires arbitrary assignment of numerical values to the ordinal scale and therefore is theoretically flawed. We assess the error rates of demerits control compared to an alternative based on the proportional odds model only to find that it is not more powerful than traditional demerits.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114620976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Bivariate Confluent Hypergeometric Series Distribution and Some of Its Properties","authors":"C. Kumar","doi":"10.1515/eqc-2013-0009","DOIUrl":"https://doi.org/10.1515/eqc-2013-0009","url":null,"abstract":"Abstract In this paper we develop a bivariate version of the confluent hypergeometric series distribution through its probability generating function and study some of its properties by deriving its probability mass function, factorial moments, probability generating functions of its marginal and conditional distributions and recursion formulae for probabilities, raw moments and factorial moments. Further certain mixtures and limiting cases of this distribution are also obtained.","PeriodicalId":360039,"journal":{"name":"Economic Quality Control","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132436428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}