eaglet.gif (3639 bytes)

2003 Quality & Productivity Research Conference

IBM T. J. Watson Research Ctr., Yorktown Heights, NY

May 21-23, 2003


Contributed Paper Sessions (with Abstracts and Papers)



1. Industrial Applications

Session Chair: Andre Pinho, University of Wisconsin


1. "The Six Sigma Approach to Reduce the Product Damages in the Warehouse", Nihal M. Erginel, Anadolu University, Turkey, and Nimetullah Burnak, Osmangazi University, Turkey. Paper

Abstract: The most common complaint of end customers about the defects is the product damages to the warehouse in which the products are packaged, stored, and shipped. Six sigma methodology is applied to eliminate the complaints about the product damages. The factors which are assumed to be effective on the occurrence of the damages are determined by using process map, and cause ö effect diagram. Their priorities are assigned via cause ö effect matrix. It is decided that the product damages can occur in any one of the two phases. The first phase is the packaging for which packaging materials and packaging method are considered. The second phase is the transportation for which handling equipments are analyzed.

DOE is conducted separately for each of the phases, and the results are discussed.


2. "Improvement of Heating Component Production Process Using 6 Sigma Methodology", Berna ATA, Korel Elektronik, Eskisehir, Turkey, and Nimetullah Burnak, Osmangazi University, Turkey.

Abstract: Product improvement, productive manufacturing are two of the important topics of the companies in order to compete. Companies must consider productivity and efficiency. Under severe competitive conditions, in order to increase the market share requires to decrease or eliminate variation in the production process and improve product quality. 6 sigma methodology is a useful tool to improve the production process. In this study, the improvement studies with 6 sigma of a heating component used to defrost in a cooling product are discussed. Starting from problem definition to the results obtained such as type of fiberglass, diameter of the wire, etc. are covered.

3. "Assessing Quality of Asphalt Paving Jobs to Determine Contractor Pay", Robin C. Wurl and James R. Lundy, Oregon State University. Paper

Abstract: A quality assessment procedure for asphalt mix production is developed for the Oregon Department of Transportation to link contractor pay to expected field performance. The methodology, based on a loss function, encourages the production of asphalt mixes that are consistent with specifications with minimum variability. The mix has multiple quality characteristics but not all are equally important and the specifications may change during the lifetime of a job.

4. "Estimating the efficiency of collaborative problem-solving, with applications to chip design", Mary Y. Lanzerotti - Wisniewski, E. Yashchin, R.L. Franch, D.P. Conrady, G. Fiorenza, I.C. Noyan, IBM Research
Paper

Abstract: We present a statistical framework to address questions that arise in general problems involving collaboration of several contributors. One instance of this problem occurs in the complex process of designing ultralarge-scale-integration semiconductor chips. In cases involving complex designs, the computer-aided design tools are unable to create designs that satisfy specified project criteria, and a number of questions arise about how to measure the effectiveness of systematic external intervention that is implemented with some supplemental algorithm. As an example, we apply the statistical framework to the problem of routing a functional unit of the IBM POWER4 microprocessor.


2. Process Control I

Session Chair: Norma Leyva - Estrada, Iowa State University


5. "Using Statistical Process Control To Monitor Active Managers", Thomas K. Philips, Paradigm Asset Management, Emmanuel Yashchin, IBM Research, David M. Stein, Parametric Portfolio Associates Paper

Abstract: Investors who are invested in (or bear responsibility for) many active portfolios face a resource allocation problem: To which products should they direct their attention and scrutiny? Ideally they will focus their attention on portfolios that appear to be in trouble, but these are not easily identified using classical methods of performance evaluation. In fact, it is often claimed that it takes forty years to determine whether an active portfolio outperforms its benchmark. The claim is fallacious. In this article, we show how a CUSUM process control scheme can be used to reliably detect flat-to-the-benchmark performance in forty months, and underperformance faster still. By rapidly detecting underperformance, the CUSUM allows investors to focus their attention on potential problems before they have a serious impact on the performance of the overall portfolio. The CUSUM procedure proved to be robust to the distribution of excess returns, allowing its use in almost any asset class, including equities, fixed income, currencies and hedge funds without modification, and is currently being used to monitor over $500 billion in actively managed assets.

6. "Phase I Monitoring of Nonlinear Profiles", James D. Williams, William H. Woodall and Jeffrey B. Birch, Virginia Polytechnic Institute & State University. Paper

Abstract: In many quality control applications, a single measurement is insufficient to characterize the quality of a produced item. In an increasing number of cases, a profile (or signature) is required, often consisting of several measurements of the item across time or space. Such profiles can frequently be modeled using linear regression models or a nonlinear regression models. In recent research others have developed multivariate T2 control charts for monitoring the coefficients in a simple linear regression model of a profile. However, little work has been done to address the monitoring of profiles that can be represented by a parametric nonlinear regression model. Here we extend the use of the T2 control chart to monitor the coefficients of the nonlinear regression fits to the profile data. We give several general approaches to the formulation of the T2 statistics and the associated upper control limits in Phase I applications. Finally, these approaches are illustrated using the vertical board density profile data used by Walker and Wright (JQT, 2002).

7. "On the design of Bayesian Control Charts using Markov Decision Processes", Bianca M. Colosimo, Politecnico di Milano (Italy)

Abstract: Following the first pioneristic work by Girshick and Rubin (1952), different studies showed that traditional non-Bayesian control charts are not optimal from an economic point of view. Despite of the "significant theoretical value of Girshick and Rubin's model" (Montgomery, 2001), this approach has had little attention because of the computational complexity required to derive the optimal control rule.
During the last decade, Bayesian control charts received a renewed attention in the framework of Adaptive Control Charts (Tagaras 1994, 1996, 1998), and dynamic programming was presented as the viable solution to overcome computational difficulties. Following the work of Smallwood and Sondik (1973), this paper re-address the problem of designing a Bayesian control chart using newly developed algorithms developed in the framework of Partially Observable Markov Decision Processes (POMDP).

8. "Development of a Process Control Scheme for Reduction in Weight Variation of Capsules", Prasun Das, Indian Statistical Institute.

Abstract: The study aimed for investigating the causes of different sources of weight variation of capsules and thereby suggesting a suitable control scheme to improve productivity in a pharmaceutical industry. Data were analysed, using the capsule filling process, under a mixed effect cross-nested model. The process capabilities were estimated using Clementsâ method followed by developing a routine setting and control procedure. In one case, Taguchiâs two-step optimization procedure was also suggested.


3. Design of Experiments

Session Chair: Alexandra Kapatou, University of Michigan


9. "Construction of Optimal Constrained Permutation Mixture Experiment Designs", Ben Torsney and Yousif A. Jaha, University of Glasgow. Paper

Abstract: We construct 'D-optimal' constrained permutation mixture exper-iment designs. Design points are permutations of a single set of (non-negative) proportions summing to 1, thereby meeting conditions satisfied by 'mixture' variables. Two types of constraint are considered on the proportions : order constraints (for finding local maxima); common lower and/or upper bound constraints. In both cases the constrained optimisation problem can be transformed to one in which optimisation is with respect to a new set of proportions or convex weights. A multiplicative algorithm is used to optimise the D-criteria over the proportions under a Scheffe model. Results extend to blocking and other models.

10. "New Results about Randomization and Split-Plotting", James M Lucas. Paper

Abstract: New results include this fact that the Kiefer-Wolfowitz equivalence theoremdoes not hold for split-plot experiments. D- and G- criterion give different designs. Computer approaches to design must recognize this. New design examples will be given. This is joint work with Peter Goos.

I would also discuss "SUPER-EFFICIENT" experiments and give catalogues of optimum blocking for one and two hard-to-change factors. This is joint work with Frank Anbari, Derek Webb and John Borkowski (who is attending and presenting designs generated using a genetic algorithm). Examples of Biases form running RNR (Randomized Not Reset) experiments (when the factor is not set to a neutral level and then reset when successive runs have the same level) will be presented. This is joint work with Jeetu Ganju.


11. "Post-Fractionated Strip-Block Designs: A Tool for Robustness Applications and Multistage Processes", Carla A. Vivacqua, University of Wisconsin, S¿ren Bisgaard, University of Massachussets, Harold J. Steudel, University of Wisconsin Paper

Abstract: This paper presents a novel experimental arrangement, called post-fractionated strip-block design, which represents a cost-effective method to gather knowledge and fast responses for guiding the design of robust products while reducing product development expenses. It can also be applied in the improvement of multistage processes and in studies involving hard-to-change factors.


12. "Computer Experiments: Designs to Achieve Multiple Objectives", Leslie M. Moore, Los Alamos National Laboratory

Abstract: Orthogonal arrays, or highly fractionated factorial designs, are suggested for computer experiments in which goals may include sensitivity analyses or response surface modeling. Latin hypercube samples, possibly selected by space-filling criterion, are commonly used when Gaussian spatial processes are the modeling paradigm of choice or uncertainty analysis is the objective. Designs with more than 2 or 3 levels per input or densely covered 1 or 2 dimensional projections are also desirable. Competing experiment objectives will be discussed and experiments are suggested that combine designs with different properties.



4. Process Control II

Session Chair: Daniel R. Jeske, Lucent


13. "Control Charts for Binomial Proportions", John Aleong, University of Vermont

Abstract: The problem of the control chart for the binomial proportion (p-charts) will be revisited. The p-chart is based on the standard Wald confidence interval. The erratic behavior of the coverage probability of the Wald confidence interval is discussed by Blyth & Still(1983), Agresti and Coull(1998), Santner(1998) and others. Recent results by Brown, Cai, and DasGupta (2002, 2001,2000), have shown theoretically, using Edgeworth expansions and simulations the eccentric behavior of the Wald confidence interval for various values of p and n. Using the coverage probabilities and expected length of the intervals, they compared the Wald confidence interval with other intervals. Brown et al recommended the Wilson intervals (Wilson 1927) and the Jeffrey prior interval for small n, while for large n the Agresti and Coull(1998). These results have consequences for the p-charts in current use. P öcharts based on the results of Brown et al will be presented and illustrated with recommendations.

14. "A Transition Matrix Representation of The Algorithmic Statistical Process Control Procedure with Bounded Adjustments and Monitoring", Changsoon Park, Chung-Ang University Paper

Abstract: In processes where both an irremovable disturbance and a special cause are assumed to exist, procedures for adjustment and monitoring are necessary for controlling the process level close to target. Such a combined procedure for adjustment and monitoring is termed as an algorithmic statistical process control (ASPC) procedure.
A transition matrix representation is developed to derive the properties of the ASPC procedure under an IMA(0,1,1) disturbance model. States of the transition matrix are constructed when the ranges of the two control statistics, one is the predicted deviation and the other is the EWMA chart statistic, are classified into a certain number of subintervals, and values in each interval are represented by a discrete value. Then the properties of the ASPC procedure are derived by the operation of the transition matrix. Each element of the transition matrix, i.e. the transition probability from a prior state to a posterior state, is calculated according to the given states and the process level.
This technique can be easily applied to the ASPC procedure with repeated adjustments instead of bounded adjustments.

15. "Statistical Quality Control Techniques for Low Volume and Short-Run Production", Tyler Mangin and Canan Bilen, North Dakota State University Paper

Abstract: The trend in advanced manufacturing industries has been to shift from mass production to low volume production in order to meet customer demand for smaller, more frequent deliveries. This has resulted in a need to develop statistical quality control techniques that are effective in low volume manufacturing environments. These techniques must address critical issues specific to the low volume environment including: charting multiple product-types processed on a single machine, frequency and effect of machine setups on process variability, data scarcity, and process adjustment decisions. The application of existing control charting methodologies in a low volume production environment will be addressed, with emphasis on electronics assemble and manufacturing.


5. Statistical Methods I

Session Chair: J.D. Williams, Virginia Tech


16. "Bayesian Inference for PVF Frailty Models ", Madhuja Mallick and Nalini Ravishanker, University of Connecticut Paper

Abstract: In this article, we describe inference for multivariate lifetimes data using a conditional proportional hazards model with a power variance family (PVF) frailty distribution and a piecewise exponential hazard with correlated prior process baseline hazard. The likelihood function is derived as the joint density of tilted positive stable random variables. Inference is carried out in the Bayesian framework, using Markov chain Monte Carlo techniques. We illustrate our approach on data involving recurrent infections due to insertion of a catheter in patients on portable dialysis machines.

17. "A Method for Estimating Mean Shift Caused by Variance of Input Parameters", Daniel D. Frey, MIT.

Abstract: When the response of an engineering system is a function of randomly varying input parameters, the expected value of the response may shift from the nominal response. This shift in expected value is important because it is especially prominent in the neighborhood of an optima. This paper will present a way to estimate the shift by sampling the response at 4n+l points in the space of the n randomly varying inputs.

18. "Analysis of Repairable Systems: MTBF Versus MCF", David Trinidade, Sun Microsystems.

6. Statistical Methods II

Session Chair: Carla Vivacqua, University of Wisconsin

19. "Estimating Sensitivity Of Process Capability Modeled By A Transfer Function", Alan Bowman and Josef Schmee, Graduate Management Institute, Union College

Abstract: Assume that the output variable Y of a process with a number of inputs (X1, X2,..., Xn) is subject to specification limits and that the inputs can be represented as random variables. The probability that Y falls within its specification limits is a measure of process capability. In this paper, we demonstrate a one-pass Monte Carlo simulation method that allows the estimation of the sensitivity of the process capability to each parameter of the input variables. The results can be used in improving system performance by directing the analyst to those parameters for which small changes result in the largest change in capability. The paper outlines the algorithm, demonstrates it on three problems and provides some intuition as to why it works.

20. "Follow-up Experiments to Remove Confounding Between Location and Dispersion Effects in Unreplicated Two-Level Factorial Designs", AndrŽ L. S. de Pinho, University of Wisconsin, S¿ren Bisgaard, University of Massachussets, Harold J. Steudel, University of Wisconsin Paper

Abstract: The objective of this paper is to present a methodology that allows us to select the minimum necessary number of trials to gather new information to help resolve the ambiguity among concurrent models. We extend the discrimination criterion of Meyer, Steinberg and Box (1996) by allowing a non-homogeneous variance scenario. The results are exemplified on Montgomeryâs (1990) injection molding experiment.

21. "Two New Mixture Models for Living with Collinearity but Removing its Influence ", John A. Cornell, University of Florida

Abstract: Fitting equations to mixture data collected from highly constrained regions has challenged modelers for the past 35 years. Collinearity among the terms in the models resulting in imprecise coefficient estimates is one of the problems encountered. Recently, two new model forms have been introduced where the terms are scaled and thus remove the influence of collinearity. The benefits of fitting the new models is illustrated using two numerical examples.


22. "Analyzing Supersaturated Designs Using Biased Estimation", Adnan Bashir and James R. Simpson, Florida State University Paper

Abstract: A designed experiment which investigates m number of factors with n number of runs, where m>n-1 is referred to a supersaturated design. Supersaturated designs recently received increased attention due to their use in investigating many factors with fewer experiments. Stepwise regression is the most widely used method to analyze the data of supersaturated designs. Stepwise often fails to provide proper models, due to the multicollinearity existing in supersaturated designs. A new proposed biased estimation technique for analyzing the data of the supersaturated designs is proposed. The technique combines the ridge regression estimation with a modified best subset variable selection procedure, to select the significant factors in the model. Designs of experiments are developed for different configurations of factor settings in the true model. The performance measures are the observed Type I and Type II average errors. The designed of experiments are applied to the both the stepwise regression and the proposed method. Analysis of designed experiments study is conducted on the different factors affecting Type I and Type II errors. A comparison of results shows that the proposed method performs better than the stepwise methods.