Poster Session Abstracts

 

A Case Study of Combining SPC and EPC in Multistage Manufacturing Processes

Zhu Yada, National University of Singapore

Bonding wire fabrication is a typical multistage manufacturing process. In ensuring the wire quality, the process parameters of one or different stages present a great challenge for process modeling, optimization, and fault diagnosis. In industry practice, critical wire properties, such as elongation and breaking load, are monitored with statistical process control tools, e.g. control charts, process capability values and so on. However, such applications have limited diagnostic capability. Operation personnel must well know the process physics as well as sufficient statistical knowledge to identify an assignable cause as needed. Since in-process variables such as annealing temperature and time can be adjusted to manipulate product properties, the concept of engineering process control (EPC) can be applied. This paper presents a case study integrating SPC and EPC to improve the boning wire quality and associated manufacturing process capability. Data collected from industry have demonstrated the effectiveness of such a scheme. Limitations of application SPC and EPC in practice are also discussed.

Sequential Search Algorithms for Optimal Nonregular Fractional Factorial Design

Aijun Zhang, University of Michigan

The generalized minimum aberration is a popular criterion for assessing nonregular fractional factorial designs. It is defined to sequentially minimize the generalized word-length pattern, and each step is essentially an integer programming problem with a nonlinear objective. In this paper we derive some lower bounds and optimality conditions from low to high orders. Our solution is based on a relaxation-and-strengthening technique, where the relaxed problem is solved by Lagrange analysis, then strengthened by an interesting property of integers. The theoretical results are applied to search for new designs with generalized minimum aberration.

Orthogonal-Maximum Latin Hypercube Designs

Ying Hung, Georgia Institute of Technology

A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good space-filling properties. There are procedures to find good LHDs by minimizing correlations or maximizing the inter-site distances. In this article we have shown that these two criteria need not agree each other. In fact, maximization of inter-site distances can result in LHDs where variables are highly correlated and vice versa. Therefore, we propose a multi-objective optimization approach to find good LHDs by combining correlation and distance performance measures. We also propose a new exchange algorithm for efficiently generating such designs. Several examples are presented to show that the new algorithm is fast and the optimal designs are good in terms of both correlations and distances.
On Detecting a Rate Increase Using a Bernoulli-Based Scan Statistic

Michael Joner, Jr., William Woodall, and Marion Reynolds, Jr., Virginia Tech

Scan statistics are most often used by public health specialists to detect a cluster of data points that indicate an unusually large number of events. This is most often done retrospectively. Some have proposed the use of the scan statistic in prospective monitoring to detect an increase in the rate of some event. It is clearly desirable to detect such an increase as soon as possible after such an increase occurs. Industrial professionals generally use other methods, such as the p-chart or Bernoulli CUSUM chart, in these circumstances. We compare the performance of a control chart based on the prospective scan statistic method with the standard quality control methods. We also discuss some of the issues involved in implementing control charts based on the scan statistic.

Deriving Optimal Conditions for Large-Scale Controlled Synthesis of Nanostructures Using Statistical Methods

Tirthankar Dasgupta, Georgia Institute of Technology

In this paper, an effort is made to systematically investigate the best process conditions that ensures large-scale synthesis of different types of nanostructures. Through a designed experiment and rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions that maximize the above probabilities and, at the same time, make the synthesis process less sensitive to variations of process variables around set values are derived from the fitted models using Monte-Carlo simulations.

Analysis of Optimization Experiments

James Delaney and V.R. Joseph, Georgia Institute of Technology

Traditionally, experimental data is analyzed to find statistically significant effects and then the model containing only those effects is used for optimization. This can lead to inconclusive results if some important effects are not included in the model. In this article, we propose a strategy for analyzing experiments by focusing on the goal of optimization rather than variable selection. The usefulness of the strategy is illustrated with real experiments.


 

Some Cautions on Applying the EM Algorithm to a Quality Asseessment Application

Lorrie L. Hoffman, Armstrong Atlantic State University

The problem of handling missing data began to be extensively studied in the late 1970’s. The mechanism of solution is inherently a multivariate one with the EM algorithm one potential approach. It has been just a decade ago that journals targeted at quality assessment wrote of future innovations in multivariate applications. Thus in a quality engineering environment, the act of addressing "missingness" in data collection and analysis is a rather new endeavor. Several articles using techniques like EM have recently appeared in the literature. Via an example dealing with proportions of defects, we explore the application of the EM algorithm that allows for use of more data than just the complete pairs. We illustrate the importance of the concept of "missing at random" and its effect on proper convergence to the maximum likelihood estimates.

Testing for Change in Rayleigh Distribution with Staggered Entry

Dong-Yun Kim, Illinois State University and Michigan State University

In this paper we present a likelihood-ratio based procedure for testing for a parameter change in the Rayleigh distribution when the test subjects are sequentially entered to the test at random times and subject to Type I censoring. We establish the convergence of the profile log-likelihood ratio process to a Gaussian process and determine the power of the test via Monte Carlo simulation. Also we illustrate the method using real data.

On Some Alternative Start-up Demonstration Tests

William Griffith and Michelle DePoy Smith, University of Kentucky

 

Hahn and Gage (1983) investigated a start-up demonstration test based on consecutive successful start-ups. Balakrishnan and Chan (2000) introduced a modification of this which allowed for early termination and rejection of the equipment if a pre-specified number of failures occurred prior to the required number of consecutive successes. Smith and Griffith (2003, 2005) and Martin (2004) have also studied this test. In this paper we discuss other start-up demonstration tests based on alternative or additional criteria and consider the possibility that levels of success may be distinguished, so that fewer consecutive successes may be required if higher performance levels are demonstrated. We also consider the possibility of time truncation as an additional termination criterion.

 


 

Applying Statistical Process Control to Earned Value Management

Barbara Thibadeau, Oak Ridge National Laboratory

Earned value measurement is a key element of project management that enables a quantitative assessment of project performance. Two key values are the cost and schedule performance indices. These normalized values enhance the analysis of a project’s performance and enable comparison between and within projects. Additionally, these values are ideal candidates for assessment utilizing statistical process control techniques. This research applies statistical process control techniques to performance data from an actual project and shows that, when the data is normally distributed, this approach provides project managers with information not available from the current project performance reporting suite.

Change Point Methods for Monitoring Polynomial Profiles

Shilpa Gupta, Arizona State University

Nonlinear curves occur very common in continuous process, yet we lack optimum monitoring techniques to efficiently track the process behavior using only the characteristic profiles. A control chart scheme is proposed for low order polynomial profiles based on the change point method. We use the likelihood ratio test to detect a sustained step shift in the process. The test statistic is plotted on a Shewhart-like chart with control limits derived from asymptotic distribution theory. Further, the test statistic is factored to reflect the variation due to the parameters to aid in interpreting an out of control signal. Both retrospective and prospective analyses are proposed and are illustrated through examples.


 

GIS-Based Banking Branch Performance Evaluation through DEA and Regression Analysis

Wenjun Yin, IBM Research Laboratory, Jia Chen, IBM China Research Laboratory and Chinease Acadamy of Sciences, Jin Dong, IBM Research Laboratory

Banking branches, especially in emerging countries, serve as the most important channel to deliver financial products and services. Many banks have kept adjusting and reinvigorating their branch networks to improve overall profitability with effective cost. As the first step, each branch performance should be addressed thoroughly in order to identify key gaps and then guide future branch transformation. However, current research efforts and industrial practices have been mainly focused on evaluating branch performance by use of statistical models or data mining techniques on banking internal business data. With limited linkage to the external data around branches (e.g., massive geographic environment or demographic information), the above approaches often give poor understanding of branch market potential and couldn’t quantify believable gaps for those branches to be transformed. In emerging countries such as China, these traditional branch performance analysis techniques are now further challenged due to the unqualified geographic and demographic data still under construction.

In this paper, a GIS (geographic information system) based banking branch performance evaluation model has been proposed through the crossover of statistical (e.g., multivariate analog regression) and DEA (data envelopment analysis) method. Firstly, the comparative analysis is carried out to verify the inherent consistency between statistics methods and DEA. Further, the crossover model is proposed from practical points of view, in which DEA sensitivity analysis is used to select primary regression variables while the predict regression results are feedback to verify low or high branch performance from DEA models. For the real case of one of the biggest China banks, the performance evaluation model is studied for about 40 branches with both branch financial data and external GIS data as input. Both the best practice and real performance gap are successfully calculated and as the result, the total of 30% improvement on current branch network is significantly estimated to fill in the gaps.


 

Reduction of Luer Taper Distortion in the 27 Guage Spinal Needle Hubs – Multivariate Optimization Using First Principal Componet

Shankang Qu, BD Company

Current diameter measurements on the Luer taper angle for the 27 Gauge Spinal Anesthesia needle hubs do not meet the Luer Taper Diameters calculated specifications at the four depths, thus causing a distortion in the part. In theory, this distortion also affects the circularity of these diameters. This problem may be the root cause for a leak observed in the actual spinal anesthesia procedure. Although we do not know yet accurately how to correlate this leak to the distortion, a preliminary decision was made to focus on correcting this distortion of the taper diameter problem as this fix can only improve the observed leakage problem. The most critical KPIVs affecting the CTQ of the measurements of the Luer taper diameters and the circularity at these various depths were found in the vision molding process. Method of multivariate optimization using first principal component is applied in multi-stage DOE to reduce the nine response variables into one. Cpk Evaluation is carried out in the follow-up stability analysis.

Multivariate Statistical Approach Applied to NMR-Based Metabolic Profiling

Hyun-Woo Chu, University of Tennessee, Seong Bum Kin, University of Texas, Dean P. Jones, Emory University, Myong Jeong (Presenter), The University of Tennessee

Metabolomics approaches with proton NMR have arisen rapidly in recent years to study dynamic and time-dependent profile of metabolic responses to pathophysiological stimuli or genetic modification in an integrated biological system. This work presents the use of linear feature selection methods for identifying informative NMR spectral regions that contribute to the distinction of spectral profiles generated from different experimental conditions. Blood collection of four subjects was performed over the designated time points for 12 days in the Emory General Clinical Research Center. Unsupervised learning methods such as principal components analysis (PCA) facilitate the visualization of complicated metabolic changes in response to SAA deficiency. In addition, the kernel and OSC techniques are adopted to improve visualization and classification. The presented feature selection methods along with the kernel and OSC techniques are illustrated using real NMR spectral data in which the analytical objective is to identify the spectra regions that characterize metabolic patterns in response to sulfur amino acid intake (SAA) in human plasma. The approach here has potential for extracting useful information of metabolic responses to SAA deficiency, and contributes to SAA-deficiency related disease development in many ways.