QPRC 2016

Failure Analysis of Data Integrity Risks for Big Data


Jeff Robinson

AQI


This presentation reports on a real world case study of an application of Six Sigma principles to achieve high integrity of a large digital archive. In order to ensure the preservation of a large data repository containing more than five petabytes of irreplaceable digital objects, a detailed failure model was developed that considered extremely rare events. This model was then used to evaluate different strategies for preventing corruption or loss of any data. Very sophisticated simulations were performed to create a predictive model of great accuracy. The results, however, were quite surprising since the most statistically significant potential causes of data loss were not what the sponsors and investigators initially envisioned. Some of the Six Sigma techniques employed in this study included failure analysis, FMEAs, Monte Carlo analysis, root cause analysis. This is presentation is relevant to business leaders and quality professionals interested in quantifying risks and proactively reducing the risk exposure of data integrity in industries heavily dependent on big data, such as financial, healthcare, education, government.

Bio:
Dr Robinson has more than 25 years experience in Information Technology, manufacturing, and automation. A frequent lecturer and author, he has degrees in Physics, Electrical Engineering, an MBA and a PH.D. in Information Systems. He is a Six Sigma Master Black Belt and holds certifications in CMMI, ITIL, Digital Forensics, and a PMP. He has been teaching courses in graduate and undergraduate courses in Business and Technology since 1989 and has worked at companies such as Motorola, American Express, Rockwell International, Hughes Aircraft, and Medtronics, in addition to consulting in Six Sigma, process and quality.