• Program

    Minitab Insights
    Conference 2017

    Chicago, IL | 11-12 September, 2017

    Build Skills. Exchange Ideas. Develop Community.

    2017 Presentations

    Steve Bailey
    Steven P. Bailey, LLC
    Experimental Strategies for Factor Screening and Process Optimization
    A two-stage Strategy of Experimentation that has been taught and used successfully in DuPont for over 50 years. The strategy involves using two-level screening designs – Plackett-Burman or fractional factorial designs – to identify the important process factors, followed by using response surface designs – central composite or Box-Behnken designs – to identify the optimal settings for the critical few factors. More recently, a class of three-level Definite Screening Designs for screening and optimizing continuous factors was introduced, followed by extending the approach to include categorical factors at two levels. These experimental strategies will be reviewed and compared, using a 7-factor workshop case study from DuPont's BB and MBB training program for illustration.
    Erin O’Shaughnessy
    Minitab
    Fast Track Your Analysis: Automate Repetitive Tasks with Macros
    Minitab macros can save you significant amount of time by automating repetitive tasks. The simplest form of Minitab macros is an Exec macro. However, an Exec macro may not be enough to fully automate your task. Global macros are easy to write and are more powerful than Exec macros. This presentation assumes that you have an understanding of Exec macros. By the end of this session, you will understand the structure of Global macros and be able to convert an Exec macro into a Global macro and add control statements, such as DO loops and IF statements, to make your macro more flexible and powerful.
    Greg Eisenbach
    Berry Global
    DOE: A Practical Theory Guide for the Non-Statistician
    Design of Experiments is an often used all-encompassing term as something that is done when the normal problem-solving methods don't work. In reality, Design of Experiments is a collection of tools used together to help understand how things work. In this discussion, we'll look at the tools, where they came from, and what they tell you. We also look at the theory behind the numbers, not enough to be statisticians but enough to impart a better understanding of how the system works so you'll be better equipped to apply the tools to your own situation.
    Scott Sterbenz
    Ford Motor Company
    Industry Examples of Binary Logistic Regression
    Even with objective measurements, the performance of a product or process is sometimes best evaluated with a binary response. Examples include whether a customer returns a product for warranty service, whether the product or process functions as intended, or whether a part evaluation passes or fails. Six Sigma teams at both Ford Motor Company and the United States Bowling Congress have used the principles of Binary Logistic Regression in both research and manufacturing situations, where the response is binary in nature. Binary Logistic Regression has enabled these teams to properly set acceptance limits on the engineering predictive measurables for binary responses and accurately predict the probability of event occurrence, customer satisfaction, and throughput. In addition to the basic theory of Binary Logistic Regression, three case studies will be presented. The first focuses on the use of Binary Logistic Regression in manufacturing to reduce rework; the second details its use in product development to increase customer satisfaction and reduce warranty; the third, to determine the boundary of the pocket in the sport of bowling.
    Dr. Ken Feldman
    Dr Pepper Snapple Group
    Measure for Measure: Do Both Vendor and Customer Need to Do Inspections?
    This presentation and subsequent analysis highlights a common problem encountered at all types of organizations. Should you continue to do incoming inspection if the vendor is doing it before shipping? Should you do interim inspections on the line and then verify them at a central QC lab? In this session, we will explore a case study where a beverage company purchases plastic blow-molded bottles from a vendor for carbonated liquid refreshment beverages, and demonstrate how to use orthogonal regression to decide whether two inspection machines are equivalent.
    Jenn Atlas
    Minitab
    Maximize the Power of Companion: Monte Carlo Simulation and Process Optimization
    For both new product design and existing product improvement, a good approach to ensuring products meet their intended specifications is to study the process inputs and develop an equation that describes your process (y=f(x)). You can take those equations further by considering the uncertainty in the inputs- which results in a more realistic view of the response. Companion makes it easy for you to perform Monte Carlo Simulation to predict process capability, and to identify the best strategy to determine optimal process settings using parameter optimization and sensitivity analysis.
    Dennis Corbin
    Minitab
    Ease of Visualizing Data: Telling the Story with Powerful and Accurate Graphical Output
    With such a high aversion to numbers and an increasing need for quickly digestible results, pictures are worth so much more nowadays! Some stakeholders might require only a high-level overview and an understanding of how results relate to key metrics (e.g. “the bottom line”), while other stakeholders, such as Quality Improvement Managers, likely require more in-depth analyses. This presentation focuses on how to adequately present your technical and statistical findings to people with varying backgrounds. Some of Minitab’s most commonly used graphical tools used will be presented, such as histograms and main effects and interaction plots, in addition to techniques for constructing, interacting with, and adding adequate information to graphs. Common patterns of interest within specific graphs, and what these patterns imply in terms of process control, will also be discussed. Join us to see how Minitab can make your life easier when compiling reports.
    Fred Eberle
    Hi-Lex Controls, Inc.
    Mechanical Design Life and Capability from Weibull Failure Analysis
    This session will illustrate the process of Weibull failure analysis. In engineering, we usually want to create a predictive model of component failure to guide us in our design intent or to do confirmation testing to validate the integrity of the actual invention. Fatigue failure is investigated because most gearing and machine design components encounter dynamic stresses that are linear, rotational, or steady state or possess a varying duty cycle. Such components are intended to have a specific design life. Knowing when a failure is likely to occur or having the ability to estimate a mean time to failure may be the most important analysis in mechanical and electrical engineering. Nothing could be truer in applications where the economics of failure are significant.
    Joel Smith
    Dr Pepper Snapple Group
    The Five Y’s: Rethinking the Right Response

    Data-driven improvement typically follows a familiar pattern: Define a Y metric, identify potential X variables, obtain data, and model Y=f(X) to determine optimal conditions. When that pattern was used to investigate critical factors in maintaining product availability at retailers, things broke down quickly in spite of having a large set of high-quality data. So what should a practitioner do when modeling X’s result in “Why?” instead of Y?

    In this case, the Y metric for the project turned out not be the right Y to model – a lesson all Minitab users can learn from. But discovering the right Y was not as easy as it first seemed, and the presenter ultimately had to keep digging deeper until the right metric was uncovered. Adding to the challenge, asking “Why?” uncovered that the true controllable metric was censored – meaning that for many observations only a range of possible values was known. Fortunately Minitab comes armed with the tools for analyzing such data, and in the end Y=f(x) saved the day once again!

    Attendees will learn advice for selecting an appropriate Y metric, giving a model a healthy reality check, and dealing with censored data.

    Dr. Diane Evans
    Rose-Hulman Institute of Technology
    A Pinch or a Dash? An Attribute Agreement Analysis Classroom Lab
    In order to be health conscious when preparing or eating meals, it’s important not to overuse certain ingredients, such as sugar or salt. If a recipe calls for a pinch of salt, few people have the necessary tools to measure a pinch. If you want to get very technical and scientific about tiny measurements, a pinch is generally defined as 1/16 teaspoon, while a dash is “somewhere between 1/16 and a scant 1/8 teaspoon.” Without measuring tools, can people really distinguish between a dash and pinch, or a drop and a smidgen? This presentation is a hands-on classroom Attribute Agreement Analysis lab to determine if appraisers can categorize the amounts of cornmeal in tiny plastic bags by sight alone.
    Doug Gorman
    Minitab
    Meet and Exceed Customer Needs Through Design for Six Sigma
    As organizations mature in their continuous improvement journey, Six Sigma thinking must pervade all aspects of product and process development. Companion brings together Design for Six Sigma (DFSS) methodologies, Quality Function Deployment (QFD), and engineering decomposition to link product and process parameters to customer needs.
    Dr. David Kashmer
    The Surgical Lab, LLC
    Specific Tools to Take You from Volume to Value in Healthcare
    Value-based reimbursement is here in Healthcare. This talk shares specific tools, tips, & techniques to help transition from volume based (fee for service) to value based reimbursement. Stories and discussion center around specific stories of quality improvement – including how to improve physician buy-in – and introduce a novel measure for quantifying the value of a healthcare process. Come join us to consider how specific quality improvement tools take us from volume to value!
    Thomas Rust
    Autoliv
    Visualizing Your Analysis and the 7 Basic Quality Tools
    Inferential statistics tests are powerful tools that help identify sources of variation and understand our processes. But, between p-values, normality tests, ANOVA's, DOE's, t-tests, and a lot more alphabetic soup, they can be confusing and very daunting. Most analyses should start or can even be completely understood with 5 basic graphs (5 of the 7 Basic Quality Tools). These can easily be created in Minitab and allow for a very visual analysis of most data. With a few simple modifications – such as confidence intervals – these tools can be even more powerful with little additional complication. We will also see how some simple automation can make replicating these graphs in your own custom way achievable.
    Cheryl Pammer
    Minitab
    Using Normal and Non-Normal Tolerance Intervals to Assess Process Performance
    A tolerance interval is a range of measurements in which a given proportion of the process population is likely to fall. The most common use of these intervals is to compare them with the product’s specification limits to make sure that a certain proportion of the product population will remain within specification with a high degree of confidence. Because tolerance interval calculations are very sensitive to the assumed distribution, it is important to understand the distribution of your data before choosing the appropriate Tolerance Interval tool in Minitab. Through the use of case studies covering a variety of applications, you will learn strategies for creating tolerance intervals that you can successfully apply to your own normal or non-normal data.
    Gerald Phillips
    Phillips Statistical Consulting, LLC
    One Failure Mode or Two?
    Do you know how to identify if your data demonstrates multiple populations or failure modes? This tutorial session will provide examples on how to identify and estimate the reliability of data with multiple populations. If all transformations and distributions fail to fit your data, then the likely root cause is multiple populations or failure modes, which frequently occur in reliability testing to failure. For example, performing a pull test to failure on a solvent bond junction may result in the material breaking or the bond separating, resulting in at least two populations. Minitab’s Individual Distribution Identification tool will be used to generate 14 probability plots to diagnose whether the data is 1) normal, 2) skewed or 3) contains multiple populations. Parametric Distribution Analysis will then be used to provide a product reliability statement for multiple failure modes of the form: One is 95% confident that at least R% of the product meet the specification requirements.
    Dr. Scott Kowalski
    Minitab
    Are Your DOE Screening Designs Definitive?
    Definitive screening designs are one of the exciting new features in Minitab 18. Screening designs are a common DOE tool used in the initial stages of experimentation to identify the vital few important factors from the many potential ones. Common screening designs are fractional factorials and Plackett-Burman designs. A new approach to screening factors is to use a definitive screening design. This talk will introduce definitive screening designs as well as how to use Minitab 18 to construct the design and analyze the resulting data.
    SPC: Beyond X-bar and R Charts
    Statistical Process Control (SPC) is a powerful tool for exploring and controlling processes. It has been a proven tool for years and many companies use X-bar and R control charts to stay competitive. However, there are many control charts that can be effectively used in SPC. This talk will introduce some of the other available control charts in Minitab and provide examples that show where they can be used in practice.
    Richard Titus
    Titus Consulting
    The Forgotten MSA’s: Linearity and Bias, Type I Gage, and Attribute Agreement Analysis
    Crayola manufactures over 2.2 billion crayons and 500 million markers per year at their Lehigh Valley facilities. Since 2007, over 100 Green and Black Belts have been trained and certified, where Measurement Systems Analysis (MSA) tools and techniques have been utilized as an essential and integral part of their Lean Six Sigma projects. Typically Gage R&R’s have been employed on projects insuring the measurements could be trusted or needed improvement. It seems that Linearity and Bias, Type I Gage and Attribute Agreement Analysis are the forgotten MSAs. This presentation will highlight the importance, application and learnings based on the use of these important measurement system assessment tools.
    Michael Rusak
    Air Products and Chemicals
    Pommy Grewal
    Air Products and Chemicals
    Improving Hydrogen Plant Performance by Refining Measurement Detection Confidence
    The efficient operation of large scale chemical processing plants requires operational data to adjust performance. This presentation will describe how statistical approaches were used to help identify measurement discrepancies and increase the reliability of on-line measurement instruments for several hydrogen production facilities. Minitab’s Gage R&R tools provided quantitative estimates on data reliability and instrument behavior over the required operating range. We will discuss how these insights and information were key to identifying the best measuring device for manufacturing operations and instrument data integrity maintenance protocols.
    Rip Stauffer
    Management Science and Innovation, Inc.
    Testing for Differences using Stages: Exploiting Homogeneity in Time-Series Data
    Many Six Sigma Black Belt and Business Statistics classes push t-tests, z-tests and other tests of hypotheses as the default method for determining the differences between various input variables, and for checking the results of improvement actions. Many of these texts and courses also relegate process control charts to the Control phase in DMAIC. However, because process improvement projects almost always use time-series data for primary input and output variables, process behavior or control charts offer not only the best estimate of baseline performance, but they are also more powerful (and sensitive) tools for detecting differences and changes as they happen. Minitab's control charting functions make it easy to detect differences and to divide the time series into segments, for comparing production lines, shifts, operators or performance before and after. This makes them great tools for stratification in the Measure phase, for validation runs after experiments in the Analyze phase, and to very quickly detect the effects of actions in the Improve phase. In this presentation, we will examine the theory and assumptions underlying commonly-used tests and charts, discuss the importance of homogeneity in baselines, demonstrate data stratification, and explore the concept of rational subgrouping (and its relationship to data stratification).
    Bonnie Stone
    Minitab
    Flow Diagrams, Swim Lanes, and Value Stream Maps, Oh My!

    Process mapping is an easy-to-use method that allows people to analyze and agree on the most efficient paths for improving or re-engineering processes. Process maps aid in identifying redundant and inefficient tasks, poor handoffs, and unclear decisions. Process maps can be used with virtually any process, but selecting the best process map tool often is not clear leading to delays and rework for the process improvement team.

    This session will present guidelines for using process mapping as an improvement tool by describing:

    • The benefits of process mapping
    • Three types of process maps that can be used to document, analyze, and improve processes
    • When to select which process map tool
    • How to read the process map to uncover improvement opportunities
    • Tips to construct a future state process map

    This presentation will demonstrate process mapping tools using Companion software and will be beneficial to all attendees who are involved in process improvement.

    Andrew McDermott
    Lutron Electronics
    A 5-cent Solution to a Million Dollar Solder Problem

    This presentation will summarize the approach taken by a team which improved internal vision inspection yields from 65 to over 98 percent. A project was initiated to resolve customer returns due to open solder connections. The project team ultimately discovered a root cause unknown to the electronics industry. The tools used for determining the root cause were not new and ranged from simple graphical analysis to more complex statistical tools such as DOE’s.

    In this presentation, we will also discuss the difficulties the team encountered to implement a solution when something new and unknown gets discovered, summarize the data-driven techniques used, and provide tips and lessons learned during the team’s journey.

    Lauren Migliore
    Crayola, LLC
    Inside Crayola's Lean Six Sigma Matrix: A New Process Engineer's Journey

    My journey with Six Sigma at Crayola started with a project focused on improving product cost in the watercolor line. The green belt training, project mentoring and senior management project reviews combined with great support from operations personnel help insure project success. Through measurement and data analysis I learned the importance of developing a clearly defined problem statement. The problem statement was not provided and had to be discovered. Working with a team through the measure phase of the project to develop a clearly defined charter statement is a skill that I will use throughout my career at Crayola. Directly observing the process and speaking to operators was the starting point to brainstorming practical theories when entering the measure phase. Spending time speaking to the operators and observing the current state of the process drove all of the improvement efforts. Understanding the statistical methods and software tools allowed all decisions to be data driven. As a new engineer, the ability to make decisions solely on the data made changes easier to communicate to all impacted by the decision.

    Currently I am enrolled in Black Belt training and certification with projects varying from supplier quality assessments, capability analysis, many designed experiments and measurement systems analysis. My projects have had great involvement and participation from operations personnel. We have worked to analyze top contributors of process downtime through Pareto chart analysis, conducted and designed various design of experiments, and built the capability of raw material suppliers to reduce process variation. All of these components are essential elements that are part of Crayola’s Lean Six Sigma Matrix and support the on-going efforts to reduce cost, sustain excellent quality and continually improve customer service levels.

    Charles Harrison
    Salford Systems
    Introduction to CART Decision Trees

    In this session, we will introduce Classification and Regression Trees (CART). We will provide a detailed description of the CART algorithm, discuss advantages of using CART, learn how to interpret the model, and discuss how you can use CART to better understand your data and potentially improve your models. Nearly all of the concepts discussed will be explained via visual or animated examples.

    Amy AlSahsah
    Greene, Tweed and Company
    Effectively Manage Your CI Program

    Managing your CI program can be difficult but it doesn't have to be. We will discuss practical tips and tricks for managing your program for the best results and moving your company forward. In addition to project management, we will also discuss how we manage strategic initiatives and deal with global and site issues.

By using this site you agree to the use of cookies for analytics and personalized content in accordance with our Policy.

OK