• Alma Brainy

MSA, what is so tough? (Part 1)

It is surprising that MSA is still the topic that less understood by the automotive suppliers. One of the most growing concern when it comes to MSA is the interpretation of the result. Having said that the attribute MSA has becoming the top in the concern pareto. In this entry (which comes in 4 parts) we will start with explanation on how to analyze the attribute MSA result.


The Cross Tab Method

The best known method so far in performing an attribute MSA is by using the Cross Tab Method. This method analyzes the distribution of data from two or more categorical variables. The result presented in the matrix as shown below:-

A, B and C represent three different appraisers

First Step:

You will need to analyse the table above by counting when they agreed and when they disagreed for each set of evaluation. "1" represents agreement while "0" is the otherwise. The summary of the analysis can be then summarised in the table below, take note the number of observation is 150 for each appraiser:-

Table III-D-1


Second Step:

Next is to estimate the expected data distribution. Out of total 150 observation, how many times they agreed to each other and how many times there are dispute in their judgement. If you analyse table III-C-1 correctly, you'll find A rejected the parts for 50 times while B rejected for 47 times.


PA ; 47/150 = 0.313

PB : 50/150 = 0.333


The probability that A and B agreed are given by:

PA x PB = 0.104


The expected number that A and B agreed that the parts are bad is calculated per the formula below:-

150 x (PAxPB) = 15.7



Third Step:

The similar approach is to be done between A&C and B&C.





Fourth Step:

To determine the level of agreement, ytou need to calculate the Kappa Value. Kappa value ranges from 0 to 1, with 1 represent the perfect agreement between appraisers.


Kappa is a measure of interrater agreement. Formula of Kappa is given below


kappa = Po - Pe / 1 - Pe


where

Po = sum of the observed propotions in diagonal cell

Pe = sum of the expected propotions in diagonal cell


Kappa greater than 0.75 indicated Good agreement and the measurement technique is acceptable while Kappa below 0.4 shall be rejected..


In the next entry we will be analyzing more on how table III-D-1 is constructed.

37 views