MiC Quality MiC Quality BI-MONTHLY UPDATE
APRIL 2008 
  Click here for the Web version of the current issue  

MiC Quality

 

Past Issues
February 2008
December 2007
October 2007
August 2007
June 2007
April 2007
February 2007
 
 
 

 

 

Our Visits: Toronto and Cornell University 

In this issue:

To unsubscribe please use the contact form and select "Unsubscribe from Updates".

:: Special: 5% Discount

Enroll in any course by the end of May and receive...
5% DISCOUNT by quoting DM05
in the Promotional Code field of the enrollment form.

:: Face to Face Visits: Toronto and Cornell

We have recently returned from our face to face visits where we meet some of our students and business customers. This helps us understand how we can improve our existing online courses as well as generate ideas for new ones.

Our trip started in Toronto where we visited a company that specializes in precision silicone rubber products. We saw how they were using Measurement Systems Analysis to carry out precise measurements of small flexible components.

Near Toronto we visited a Fortune 500 company that manufactures power plants, that gave us an insight into how Advanced Statistics can improve the reliability of heat exchanger tubes.

In Ithaca, NY we visited Cornell University where we were privileged to see the cutting edge technology being used in the NanoScale Science and Technology Facility. This helped us understand how our Design of Experiments course can help with the exacting business of producing features that are less than 10 nanometers in size.

The visits are invaluable in showing how we can improve our courses, but it is a great pleasure to actually meet some of you face to face.

:: Six Sigma Glossary: Cohen's Kappa
Cohen's Kappa

Cohen's kappa is used to compare the degree of consensus between raters (inspectors) in, for example, Measurement Systems Analysis. It uses a contingency table approach.

Two raters inspect 150 parts independently and make the following determinations:

   
Bret
   
Reject
Accept
Total
  Reject
20
19
39
Alice Accept
1
110
111
  Total
21
129
150

The expected values in each cell would be:

   
Bret
   
Reject
Accept
Total
  Reject
5.46
33.54
39
Alice Accept
15.54
95.46
111
  Total
21
129
150

These are the values that would give the same totals if the determinations were made by pure chance and is calculated from:

(row total x column total)/overall total

The Kappa statistic is calculated from:

Kappa =
Actual - Expected
Trials - Expected
 =  130 - 100.92
150 - 100.92
=  0.593 

where:

Actual the number of times the appraisers agreed (110 + 20 = 130)
Expected the number of times they would have agreed by chance (5.46 + 95.46)
Trials the number of trials

The value of Kappa will be between 0 and 1.

If the results were made by chance, neither rater showing judgment the value would be zero. If the raters were in perfect agreement, the number of agreements would equal the number of trials and Kappa would be 1.

Enroll to learn more in the MSA course

MIC QUALITY ONLINE COURSES
:: Six Sigma Primer
:: Statistical Process Control (SPC), Advanced Statistical Process Control
:: Design of Experiments (DOE), Advanced Design Of Experiments
:: Primer in Statistics, Advanced Statistics
:: Measurement Systems Analysis (MSA)/ Gage R&R
:: FREE Excel Primer
:: FREE Sample Module "Introduction to Statistics"
:: ALL COURSES
FREE Statistics Reference Booklet
FREE Six Sigma Summary Booklet
DOWNLOAD Brochure for All Courses


FREE SIX SIGMA TRAINING RESOURCES
:: 

500+ terms Six Sigma Glossary
(includes all terms in the ASQ SSBB Six Sigma Black Belt Body of Knowledge)

::

Sigma Calculator

::

Book Reviews

:: Reference Tables
:: Next Update

Next update will come out in June.

[Top]