What is Measurement System Analysis?
Measurement System Analysis (MSA) is a scientific method used to evaluate the capability, accuracy, and stability of a measurement system. It quantifies the variation introduced by the measurement process itself, ensuring that collected data reliably represents the true characteristics of the product or process being measured.
Key Objectives of MSA
- Assess measurement system variation
- Quantify measurement error components
- Determine if the system is adequate for intended use
- Identify sources of measurement variation
- Provide basis for measurement system improvement
5 Key Characteristics of a Good Measurement System
1. Accuracy
Closeness of agreement between measured value and true value (lack of bias).
2. Precision
Closeness of agreement between repeated measurements (repeatability and reproducibility).
3. Stability
Consistency of measurements over time (lack of drift).
4. Linearity
Consistency of accuracy across the operating range.
5. Resolution
Ability to detect small differences (discrimination).
Components of Measurement Variation
Total observed variation consists of two main components:
σ²total = σ²part + σ²measurement
Measurement system variation itself has multiple components:

Gage Repeatability and Reproducibility (GR&R)
The most common MSA technique for variable data:
GR&R Study Components
- Repeatability (Equipment Variation): Variation when one operator measures the same part multiple times with the same gage
- Reproducibility (Appraiser Variation): Variation when different operators measure the same parts with the same gage
- Part-to-Part Variation: Actual variation between parts
Conducting a GR&R Study
- Select 10 parts representing the entire process range
- Choose 2-3 operators who normally use the gage
- Have each operator measure each part 2-3 times in random order
- Record all measurements carefully
- Analyze results using appropriate statistical methods
Interpreting GR&R Results
Results are typically expressed as % of tolerance or % of process variation:
%GR&R | Interpretation | Action Required |
---|---|---|
< 10% | Excellent measurement system | None |
10% - 30% | Marginally acceptable | May be acceptable based on application |
> 30% | Unacceptable measurement system | Must improve before use |

Important GR&R Considerations
- For destructive testing, use nested GR&R design
- Ensure parts cover the full range of production
- Operators should be unaware of previous measurements
- Use actual production measurement procedures
- Consider both %Tolerance and %Process Variation
Other MSA Methods
1. Bias Studies
Evaluate the difference between observed measurements and reference values:
Bias = Average(Measurements) - Reference Value
Conduct by measuring a reference standard multiple times.
2. Linearity Studies
Assess whether bias remains constant across the measurement range:
- Select 5+ parts covering the measurement range
- Determine reference values for each
- Measure each part multiple times
- Plot bias vs. reference values
3. Stability Studies
Evaluate measurement system performance over time:
- Measure a master or control part periodically
- Use control charts to monitor measurements
- Look for trends, shifts, or excessive variation
Attribute MSA (Discrete Data)
For pass/fail or categorical measurement systems:
Method | Description | Acceptance Criteria |
---|---|---|
Attribute Agreement Analysis | Assesses consistency of categorical judgments | Kappa > 0.75 (excellent) |
Signal Detection Theory | Evaluates sensitivity to defect detection | d' > 2 (good discrimination) |
Effectiveness Analysis | Measures correct classification rate | > 90% effectiveness |
Conducting an Attribute MSA
- Select 20-30 parts with known reference values (some good, some bad)
- Have 2-3 appraisers evaluate each part 2-3 times
- Calculate agreement statistics (Kappa, Effectiveness, etc.)
- Analyze false alarms and missed defects
MSA Best Practices
- Perform MSA before capability studies or SPC implementation
- Include all measurement systems used for quality decisions
- Use actual production parts covering the full range
- Involve normal production operators
- Follow standard measurement procedures
- Document all MSA studies thoroughly
- Revalidate after gage maintenance or process changes
- Consider both short-term and long-term variation
Common MSA Mistakes to Avoid
- Using perfect or identical parts in the study
- Not randomizing the measurement order
- Ignoring operator training effects
- Using incorrect tolerance or process variation
- Not addressing poor measurement systems promptly
- Forgetting to include environmental factors
- Assuming calibration ensures measurement capability
MSA Software Tools
Common software used for MSA analysis:
- Minitab
- JMP
- QI Macros
- Excel templates (AIAG format)
- MES-integrated MSA modules
- Custom statistical software
Automated MSA Benefits
- Reduces calculation errors
- Standardizes reporting formats
- Provides visualizations of results
- Enables trend analysis over time
- Facilitates enterprise-wide comparisons
MSA in Industry Standards
MSA requirements in major quality standards:
Standard | MSA Requirements | Reference Manual |
---|---|---|
IATF 16949 | Mandatory for all measurement systems | AIAG MSA Manual 4th Edition |
ISO 9001 | Implied requirement for valid measurements | ISO 10012 |
VDA 6.3 | Required for automotive suppliers | VDA Volume 5 |
AS9100 | Required for aerospace measurements | AS9103 |