Sigma Level

Personalize This

Get insights for your role

Sigma level is a statistical measure of process capability that indicates how many standard deviations fit between the process mean and specification limits.

Illustration explaining Sigma Level

Definition

Sigma level is a statistical measure that indicates how well a process performs relative to customer specifications. It represents the number of standard deviations that fit between the process average and the nearest specification limit. Higher sigma levels mean fewer defects: a 3-sigma process produces about 66,800 defects per million opportunities, while a 6-sigma process produces only 3.4. This metric provides a universal language for comparing process quality across different industries and applications.

Examples

An automotive supplier measures shaft diameters with a specification of 10.0 ± 0.1 mm. Their process has a mean of 10.0 mm and standard deviation of 0.02 mm. With 5 standard deviations fitting within the spec limits, they operate at approximately 5 sigma (233 DPMO).

Key Points

  • 1 sigma = 690,000 DPMO (31% yield) — unacceptable for most processes
  • 3 sigma = 66,800 DPMO (93.3% yield) — average process performance
  • 4 sigma = 6,210 DPMO (99.4% yield) — good performance
  • 6 sigma = 3.4 DPMO (99.99966% yield) — world-class performance

Common Misconceptions

Every process should target 6 sigma. The appropriate sigma level depends on criticality and cost. A billing error might acceptably run at 4 sigma, while a medication process might require 6 sigma. The key is matching capability to customer needs.

Sigma level is just about manufacturing tolerances. Sigma level applies to any measurable output with a specification—response times, error rates, fill rates, on-time delivery. Any process with a defined "good" versus "defect" can be measured in sigma terms.