Skip to content
logo_color
  • Free FMEA Course
  • Services
Contact Us
Contact Us
logo_color

Introduction to FMEA

5
  • What is Risk in FMEA? Why Prevention Important?
  • Introduction to FMEA | Purpose & Key Benefits
  • History of FMEA – NASA to AIAG to AIAG-VDA
  • Types of FMEA – DFMEA, PFMEA, and FMEA-MSR
  • FMEA in APQP & IATF 16949 Context

Foundations of FMEA

7
  • Function Requirement Failure in FMEA
  • Severity in FMEA (AIAG-VDA) | Explained with Examples
  • Occurrence in FMEA (AIAG-VDA) | Explained with Examples
  • Detection in FMEA (AIAG-VDA) | Explained with Examples
  • RPN vs Action Priority (AP) – Why RPN is Outdated
  • FMEA Linkages – ISO 9001, IATF 16949, APQP, PPAP.
  • Why AIAG-VDA 7-Step Approach?

Step-1: Planning & Preparation in FMEA

4
  • Step 1 – Planning & Preparation in FMEA (AIAG-VDA Standard)
  • The Five Ts in FMEA – Intent, Timing, Team, Task, Tools
  • Defining Scope, Boundaries & Assumptions in FMEA
  • Cross-Functional Team Formation in FMEA

Step 2: Structure Analysis in FMEA

4
  • Step 2 – Structure Analysis in FMEA
  • System, Subsystem, and Component Breakdown in FMEA
  • Process Flow – Structure Tree & Block Diagram in FMEA
  • Motor Stator Winding – Structure Analysis in FMEA Example

Step 3: Function Analysis in FMEA

3
  • Step 3 – Function Analysis in FMEA
  • Defining Functions & Requirements in FMEA
  • How to Write Measurable Requirements in FMEA

Step 4: Failure Analysis in FMEA

6
  • Step 4 – Failure Analysis in FMEA (Failure Modes, Effects, Causes)
  • Function Net in FMEA | Chain of Functions
  • Failure at Mode Level – Failure Modes
  • Effects of Failure in FMEA
  • Causes of Failure in FMEA (Design vs Process)
  • Cascading Failures – Failure Cause Mode Effect Relationship in FMEA

Step 5: Risk Analysis in FMEA

9
  • Current Detection Controls in FMEA
  • Current Prevention Controls in FMEA (AIAG-VDA Standard)
  • Risk Evaluation in FMEA
  • Action Priority (AP) vs RPN in FMEA
  • Action Priority in FMEA (AIAG-VDA Standard)
  • Step 5 – Risk Analysis in FMEA
  • Severity in FMEA (AIAG-VDA) | Explained with Examples
  • Occurrence in FMEA (AIAG-VDA) | Explained with Examples
  • Detection in FMEA (AIAG-VDA) | Explained with Examples

Step 6: Optimization in FMEA

2
  • Tracking & Closing Actions in FMEA
  • Step 6 – Optimization in FMEA

Step 7: Results Documentation in FMEA

3
  • Customer Communication & Lessons Learned in FMEA
  • FMEA Report (Summary Table)
  • Step 7 – Results Documentation in FMEA

DFMEA in Practice

8
  • DFMEA in Practice – Step‑by‑Step
  • DFMEA Audit Readiness
  • DFMEA Optimization Step
  • DFMEA Risk Analysis
  • DFMEA Failure Analysis
  • DFMEA Function Analysis
  • DFMEA Structure Analysis
  • Product Snapshot – DFMEA in Practice (Step-by-Step)

PFMEA in Practice

10
  • PFMEA Audit Readiness
  • PFMEA Results Documentation
  • PFMEA Optimization step
  • PFMEA Risk Analysis
  • PFMEA Failure Analysis
  • PFMEA Function Analysis
  • PFMEA Structure Analysis
  • PFMEA Planning and Preparation
  • PFMEA Process Snapshot
  • PFMEA in Practice – Step‑by‑Step

FMEA Linkages

5
  • 📘 Case Study: How DFMEA Links to PFMEA and Control Plan — A Practical Guide
  • How FMEA Links to PPAP Deliverables
  • Prevention and Detection Controls in PFMEA to Control Plan | How to Link Them
  • How FMEA Drives Control Plans in Manufacturing Quality
  • FMEA and Control Plan Linkage

FMEA Tools & Templates

3
  • Excel vs Professional FMEA Software: Explain
  • FMEA in APIS IQ, PLATO SCIO, and Knowlence TDC: Overview of Top FMEA Software Tools
  • Excel-Based AIAG-VDA FMEA Template (Walkthrough)

FMEA Best Practices

2
  • FMEA Moderation: Common Mistakes & Best Practices
  • Common Mistakes & Best Practices in FMEA Creation

FMEA Advanced Applications

12
  • Future of FMEA – AI, Automation & Digital Technology
  • FMEA Use Cases in EVs, Welding, Electronics & Embedded Systems
  • Internal & Customer FMEA Audit Preparation
  • FMEA Moderation Techniques for Cross-Functional Teams
  • Advanced Failure Cause Modeling in FMEA
  • Family FMEA – Save Time Across Product Lines
  • FMEA in APQP Phases and Project Milestones
  • Using FMEA in Functional Safety (ISO 26262)
  • What is System FMEA? Scope, Structure & Interface Analysis
  • Which FMEA Software Should You Choose?
  • Software for FMEA
  • How FMEA Links with Control Plan, PPAP & Special Characteristics
View Categories
  • Home
  • FMEA Knowledge base
  • PFMEA in Practice
  • PFMEA Risk Analysis

PFMEA Risk Analysis

FMEA Expert
Updated on September 6, 2025

4 min read

🧭 What you’re doing in Step 5 #

You already set Severity (S) in Step 4 from the end-user effect. Now you will:

  1. Rate Occurrence (O) — “How likely is the cause to happen given our prevention controls?”
  2. Rate Detection (D) — “How likely is it we’ll catch the failure before ship given our detection controls?”
  3. Use the AIAG-VDA Action Priority (AP) table to classify each row as High / Medium / Low priority for action.

Rule of thumb: Improve O with prevention, improve D with detection. Don’t mix them.


📊 Practical rating rubric (evidence-based) #

Occurrence (O) — rate the cause frequency (1 = remote, 10 = very high)

Use hard evidence wherever possible:

  • Prevention design strength: poka-yoke, recipe lock, hard-stops, error-proof fixtures
  • Capability: Cp/Cpk (or Pp/Ppk) vs. spec; stability (SPC)
  • Historical data: FPY, defect ppm, audit findings, MTBF/PM compliance
O rangeTypical evidence & interpretation (examples)
1–2Robust prevention + historical proof (Cpk ≥ 1.67, mistake-proof design, no escapes in 12 months)
3–5Good controls but not bulletproof (Cpk ≈ 1.33, controlled via SPC/PM; rare issues)
6–8Weak or manual prevention; recurring issues weekly; Cpk < 1.00 or unstable
9–10New/untuned process, no prevention, frequent daily issues

Detection (D) — rate the chance of NOT detecting (1 = almost certain detect, 10 = almost impossible to detect)

Consider coverage, timing, automation, and MSA:

  • Coverage: 100% vs. sample; inline vs. offline
  • Timing: early station vs. final EoL vs. no test
  • Automation: interlocks, curve/limits auto-check vs. manual visual
  • MSA quality: GR&R, masters, calibration, false-accept rate
D rangeTypical evidence & interpretation (examples)
1–2Automated 100% detection at/near source with reliable interlock & proven MSA; cannot pass if bad
3–5Automated 100% or strong test but later in flow / relies on trend/curve checks; solid MSA
6–8Sample checks, manual visual, late/indirect test; MSA marginal
9–10No control, or the failure is latent/undetectable prior to ship

Prevention affects O (e.g., hard-stop, recipe lock). Detection affects D (e.g., LVDT 100%, leak bench results, MES gates). A “preventing” poke-yoke is not a detection control.


🔺 Action Priority (AP) — deciding what to fix first #

Use the AIAG-VDA AP table (S, O, D) to mark each row H / M / L:

  • AP = High (H): Action is required (or you must strongly justify why not).
  • AP = Medium (M): Consider action; justify if none taken.
  • AP = Low (L): No action typically required.

Quick heuristics (teaching aide):

  • If S ≥ 9, AP tends to be High unless both O and D are very low (≈1–2).
  • If D ≥ 7 on any safety/regulatory row (S ≥ 8), AP likely High.
  • Strong, early 100% detection (D ≈ 2–3) can move AP to Medium/Low when S < 9.

(Use your customer’s official AP table for final calls.)


🧪 Worked example — rating O/D and AP for key chains #

S values came from Lesson 5.4. Here we justify O and D from the current controls (no future actions yet).

#StationFailure Mode (from 5.4)SODAPWhy (O/D justification)
1OP05 Press-fitDepth out of spec / signature out1043HHard-stop + recipe + LVDT 100% + SPC → O=4; inline 100% detection D=3. S=10 keeps AP High.
2OP05 Press-fitSpline/impeller crack (over-force)834MForce window + signature catch most issues → O=3, D=4 (pattern-based).
3OP06 SealMis-orientation / lip damage832M/LowGuided fixture + 100% vision gives strong detection (D=2). With O=3, AP often Low/Medium.
4OP06 SealShaft roughness high (Ra)846HPrevention via COA + periodic audit (O=4), but detection mostly late (OP10/12) (D=6).
5OP07 ESDESD event uncontrolled958HManual checks & shift audits only; latent failure → weak detection (D=8). Occurrence not rare (O=5) without continuous monitor.
6OP07 PottingMass low / voids / under-cure747HRecipe lock exists (O=4), but no guaranteed 100% in-station verification → late/latent D=7.
7OP09 TorqueUnder-torque933MDC tool with 100% trace & socket ID (D=3) and good calibration (O=3).
8OP09 TorqueOver-torque / cross-thread834MStrategy + trace (D=4). Some risk from thread starts → O=3.
9OP10 Pre-leakBench recipe mis-set / fixture leak844H/MRecipe lock & daily master help, but fixture wear & set-up risk persist (O=4, D=4). Many customers still expect H closure here.
10OP12 Final testTest bypass / recipe wrong1028HRole/recipe control lowers O, but if bypass occurs there’s no back-stop → D=8.
11OP12 Final flowMeter mis-cal / clogged filter1036HCal matrix exists (O=3). False pass risk until master/dual-check tightened → D=6.
12OP01 KittingWrong impeller variant1033M100% scan & kit verify; still O=3 due to human/label risks; D=3 with MES gate.
13Interface (MES)“No scan → no progress” gate disabled1029HRare override (O=2) but catastrophic if it happens (D=9).
14OP08 ConnectorMis-seat / poor crimp745MPM + vision/pull sample: O=4, D=5 (sampled). Strengthen either will move to Low.

Use this table as a pattern: repeat for your remaining rows. When in doubt, err conservative on D if evidence is weak (e.g., no GR&R, no master part routine, or sampled manual checks).


📌 Building defensible ratings — the “evidence ladder” #

When auditors/customers ask “Why O=3? Why D=4?”, point to:

For O (prevention strength)

  • Fixture design reviews, poka-yoke photos, recipe lock screenshots
  • SPC stability/Cpk reports, PM compliance logs, supplier COAs with incoming capability
  • Pilot/Run@Rate defect Pareto (frequency)

For D (detection coverage & quality)

  • 100% vs. sample rationale; station placement (early vs. final)
  • MSA studies (GR&R %, ndc), master part schedules/results, calibration certificates
  • Curve storage, automated limit checks, interlocks, MES gate logs
  • False-accept/false-reject data (where available)

🧾 Risk Analysis worksheet (columns to keep) #

  • Function → Failure Mode → Effects (line & end user) → S
  • Cause → O (with evidence note)
  • Current detection → D (with evidence note)
  • AP (H/M/L) → Decision (Action? Yes/No + rationale)

(You’ll add owners/dates and re-ratings in Step 6.)


✅ Outputs of Step 5 #

  • ✅ Each PFMEA row has defensible O & D ratings tied to real evidence.
  • ✅ AP marked to drive prioritization (H first, then M).
  • ✅ A short list of must-fix chains ready for Step 6 (Optimization).

⚠️ Common pitfalls (and quick fixes) #

PitfallFix
Using RPN instead of APAIAG-VDA uses AP. Keep RPN out of decisions.
Rating by opinionAttach evidence (SPC, MSA, PM, masters, audits).
Giving D too much credit for preventionRemember: prevention → O; only checks/tests → D.
Late detection accepted as “good”Final EoL only is weaker than in-station detection.
Not separating false-accept riskUse masters, calibration, and curve plausibility checks.

🔗 What’s next #

Proceed to Lesson 5.6 — Step 6: Optimization (Actions & Re-evaluation). We’ll convert AP=High/Medium rows into concrete actions, assign owners/dates, and re-rate O/D based on evidence after implementation.


🧠 Pro Tip #

If you can’t prove a control works (with data), rate it as if it doesn’t. Let actions in Step 6 earn you the lower O/D later.

Updated on September 6, 2025

Are this content helpful..

  • Happy
  • Normal
  • Sad

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
PFMEA Optimization stepPFMEA Failure Analysis
Table of Contents
  • 🧭 What you’re doing in Step 5
  • 📊 Practical rating rubric (evidence-based)
  • 🔺 Action Priority (AP) — deciding what to fix first
  • 🧪 Worked example — rating O/D and AP for key chains
  • 📌 Building defensible ratings — the “evidence ladder”
  • 🧾 Risk Analysis worksheet (columns to keep)
  • ✅ Outputs of Step 5
  • ⚠️ Common pitfalls (and quick fixes)
  • 🔗 What’s next
  • 🧠 Pro Tip
  • Free FMEA Course
  • Services
Contact Us
Contact Us
logo_color

One touch solution for FMEA documentation training or creation and support.

Learn

  • Knowledge base
  • Training
  • Newsletter

Company

  • About Us
  • Contact
  • Services
  • Products

Connect

© 2025 Quality Assist

Powered by Quality Assist