Switch Outgrown your ERP? See what happens when you stop paying per user and start moving fast. See migration options →
AI & Intelligence

How Computer Vision Replaced Manual Inspection for 2,500+ Part Variants

November 2025 6 min read

The problem: inspecting thousands of variants at aerospace tolerances

Quality inspection in aerospace manufacturing operates under constraints that differ fundamentally from most other industries. The tolerance for defect escape — a defective component reaching service — is functionally zero. Inspection standards are defined by regulatory frameworks and customer specifications that carry legal weight. And the cost of a recall or in-service failure is not merely financial: it is reputational and potentially catastrophic.

A leading aerospace manufacturer produces an unusually wide range of precision components. Across machining, casting, forging, and composites operations, the facility produces more than 2,500 distinct part variants. Each variant has different geometry, different surface finish requirements, different dimensional tolerances, and different known failure modes. Some defects are structural — cracks, voids, inclusions — visible under the right conditions. Others are dimensional — a feature that is nominally correct but outside tolerance by a few microns.

The inspection department was staffed with experienced inspectors working under controlled conditions. The challenge was not competence — it was scalability and consistency. An inspector working the fourth hour of an eight-hour shift is demonstrably less alert than an inspector working the first. An inspector familiar with one part family may have a different mental model of acceptable surface condition than a colleague who specialises in another family. Inspection criteria that exist in engineering documents do not always translate uniformly into human judgement calls made under production pressure.

The defect escape rate was not alarming in absolute terms, but for aerospace it was higher than acceptable. Escaped defects were found at downstream operations, during final inspection, or — in the most serious cases — by the customer. Each escape generated a non-conformance report, a root cause investigation, and a corrective action. The administrative burden of managing non-conformances was itself a significant overhead. More importantly, the data generated by non-conformances pointed to a systematic problem: certain defect types were consistently being missed by visual inspection because they were genuinely difficult for human inspectors to detect reliably.

2,500+Part variants covered
Faster inspection throughput
24%Fewer defect escapes

The solution: cloud-based CV as an augmentation layer

The system deployed was a cloud-based computer vision platform designed specifically for multi-variant manufacturing inspection. The design philosophy was explicitly one of augmentation, not replacement. The system flags defects and anomalies; human auditors make the final disposition decision. This distinction shaped almost every aspect of the technical architecture and the operational workflow.

At each inspection station, parts are presented to a calibrated imaging array — camera configuration varies by part family, with some parts requiring multi-angle imaging to capture all inspection surfaces. The imaging is triggered automatically when a part is placed in the inspection fixture, typically by the operator who has just completed the preceding operation. Cycle time for imaging and initial analysis is under three seconds for most part variants.

The CV system performs two parallel analyses. The first is defect detection: the model identifies surface anomalies — cracks, porosity, scratches, burrs, inclusions, and other surface-condition defects — and marks them with bounding box annotations on the captured image. The confidence score for each detection is displayed alongside the annotation. The second analysis is dimensional verification: key features — hole diameters, edge radii, datum surface conditions — are measured from the image and compared against the inspection plan for that part variant. Features outside tolerance are flagged with the measured value and the drawing tolerance.

Parts that pass both analyses with no flags are classified as OK and released to the next operation. Parts with one or more flags — regardless of whether the flag is a defect detection or a dimensional exceedance — are classified as NOT-OK and routed to the review board queue.

The system catches things that our inspectors miss — not because our inspectors are bad, but because some defect types are genuinely at the edge of human visual detection. That is precisely where machine vision adds value.

Handling 2,500+ variants: the training challenge

The most significant technical challenge in this deployment was not the computer vision algorithms — those are mature technology — but the variant management problem. Training a CV model to detect defects on a single part family requires a labelled dataset of acceptable and defective parts. Doing this for 2,500+ variants, many of which have low production volumes and therefore limited training data, required a different approach than a single-product automated inspection system.

The platform addresses this through a combination of transfer learning and active learning. A foundation model trained on a broad dataset of aerospace inspection images provides baseline capability across material types and defect categories. Part-specific models are fine-tuned from this foundation using the available labelled data for each variant. For new part numbers with no inspection history, the foundation model provides an initial capability that is monitored closely and refined as inspection data accumulates.

The active learning loop is critical to sustaining model quality. Every time a human auditor overrides a system flag — either confirming a defect that the system classified as OK, or clearing a flag that the system raised on a non-defective part — that decision feeds back into the training pipeline. The model learns from the audit decisions continuously. False positive rates on mature part variants have declined steadily over the deployment period as the models have accumulated inspection history.

Unknown defects — defects that the model was not trained to detect because they had not previously been observed on that part variant — are handled through the anomaly detection component of the analysis. The system flags any surface region that departs significantly from the normal surface condition of that part variant, even if it does not match a known defect pattern. These anomaly flags are reviewed by the most experienced inspectors on the audit team. Several instances of novel defect modes were identified through this mechanism before they appeared in non-conformance reports from downstream operations.

The human-AI review workflow

The review board is a structured workflow, not an ad hoc inspection queue. Flagged parts arrive at review stations with the full imaging data attached: the original captured images, the annotated defect and dimensional flags, the confidence scores, and the part's inspection history. Auditors can navigate through multiple angles of each flag, zoom into the annotated region, and compare the flagged feature against the reference image for that part variant.

The disposition options are deliberate in their specificity. An auditor can mark a part as Accept — the flag was a false positive, the part is conforming. They can mark it as Rework — the defect is present but can be corrected by a defined rework process. Or they can mark it as Scrap — the defect renders the part unfit for use. Each disposition requires the auditor to select a reason code and, for rework or scrap decisions, the specific defect classification from a structured list.

This structured disposition data is not just an audit trail — it is training data. Rework and scrap decisions annotated with defect classifications feed directly back into the model training pipeline. The system learns the specific language of defect classification used by the quality team, not just the visual patterns associated with defects.

Critically, the system does not allow an automated scrap or rework decision. Every NOT-OK part is reviewed by a human auditor before disposition. This constraint was a non-negotiable requirement from the quality team — and it is the right one. In aerospace manufacturing, the accountability for component disposition sits with qualified personnel, not with an algorithm. The CV system is an inspection tool, not an inspection authority.

Dimensional verification: replacing gauges with images

The dimensional verification component of the system has proved to be as valuable as defect detection, and less obvious to visitors who think of computer vision primarily in terms of surface defect detection.

Traditional dimensional inspection uses physical gauges — pin gauges, plug gauges, CMM probes — that are time-consuming to set up, require calibration management, and can only measure features that the gauge physically contacts. Photogrammetric measurement from calibrated camera arrays can measure multiple features simultaneously from a single image capture, without physical contact, in under two seconds.

The CV system's dimensional analysis does not replace CMM measurement for critical features that require the highest accuracy. It operates as a 100% in-line check for features where image-based measurement provides sufficient accuracy — typically ±0.05mm or better for a well-calibrated setup. CMM measurement is reserved for the final stage on critical characteristics. The combination means that dimensional non-conformances are identified at the source operation rather than at final inspection, reducing rework costs and preventing the building of further value onto a part that will ultimately be rejected.

Results: consistency at scale

After twelve months of full operation, the outcomes across the key measures were:

The deployment did not reduce the inspection headcount. That was never the intent, and it would have been the wrong metric. The inspection team is now doing higher-value work: reviewing the ambiguous cases, investigating novel defect patterns, and contributing to model training through structured disposition decisions. The volume of work the team can handle has increased; the nature of the work has shifted toward tasks that genuinely require expert human judgement.

The broader implication for manufacturers considering AI in quality operations is that the frame of "automation replacing inspection" misses the actual value proposition. Computer vision adds the most value precisely where human inspection is least reliable — in detecting subtle defect modes at high throughput, consistently across shifts and fatigue states. In aerospace, where the cost of a missed defect is catastrophic, that reliability differential is the entire argument.

Ready to see it in action?

Get started today. No credit card required.

Get Started Book a Demo