Computer-Vision Fault Inspection of Yarn Packages Based on Interpretable Deep Learning

Student thesis: Phd

Abstract

The textile manufacturing industry faces critical challenges in quality control, particularly in yarn package inspection. The manual processes remain prevalent due to the challenges including the high-speed nature of production, the diversity of fault classes, difficulty in validating and trusting automated inspection, and the distributed deployment of inspection nodes. The current inspection results in inconsistent quality assessment, significant material waste, and environmental negative impact. This thesis addresses the challenges by developing a comprehensive framework for computer-vision-based fault inspection of yarn packages in textile manufacturing using interpretable deep learning approaches. The research progresses systematically through interconnected approaches. The Vision-based Inspection System for Closed-loop Analysis and Recycling (VISCAR) establishes the foundation for sustainable manufacturing practices. VISCAR incorporates the Scalable Processing Engine for Edge Decision (SPEED) for computational efficiency, and the Multi-scale Fault Inspection Network (MFIN) for robust feature extraction across diverse fault classes. Building upon this framework, the Hierarchical Extraction and Tree-based Interpretable Network (HETIN) introduces prototype-driven reasoning and dual-direction feature-fault correspondence. HETIN provides transparent explanations for classification decisions while maintaining high detection accuracy. The research culminates in a Federated Edge Computing Architecture (FECA). FECA addresses statistical heterogeneity, class imbalance, and catastrophic forgetting in distributed manufacturing environments through innovative components including Alternating Federated Edge Learning (AFEL), Prototype-based Imbalance Calibration (PIC), and Variational Consolidation for Adaptive Retention (VCAR). Experimental validation demonstrates that these integrated approaches significantly improve fault inspection accuracy, computational efficiency, and model interpretability while enabling effective deployment across distributed manufacturing facilities. HETIN achieves a state-of-the-art accuracy of 96.99% with a mean Intersection over Union (IoU) of 74.55%, outperforming conventional baselines. In terms of efficiency, the SPEED framework is shown to reduce peak processing latency by over 90% during simulated production overloads. Furthermore, FECA proves its effectiveness in distributed settings by improving model performance on specialised nodes by up to 5.5% over a generic global model, boosting F1 scores on severely imbalanced data by over 20 percentage points, and preventing a catastrophic 12% accuracy drop during continuous learning.
Date of Award19 Dec 2025
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorHugh Gong (Main Supervisor) & Anura Fernando (Co Supervisor)

Keywords

  • Computer Vision
  • Interpretable Deep Learning
  • Fault Inspection
  • Textile Quality

Cite this

'