ISSN: 2476-2075

検眼: オープンアクセス

オープンアクセス

当社グループは 3,000 以上の世界的なカンファレンスシリーズ 米国、ヨーロッパ、世界中で毎年イベントが開催されます。 1,000 のより科学的な学会からの支援を受けたアジア および 700 以上の オープン アクセスを発行ジャーナルには 50,000 人以上の著名人が掲載されており、科学者が編集委員として名高い

オープンアクセスジャーナルはより多くの読者と引用を獲得
700 ジャーナル 15,000,000 人の読者 各ジャーナルは 25,000 人以上の読者を獲得

抽象的な

Revolutionizing Fashion Accessibility: Object Detection for Clothing Defect Detection in the Visually Impaired

Chris Lievens

The fashion industry plays a significant role in society, but it presents unique challenges for individuals with visual impairments. Detecting defects in clothing is a crucial task that allows individuals to maintain their self-confidence and independence. This article reviews the use of object detection technology to identify defects in clothing for blind people. We explore the current state of the art, challenges, and potential future directions for this technology, emphasizing its impact on the lives of visually impaired individuals. Blind people often encounter challenges in managing their clothing, specifically in identifying defects such as stains or holes. With the progress of the computer vision field, it is crucial to minimize these limitations as much as possible to assist blind people with selecting appropriate clothing. Therefore, the objective of this paper is to use object detection technology to categorize and detect stains on garments. The methodology used for the optimization of the defect detection system was based on three main components: (i) increasing the dataset with new defects, illumination conditions, and backgrounds, (ii) introducing data augmentation, and (iii) introducing defect classification. The authors compared and evaluated three different YOLOv5 models. The results of this study demonstrate that the proposed approach is effective and suitable for different challenging defect detection conditions, showing high average precision (AP) values, and paving the way for a mobile application to be accessible for the blind community.

免責事項: この要約は人工知能ツールを使用して翻訳されており、まだレビューまたは確認されていません。