{"id":4590,"date":"2025-07-14T18:57:46","date_gmt":"2025-07-14T18:57:46","guid":{"rendered":"https:\/\/scian.cl\/scientific-image-analysis\/?p=4590"},"modified":"2025-07-14T18:57:46","modified_gmt":"2025-07-14T18:57:46","slug":"automatic-counting-of-dynamic-technical-actions-using-computer-vision-in-the-biomechanical-risk-assessment-of-work-related-musculoskeletal-disorders-of-the-upper-limb","status":"publish","type":"post","link":"https:\/\/scian.cl\/scientific-image-analysis\/automatic-counting-of-dynamic-technical-actions-using-computer-vision-in-the-biomechanical-risk-assessment-of-work-related-musculoskeletal-disorders-of-the-upper-limb\/","title":{"rendered":"Automatic Counting of Dynamic Technical Actions using Computer Vision in the Biomechanical Risk Assessment of Work-Related Musculoskeletal Disorders of the Upper Limb"},"content":{"rendered":"\n<p>In\u00a0<em>2024 International Symposium on 3D Analysis of Human Movement (3DAHM)<\/em><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Abstract:<\/h2>\n\n\n\n<p>Work-related musculoskeletal disorders constitute a huge cost to economies, with 30\u201340% of affected workers in the EU and US reporting these conditions over the last two decades. Human observation methods used for biomechanical risk assessment are inherently subjective. Inertial sensors can measure posture and motion variables for ergonomic assessments, but their operational use is limited due to the need for specialized knowledge, calibration, and potential to interfere with worker&#8217;s movement and performance. Computer vision offers a non-invasive alternative to assess human motion variables, yet literature on using this method for counting \u201cDynamic Technical Actions\u201d remains scarce. This study adapts and validates an algorithm initially designed for inertial sensors, on videos of repetitive tasks and 3D joint coordinates from the University of Washington Indoor Object Manipulation Dataset. The algorithm automatically counts \u201cDynamic Technical Actions\u201d and determines the \u201cFrequency Factor\u201d of the Occupational Repetitive Action method using Mediapipe Pose, a ready-to-use computer vision solution from Google. The performance is compared against \u201cConsensus Human Count\u201d of dynamic technical actions obtained from three trained evaluators, the comparison was made by testing 50 different threshold combinations and using statistical analyses. The adapted algorithm, with amplitude threshold = 15\u00b0 and temporal threshold = 0.7 s produced counts and frequency factors that were statistically indistinguishable from the human consensus, with errors below the standard deviations of the human counts. The adapted algorithm offers a promising tool for risk assessment, potentially improving the reproducibility of ergonomic evaluations using computer vision.<\/p>\n\n\n\n<p><a href=\"https:\/\/doi.org\/10.1109\/3DAHM62677.2024.10920871\" target=\"_blank\" rel=\"noreferrer noopener\">10.1109\/3DAHM62677.2024.10920871<\/a><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>6.\tContreras, D. V., H\u00e4rtel, S., &#038; Zeman, V. C. (2024, December). Automatic Counting of Dynamic Technical Actions using Computer Vision in the Biomechanical Risk Assessment of Work-Related Musculoskeletal Disorders of the Upper Limb. In 2024 International Symposium on 3D Analysis of Human Movement (3DAHM) (pp. 1-6). IEEE<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_kadence_starter_templates_imported_post":false,"_kad_post_transparent":"","_kad_post_title":"","_kad_post_layout":"","_kad_post_sidebar_id":"","_kad_post_content_style":"","_kad_post_vertical_padding":"","_kad_post_feature":"","_kad_post_feature_position":"","_kad_post_header":false,"_kad_post_footer":false,"_kad_post_classname":"","footnotes":""},"categories":[7,74],"tags":[],"class_list":["post-4590","post","type-post","status-publish","format-standard","hentry","category-publications","category-publications-2024"],"_links":{"self":[{"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/posts\/4590","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/comments?post=4590"}],"version-history":[{"count":1,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/posts\/4590\/revisions"}],"predecessor-version":[{"id":4591,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/posts\/4590\/revisions\/4591"}],"wp:attachment":[{"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/media?parent=4590"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/categories?post=4590"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scian.cl\/scientific-image-analysis\/wp-json\/wp\/v2\/tags?post=4590"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}