Hostname: page-component-68945f75b7-72kh6 Total loading time: 0 Render date: 2024-08-06T03:44:20.207Z Has data issue: false hasContentIssue false

A comparative study of in-field motion capture approaches for body kinematics measurement in construction

Published online by Cambridge University Press:  20 December 2017

JoonOh Seo*
Affiliation:
Department of Building and Real Estate, Hong Kong Polytechnic University, Hong Kong
Abdullatif Alwasel
Affiliation:
Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada
SangHyun Lee
Affiliation:
Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA
Eihab M. Abdel-Rahman
Affiliation:
Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada
Carl Haas
Affiliation:
Department of Civil and Environmental Engineering, University of Waterloo, Waterloo, ON, Canada
*
*Corresponding author. E-mail: joonoh.seo@polyu.edu.hk

Summary

Due to physically demanding tasks in construction, workers are exposed to significant safety and health risks. Measuring and evaluating body kinematics while performing tasks helps to identify the fundamental causes of excessive physical demands, enabling practitioners to implement appropriate interventions to reduce them. Recently, non-invasive or minimally invasive motion capture approaches such as vision-based motion capture systems and angular measurement sensors have emerged, which can be used for in-field kinematics measurements, minimally interfering with on-going work. Given that these approaches have pros and cons for kinematic measurement due to adopted sensors and algorithms, an in-depth understanding of the performance of each approach will support better decisions for their adoption in construction. With this background, the authors evaluate the performance of vision-based (RGB-D sensor-, stereovision camera-, and multiple camera-based) and an angular measurement sensor-based (i.e., an optical encoder) approach to measure body angles through experimental testing. Specifically, measured body angles from these approaches were compared with the ones obtained from a marker-based motion capture system that has less than 0.1 mm of errors. The results showed that vision-based approaches have about 5–10 degrees of error in body angles, while an angular measurement sensor-based approach measured body angles with about 3 degrees of error during diverse tasks. The results indicate that, in general, these approaches can be applicable for diverse ergonomic methods to identify potential safety and health risks, such as rough postural assessment, time and motion study or trajectory analysis where some errors in motion data would not significantly sacrifice their reliability. Combined with relatively accurate angular measurement sensors, vision-based motion capture approaches also have great potential to enable us to perform in-depth physical demand analysis such as biomechanical analysis that requires full-body motion data, even though further improvement of accuracy is necessary. Additionally, understanding of body kinematics of workers would enable ergonomic mechanical design for automated machines and assistive robots that helps to reduce physical demands while supporting workers' capabilities.

Type
Articles
Copyright
Copyright © Cambridge University Press 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Everett, J. G., “Overexertion injuries in construction,” J. Constr. Eng. Manag. 125 (2), 109114 (1999).CrossRefGoogle Scholar
Center for Construction Research and Training (CPWR), The Construction Chart Book: The U.S. Construction Industry and Its Workers (CPWR Publications, Silver Spring, MD, USA, 2013).Google Scholar
Zatsiorski, V. M., Kinematics of Human Motion (Human Kinetics, Champaign, IL, USA, 2002).Google Scholar
Radwin, R. G., Marras, W. S. and Lavender, S. A., “Biomechanical aspects of work-related musculoskeletal disorders,” Theor. Issues Ergon. 2 (2), 153217 (2001).CrossRefGoogle Scholar
Armstrong, T. J., Buckle, P., Fine, L. J., Hagberg, M., Jonsson, B., Kilbom, A., Kuorinka, I. A., Silverstein, B. A., Sjogaard, G. and Viikari-Juntura, E. R., “A conceptual model for work-related neck and upper-limb musculoskeletal disorders,” Scand. J. Work Environ. Health, 19 (2), 7384 (1993).CrossRefGoogle ScholarPubMed
Everett, J. G. and Slocum, A. H., “Automation and robotics opportunities: Construction versus manufacturing,” J. Constr. Eng. Manag. 120 (2), 443452 (1994).CrossRefGoogle Scholar
Schiele, A. and van der Helm, F. C., “Kinematic design to improve ergonomics in human machine interaction,” IEEE Trans. Neural Syst. Rehabil. Eng. 14 (4), 456469 (2006).CrossRefGoogle ScholarPubMed
Aminian, K. and Najafi, B., “Capturing human motion using body-fixed sensors: Outdoor measurement and clinical applications,” Comput. Animat. Virtual Worlds, 15 (2), 7994 (2004).CrossRefGoogle Scholar
Takubo, T., Imada, Y., Ohara, K., Mae, Y. and Arai, T., “Rough Terrain Walking for Bipedal Robot by Using ZMP Criteria Map,” Proceedings of the IEEE International Conference on Robotics and Automation (May 12–17, 2009) pp. 788–793.Google Scholar
Cruz-Ramírez, S. R., Mae, Y., Arai, T., Takubo, T. and Ohara, K., “Vision-based hierarchical recognition for dismantling robot applied to interior renewal of buildings,” Comput. Aided Civ. Inf. 26 (5), 336355 (2011).CrossRefGoogle Scholar
Jacobs, T., Reiser, U., Hägele, M. and Verl, A., “Development of Validation Methods for the Safety of Mobile Service Robots with Manipulator,” Proceedings of the 7th German Conference on Robotics ROBOTIK2012, Munich, Germany (May 21–22, 2012) pp. 117–122.Google Scholar
Migliaccio, G., Teizer, J., Cheng, T. and Gatti, U., “Automatic Identification of Unsafe Bending Behavior of Construction Workers Using Real-Time Location Sensing and Physiological Status Monitoring,” Proceedings of the Construction Research Congress CRC2012, West Lafayette, IN, USA (May 21–23, 2012).Google Scholar
Ray, S. J. and Teizer, J., “Real-time construction worker posture analysis for ergonomics training,” Adv. Eng. Inform. 26 (2), 439455 (2012).CrossRefGoogle Scholar
Georgoulas, C., Linner, T. and Bock, T., “Towards a vision controlled robotic home environment,” Automat. Constr. 39, 106116 (2014).CrossRefGoogle Scholar
Linner, T., Güttler, J., Bock, T. and Georgoulas, C., “Assistive robotic micro-rooms for independent living,” Automat. Constr. 51, 822 (2015).CrossRefGoogle Scholar
Han, S. and Lee, S., “A vision-based motion capture and recognition framework for behavior-based safety management,” Automat. Constr. 35, 131141 (2013).CrossRefGoogle Scholar
Han, S., Achar, M., Lee, S. and Peña-Mora, F., “Empirical assessment of a RGB-D sensor on motion capture and action recognition for construction worker monitoring,” Vis. Eng. 1 (1), 113 (2013).CrossRefGoogle Scholar
Liu, M., Han, S. and Lee, S., “Tracking-based 3D human skeleton extraction from stereo video camera toward an on-site safety and ergonomic analysis,” Constr. Innov. 16 (3), 348367 (2016).CrossRefGoogle Scholar
Starbuck, R., Seo, J., Han, S. and Lee, S., “A Stereo Vision-Based Approach to Marker-Less Motion Capture for On-Site Kinematic Modeling of Construction Worker Tasks,” Proceedings of the 15th International Conference on Computing in Civil and Building Engineering ICCCBE2014, Orlando, FL, USA (Jun. 23–25, 2014).Google Scholar
Alwasel, A. A., Elrayes, K., Abdel-Rahman, E. M. and Haas, C. T., “A Human Body Posture Sensor for Monitoring and Diagnosing MSD Risk Factors,” Proceedings of the 30th International Symposium on Automation and Robotics in Construction ISARC2013, Montreal, Canada (Aug. 11–15, 2013).Google Scholar
Poppe, R., “A survey on vision-based human action recognition,” Image Vis. Comput. 28 (6), 976990 (2010).CrossRefGoogle Scholar
Lee, M. W. and Cohen, I., “A model-based approach for estimating human 3D poses in static images,” IEEE Trans. Pattern Anal. 28 (6), 905916 (2006).CrossRefGoogle ScholarPubMed
Plagemann, C., Ganapathi, V., Koller, D. and Thrun, S., “Real-Time Identification and Localization of Body Parts from Depth Images,” Proceedings of the IEEE International Conference on Robotics and Automation ICRA2010, Anchorage, AK, USA pp. 3108–3113.Google Scholar
Siddiqui, M. and Medioni, G., “Human Pose Estimation from a Single View Point, Real-Time Range Sensor,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops CVPRW2010, San Francisco, CA, USA (Jun. 13–18, 2010) pp. 1–8.Google Scholar
Shotton, J., Sharp, T., Kipman, A., Fitzgibbon, A., Finocchio, M., Blake, A. and Moore, R., “Real-time human pose recognition in parts from single depth images.” Commun. ACM, 56 (1), 116124 (2013).CrossRefGoogle Scholar
Rafibakhsh, N., Gong, J., Siddiqui, M. K., Gordon, C. and Lee, H. F., “Analysis of Xbox Kinect Sensor Data for Use on Construction Sites: Depth Accuracy and Sensor Interference Assessment,” Proceedings of Constitution Research Congress CRC2012, West Lafayette, IN, USA (May 21–23, 2012) pp. 848–857.Google Scholar
Zhang, Z., “Microsoft kinect sensor and its effect,” IEEE Multimed. Mag. 19 (2), 410 (2012).CrossRefGoogle Scholar
Jin, S., Cho, J., Pham, X. D., Lee, K. M., Park, S. K., Kim, M. and Jeon, J. W., “FPGA design and implementation of a real-time stereo vision system,” IEEE Trans. Circ. Syst. Vid. 20 (1), 1526 (2010).Google Scholar
Woodfill, J. I., Gordon, G. and Buck, R., “Tyzx Deepsea High Speed Stereo Vision System,” Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop CVPRW2004, Washington, DC, USA (Jun. 27–Jul. 2, 2004).Google Scholar
Shan, C., Tan, T. and Wei, Y., “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40 (7), 19581970 (2007).CrossRefGoogle Scholar
Liu, C., Yuen, J. and Torralba, A., “SIFT flow: Dense correspondence across scenes and its applications,” IEEE Trans. Pattern Anal. 33 (5), 978994 (2011).CrossRefGoogle ScholarPubMed
Uijlings, J. R. R., Smeulders, A. W. M. and Scha, R. J. H., “Real-time visual concept classification,” IEEE Trans. Multimed. 12 (7), 665681 (2010).CrossRefGoogle Scholar
Bay, H., Tuytelaars, T. and van Gool, L., “SURF: Speeded up robust features,” Comput. Vis. Image Underst. 110 (3), 346359 (2008).CrossRefGoogle Scholar
Hartley, R. and Zisserman, A., Multiple View Geometry in Computer Vision (Cambridge University Press, Cambridge, UK, 2003).Google Scholar
Zhang, Z., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22 (11), 13301334 (2000).CrossRefGoogle Scholar
Tousignant, M., de Bellefeuille, L., O'Donoughue, S. and Grahovac, S., “Criterion validity of the cervical range of motion (CROM) goniometer for cervical flexion and extension,” Spine 25 (3), 324330 (2000).CrossRefGoogle ScholarPubMed
Alwasel, A. A., Abdel-Rahman, E. M. and Haas, C. T., “A Technique to Detect Fatigue in the Lower Limbs,” Proceedings of the ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, New York, USA (Aug. 17–20, 2014).Google Scholar
Yamada, T., Hayamizu, Y., Yamamoto, Y., Yomogida, Y., Izadi-Najafabadi, A., Futaba, D. N. and Hata, K., “A stretchable carbon nanotube strain sensor for human-motion detection,” Nat. Nanotechnol. 6 (5), 296301 (2011).CrossRefGoogle ScholarPubMed
Veltink, P. H., Bussmann, H. B., De Vries, W., Martens, W. L. and Van Lummel, R. C., “Detection of static and dynamic activities using uniaxial accelerometers,” IEEE Trans. Neural Syst. Rehabil. Eng. 4 (4), 375385 (1996).CrossRefGoogle ScholarPubMed
Savitzky, A. and Golay, M. J., “Smoothing and differentiation of data by simplified least squares procedures,” J. Anal. Chem. 36 (8), 16271639 (1964).CrossRefGoogle Scholar
Esser, P., Dawes, H., Collett, J. and Howells, K., “IMU: Inertial sensing of vertical CoM movement,” J. Biomech. 42 (10), 15781581 (2009).CrossRefGoogle ScholarPubMed
Kim, M. Y., Ayaz, S. M., Park, J. and Roh, Y., “Adaptive 3D sensing system based on variable magnification using stereo vision and structured light,” Opt. Laser Eng. 55, 113127 (2014).CrossRefGoogle Scholar
Rowe, P. J., Myles, C. M., Hillmann, S. J. and Hazlewood, M. E., “Validation of flexible electrogoniometry as a measure of joint kinematics,” J. Physiother. 87 (9), 479488 (2001).CrossRefGoogle Scholar
Li, G. and Buckle, P., “Current techniques for assessing physical exposure to work-related musculoskeletal risks, with emphasis on posture-based methods,” Ergonomics 42 (5), 674695 (1999).CrossRefGoogle ScholarPubMed
McAtamney, L. and Corlett, E. N., “RULA: A survey method for the investigation of work-related upper limb disorders,” Appl. Ergon. 24 (2), 9199 (1993).CrossRefGoogle ScholarPubMed
Hignett, S. and McAtamney, L., “Rapid entire body assessment (REBA),” Appl. Ergon. 31 (2), 201205 (2000).CrossRefGoogle Scholar
Laurig, W., Kühn, F. M. and Schoo, K. C., “An approach to assessing motor workload in assembly tasks by the use of predetermined-motion-time systems,” Appl. Ergon. 16 (2), 119125 (1985).CrossRefGoogle ScholarPubMed
Kilbom, Å. and Persson, J., “Work technique and its consequences for musculoskeletal disorders,” Ergonomics, 30 (2), 273279 (1987).CrossRefGoogle ScholarPubMed
Kilbom, Å., “Repetitive work of the upper extremity: Part II-the scientific basis (knowledge base) for the guide,” Int. J. Ind. Ergon. 14 (1), 5986 (1994).CrossRefGoogle Scholar
Seo, J., Starbuck, R., Han, S., Lee, S. and Armstrong, T., “Motion-data–driven biomechanical analysis during construction tasks on sites,” J. Comput. Civil Eng. 29 (4), (2014).Google Scholar
Chaffin, D. B., Andersson, G. and Martin, B. J., Occupational Biomechanics, 4th ed. (Wiley, New York, USA, 2006).Google Scholar
Chaffin, D. B. and Erig, M., “Three-dimensional biomechanical static strength prediction model sensitivity to postural and anthropometric inaccuracies,” IIE Trans. 23 (3), 215227 (1991).CrossRefGoogle Scholar
Knoop, S., Vacek, S. and Dillmann, R., “Sensor Fusion for 3D Human Body Tracking with an Articulated 3D Body Model,” Proceedings of the IEEE International Conference on Robotics and Automation ICRA2006 pp. 1686–1691.Google Scholar