Getty Images
Using Machine Learning to Improve Ergonomics in Warehouse Jobs

Using Machine Learning to Improve Ergonomics in Warehouse Jobs

Sept. 10, 2019
"Robots and humans could have an active collaboration, where a robot can say, 'I see that you are picking up these heavy objects from the top shelf and I think you may be doing that a lot of times. Let me help you.'"

Machine learning is everywhere. And in the care of those working in warehouses, that’s a good thing.

Researchers at the University of Washington have used machine learning to develop a new system that can monitor factory and warehouse workers and tell them how risky their behaviors are in real time.

The algorithm divides up a series of activities — such as lifting a box off a high shelf, carrying it to a table and setting it down — into individual actions and then calculates a risk score associated with each action. 

The ability to calculate this risk is very important to an industry that saw the highest number of incidents in the overall  350,000 incidents of workers taking sick leave due to injuries affecting muscles, nerves, ligaments or tendons — like carpal tunnel syndrome — according to the U.S. Bureau of Labor Statistics in 2017.

"Right now workers can do a self-assessment where they fill out their daily tasks on a table to estimate how risky their activities are," said senior author Ashis Banerjee, an assistant professor in both the industrial & systems engineering and mechanical engineering departments at the UW. 

"But that's time-consuming, and it's hard for people to see how it's directly benefiting them. Now we have made this whole process fully automated. Our plan is to put it in a smartphone app so that workers can even monitor themselves and get immediate feedback."

For these self-assessments, people currently use a snapshot of a task being performed. The position of each joint gets a score, and the sum of all the scores determines how risky that pose is. But workers usually perform a series of motions for a specific task, and the researchers wanted their algorithm to be able to compute an overall score for the entire action.

Moving to video is more accurate, but it requires a new way to add up the scores. To train and test the algorithm, the team created a dataset containing 20 three-minute videos of people doing 17 activities that are common in warehouses or factories.

"One of the tasks we had people do was pick up a box from a rack and place it on a table," said first author Behnoosh Parsa, , a UW mechanical engineering doctoral student. "We wanted to capture different scenarios, so sometimes they would have to stretch their arms, twist their bodies or bend to pick something up."

The researchers captured their dataset using a Microsoft Kinect camera, which recorded 3D videos that allowed them to map out what was happening to the participants' joints during each task.

Using the Kinect data, the algorithm first learned to compute risk scores for each video frame. Then it progressed to identifying when a task started and ended so that it could calculate a risk score for an entire action.

The algorithm labeled three actions in the dataset as risky behaviors: picking up a box from a high shelf, and placing either a box or a rod onto a high shelf.

Now the team is developing an app that factory workers and supervisors can use to monitor in real-time the risks of their daily actions. The app will provide warnings for moderately risky actions and alerts for high-risk actions.

Eventually, the researchers want robots in warehouses or factories to be able to use the algorithm to help keep workers healthy. To see how well the algorithm could work in a hypothetical warehouse, the researchers had a robot monitor two participants performing the same activities. Within three seconds of the end of each activity, the robot showed a score on its display.

"Factories and warehouses have used automation for several decades. Now that people are starting to work in settings where robots are used, we have a unique opportunity to split up the work so that the robots are doing the risky jobs," Banerjee said. "Robots and humans could have an active collaboration, where a robot can say, 'I see that you are picking up these heavy objects from the top shelf and I think you may be doing that a lot of times. Let me help you.'"

Latest from Warehousing