The ergofox measures posture directly at the workplace. A holistic concept of analysis, as well as individual tips and exercises, supports the elimination of incorrect strain and ensures a healthier sitting posture in the long term.
The integrated depth sensor detects distances in three-dimensional space and thus calculates a depth image of the measuring environment. The depth image determined in this way in the form of point data serves as the basis for our algorithms to recognize your sitting posture.
Based on the depth data transmitted by the sensor, the built-in single-board computer calculates selected body points directly in the device in real time. These are sent to us in Hamburg as encrypted 3D coordinates (x/y/z) via an integrated SIM card and visualized as a seating position using trained algorithms.
Our posture analysis is also suitable for the home office and thus supports all employees in staying healthy at home. In this way, the sitting posture can also be measured, analyzed and improved at workstations without an external monitor.
"Do I sit up straight or do I push my head forward? From an ergonomic perspective, that's an important difference!"
"The ergofox is innovative and individual - we are looking for such solutions for our BGM"
"A great added value for our employees and super easy to use."
Would you like to support your team and their health and are you looking for an innovative, flexible solution?
With the ergofox you can address all employees of your company individually. In this way you ensure competent support to avoid postural damage and promote the development of a sustainably healthy sitting posture at the workplace. Depending on the size of the company, you can use one or more ergofox. After the measurement has been successfully completed, the ergofox is simply passed on to the next person. Up to 80 employees in your company can benefit from an ergofox within a year.
The ergofox is the successful result of a 3-year EU research project in cooperation with the Technical University of Vienna. The project was supported and funded by: