Learning Hand-to-Mouth Movements via Triaxial Accelerometers Skip to main content
Utah's Foremost Platform for Undergraduate Research Presentation
2014 Abstracts

Learning Hand-to-Mouth Movements via Triaxial Accelerometers

Stephen Clarkson, Brigham Young University

Health

While there is an abundance of mobile health apps for weight management on the market today, almost all focus entirely on net caloric intake (exercise minus calories consumed). Recording daily caloric intake can be cumbersome, inefficient and inaccurate. One rising suggestion in the health field for reaching weight goals is objectively recording hand-to-mouth movements (HTMMs) during meals throughout the day. This method focuses entirely on portion control and if any improvements are to be made in this area, an effective method of activity recognition must be developed. In this paper, we report our efforts to classify HTMMs and non-HTMMs in an effort to automate counting the number of HTMMs during meals throughout the day. We also report on the performance of several base-level classifiers such as k-NN, Naive Bayes and Decision Trees as well as meta-level classifiers (Voting, Bagging and Boosting).