カンファレンス (国際) Improving the Accuracy of Walking Context Recognition Using Step Rate
Takuzo Ikuta (Ritsumeikan University), Kota Tsubouchi, Nobuhiko Nisho (Ritsumeikan University)
The eleventh edition of the International Conference on Indoor Positioning and Indoor Navigation (IPIN 2021)
Demand for recognition of purchasing behavior has been increasing due to the expectation that it would be profitable in marketing. A method of recognizing purchase behavior involves classifying customer behaviors into three states: “moving”, “searching” (for a product) and “observing” (i.e., examining the label, etc.) on the basis of acceleration and angular velocity data obtained from the sensor of a smartphone. The method can recognize a user's purchasing behavior without relying on external infrastructure such as cameras or radio frequency identifiers (RFIDs). However, its recognition accuracy is low; it falls below 30 percent especially in the searching state. In this study, we focus on the fact that the number of steps per time (step rate) is different in each state. That is, the customer’s step rate becomes slower as states change from moving to observing, as it is harder to examine products while on the move. In addition, we propose a two-stage classifier based on the differences in walking motion. Through these features and the classifier, we aim to improve the accuracy of behavior recognition. For an evaluation, we asked subjects to freely walk and shop for products in a supermarket and took movies of them. We compared the classification accuracies of three types of feature: base features, step-rate features, and the total feature combining the first two types. Our evaluation indicated the accuracy score of the step-rate features improved by 12.7% and that of the total feature improved by 18.2% relative to using the base features. In addition, a two-step classifier using the total features improve the accuracy score by 2.3% compared with the one-stage classifiers. By further subdividing purchase behaviors into ones such as “crouching motion”, we can further improve accuracy and also recognize purchase behaviors in more detail.