Königsweg 67
14163 Berlin
+49 30 838 61146
tierschutz@vetmed.fu-berlin.de
Equine daily behavior is a key welfare indicator, offering insights into how environmental and training conditions influence health and well-being. Continuous direct behavior observation, however, is labor-intensive and impractical for large-scale studies. While advances in wearable sensors and deep learning have revolutionized human and animal activity recognition, automated wearable sensor systems for recognizing a diverse repertoire of equine daily behaviors remain limited.
We propose a hierarchical deep learning framework combining a Time-Distributed Residual LSTM-CNN for extracting local spatiotemporal features from short subsegments of sensor data and a bidirectional LSTM (BiLSTM) for capturing long-term temporal dependencies. Our model was validated using approximately 60 h of tri-axial accelerometer and gyroscope data collected from 10 horses wearing collar-mounted sensors. Fifteen
daily behaviors were labeled based on video recordings. The model achieved an overall classification accuracy of > 93 % in 10-fold cross-validation and > 85 % in leave-one-subject-out cross-validation. The classification performance was significantly affected by housing conditions and the associated varying frequency of behaviors in the dataset.
This study provides a valid framework for sensor-based automatic behavior recognition in horses, capable of capturing both local spatiotemporal and long-term temporal dependencies from raw sensor data. Our proposed framework enables scalable and reliable monitoring of equine daily behaviors and makes an important contribution to the development of automated, data-driven approaches to equine welfare assessment.