Königsweg 63
14163 Berlin
+49 30 838 62676
gefluegelkrankheiten@vetmed.fu-berlin.de
This study investigates the use of deep learning, specifically convolutional neural networks (CNNs), and transfer learning for detecting signs of discomfort in chickens through image analysis.
We present a comprehensive framework that includes data preparation, model training, and evaluation using transfer learning with pre-trained CNN models such as EfficientNet and MobileNet. The methodology includes image extraction from video footage, followed by preprocessing, and augmentation to improve dataset diversity and robustness.
Model performance was evaluated using cross-validation on the original dataset and validation on two separate datasets, with metrics such as accuracy, sensitivity, and specificity. Results of the CNNs were compared to human observers' stress ratings on the same datasets (= images) of chickens using the Stressed Chicken Scale.
We found that AI can detect discomfort in individual chickens in side-view images, comparable to humans. Our findings show that certain CNN models, in particular variants of EfficientNet, show high performance in identifying stress signs in chickens. These results highlight the potential of deep learning for automated animal welfare monitoring.
To enhance model interpretability, we used a Grad-CAM, which provides valuable insights into the decision-making process of the models. We found that the AI “looks” at specific body parts of the chickens when making decisions.
This research contributes to the development of innovative, non-invasive methods for monitoring chicken welfare, and may provide the foundation for a useful tool for early detection of stress and discomfort indicators in chickens at individual animal level.