jump to content

Fachbereich Veterinärmedizin


Service-Navigation

    Publication Database

    Recognizing stressed chicken signs:A comparison using the Happy Chicken Tool and the Stressed Chicken Scale (2025)

    Art
    Zeitschriftenartikel / wissenschaftlicher Beitrag
    Autoren
    Schlegel-Pape, Larissa (WE 15)
    Opitz, Robert
    Henning, Marko
    Cruz, Cristina Ortiz
    Kleine, Anne S.
    Gebhardt-Henrich, Sabine G.
    Mielke, Hans
    Fischer-Tenhagen, Carola
    Quelle
    Poultry Science
    Seiten: 106141
    ISSN: 0032-5791
    Sprache
    Englisch
    Verweise
    URL (Volltext): https://linkinghub.elsevier.com/retrieve/pii/S0032579125013811
    DOI: 10.1016/j.psj.2025.106141
    Kontakt
    Nutztierklinik: Abteilung Geflügel

    Königsweg 63
    14163 Berlin
    +49 30 838 62676
    gefluegelkrankheiten@vetmed.fu-berlin.de

    Abstract / Zusammenfassung

    This study investigates the use of deep learning, specifically convolutional neural networks (CNNs), and transfer learning for detecting signs of discomfort in chickens through image analysis.
    We present a comprehensive framework that includes data preparation, model training, and evaluation using transfer learning with pre-trained CNN models such as EfficientNet and MobileNet. The methodology includes image extraction from video footage, followed by preprocessing, and augmentation to improve dataset diversity and robustness.
    Model performance was evaluated using cross-validation on the original dataset and validation on two separate datasets, with metrics such as accuracy, sensitivity, and specificity. Results of the CNNs were compared to human observers' stress ratings on the same datasets (= images) of chickens using the Stressed Chicken Scale.
    We found that AI can detect discomfort in individual chickens in side-view images, comparable to humans. Our findings show that certain CNN models, in particular variants of EfficientNet, show high performance in identifying stress signs in chickens. These results highlight the potential of deep learning for automated animal welfare monitoring.
    To enhance model interpretability, we used a Grad-CAM, which provides valuable insights into the decision-making process of the models. We found that the AI “looks” at specific body parts of the chickens when making decisions.
    This research contributes to the development of innovative, non-invasive methods for monitoring chicken welfare, and may provide the foundation for a useful tool for early detection of stress and discomfort indicators in chickens at individual animal level.