zum Inhalt springen

Fachbereich Veterinärmedizin


Service-Navigation

    Publikationsdatenbank

    Towards an automated surveillance of well-being in mice using deep learning (2019)

    Art
    Vortrag
    Autoren
    Andresen, Niek
    Wöllhaf, Manuel
    Hohlbaum, Katharina (WE 11)
    Lewejohann, Lars (WE 11)
    Hellwig, Olaf
    Thöne-Reineke, Christa (WE 11)
    Belik, Vitaly (WE 16)
    Kongress
    19th Annual Congress of EUSAAT
    Linz, Österreich, 10. – 13.10.2019
    Quelle
    ALTEX proceedings
    Bandzählung: 8
    Heftzählung: 1
    Seiten: 3
    ISSN: 2194-0479
    Sprache
    Englisch
    Verweise
    URL (Volltext): http://www.altex.ch/resources/altex_Linz2019_full.pdf
    Kontakt
    Institut für Veterinär-Epidemiologie und Biometrie

    Königsweg 67
    14163 Berlin
    +49 30 838 56034
    epi@vetmed.fu-berlin.de

    Abstract / Zusammenfassung

    Appropriate refinement methods can only be applied if we are aware that well-being of an animal is compromised [1]. Therefore, tools to assess pain, suffering, and distress in laboratory animals are highly demanded. In recent years, coding systems to analyze the facial expressions of pain were developed for various animal species, for instance for mice, the most commonly used laboratory animals. The so-called Mouse Grimace Scale (MGS) is accurate and reliable [2]. It became a valuable tool for assessing the well-being of mice. However, the use of the MGS is very time-consuming because humans must be thoroughly trained. Moreover, someone must be present and generate live scores. The presence of a human is disadvantageous for well-being assessment
    since mice are prey animals and often hide signs of weakness, injury, and pain in the presence of humans [3]. Another option to use the MGS is to acquire images/videos to be scored retrospectively, which does not necessarily require the presence of humans. If MGS scores are obtained retrospectively and indicate impairment of well-being, there will be no chance to intervene and apply refinement measures at the right moment. Furthermore, the well-being of a mouse can only be assessed during periods in which the animals are monitored and humans evaluate their status.
    Taking into account the great effort and limitations of manual MGS scoring, it is decisive to find a way to automatically monitor well-being of a mouse [4]. Since facial expression analysis has been shown to be useful in mice, we focused on facial expression as a first step and aimed to develop an automated facial expression recognition software for mice [4]. For this approach, we used a data set of images of C57BL/6JRj mice, which had been generated in previous experiments after anesthesia (with isoflurane or ketamine/xylazine), and surgery (castration, under isoflurane and meloxicam) [4]. Images were generated in an observation cages (22×29×39 cm, three white walls, one clear wall) [4]. Since mice were moving freely in the observation cage, images contain natural variation regarding perspective and background [4]. On the one hand, this makes data analysis more challenging, but on the other hand our data set reflects realistic conditions as it would be
    obtainable without human intervention [4].

    Images of the data set were divided into two categories: 1) impaired well-being, 2) unimpaired well-being in order to train a binary classifier [4]. Three convolutional neural network architectures (two pre-trained state of the art deep CNN: ResNet50 and InceptionV3; one CNN without pre-training) were used and achieved an accuracy of up to 99% for the two categories [4]. The result depended on the treatment of the mice. The decision-making process of the CNN architectures was mainly based on the facial expressions of a mouse [4].
    Our semi-automated pipeline provides a first step towards the long-term goal to develop a fully automated surveillance (“smart environment”) for mice [4].