Prematurely born infants receive special care in the Neonatal Intensive Care Unit (NICU), where physiological parameters are continuously monitored. Clinically, pain and discomfort are considered as subjective experiences and are typically measured by patient self-report. Nurses and parents are responsible to assess the discomfort and possible pain because infants cannot communicate about it. In this project, we investigate an experimental video monitoring system for automatic discomfort detection in infants’ faces based on the analysis of their facial expressions.
Prematurely born infants receive special care in the Neonatal Intensive Care Unit (NICU), where various physiological parameters, such as heart rate, oxygen saturation and temperature are continuously monitored. However, there is no system for monitoring and interpreting their facial expressions, the most prominent discomfort indicator. Clinically, pain and discomfort are considered as subjective experiences and are typically measured by patient self-report. Healthy adults are able to indicate the intensity, location and duration of their pain. However, infants do not have the ability to communicate verbally and thus are unable to self-report. Inability to provide a reliable report about pain leaves the neonates vulnerable to under-recognition, and under- or over-treatment. Nurses and parents are then responsible to assess the discomfort of the infant and take action whenever treatment is needed.
The assessment of pain in infants is considered as one of the most challenging problems in neonatology. There are many reasons for recognizing pain in infants. Most significantly, pain is a major indication of infant illness. Furthermore, the quality of the care that an infant receives depends largely on the quality of the pain assessment. It is important to recognize and treat pain, since persistent unrelieved pain can cause severe complications, such as nervous system changes and delayed development. Behavioral pain indicators include crying, changes in body movements and changes in facial expressions. Physiological changes involve increase in heart rate, respiratory rate, and blood pressure, as well as changes in the levels of oxygen and carbon dioxide in the blood. Many pain assessment tools have been created to assist healthcare professionals to identify and quantify pain and discomfort, such as the COMFORT, PIPP (Premature Infant Pain Profile), BIIP (Behavioral Indicators of Infant Pain) and MIPS (Modified Infant Pain Scale). However, there is currently no broadly accepted tool to assess neonatal pain. The main controversy surrounding the use of such assessment tools is the subjectivity of the observer. Despite the significance of pain recognition, most neonatal intensive care units do not have sufficient resources for monitoring it continuously and manual assessment of pain is time-consuming.
In this project, we investigate an experimental video monitoring system for automatic discomfort detection in infants’ faces based on the analysis of their facial expressions. The proposed system appearance modeling and/or specific features of the face to robustly track both the global motion of the newborn’s face, as well as its inner features. The system detects discomfort by employing the feature representations of the face on a frame-by-frame basis, using a Support Vector Machine (SVM) classifier. Several aspects increase the performance of the system, like the extraction of several histogram-based texture descriptors to improve the appearance representations/models. Data features are fused into a robust classifier, while we improve the temporal behavior and stability of the discomfort detection by applying an averaging filter to the classification outputs. As a bonus, the system offers monitoring of the infant’s expressions when it is left unattended and it additionally provides objective judgment of discomfort.
This project is a cooperation between MMC Hospital Veldhoven and Eindhoven University of Technology (SPS-VCA group, faculty EE). Involved Researchers: Ir. Ronald Saeijs and MSc. Cheng Li.