PyAFAR: Python-based Automated Facial Action Recognition library for use in Infants and Adults

(FG 2024 & ACII 2023)

1University of Pittsburgh   2Utrecht University   3University of Miami   4CGI Technologies and Solutions Inc  

Abstract


PyAFAR is a Python-based, open-source facial action unit detection library for use with adults and infants. Convolutional Neural Networks were trained on BP4D+ for adults, and Histogram of Gradients (HoG) features from MIAMI and CLOCK databases for infants were used with Light Gradient Boosting Machines. In adults, Action Unit occurrence and intensity detection are enabled for 12 AUs. The 12 AUs were selected on the criterion that they occurred more than 5% of the time in the training data that included BP4D. Because 5% baseline was the minimum for which reliability could be measured with confidence, AU with prevalence lower than that were not included. Action unit intensity estimation is enabled for 5 of these AU. In infants, AU occurrence is enabled for 9 action units that are involved in expression of positive and negative affect. For both adults and infants, facial landmark and head pose tracking are enabled as well. For adults, multiple persons within a video may be tracked. The library is developed for ease of use. The models are available for fine-tuning and further training. PyAFAR may be easily incorporated into user Python code.

Pipeline of PyAFAR.

PyAFAR has separate modules for facial feature extraction and tracking, Face normalization and AU predictions:

It uses MediaPipe library for landmarks. Tracking is performed using the FaceNet. The Perspective-n-Point (PnP) method is used to predict Roll, Pitch and Yaw.

The landmark predictions are used to normalize faces using the dlib library.

Normalized faces are used for AU predictions (occurrence and intensity). Separate detection modules for occurrence are available for adults and infants. Intensity predictions are available for adults only.

PyAFAR can output frame-level predictions in CSV and JSON formats to enable easy reading with most platforms used by both computational as well as domain experts.


Demo

For installation instructions click here and how to use click here.

Interested in learning more about Facial Action Coding System (FACS) for Action Unit annotation? Visit this interactive application


How to use



Citations

                  
  @inproceedings{hindujapyafar,
  title={PyAFAR: Python-based Automated Facial Action Recognition library for use in Infants and Adults},
  author={Hinduja, Saurabh and Ertugrul, Itir Onal and Bilalpur, Maneesh and Messinger, Daniel S and Cohn, Jeffrey F},
  booktitle={International Conference on Affective Computing and Intelligent Interaction},
  year={2023},
  organization={IEEE}
  }

  @inproceedings{ertugrulpyafar,
  title={Expanding PyAFAR: A Novel Privacy-Preserving Infant AU Detector},
  author={Ertugrul, Itir Onal and Hinduja, Saurabh and Bilalpur, Maneesh and Messinger, Daniel S and Cohn, Jeffrey F},
  booktitle={International Conference on Automatic Face and Gesture Recognition},
  year={2024},
  organization={IEEE}
  }

                

Acknowledgements

The development of PyAFAR was supported in part by NIH awards R01MH096951, R01GM105004, and UH3NS100549.


License

PyAFAR is freely available for free non-commercial use, and may be redistributed under these conditions. Please see the complete licensing information. Interested in a commercial license? Please contact Jeffrey Cohn.