Essential Maintenance: All Authorea-powered sites will be offline 9am-10am EDT Tuesday 28 May
and 11pm-1am EDT Tuesday 28-Wednesday 29 May. We apologise for any inconvenience.

loading page

Identity-Privacy Protection for Facial-rPPG Based Smart Health Research
  • +1
  • Mingliang Chen ,
  • Ashira Jayaweera ,
  • Chau-Wai Wong ,
  • Min Wu
Mingliang Chen
University of Maryland

Corresponding Author:[email protected]

Author Profile
Ashira Jayaweera
Author Profile
Chau-Wai Wong
Author Profile

Abstract

Camera-based remote photoplethysmography (rPPG) technology has shown a promising future in contact-free cardiac and other smart health applications. The rPPG technology typically requires facial videos as a source input, which may lead to identity-privacy concerns. Facial videos are sensitive and contain subjects’ identifiable appearance features. Coupled with the health information potentially revealed by rPPG techniques, the compounding sensitivity has been a major obstacle to encouraging the sharing of facial rPPG video datasets in the research community to foster the advancement of the field. This paper investigates a suite of anonymization transforms that remove the identifiable appearance features in facial videos and retain the physiological signals for rPPG analysis. After the transformation, the facial videos are de-identified and may be shared in public with little risk of identity-privacy leakage. The proposed algorithm offers tunable options to balance the physiological fidelity and the identity-protecting strength to meet different levels of privacy requirements. A human subject study has been carried out to understand – both qualitatively and quantitatively – the perceived strength and efficacy of privacy protection by these anonymization techniques in de-identifying the facial videos and maintaining the physiological signals.