WASHINGTON – A new artificial intelligence tool named FaceAge is transforming how doctors evaluate patient health by turning selfies into biological age estimates that could influence life-saving decisions, especially for cancer patients.

Described in The Lancet Digital Health, FaceAge is a deep learning algorithm trained on nearly 59,000 headshots of healthy adults over 60. It compares facial features to biological aging markers and was shown to predict cancer survival better than traditional age-based metrics. On average, cancer patients appeared biologically nearly five years older than their actual age.

Co-developer Dr. Raymond Mak, an oncologist at Mass Brigham Health in Boston, says the tool could serve as a biomarker to help tailor cancer treatment intensity to a patient’s real physiological condition rather than just their age.

In one example, a fit 75-year-old whose FaceAge reads 65 might tolerate aggressive therapy better than a frail 60-year-old with a biological age of 70. This could impact decisions across medical fields, including heart surgery and end-of-life care.

FaceAge also outperformed physicians in predicting six-month survival odds for terminal cancer patients. While doctors did only slightly better than chance, their accuracy improved significantly when supported by FaceAge data.

The model, which pays attention to facial muscle tone over obvious traits like gray hair, even confirmed pop culture lore—accurately assessing actor Paul Rudd’s youthful look.

Although early results show minimal racial bias, the developers are now training a broader version of the tool and exploring how external factors like lighting or makeup could affect results. Ethical concerns remain, especially if such tools are misused by insurers or employers.

Researchers aim to launch a public version of FaceAge for research participation, while clinical use will require further validation.

Leave a Reply