Employing techniques used by astronomers to “measure the shapes of galaxies”, a team of researchers at the University of Hull led by Masters student Adejumoke Owolabi has found that deepfaked humans don’t have the same consistency in reflections across both eyes, allowing them to tell if a portrait of a human was AI-generated or not.
“We detect the reflections in an automated way and run their morphological features through the CAS [concentration, asymmetry, smoothness] and Gini indices to compare similarity between left and right eyeballs. The findings show that deepfakes have some differences between the pair.
It’s important to note that this is not a silver bullet for detecting fake images. There are false positives and false negatives; it’s not going to get everything. But this method provides us with a basis, a plan of attack, in the arms race to detect deepfakes.”
Kevin Pimbblet, professor of astrophysics, University of Hull
The Gini coefficient measures the distribution of light in any given image of a galaxy. It orders the pixels by their brightness and compares the results to a perfectly even distribution.
Adejumoke and her colleagues presented their work at the recent Royal Astronomical Society Astronomy Meeting in Hull.