Detecting Deepfakes with Deep Learning and Gabor Filters

Jameel, Wildan J. and Kadhem, Suhad M. and Abbas, Ayad R. (2022) Detecting Deepfakes with Deep Learning and Gabor Filters. ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY, 10 (1). pp. 18-22. ISSN 2410-9355

[img] Text (PDF File)
ARO.10917-Vol10.No1.2022.ISSUE18-PP 18-22.pdf
Available under License Creative Commons Attribution Non-commercial Share Alike.

Download (1MB)
Official URL:


The proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters in different directions and then feed them to a binary CNN classifier instead of using the red-green-blue color information. The purpose of this paper is to give the reader a deeper view of (1) enhancing the efficiency of distinguishing fake facial images from real facial images by developing a novel model based on deep learning and Gabor filters and (2) how deep learning (CNN) if combined with forensic tools (Gabor filters) contributed to the detection of deepfakes. Our experiment shows that the training accuracy reaches about 98.06% and 97.50% validation. Likened to the state-of-the-art methods, the proposed model has higher efficiency.

Item Type: Article
Uncontrolled Keywords: Deepfake detection, deep learning, Gabor filter, VGG16
Subjects: T Technology > TR Photography
Divisions: ARO-The Scientific Journal of Koya University > VOL 10, NO1 (2022)
Depositing User: Dr Salah Ismaeel Yahya
Date Deposited: 05 Aug 2022 12:04
Last Modified: 05 Aug 2022 12:04

Actions (login required)

View Item View Item