• Home
  • Browse
    • Current Issue
    • By Issue
    • By Author
    • By Subject
    • Author Index
    • Keyword Index
  • Journal Info
    • About Journal
    • Aims and Scope
    • Editorial Board
    • Publication Ethics
    • Peer Review Process
  • Guide for Authors
  • Submit Manuscript
  • Contact Us
 
  • Login
  • Register
Home Articles List Article Information
  • Save Records
  • |
  • Printable Version
  • |
  • Recommend
  • |
  • How to cite Export to
    RIS EndNote BibTeX APA MLA Harvard Vancouver
  • |
  • Share Share
    CiteULike Mendeley Facebook Google LinkedIn Twitter
JES. Journal of Engineering Sciences
arrow Articles in Press
arrow Current Issue
Journal Archive
Volume Volume 53 (2025)
Volume Volume 52 (2024)
Volume Volume 51 (2023)
Volume Volume 50 (2022)
Volume Volume 49 (2021)
Volume Volume 48 (2020)
Issue No 6
Issue No 5
Issue No 4
Issue No 3
Issue No 2
Issue No 1
Volume Volume 47 (2019)
Volume Volume 46 (2018)
Volume Volume 45 (2017)
Volume Volume 44 (2016)
Volume Volume 43 (2015)
Volume Volume 42 (2014)
Volume Volume 41 (2013)
Volume Volume 40 (2012)
Volume Volume 39 (2011)
Volume Volume 38 (2010)
Volume Volume 37 (2009)
Volume Volume 36 (2008)
Volume Volume 35 (2007)
Volume Volume 34 (2006)
Gadelrab, A., Mohamed, Y., El-Melegy, M. (2020). Face Recognition from Small Datasets using Kernel Selection of Gabor Features. JES. Journal of Engineering Sciences, 48(No 6), 1051-1071. doi: 10.21608/jesaun.2020.42513.1013
Alyaa Aly Gadelrab; Yasser Farouk Mohamed; Moumen Taha El-Melegy. "Face Recognition from Small Datasets using Kernel Selection of Gabor Features". JES. Journal of Engineering Sciences, 48, No 6, 2020, 1051-1071. doi: 10.21608/jesaun.2020.42513.1013
Gadelrab, A., Mohamed, Y., El-Melegy, M. (2020). 'Face Recognition from Small Datasets using Kernel Selection of Gabor Features', JES. Journal of Engineering Sciences, 48(No 6), pp. 1051-1071. doi: 10.21608/jesaun.2020.42513.1013
Gadelrab, A., Mohamed, Y., El-Melegy, M. Face Recognition from Small Datasets using Kernel Selection of Gabor Features. JES. Journal of Engineering Sciences, 2020; 48(No 6): 1051-1071. doi: 10.21608/jesaun.2020.42513.1013

Face Recognition from Small Datasets using Kernel Selection of Gabor Features

Article 3, Volume 48, No 6, November and December 2020, Page 1051-1071  XML PDF (719.29 K)
Document Type: Research Paper
DOI: 10.21608/jesaun.2020.42513.1013
View on SCiNiTO View on SCiNiTO
Authors
Alyaa Aly Gadelrab email ; Yasser Farouk Mohamed; Moumen Taha El-Melegyorcid
Electrical,Engineering,asyut,Egypt
Abstract
Recent advances in face recognition are mostly based on deep methods that require large datasets for training. This paper presents a novel method that combines Gabor-features, feature selection and kernel selection to achieve comparable performance on smaller datasets.

The paper compares different feature selection methods in this context.
The problem tackled in this paper is achieving accurate face recognition with limited computational resources. By “limited" computational resources we mean low computational power (i.e. memory, CPU ops) during both system training and evaluation. Noted that we are not competing against deep learning systems in term of accuracy but we provided a middle ground between hand-coded fast feature extraction and learning based deep learning in terms of both speed and accuracy.
To achieve this goal, we propose “kernel selection" as the main method to reduce the dimensionality of the classification problem faced by the final classifier in the FR system. Kernel selection is the process of eliminating less important Gabor kernels for classification while keeping the level of accuracy achievable. Kernel selection differs from traditional feature selection in measuring the value of complete kernels consisting of several features together. Because of its structured nature, Kernel selection has the advantage of eliminating the need to evaluate complete Gabor kernels reducing the computational cost of the system compared with traditional feature selection methods.
Keywords
Kernel selection; feature selection; face recognition
Main Subjects
Electrical Engineering, Computer Engineering and Electrical power and machines engineering.
References
[1]   Zhang, Qian, et al. "Feature extraction of face image based on LBP and 2-D Gabor wavelet transform." Mathematical Biosciences and Engineering: MBE 17.2 (2019): 1578-1592.

[2]   Huang, G., Ramesh, M., Berg, T. and Learned-Miller, E., 2007. Labeled faces in the wild: A database for studying face recognition in unconstrained environments Univ. Massachusetts, Amherst. MA, Tech. Rep. 07-49.

[3]   Umer, Saiyed, Bibhas Chandra Dhara, and Bhabatosh Chanda. "Face recognition using fusion of feature learning techniques." Measurement 146 (2019): 43-54.

[4]   Georghiades, A.S., Belhumeur, P.N. and Kriegman, D.J., 2001. From few to many: Illumination cone models for face recognition under variable lighting and pose. IEEE transactions on pattern analysis and machine intelligence, 23(6), pp.643-660.

[5]   Guyon, I., Weston, J., Barnhill, S. and Vapnik, V., 2002. Gene selection for cancer classification using support vector machines. Machine learning, 46(1-3), pp.389-422.

[6]   Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V. and Vanderplas, J., 2011. Scikit-learn: Machine learning in Python. the Journal of machine Learning research, 12, pp.2825-2830.

[7]   Babatunde, O.H., 2015. A neuro-genetic hybrid approach to automatic identification of plant leaves.

[8]   PEH, D. "RO Duda, PE Hart, and DG Stork, Pattern Classification." (2001): 305-307.

[9]   Jović, A., Brkić, K. and Bogunović, N., 2015, May. A review of feature selection methods with applications. In 2015 38th international convention on information and communication technology, electronics and microelectronics (MIPRO) (pp. 1200-1205). IEEE.

[10]    Krizhevsky, A., Sutskever, I. and Hinton, G.E., 2012. ImageNet classification with deep convolutional neural networks. In Advances in neural information processing systems (pp. 1097-1105).

[11]   Weilun Chao. Gabor wavelets transform and its application. R98942073 (TFA and WT final project), 2010.

[12]   Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1), 267-288.

[13]   Yihua Liao and V Rao Vemuri. Use of k-nearest neighbor classifier for intrusion detection. Computers and security, 21(5):439–448, 2002.

[14]   Amiri, F., Yousefi, M.R., Lucas, C., Shakery, A. and Yazdani, N., 2011. Mutual information-based feature selection for intrusion detection systems. Journal of Network and Computer Applications, 34(4), pp.1184-1199.

[15]   L Ladha   and T Deepa. Feature selection methods and algorithms. International journal on computer science and engineering, 3(5):1787–1797, 2011.

[16] Taigman, Y., Yang, M., Ranzato, M.A. and Wolf, L., 2014. Deep face: Closing the gap to human-level performance in face verification. In proceedings of the IEEE conference on computer vision and pattern recognition (pp. 1701-1708).

 [17] Ng, Andrew, Jiquan Ngiam, Chuan Yu Foo, Yifan Mai, Caroline Suen, Adam Coates, Andrew Maas et al. "Deep learning tutorial." Univ. Stanford (2015).

 [18] Mark Andrew Hall.  Correlation-based feature selection for machine learning.  1999.

[19] Jiliang Tang, Salem Alelyani, and Huan Liu. Feature selection for classification:  A review. Data classification: Algorithms and applications, page 37, 2014.

[20] Guyon I, Gunn S, Nikravesh M, Zadeh LA, editors. Feature extraction: foundations and applications. Springer; 2008 Nov 16.

[21] Isabelle Guyon, Jason Weston, Stephen Barnhill, and Vladimir Vapnik. Gene selection for cancer classification using support vector machines. Machine earning, 46(1-3):389–422, 2002.

[22] H. Liu and H. Motoda. Computational Methods of Feature Selection. Chapman and Hall/CRC Press, 2007.

[23] Pedregosa, Fabian, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel et al. "Scikit-learn: Machine learning in Python." the Journal of machine Learning research 12 (2011): 2825-2830.

[24] Giorgos Borboudakis and Ioannis Tsamardinos.  Forward-backward selection with early dropping. The Journal of Machine Learning Research, 20(1):276–314, 2019.

[25] Jun Shao. Linear model selection by cross validation. Journal of the American   statistical Association, 88(422):486–494, 1993

[26] Max Kuhn and Kjell Johnson. Feature engineering and selection: A practical approach for predictive models.  CRC Press, 2019.

[27] Ghosh, Manosij, et al. "A wrapper-filter feature selection technique based on ant colony optimization." Neural Computing and Applications (2019): 1-19.

Statistics
Article View: 286
PDF Download: 500
Home | Glossary | News | Aims and Scope | Sitemap
Top Top

Journal Management System. Designed by NotionWave.