Gender classification using facial images and basis pursuit

Rahman Khorsandi, Mohamed Abdel-Mottaleb

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations


In many social interactions, it is important to correctly recognize the gender. Researches have addressed this issue based on facial images, ear images and gait. In this paper, we present an approach for gender classification using facial images based upon sparse representation and Basis Pursuit. In sparse representation, the training data is used to develop a dictionary based on extracted features. Classification is achieved by representing the extracted features of the test data using the dictionary. For this purpose, basis pursuit is used to find the best representation by minimizing the l1 norm. In this work, Gabor filters are used for feature extraction. Experimental results are conducted on the FERET data set and obtained results are compared with other works in this area. The results show improvement in gender classification over existing methods.

Original languageEnglish (US)
Title of host publicationComputer Analysis of Images and Patterns - 15th International Conference, CAIP 2013, Proceedings
Number of pages8
EditionPART 1
StatePublished - Sep 26 2013
Event15th International Conference on Computer Analysis of Images and Patterns, CAIP 2013 - York, United Kingdom
Duration: Aug 27 2013Aug 29 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume8047 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Other15th International Conference on Computer Analysis of Images and Patterns, CAIP 2013
Country/TerritoryUnited Kingdom


  • Basis Pursuit
  • Facial Images
  • Gabor Wavelets
  • Gender Classification
  • Sparse Representation

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science


Dive into the research topics of 'Gender classification using facial images and basis pursuit'. Together they form a unique fingerprint.

Cite this