The effects of regularization on learning facial expressions with convolutional neural networks

Link:
Autor/in:
Beteiligte Personen:
  • Villa, Alessandro E.P.
  • Masulli, Paolo
  • Pons Rivero, Antonio J.
Verlag/Körperschaft:
Springer International Publishing
Erscheinungsjahr:
2016
Medientyp:
Text
Schlagworte:
  • Batch normalization
  • Convolutional neural network
  • Dropout
  • Facial expression recognition
  • Max pooling dropout
  • Regularization
Beschreibung:
  • Convolutional neural networks (CNNs) have become effective instruments in facial expression recognition. Very good results can be achieved with deep CNNs possessing many layers and providing a good internal representation of the learned data. Due to the potentially high complexity of CNNs on the other hand they are prone to overfitting and as a result, regularization techniques are needed to improve the performance and minimize overfitting. However, it is not yet clear how these regularization techniques affect the learned representation of faces. In this paper we examine the effects of novel regularization techniques on the training and performance of CNNs and their learned features. We train a CNN using dropout, max pooling dropout, batch normalization and different combinations of these three. We show that a combination of these methods can have a big impact on the performance of a CNN, almost halving its validation error. A visualization technique is applied to the CNNs to highlight their activations for different inputs, illustrating a significant difference between a standard CNN and a regularized CNN.
Lizenz:
  • info:eu-repo/semantics/closedAccess
Quellsystem:
Forschungsinformationssystem der UHH

Interne Metadaten
Quelldatensatz
oai:www.edit.fis.uni-hamburg.de:publications/aee5c0b4-405a-4ab4-88ee-a834391b398c