Lip motion analysis can enhance a security system based on face recognition significantly. Examining speaker dependent lip movements and visually determining the spoken content can prevent attacks using photographs or pre-recorded video sequences. Our system operates under active near infrared illumination. We investigate the use of a new type of features, namely local binary patterns, to model lip motions with hidden Markov models. We evaluate the classification accuracy with the TUNIR database, which we made available to the public for the future comparison of competing approaches.