TY - CONF AU - Volkmar Frinken AU - Francisco Zamora AU - Salvador España AU - Maria Jose Castro AU - Andreas Fischer AU - Horst Bunke A2 - ICPR PY - 2012// TI - Long-Short Term Memory Neural Networks Language Modeling for Handwriting Recognition BT - 21st International Conference on Pattern Recognition SP - 701 EP - 704 N2 - Unconstrained handwritten text recognition systems maximize the combination of two separate probability scores. The first one is the observation probability that indicates how well the returned word sequence matches the input image. The second score is the probability that reflects how likely a word sequence is according to a language model. Current state-of-the-art recognition systems use statistical language models in form of bigram word probabilities. This paper proposes to model the target language by means of a recurrent neural network with long-short term memory cells. Because the network is recurrent, the considered context is not limited to a fixed size especially as the memory cells are designed to deal with long-term dependencies. In a set of experiments conducted on the IAM off-line database we show the superiority of the proposed language model over statistical n-gram models. SN - 1051-4651 SN - 978-1-4673-2216-4 L1 - http://refbase.cvc.uab.es/files/FZE2012.pdf N1 - DAG ID - Volkmar Frinken2012 ER -