When Errors Become the Rule : Twenty Years with Transformation-Based Learning

Research output: Contribution to journalArticle


Transformation-based learning (TBL) is a machine learning method for, in particular, sequential classification, invented by Eric Brill [Brill 1993b, 1995a]. It is widely used within computational linguistics and natural language processing, but surprisingly little in other areas.

TBL is a simple yet flexible paradigm, which achieves competitive or even state-of-the-art performance in several areas and does not overtrain easily. It is especially successful at catching local, fixed-distance dependencies and seamlessly exploits information from heterogeneous discrete feature types. The learned representation—an ordered list of transformation rules—is compact and efficient, with clear semantics. Individual rules are interpretable and often meaningful to humans.

The present article offers a survey of the most important theoretical work on TBL, addressing a perceived gap in the literature. Because the method should be useful also outside the world of computational linguistics and natural language processing, a chief aim is to provide an informal but relatively comprehensive introduction, readable also by people coming from other specialities.


Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • General Language Studies and Linguistics


  • Artificial intelligence, Knowledge Representation Formalisms and Methods, Computational Linguistics, Natural Language Processing, Rule learning
Original languageEnglish
Pages (from-to)50-50:51
JournalACM Computing Surveys
Issue number4
Publication statusPublished - 2014
Publication categoryResearch

Bibliographic note

The information about affiliations in this record was updated in December 2015. The record was previously connected to the following departments: Linguistics and Phonetics (015010003)