| 123456789101112131415161718192021222324252627282930 | [[analysis-tokenizers]]== TokenizersTokenizers are used to break a string down into a stream of termsor tokens. A simple tokenizer might split the string up into terms wherever it encounters whitespace or punctuation.Elasticsearch has a number of built in tokenizers which can beused to build <<analysis-custom-analyzer,custom analyzers>>.include::tokenizers/standard-tokenizer.asciidoc[]include::tokenizers/edgengram-tokenizer.asciidoc[]include::tokenizers/keyword-tokenizer.asciidoc[]include::tokenizers/letter-tokenizer.asciidoc[]include::tokenizers/lowercase-tokenizer.asciidoc[]include::tokenizers/ngram-tokenizer.asciidoc[]include::tokenizers/whitespace-tokenizer.asciidoc[]include::tokenizers/pattern-tokenizer.asciidoc[]include::tokenizers/uaxurlemail-tokenizer.asciidoc[]include::tokenizers/pathhierarchy-tokenizer.asciidoc[]
 |