Elasticsearch gives you a way to customize your analyzer. The first step is to define the analyzer and then use it in the mappings. You must define the analyzer in the index settings. You can then define your analyzer either in an index or in an index template for multiple indices that match the index pattern. Recall that an analyzer must only have one tokenizer and, optionally, many character filters and token filters. Let's create a custom analyzer to extract the tokens that we will use in the next chapter, which contain the following components:
- tokenizer: Use the char_group tokenizer to have separators such as whitespace, digit, punctuation except for hyphens, end-of-line, symbols, and more.
- token filter: Use the pattern_replace, lowercase, stemmer, stop, length, and unique filters.
Since the description text will be indexed differently, we need to...