All Glossary Terms

Tokenisation

Token

Tokenisation is how LLMs break text into tokens — small units they can process. Token limits affect context windows, cost, and content processability.

Technical Foundations