Tokenization

TERM
Tokenization
DEFINITION
Breaking text into tokens the AI can process