Back to Glossary Index
Dagster Data Engineering Glossary:
Tokenization
The process of converting input text into smaller units, or tokens, typically words or phrases, used in natural language processing to understand the structure of the text.
Dagster Data Engineering Glossary:
The process of converting input text into smaller units, or tokens, typically words or phrases, used in natural language processing to understand the structure of the text.