The Linux Foundation Projects
Skip to main content

Tokenization

The process of converting text into tokens, which can then be used as input for machine learning models.