The Linux Foundation Projects
Skip to main content


A token is a unit of text, such as a word, part of a word, or a symbol, that has been extracted from a larger dataset during the tokenization process. Tokens are the basic building blocks used for further processing and analysis, such as in language modeling and text classification tasks. In blockchain and cryptocurrency, a token represents a unit of value issued by a project or organization and can be traded or used within a specific ecosystem.