What are tokens?

What are tokens?

What are tokens?

AI language models process information in tokens. Tokens represent a word, part of a word, a phrase, or a symbol. Different models tokenize text in different ways, but in AI Dungeon, one token is about 4 characters.

To generate an output, AI models analyze text inputs as tokens and then predict the most likely sequence of tokens to return back to the user. The AI-generated response is automatically translated from tokens back into human-readable text. You can picture this process working like a really sophisticated autocomplete.

Use this resource to understand how words are tokenized →

On this page

icon
image

© Latitude 2023