What are tokens?

What are tokens?

What are tokens?

A token is a representation of a word, part of a word, common phrase, or symbol. Tokens are used by the AI as a way to conserve memory and computing power.

The way AI language models generate a response is they first analyze text (the input) and then predict the mostly likely sequence of words to return to the user (the output). You can picture this process working like a sophisticated autocomplete.

AI can only hold so much in its memory, so tokenizing words allows the AI to consider more content at once. It’s sort of like how people shorten words to stay within Twitter’s character limit when tweeting. “What” changes to “w/”, “As far as I know” changes to “AFAIK”, and “fabulous” might change to “fab”.

In the same way, tokens convey the same meaning as the words you type, but in a format that takes up less storage and computing power. This means you can send more “meaning” to the AI with each input, because your words are “tokenized” or translated into tokens.

The AI generates its response (the output) in tokens, and those are translated back to human readable text.

On this page

icon
image

© Latitude 2022