-
Code mode for mcp servers and llms
Code mode for mcp servers is about LLM writing and calling the code to use a proper MCP instead of calling it directly with the whole context. It makes the call a lot smaller, no overhead is passed, just the basics that are required to call the proper MCP method. Just as You do in code, method or a function, proper parameters, everything validated and… BAM ! We are returning a context that LLM uses in further stages. Anthropics wrote… This is really nice looking but only for a huge models with a 1kk tokens of a context. We need to remember that this is not possible on any kind…
-
Tokenization and embedding of song lyrics “We will, we will…”
Tokenization and embedding of song lyrics “We will, we will…” i know you know how it ends. but have You wondered what would an LLM say ? Let us find out. I want to ask the Clause Sonnet 4.5 about the embeddings, tokenizations and probability of figuring out the lyrics for “We will, we will…” prompt 🙂 Can you show me the tokenization, embeddings, emtadata and probabilities for “We will, we will …” Tokenization Any kind of machine learning uses numbers in the back stage. The text is split into tokens, each mapped to an ID ( example values ) : Token ID Token Text Type Position Length 1234 We…
-
What is a token in AI ? Prompt examples
What is a token in AI ? – A piece of text, usually a word that we send to the LLM. The same goes for when we get a response ( usually number of words = number of tokens, roughly). How to use it ? Since we pay for it , rather cautiously. We want to send the minimum and get the maximum out of every request. Pretty much the basics of economics. Below are couple of prompts i used to analyze my tokens usage. Something are obsiously hallucinations but on the other hand we get a prety decent breakdown of all the data i did send for that coding…






