/TEM-puh-ruh-chur/
A setting (0 to 1+) that controls how random or creative an AI's responses are. Low temperature = predictable and focused. High temperature = creative and varied.
Temperature is a parameter that controls the randomness of AI output. At temperature 0, the model always picks the most probable next word — giving you deterministic, repetitive, but reliable output. At temperature 1, it samples more broadly from its probability distribution — producing creative, varied, but sometimes chaotic results.
Think of it like a dial between 'accountant' and 'jazz musician.' For code generation, data extraction, and factual Q&A, you want low temperature (0-0.3). For creative writing, brainstorming, and exploring ideas, you want higher temperature (0.7-1.0). Most AI interfaces default to around 0.7.
The word is borrowed from thermodynamics — at high temperatures, particles move chaotically; at low temperatures, they're stable and predictable. Same principle with AI tokens.
When you need to control the creativity-vs-reliability tradeoff in AI output. Essential for any production AI system.
Temperature is the single most impactful parameter most people never touch. Knowing when to adjust it separates hobbyists from professionals.
Hot = wild and unpredictable. Cold = stable and precise. Just like molecules in physics.
A Mac app that coaches your AI vocabulary daily