Skip to main content
TCW

EXPLORE

Random fragments from the AI knowledge graph. Refresh for more.

HISTORY·Google Brain, 2017

The original Transformer paper 'Attention Is All You Need' was rejected from a top conference before becoming the most cited AI paper of the decade.

SCALE·OpenAI, 2020

GPT-3 has 175 billion parameters. If each parameter were a grain of sand, it would fill roughly 7 dump trucks.

CONCEPT·Common AI terminology

The term 'hallucination' in AI was borrowed from psychiatry. LLMs don't see things that aren't there — they confidently generate plausible-sounding fiction.

HISTORY·Bahdanau et al., 2014

The attention mechanism in transformers was inspired by human visual attention — how your eyes focus on relevant parts of a scene while ignoring the rest.

SCALE·IEA estimates, 2025

A single ChatGPT query uses roughly 10x more energy than a Google search. At scale, AI inference is becoming a significant portion of global compute.

CONCEPT·Signal processing → NLP crossover

The 'context window' metaphor comes from signal processing, where a window function selects a section of a signal for analysis. In LLMs, it's how much text the model can 'see' at once.

refresh page for new trivia