The original Transformer paper 'Attention Is All You Need' was rejected from a top conference before becoming the most cited AI paper of the decade.
Read more
Random fragments from the AI knowledge graph. Refresh for more.
The original Transformer paper 'Attention Is All You Need' was rejected from a top conference before becoming the most cited AI paper of the decade.
Read more
GPT-3 has 175 billion parameters. If each parameter were a grain of sand, it would fill roughly 7 dump trucks.
Read more
The term 'hallucination' in AI was borrowed from psychiatry. LLMs don't see things that aren't there — they confidently generate plausible-sounding fiction.
Read more
The attention mechanism in transformers was inspired by human visual attention — how your eyes focus on relevant parts of a scene while ignoring the rest.
Read more
A single ChatGPT query uses roughly 10x more energy than a Google search. At scale, AI inference is becoming a significant portion of global compute.
Read more
The 'context window' metaphor comes from signal processing, where a window function selects a section of a signal for analysis. In LLMs, it's how much text the model can 'see' at once.
Read more
refresh page for new trivia