--- id: ins_wysiati-overconfidence operator: Daniel Kahneman operator_role: Nobel laureate; Princeton emeritus; co-founder of behavioral economics source_url: https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow source_type: book source_title: "Thinking, Fast and Slow — WYSIATI" source_date: 2011-10-25 captured_date: 2026-05-05 domain: [research-discovery, strategy, leadership] lifecycle: [strategy-bets, risk-quality, customer-research] maturity: foundational artifact_class: framework score: { originality: 4, specificity: 4, evidence: 4, transferability: 5, source: 5 } tier: A related: [ins_system1-system2-thinking] raw_ref: raw/expert-content/experts/daniel-kahneman.md --- # The less you know, the more confident you are, WYSIATI builds the cleanest stories from the thinnest data ## Claim What You See Is All There Is: humans build the best possible coherent story from whatever limited information is in front of them, with no awareness of what is missing. Confidence in the story rises as data thins, because thin data produces fewer contradictions to explain, overconfidence is therefore inversely correlated with actual knowledge. ## Mechanism System 1 is a coherence-seeking machine. It does not represent uncertainty as "I don't know"; it represents it as "the story I have built is right." Missing information is invisible to System 1 because by definition there is nothing in working memory representing it. The more data you actually gather, the more contradictions surface, the harder coherence becomes, which feels like declining confidence even though the picture is now closer to truth. The fix is to explicitly enumerate what you do not know before you commit to a position, since System 2 has to be invoked to populate the empty slots. ## Conditions Holds when: - The decision is being made under information scarcity (early-stage research, novel competitive landscapes, first customer conversations). - The decision-maker has authority and is rewarded for confidence. - No structured process forces the unknowns to be listed. Fails when: - The domain has well-known calibration tools (insurance actuarial models, weather forecasting) that surface the unknowns automatically. - The team practices red-teaming or pre-mortems as routine, which forcibly populate the missing-info slots. - The decision-maker is genuinely an expert in the domain and recognises the boundary of their own knowledge (this expertise-paradox is itself a System-2 trained skill). ## Evidence > "you build the best possible story from whatever limited information you have, with no awareness of what you do not know, which produces overconfidence that is inversely correlated with actual knowledge" · synthesized from Kahneman's published work; see `raw/expert-content/experts/daniel-kahneman.md` line 16. ## Signals - Strategy decks that confidently characterise a competitor's roadmap from 2-3 LinkedIn posts. - ICP definitions written before any switch interviews, with high specificity and zero counter-examples. - Forecasts that get more confident as the team prepares the deck, not less. ## Counter-evidence Confidence-as-leadership-signal is sometimes load-bearing for execution morale even when it is epistemically wrong; founders who pause to enumerate unknowns can stall teams. The cure (red-teaming) has its own failure mode: process theater where unknowns are listed but ignored. ## Cross-references - `ins_system1-system2-thinking`, WYSIATI is the System-1 mechanism that produces overconfidence; System-2 corrective is to enumerate unknowns explicitly.