Humans Can Learn from Transformers and Agents

1 min read Original article ↗

Idea 1: Transformer Architecture & Multi-Pass Thinking

50% of the solution is knowing the problem. Transformers solve this with Query, Key, Value attention.

Analogy: Librarian + Book + Contents

The transformer attention mechanism teaches us to think in three passes.

Scaled Dot-Product Self-Attention

What am I looking for?

Where should I look?

What content do I extract?

Human Application

• Pass 1: Locate - What's important? Where to focus?
• Pass 2: Identify - Who you are, what you need
• Pass 3: Extract - Organize and structure the content

Dialogic Approach to Knowledge

In therapy or self-reflection:

  • 50% is finding what's important to you
  • 50% is the "what" and "how" of contents

Key insight: Multiple passes help us overcome cognitive fixation and limited working memory - just like transformers overcome limited context windows.