Saturday, July 05, 2025

Scholars Discover a Universal Law of Memory

Amazing stuff! A unification of human memory and mathematics?

Human memory as a random tree with multiple key points along the way when a story is retrieved from memory. 

"... a team of IAS scholars to a surprising discovery about the individualized content and organization of our memories: they are united by a common mathematical structure.

In a study investigating how people recall stories, the team discovered a general law that governs the way we store and compress meaningful memories. Although memories align with an individualized sense of what matters, our brains use a common architecture to organize our recollections, nesting information in a hierarchical structure that helps us navigate them efficiently.

Even more striking is the team’s finding that these hierarchical structures give rise to a universal pattern in the way people encode and communicate what they remember. For the first time, scientists are offering evidence that humans compress information into a limited number of “building blocks” that represent detailed stories. 

These units of information are capacious, able to summarize ever greater swaths of information as a narrative grows. But no matter how long or short the original story—whether it is War and Peace or a quick jolt of horror from Stephen King—people will still use the same finite set of units to relay their memory of the narrative.  ...

Physics meets neuroscience
A specialist in statistical mechanics, he applies methods originating in the probabilistic explanation of complex physical systems to neuroscience. ...

This model, described as a “random tree ensemble,” highlights above all the efficiency of our biology. Our working memory—the system our brains use to hold, sort, and retrieve information for active processing—is, by nature, limited. The model displays how our brains optimize for this limitation, using hierarchical networks that cascade downward in a tree-like formation to nest details inside meaningful events.

To recall a story or a series of autobiographical events, we travel down this tree, which accounts for the complex nature of narrative memory. ...

The IAS team modeled subjects’ recollections using linguistic clauses as fundamental building blocks (similar to how physics models the collective behavior of atoms) and analyzed their structural organization statistically. 

The resulting model shows that the average length of our recall of a story does not increase in proportion with the length of the story itself. Instead, as a story becomes longer and more detailed, our brains compress the narrative into resourceful summaries. 

The model also makes an important new prediction: as narratives grow longer, a universal, scale-invariant limit emerges. 

Whether we give people [subjects of study] a novel or a short story, the average distribution of their summarization and compression levels would be exactly the same ... “The chance a clause from the recall will summarize 10% of the story is independent of the story’s length, for example. Our brains seem to have evolved to consolidate information in this universal way to make the most of our physiology’s advantages and limitations.” ..."

From the abstract:
"Traditional studies of memory for meaningful narratives focus on specific stories and their semantic structures but do not address common quantitative features of recall across different narratives.
We introduce a statistical ensemble of random trees to represent narratives as hierarchies of key points, where each node is a compressed representation of its descendant leaves, which are the original narrative segments.
Recall from this hierarchical representation is constrained by working memory capacity. Our analytical solution aligns with observations from large-scale narrative recall experiments.
Specifically, our model explains that
(1) average recall length increases sublinearly with narrative length and
(2) individuals summarize increasingly longer narrative segments in each recall sentence.
Additionally, the theory predicts that for sufficiently long narratives, a universal, scale-invariant limit emerges, where the fraction of a narrative summarized by a single recall sentence follows a distribution independent of narrative length."

IAS Scholars Discover a Universal Law of Memory - Ideas | Institute for Advanced Study



Fig. 1 Ensemble of random trees.
(a) Schematics of memory retrieval from a random hierarchical representation. ...
(b) Mean recalled length 𝐶 as a function of the encoded length 𝑁. ... 
(c) Distribution of the chunk size at the 𝐷th level 𝑛(𝐷) ( 𝐷 =4) given root size 𝑛(1) =𝑁; the range between tick marks in the 𝑦 axis corresponds to [0, 1].
(d) Distribution of compression ratios scaled by 𝑁 as a function of the compression ratios divided by 𝑁. Simulations of different 𝑁’s are shown in different shades of green. The red dashed line is the asymptotic scaling function from Eq. (7).



The random tree of memory, sketched on a blackboard


No comments: