top of page

Search


Breaking the Amnesia Cycle in Large Sequence Models
Nested Learning reframes neural models as multi-loop systems updating at different frequencies, revealing that depth stacking hides gradient mechanics and limits continual learning. It interprets optimizers like Momentum and Adam as associative gradient memories and introduces CMS for incremental abstraction. The HOPE module combines self-modification, multi-clock updates, and deep contextual compression, offering a white-box path beyond static backbones for long-context and

Juan Manuel Ortiz de Zarate
1 day ago9 min read


Measuring Controversy in Social Networks through NLP
Discover how NLP tools can identify and analyze contentious topics.

Juan Manuel Ortiz de Zarate
Jul 6, 202410 min read
bottom of page