Topology’s Continuous Learning Model (CLM) is a natural language program that accumulates knowledge, experience, and skills over time, just like humans.
LLMs have the following problems:
Topology’s CLM:
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/8d5858df-1063-4797-a4e7-227579d3406e/785dfe0d-7ed0-4c79-9b87-b0719f6206e0/TopologyLogoBig.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/8d5858df-1063-4797-a4e7-227579d3406e/785dfe0d-7ed0-4c79-9b87-b0719f6206e0/TopologyLogoBig.png" width="40px" /> Have questions about our docs? Chat with the CLM! It’s already learned all of this so you don’t have to.
</aside>
A novel algorithm that improves with scale enables continuous learning.
Topology has built a ‘hippocampus’ for transformer-based language models. Human frontal lobes are responsible for language, while our hippocampus aids memory organization and learning. We call hippocampal damage ‘amnesia.’