Comparative Analysis of Active Inference in Hebbian Networks and Cognitive Computing Frameworks
Description
Abstract
This study integrates Hebbian learning and Q-learning within a unified cognitive framework to facilitate efficient decision-making in dynamic environments. By merging these learning paradigms, we emulate human cognitive processes and analyze how various cognitive mechanisms can enhance agent behavior. We compare our approach to the "Active Inference in Hebbian Learning Networks" study, which employs Hebbian learning within active inference (AIF) frameworks for controlling dynamic agents. Their study uses two Hebbian ensembles: a posterior network for inferring latent states from observations and a state transition network for predicting future states based on current state-action pairs. Experimental results in the Mountain Car environment demonstrate that Hebbian AIF outperforms Q-learning, highlighting the efficiency of Hebbian learning without replay buffers.
In our approach, Hebbian learning is applied for memory encoding within a cognitive model, enhancing connections between frequently co-activated nodes and transforming sensory input into a storable format. Q-learning is implemented as a reinforcement learning mechanism using a traditional table-based method, integrated with memory retrieval and attentional selection.
Our system architecture integrates multiple cognitive mechanisms, including among others memory systems, reinforcement learning, and attentional processes, aiming for adaptive intelligence and efficient decision-making based on feedback and learning.
We will present experimental results demonstrating a strong foundation early in the process based on the effectiveness of the agent's performance within a randomized maze environment with dynamical objects. Performance in this environment shows that aspects of this approach may lead to improved computation handling and efficiency in learning and adaptation. We integrate a computational function using Robert Worden's Requirement Equation which is referred to here as the Worden RE Subsystem. We also discuss the impact of distress dynamics on memory encoding and retrieval, highlighting the importance of distress states in cognitive processes.
The paper concludes by emphasizing the distinctions and similarities between our approach and the referenced study, highlighting the importance of unsupervised learning and biological plausibility. By incorporating distress states, belief dynamics, and other operations into the learning process, our model attempts a holistic representation of engineered active inference and means to enhance the overall performance and decision-making capabilities of cognitive computing models in complex, real-world environments.
Files
Files
(1.9 MB)
Name | Size | Download all |
---|---|---|
Comparative Analysis of Active Inference in Hebbian Networks and Cognitive Computing Frameworks.docx
md5:06d4345dba57f1f3797556e6f01f729f
|
1.9 MB | Download |
Additional details
Identifiers
- arXiv
- arXiv:2306.05053
Dates
- Available
-
2024-07
Software
- Repository URL
- https://github.com/Infrabenji/cognitive-computing-frameworks
- Programming language
- Python
- Development Status
- Active
References
- Nelson, B. (2024). Comparative Analysis of Active Inference in Hebbian Networks and Cognitive Computing Frameworks. Zenodo. http://doi.org/10.5281/zenodo.12562484