Sleep: Periodic knowledge compression in PRIMUS

chevron-icon
RFP Proposals
Top
chevron-icon
project-presentation-img
user-profile-img
JacobBillings
Project Owner

Sleep: Periodic knowledge compression in PRIMUS

Expert Rating

n/a

Overview

This proposal develops a “sleep consolidation” paradigm for knowledge graph refinement that leverages topological data analysis to systematically optimize PRIMUS cognitive architecture performance. Drawing from neuroscientific principles of memory consolidation during sleep, our approach transforms discrete symbolic knowledge into continuous geometric representations, applies persistent homology analysis to identify natural clustering patterns, then translates discoveries back into optimized MeTTa atom structures within the Distributed Atomspace (DAS).

RFP Guidelines

Neural-symbolic DNN architectures

Complete & Awarded
  • Type SingularityNET RFP
  • Total RFP Funding $160,000 USD
  • Proposals 17
  • Awarded Projects 1
author-img
SingularityNET
Apr. 14, 2025

This RFP invites proposals to explore and demonstrate the use of neural-symbolic deep neural networks (DNNs), such as PyNeuraLogic and Kolmogorov Arnold Networks (KANs), for experiential learning and/or higher-order reasoning. The goal is to investigate how these architectures can embed logic rules derived from experiential systems like AIRIS or user-supplied higher-order logic, and apply them to improve reasoning in graph neural networks (GNNs), LLMs, or other DNNs. Bids are expected to range from $40,000 - $100,000.

Proposal Description

Our Team

Jacob Billings, Ph.D. (Computational Neuroscience, 15+ years complex systems/network analytics). Fluran (AI/Social Scientist, LangChain/Neo4j/GraphRAG expertise). Victor (Product Management, Silicon Valley experience). Combined expertise in topological data analysis, neural-symbolic integration, and scalable system design for knowledge graph optimization within AGI architectures.

Project details

This proposal develops a "sleep consolidation" paradigm for knowledge graph refinement. Much as the brain consolidates the experiences of the day, during sleep, by rebalancing the biochemistry of its many axons and dendrites, PRIMUS may benefit from periodic sortition and compression cycles for its Distributed Atomspace (DAS). Theoretically, the topological approach richly describes the structure of the atomspace. Much like a set of 1-dimensional simplices (edges) completely describes a hypergraph, a set higher-order of n-dimensional simplices (n-connected nodes) may produce a complete descriptor of the metagraph. If so, then standard techniques from topology may be used to enumerate salient features of metagraphs. Not only can we interpret the metagraph's native structure topologically, but we may also be able to imbue salience into metagraphs by periodically refining their topological structure. Here, we take a principled approach to topological refinement. One way of inferring latent topological structure is by analyzing statistical models from observations of the network. One great tool for building these models from metagraphs is PyNeuraLogic. By extending logic programming with numerical parameters, PyNeuraLogic turns knowledge-graph-like stores into differentiable geometric objects, capable of learning deep statistical structure through iterative backpropogation of error gradients. The learned numeric relationships between metagraph nodes may then be analyzed, topologically, to identify a condensed encoding of the database's durable information, while also highlighting any unexpected deviations from durable patterns. 

**Core Innovation: Cyclical Knowledge Graph Optimization**

1. **Knowledge State Extraction**: Convert current DAS MeTTa graphs into hypergraph representations suitable for geometric analysis
2. **Dense Representation Learning**: Use PyNeuraLogic to learn continuous embeddings capturing both explicit relationships and latent semantic patterns
3. **Topological Structure Discovery**: Apply Vietoris-Rips filtrations and persistent homology to identify robust structural features across multiple scales
4. **Hierarchical Atom Generation**: Transform topological discoveries into new hierarchical MeTTa atom structures preserving both local relationships and global organization
5. **DAS Integration & Performance Assessment**: Integrate consolidated knowledge with comprehensive benchmarking across PRIMUS components (PLN, ECAN, MOSES)

**Direct PRIMUS Enhancement**

Our approach specifically targets measurable improvements in PRIMUS cognitive components:

- **PLN Optimization**: Topological clustering provides natural uncertainty regions for probabilistic reasoning, enabling more efficient inference through pre-computed confidence boundaries
- **ECAN Enhancement**: Geometric attention allocation uses topological centrality measures to identify high-importance knowledge regions, improving resource allocation precision
- **MOSES Integration**: Evolutionary search operates within topologically-constrained spaces, using discovered hierarchies to guide program synthesis toward semantically meaningful solutions

**Technical Architecture**

The system integrates seamlessly with existing SingularityNET infrastructure:
- Native MeTTa type system extensions for geometric operations
- MORK billion-atom processing capabilities for large-scale topological computation
- PyNeuraLogic integration for hypergraph embedding learning
- Comprehensive performance monitoring and validation frameworks

**Expected Impact**

This work directly addresses PRIMUS scalability challenges while providing:
- Measurable reasoning performance improvements across PLN, ECAN, and MOSES
- Automated knowledge graph quality enhancement without manual intervention
- Mathematical rigor through topological analysis ensuring robust structural discoveries
- Open-source tools enabling broader AGI research community adoption

The consolidation paradigm represents a novel approach to knowledge graph evolution that leverages both symbolic reasoning strengths and geometric analysis power, creating synergistic improvements in AGI cognitive architecture performance.

Open Source Licensing

MIT - Massachusetts Institute of Technology License

Links and references

- Research Profile: https://scholar.google.com/citations?user=Lmi2sg8AAAAJ/
- Professional Background: https://www.linkedin.com/in/jacob-billings/  
- ORCID: https://orcid.org/0000-0002-8186-6126
- Recent Publication: "Nature Prefers Sustainable Structures: Implications for Large-Scale Political Self-Organization" 
- Topological Methods Research: "Simplicial and Topological Descriptions of Human Brain Dynamics" Network Neuroscience

Proposal Video

Not Avaliable Yet

Check back later during the Feedback & Selection period for the RFP that is proposal is applied to.

  • Total Milestones

    2

  • Total Budget

    $40,000 USD

  • Last Updated

    27 May 2025

Milestone 1 - Topological Foundation & PyNeuraLogic Integration

Description

Establish core topological analysis infrastructure integrated with PyNeuraLogic for hypergraph embedding learning. Develop MeTTa language extensions supporting geometric operations and create computational topology toolkit optimized for knowledge graph analysis. Key activities include implementing Vietoris-Rips complex computation, persistent homology extraction with barcode analysis, PyNeuraLogic integration for dense representation learning, and performance optimization for large-scale topological computation leveraging MORK's billion-atom processing capabilities. Create comprehensive testing framework validating topological feature significance through statistical analysis and bootstrap sampling methods.

Deliverables

- PyNeuraLogic integration library for knowledge graph hypergraph embedding learning - Computational topology toolkit including Vietoris-Rips complex generation and persistent homology computation - Performance benchmarking suite demonstrating scalability to million-node knowledge graphs - Statistical validation framework for topological feature significance testing - Comprehensive API documentation and developer guides - Automated test suite with 85%+ coverage - Integration demonstration with existing MORK infrastructure

Budget

$20,000 USD

Success Criterion

Milestone succeeds when system can process knowledge graphs with 1M+ nodes, compute persistent homology with sub-minute response times, generate statistically significant topological features validated through bootstrap analysis, demonstrate seamless PyNeuraLogic integration for embedding learning, and show successful MeTTa expression evaluation for geometric operations with comprehensive test coverage.

Milestone 2 - PRIMUS Cognitive Architecture Integration

Description

Implement direct integration with PRIMUS cognitive components (PLN, ECAN, MOSES) using topological analysis results to enhance reasoning performance. Develop specialized interfaces respecting each subsystem's computational requirements while providing enhanced knowledge organization. Create PLN uncertainty regions based on topological clustering, implement ECAN attention allocation using geometric centrality measures, and develop MOSES evolutionary search within topologically-constrained spaces. Establish comprehensive benchmarking framework measuring reasoning performance improvements across all PRIMUS components with automated A/B testing capabilities.

Deliverables

- PLN integration with topological uncertainty region computation and confidence boundary optimization - ECAN attention allocation enhancement using topological centrality and geometric importance measures - MOSES evolutionary search integration with topologically-constrained optimization spaces - Automated benchmarking suite measuring reasoning performance across PLN inference accuracy, MOSES convergence rates, and ECAN attention efficiency - A/B testing framework comparing pre/post-consolidation performance on standardized reasoning tasks - Performance optimization specifically targeting PRIMUS component integration points - Comprehensive documentation of cognitive architecture enhancement mechanisms - Statistical analysis reports demonstrating measurable reasoning improvements

Budget

$20,000 USD

Success Criterion

Milestone succeeds when PLN demonstrates 10%+ improvement in inference accuracy, ECAN shows 10%+ improvement in attention allocation efficiency, MOSES exhibits 10%+ faster convergence on program synthesis tasks, A/B testing confirms statistical significance of improvements, and integration maintains full compatibility with existing PRIMUS architecture while demonstrating sustained performance gains across multiple evaluation cycles.

Join the Discussion (0)

Expert Ratings

Reviews & Ratings

    No Reviews Avaliable

    Check back later by refreshing the page.

Welcome to our website!

Nice to meet you! If you have any question about our services, feel free to contact us.