Empathetic AI

chevron-icon
RFP Proposals
Top
chevron-icon
project-presentation-img
Expert Rating 3.6
Rakesh Jakati
Project Owner

Empathetic AI

Expert Rating

3.6

Overview

We propose developing Empathetic AI Companions, AI-driven virtual agents capable of engaging with users based on real time emotional states detected via EEG. Using MeTTa scripting, these companions will dynamically adapt their behavior, responding empathetically to emotions such as happiness, sadness, focus and stress. Leveraging our pre-existing EEG emotion classification model, developed under DF4 grant, this project will demonstrate MeTTa’s potential for real-time interaction and adaptive AI systems. While designed for versatile applications, these companions can seamlessly integrate into platforms like Sophiaverse, enhancing emotional engagement and interaction.

RFP Guidelines

Develop interesting demos in MeTTa

Complete & Awarded
  • Type SingularityNET RFP
  • Total RFP Funding $100,000 USD
  • Proposals 21
  • Awarded Projects 4
author-img
SingularityNET
Aug. 12, 2024

Create educational and/or useful demos using SingularityNET's own MeTTa programming language. This RFP aims at bringing more community adoption of MeTTa and engagement within our ecosystem, and to demonstrate and expand the utility of MeTTa. Researchers must maintain demos for a minimum of one year.

Proposal Description

Project details

Brain Computer Interfaces

BCI technology enables direct communication between the brain and external devices, transforming neural activity into actionable signals for control or feedback. By leveraging advanced signal processing, machine learning, and neuroscience, BCIs empower users with tools for cognitive improvement, enhanced wellness, and a seamless interaction between neural and digital systems.

  1. Real-Time Emotional Insights: Continuous monitoring of states like happiness, sadness, and focus.
  2. Unmatched Accuracy: A deeper understanding of internal emotions, beyond visible expressions.

This project harnesses the power of BCIs to create Empathetic AI Companions - virtual agents designed to listen and adapt conversationally based on emotional states detected in real time. These companions focus on empathetic listening rather than problem-solving or advice, fostering trust and user comfort.

Sadness Detection

When the AI detects patterns in the EEG signals associated with sadness, it approaches the user with a gentle and neutral tone. The goal is to create a space where the user feels comfortable to express themselves without feeling judged or pressured.

What the AI Says:
“It seems like you might be feeling a little down. I’m here if you’d like to talk about it.”
This statement acknowledges the emotion without assuming its cause, giving the user an open invitation to share if they feel ready.

Why This Works :

The wording is neutral and empathetic, avoiding intrusive questions or making assumptions. It signals that the AI is available to listen, allowing the user to decide whether or not to engage.
It focuses on being present, ensuring the interaction feels supportive rather than directive.


Happiness Detection

When EEG signals suggest the user is feeling happy, the AI responds in a way that reinforces and celebrates their positive emotional state. It encourages the user to reflect on the moment, amplifying their joy.

What the AI Says:
“You seem happy! Would you like to share what’s been making you feel this way?”
This approach invites the user to explore and express their happiness further, creating a positive and engaging interaction.

The AI reflects the user’s emotion, affirming their current state and encouraging them to stay in the moment.
By asking an open-ended question, the AI allows the user to decide how much they want to share.
The response reinforces the positive emotion, helping the user feel more connected and acknowledged.

AI systems today often rely on external cues to interpret emotions, such as text sentiment, voice tone, or facial expression analysis. They focus on outward signals, which can mask true internal states.

Technology Enabling Adaptive Conversations

At the heart of these companions lies MeTTa scripting, which enables dynamic, real-time behavior adaptation:

  • Emotion Classification Integration:
    MeTTa handles the communication with APIs that classify emotions (happiness, sadness, focus) based on EEG patterns, seamlessly interpreting the results for the AI system.

  • Dynamic Behavioral Adaptation:
    MeTTa enables the AI companion to adjust its conversational responses in real time, based on the classified emotional state, ensuring interactions remain relevant and engaging.

  • Streamlined Workflow:
    MeTTa simplifies the complex process of combining EEG data, API results, and adaptive AI behaviors into a cohesive, real-time system.

  • Scalable and Flexible:
    MeTTa’s modular nature allows easy expansion to new emotions or use cases, showcasing its potential for dynamic, human-centered AI applications.

Open Source Licensing

Apache License

Open Source Components

  • Integration Framework: Codebase for integrating the emotion classification model with virtual companions using MeTTa scripting. Includes API integration scripts and MeTTa scripts for adaptive responses.
  • Empathetic Response System: Scripts for mapping emotional states (happiness, sadness, focus, stress) to corresponding AI behaviors.

Proprietary Components

  • Emotion Classification Model: EEG-based pre-trained algorithms and data processing pipelines.
  • Data Handling and Security Systems: Closed components for managing sensitive user data securely.

Proposal Video

Not Avaliable Yet

Check back later during the Feedback & Selection period for the RFP that is proposal is applied to.

  • Total Milestones

    3

  • Total Budget

    $25,000 USD

  • Last Updated

    8 Dec 2024

Milestone 1 - MeTTa Framework Development

Description

This milestone focuses on the development of foundational MeTTa scripts to adapt AI behavior based on emotional states. We will create modular reusable scripts that map emotions (happiness sadness focus and stress) to AI responses. This phase will involve initial prototyping of the adaptive response system ensuring its readiness for integration with EEG emotion detection systems. 1. Design and develop MeTTa scripts for predefined emotional states. 2. Prototype core functionality demonstrating AI response generation based on static emotional inputs. 3. Create modular frameworks that allow future integration with dynamic data sources like EEG. 4. Begin drafting documentation to describe script functionality and modularity for developers.

Deliverables

1. Functional MeTTa scripts capable of simulating AI behavior for predefined emotional states. 2. Modular framework for adaptable AI behavior. 3. Draft documentation detailing the script architecture and use cases.

Budget

$6,000 USD

Success Criterion

1. Developed MeTTa scripts simulate predefined emotional states (happiness, sadness, focus, stress) with appropriate AI responses. 2. Core functionality is successfully prototyped, showcasing accurate AI response generation based on static emotional inputs. 3. Initial draft documentation clearly describes the script architecture, modularity, and use cases, making it accessible to developers.

Milestone 2 - EEG Emotion API Integration

Description

This milestone involves integrating the existing EEG-based emotion classification model with the MeTTa framework. The primary goal is to enable real-time emotional data to dynamically drive the behavior of AI companions. Rigorous testing and validation will ensure seamless communication between the emotion model and MeTTa scripts while the modular design will allow easy replication and customization. 1. Develop a robust data pipeline connecting the EEG emotion classification API with MeTTa. 2. Implement triggers in MeTTa scripts to adapt AI behavior dynamically based on real-time emotional inputs. 3. Validate system performance through end-to-end testing of real-time emotional adaptations. 4. Refine the integration framework based on test results ensuring smooth interaction and minimal latency.

Deliverables

1. Integrated data pipeline for real-time emotional state inputs. 2. Validated system demonstrating real-time adaptability based on EEG data. 3. Modular integration examples showcasing behavior triggering.

Budget

$10,000 USD

Success Criterion

1. MeTTa scripts dynamically adapt AI behavior based on real-time emotional state changes. System successfully passes end-to-end testing, demonstrating smooth interaction, minimal latency, and accurate emotional response adaptation. 2. Real-time adaptability to various EEG-detected emotional states is successfully demonstrated.

Milestone 3 - System Deployment and Open Source Release

Description

The final milestone focuses on deploying the complete emotion-adaptive AI system on a SingularityNET Hyperon instance. This includes final testing demonstration and the release of open-source components ensuring the community can replicate and extend the project. Comprehensive documentation will provide step-by-step instructions for setup customization and integration into other applications. 1. Deploy the fully functional AI system on Hyperon including real-time emotional adaptation. 2. Conduct final testing and validation to ensure robustness and scalability. 3. Prepare open-source documentation with detailed guides examples and best practices. 4. Release modular MeTTa scripts and integration frameworks in a public repository. 5. Create demonstration scenarios showcasing the AI’s ability to respond empathetically to emotional states.

Deliverables

1. Deployed AI companion system demonstrating real-time emotion adaptation. 2. Open-source repository with clean well-commented MeTTa scripts and modular integration examples. 3. Comprehensive documentation covering setup usage and customization for developers. 4. Demo scenarios to encourage adoption and replication.

Budget

$9,000 USD

Success Criterion

1. The emotion-adaptive AI system is deployed on SingularityNET Hyperon, functioning reliably with real-time emotional adaptation. 2. Public repository contains clean, well-documented, and modular MeTTa scripts along with integration frameworks. 3. Demonstration scenarios effectively showcase AI’s empathetic responses to emotional states, encouraging community adoption and replication.

Join the Discussion (0)

Expert Ratings

Reviews & Ratings

Group Expert Rating (Final)

Overall

3.6

  • Compliance with RFP requirements 4.0
  • Solution details and team expertise 4.3
  • Value for money 3.3
  • Expert Review 1

    Overall

    4.0

    • Compliance with RFP requirements 5.0
    • Solution details and team expertise 4.0
    • Value for money 0.0
    Cool team and project

    Innovative proposal demonstrating MeTTa’s real-time adaptability with emotion-driven AI companions using EEG data. Highlights MeTTa’s modular scripting and dynamic behavior capabilities, aligning well with RFP goals. Leverages pre-existing EEG models for practical application in Sophiaverse and beyond. Clear milestones and deliverables, but reliance on proprietary components limits open-source impact. Team demonstrates credibility via prior DF4 grant work. Strong potential to showcase MeTTa’s utility in adaptive AI systems.

  • Expert Review 2

    Overall

    3.0

    • Compliance with RFP requirements 3.0
    • Solution details and team expertise 3.0
    • Value for money 0.0

  • Expert Review 3

    Overall

    4.0

    • Compliance with RFP requirements 4.0
    • Solution details and team expertise 3.0
    • Value for money 0.0
    This would be a cool demo, though the way to achieve it would be slightly different than what the proposer seems to think

    I think this would be do-able using Metta-Motto as a sort of glue to connect together different NNs doing all the actual work, and using Atomspace as a sort of long term and working memory. Done like this it would be a pretty wizzy Metta-Motto demo and could be done with the EEG component as one optional input, so it could be played with by folks without that hardware as well.

feedback_icon