Content Knowledge Graph

chevron-icon
Back
Arslan Tariq
RFP Owner

Content Knowledge Graph

  • Total Pool Funding $120,000
  • Target Round Round 4
  • Type Community
  • Status Submitted

Project name

Content Knowledge Graph

Short summary

The goal is to create a foundational KG that not only structures Deep Funding data for immediate needs but also sets the groundwork for scalable expansion to encompass the entire SingularityNET ecosystem.

Longer description

Context and Background: 

The SingularityNET (SNET) ecosystem generates a large and diverse range of data, including technical documentation, educational materials, community interactions, proposals, votes and more. Currently, this wealth of information is scattered and siloed, limiting its usability.  

To address this, this RFP aims to fund the initial iteration of a Knowledge Graph (KG), focusing on Deep Funding-related data.

The purpose of the KG is to structure the data with human-provided and automated MetaData, create and expose relations, and keep track of the data's trustworthiness. The KG will serve as the key database to enable multiple data Applications (e.g. UI, webapps, bots, etc.) that access the data in a structured way and present it in a context-sensitive manner for a variety of use cases. 

Collaboration:

This RFP will be followed by a subsequent RFP for applications that use the Knowledge Graph.

Multiple applications (teams working on applications with frontend/user stories) will be funded through this future, separate RFP. The winning teams (Knowledge Graph team from this RFP and Application teams from the RFP for KG Applications) are expected to collaborate. The details are described below under non-functional requirements.

Examples of suggested Applications for the Applications RFP:

  • Automated review of incoming proposals (on both drafts and published ones), with integration into SingularityNET voting portal.
  • Categorisation of proposals and filtering proposals (tagging can be predefined categories or bottom-up).
  • Proposal builder (AI helper, separate UI from voting platform): think about extra aspects and compare with other proposals.
  • General Q&A avatar on existing proposals and rules of DF round x.
RFP Expected Outcomes:
    1. Initial Knowledge Graph Deployment: A functional KG that represents all Deep Funding content, structured for scalability and future integration across the ecosystem.
    2. Data Ingestion: Ability to ingest data from the main portal via an API offered by DF (must have) as well as other sources (nice to have) and to store it in the KG. 
    3. Thorough Documentation: provide comprehensive documentation detailing the structure and functionality of the KG to allow other teams to work with and to work on the KG.
    4. Collaboration with Application projects: Coordinate with and integrate support for the teams awarded in the Data Applications RFP. This includes making the necessary APIs available for connecting data pipelines to the KG and searching the data in the KG.
    5. Maintenance & Hosting Plan: Accompany this with a maintenance plan to ensure ongoing support, hosting, and basic updates for 1 year (included in the maximum award).
    6. Integration with SingularityNET's AI platform as a 'knowledge node': A minimum of 10% of the budget is suggested to be reserved for this purpose.

Functional Requirements

Description of the functions and features expected from the solution.

Must Have:

Initial KG functionality: 

  • Search functionality: Ability for 3rd party applications to search the Knowledge Graph.
  • Data ingestion functionality: Ability for 3rd party applications to add data to the Knowledge Graph.
  • Data management functionality: Ability for admin to delete and/or edit data.
  • Tagging functionality: improve the organization of the content by ensuring that the different entities are appropriately tagged and clustered with rich metadata.

Good to Have:

  • Interface for edge list data import.
  • Automated categorization system of data objects.
  • Permission management for data ingestion, management, and search functionalities. 

Nice to Have:

Community Collaboration Features

  • Tools and interfaces that allow for community-driven data contributions, corrections, and annotations, fostering an engaged and participatory ecosystem.
  • A feedback loop or rating system that allows users to report inaccuracies or suggest improvements to the KG content.

 

Technical Requirements

Specific technical needs or specifications that proposals must meet.

Must Have:

Initial Knowledge Graph Deployment:

  • Deploy a scalable KG infrastructure capable of representing DeepFunding content that includes 
    • entities (data objects) such as documentation, proposers, proposals and milestones, voters and votes 
    • and labelled relationships between participants in the network.
  • The KG should be designed with future ecosystem-wide content integration in mind, allowing for scalability (compatible with nested and interlinked subgraphs) and the handling of a growing volume of data types and sources.
  • Reliable hosting solution for the KG for 1 year. Having a well-defined plan to host in a partner of SNET like Ocean or ICP would be a bonus. 

Data Ingestion Process:

  • Ability to add/delete data to the KG in JSON-LD format, compatible with data from DeepFunding voting portal, and with flexibility to add new sources as required.
  • Include data cleaning mechanisms to ensure the accuracy and quality of the data ingested into the KG. (to be tested with export data from the test round). 
  • API to offer search functionality of the data in the KG. Report on the subset of GRAPHQL (https://spec.graphql.org/October2021) covered.

Good to Have:

  • Semantic and Social Network data structure to understand what has been said (Semantic) and who said what (Social Network).
  • Automated Data Pipelines for real-time data ingestion from a broader array of sources, ensuring data relevance by integrating live updates to the KG.
  • Data Enrichment Tools: Integrate NLP and ML tools for semantic analysis, automated categorization, tagging, and enrichment to uncover hidden relationships and enhance metadata.
  • Protected data and access management.

Nice to Have:

  • MetaData entry for Reputation scores for each data object (trustability).
  • MetaData entry for Flagging (3rd party application warnings on trustability).
  • Auditable log of interactions with and modifications to the KG.

Non Functional requirements

Include aspects like documentation, maintenance, and compliance with standards.

Must Have:

Thorough Documentation:

  • Provide comprehensive documentation detailing the KG's structure, functionality, data models, and interaction interfaces (APIs/UI), enabling effective collaboration and development by other teams.
  • Include guidelines for data ingestion, querying practices, and interface customization.

Collaboration and Support:

  • Collaboration process with Application teams to co-define the ontology (must contain elementary words) and to ensure that the ontology design aligns with the requirements of the user interface. This may involve discussing how entities and relationships will be presented and navigated within the Front-End(s).
    • Understand what the data presentation needs and ensure the ingested data is structured and labelled appropriately to support front-end functionalities such as querying and visualization.
    • Availability for at least 6 meetings (1 hour each) with the Applications teams and 8 hours for monitoring async communication channels during the Applications RFPs submission period and after with the selected teams to answer questions, coordinate requirements, and provide support. Any needs exceeding this amount should be budgeted by the Application teams.

  • Maintenance plan for ongoing support and updates of the Knowledge Graph for 1 year.
    • Mattermost channels for customer support to Application teams (monitored weekly) and availability for at least 2 calls a month.
    • Fixing of any bugs or gaps in the technical functioning of the KG and APIs.
    • At least 30 hours of technical support available to expand the KG’s ontology and/or support the integration of new data sources. (available to Application teams on a first come first serve basis). The hourly rate for these hours is to be presented in the proposal for the KG RFP, however, the payment of said hours should not be budgeted in the KG proposal. The payment for said hours is to be budgeted in the Application RFP's proposal and paid to the selected KG team by the selected Application team(s).

  • Integration with SingularityNET's AI platform as a 'knowledge node': 
    • A minimum of 10% of the budget is suggested to be reserved for this purpose.
    • The SingularityNET foundation team will provide guidance and support to the selected team.

Good to Have:

  • Training materials for users and admins.
  • Clear budget outlining the allocation for collaboration (coordination and support), build, and maintenance.

Nice to Have:

  • Agile methodologies to coordinate tasks and collaborate with Application teams.
  • Ontology that includes URIs, Images, Text Blocks, etc.

Available resources apart from funding

Our AI-platform is quickly evolving, especially in the area of providing API based 'knowledge nodes'. This is a bit of a moving target, but the internal singularityNET team will be available for help on onboarding and expert advice, where needed. 

Estimated Complexity

70

Lead time

maximum 6 months from kick-off (not counting maintenance plan).

Can multiple proposals be awarded?

no

Max award per proposal

$120,000

Purpose; what problem should be solved by this RFP?

The SingularityNET (SNET) ecosystem generates a large and diverse range of data, including technical documentation, educational materials, community interactions, proposals, votes and more. Currently, this wealth of information is scattered and siloed, limiting its usability.  

To address this, this RFP aims to fund the initial iteration of a Knowledge Graph (KG), focusing on Deep Funding-related data.

The purpose of the KG is to structure the data with human-provided and automated MetaData, create and expose relations, and keep track of the data's trustworthiness. The KG will serve as the key database to enable multiple data Applications (e.g. UI, webapps, bots, etc.) that access the data in a structured way and present it in a context-sensitive manner for a variety of use cases.

Description of main assessment criteria

  1. Proposals will be evaluated on the following criteria:
    1. Alignment with requirements and objectives: does the proposal meet the requirements and advances the objectives of the RFP.
    2. Pre-existing infrastructure & risk mitigation: is the team starting from 0 or has already built part/all of the proposal, and are other risk mitigation factors at play.
    3. Team Competence: does the team have relevant experience.
    4. Cost: does the proposal offer good value for money.

Join the Discussion (0)