NEURAL SEARCH

chevron-icon
Back
project-presentation-img
Luke Mahoney (MLabs)
Project Owner

NEURAL SEARCH

Funding Awarded

$150,000 USD

Expert Review
Star Filled Image Star Filled Image Star Filled Image Star Filled Image Star Filled Image 0
Community
Star Filled Image Star Filled Image Star Filled Image Star Filled Image Star Filled Image 4 (13)

Status

  • Overall Status

    🛠️ In Progress

  • Funding Transfered

    $0 USD

  • Max Funding Amount

    $150,000 USD

Funding Schedule

View Milestones
Milestone Release 1
$37,500 USD Pending TBD
Milestone Release 2
$40,000 USD Pending TBD
Milestone Release 3
$37,500 USD Pending TBD
Milestone Release 4
$35,000 USD Pending TBD

Project AI Services

No Service Available

Overview

Neural Search is a new way to incorporate Monte Carlo Tree Search into neural networks. Unlike similar techniques, Neural Search builds the exploration and optimization of the search into the layers of the network. The search itself is learnable, and provides us with a means to develop guided search algorithms trained on data. We have demonstrated that the algorithm can learn robust solutions to search problems. We propose further development of the approach to large scale problems, and to demonstrate the effectiveness of the algorithm by training it to play Go, a standard achieved AlphaGo.

Total for the project is $150K over 11 months, and it will be resourced by 3 senior MLabs members.

Proposal Description

How Our Project Will Contribute To The Growth Of The Decentralized AI Platform

We will provide a novel and efficient technique to create a performant Go engine for users of the SNET platform to play against.

Our Team

Our team excels in AI innovation and research, leveraging decades of combined experience in deep learning, generative AI, and reinforcement learning. Led by Iindustry veterans, we have a proven track record in developing robust algorithms for real-world applications. Furthermore, our team members comprise a wide variety of deep expertise, meaning we have the ability to deliver innovative and impactful solutions in a cost effective manner. 

View Team

AI services (New or Existing)

Go using Neural Search

Type

New AI service

Purpose

To offer players of Go and SNET service.

AI inputs

User input.

AI outputs

Go board moves.

Company Name (if applicable)

MLabs LTD

The core problem we are aiming to solve

Search dominated AI before the Deep Learning revolution. But current Deep Learning architectures cannot incorporate search as an intrinsic part of their models.

Our specific solution to this problem

Search is still used as a driver in many Deep Learning applications such as Natural Language Processing to evaluate multiple options for sentence completions, or to drive Reinforcement Learning models by running trained deep learning models multiple times and sampling from their action space. This is the approach used to generate playouts in Alpha Zero and similar for playing strategic games.

In these applications the model relies on a quite separate search process: search has not been trainable nor introduced as a component in neural network architectures. The search space, the transition functions, and (to some extent) the fitness functions need to be designed extrinsically and are applied at the top level of model deep learning architectures. This usually requires running the full forward pass through the network as many times as there are branches in the search

Project details

Neural Search Innovation

We have developed a new deep learning component called Neural Search capable of learning to search in the latent space of a neural network. It is capable of discovering a useful transition function between states, and a fitness function to evaluate the best state in the search.

Having learnable search inside the latent space of a neural network has multiple benefits:

  1. The neural network can evaluate multiple branches of computation and choose the optimum one, without running the full forward pass multiple times

  2. The neural network can use fewer parameters: instead of having to encode all possible branches in its parameters, it can learn a transition function and a corresponding fitness function, and then compute possible sequences of states on the fly

  3. The neural network can reason: it can learn to generate options and evaluate them

The Neural Search component is trainable via gradient descent and can be plugged into any deep learning architecture as a function between layer vectors of the same size (latent states).

Outline of Architecture

We assume a generic encoder-decoder architecture, consisting of the following steps

h = Encoder(x)
h’ = NeuralSearch(h)
y = Decoder(h’)

Neural Search is defined via three components:

  1. Transition Function: a neural network taking a state as input and generating a parametrized probability distribution over the next states. For example: it could return the element-wise mean and standard deviation of a Gaussian distribution, the element-wise lower and upper bounds of a uniform distribution, or the probabilities of a multinomial distribution

  2. Sampling Function: a function that takes as input the parameters of a distribution over states and generates a fixed number of sampled states

  3. Fitness Function: a function that takes a state and computes a scalar value defining the fitness of the state. Higher values correspond to higher fitness

Outline of Algorithm

Neural Search executes a randomized beam search over the state space by generating at each step a number of candidates and choosing/sampling the best states to retain based on the fitness function.

During training, we sample from the generated next_states using their fitness scores. A temperature hyperparameter is used to control how uniform or greedy the choices for the candidates are.

Finally, when the maximum search depth is reached, the fitness function is used to compute an attention score for each state in the beam and returns a convex combination of them.

class NeuralSearch:

transition_function

sampling_function

fitness_function

 

beam_width

max_width

max_depth

 

temperature=1.0

 

def forward(x):

return search(x)

 

def search(x):

candidates = [x]

for _ in range(max_depth):

  next_dist = transition_function(candidates)

  next_states = sampling_function(next_dist, max_width)

  next_states_fitness = fitness_function(next_states)

   

  if train:

  candidates = sample_categorical(

  next_states, 

  softmax(next_states_fitness/temperature), 

  n_samples=beam_width

)

else:

  candidates = topk(

  next_states, 

  next_states_fitness, 

  k=beam_width

  )

  

  return softmax(fitness_function(candidates)) @ candidates

 

Deterministic and Stochastic Transitions

We have thus far used a transition function which defines a probability distribution over the following states, and mandates the use of sampling to get those next states. This turns the transition and sampling functions into a one-to-many mapping.

We have selected this approach as opposed to the alternative formulation of using a one-to-many network directly as the transition function. This is because a one-to-many network is only feasible for a small number of following states, and requires many more parameters to solve the same task.

A further benefit of sampling is that we can dynamically change the number of following states without affecting the network architecture, and possibly without the need for retraining, since all the samples are independently drawn from a single probability distribution. This can allow us to increase the amount of search that the network does for a specific sub-task. It therefore has the ability to quickly race through the obvious parts of a problem, and slow down to think harder about the difficult parts - all under internal network control.

The one key drawback to a randomized transition function is that the whole network is non-deterministic. We have not found this to be a problem thus far, and indeed other neural network algorithms with randomized steps, such as Variational AutoEncoders, seem to work well, even with constraints on the distribution parameters.

Our Feasibility Analysis

We have trained our Neural Search architecture on a number of small scale problems to establish proof of concept. We are satisfied that the innovation can successfully learn the components of a search module within a neural network. To evaluate feasibility we have trained Neural Search networks to perform both classification and puzzle solving tasks, in addition to improving the inference efficiency of GPT models.

In the classification applications:

  • Since beam search applies the same function at each step, the final path from input to output was very similar to a recurrent neural network. Stabilization measures were needed to successfully train the model. We found that adding Layer Normalization at each step appropriately controlled the process

  • The broader the width of the search, the better the classification accuracy. This meant that the neural search was not collapsing to a single deterministic path, and the more exploration it performed the better the final results, for the same number of parameters and the same search depth

In the maze solving tasks:

  • We used Reinforcement Learning to provide feedback of successful strategies, and were able to learn the transition and fitness functions without the need for labeled training data

  • Neural Search was able to solve mazes using fewer parameters than a dense MLP trained on the same task. For larger problems this more parsimonious architecture has profound implication for generalization and inference efficiency

For GPT models:

  • Adding a search module to each transformer block allowed us to reduce the total number of transformer blocks, from 6 to 4 (with 25% fewer parameters), while preserving the same accuracy on the validation set. Training time was unaffected

  • Using Neural Search as part of the transformer model deterred overfitting, and opened up the capability for a far richer set of sequence selection behaviors. Learning such behaviors could provide GPT-based natural language generation to be more finely controlled

We are satisfied that Neural Search provides a robust algorithmic approach to such problems, and will scale to applications with search trees which are infeasible to solve using traditional approaches and deterministic search algorithms. Furthermore, we have clear evidence that Neural Search can solve such large scale problems more efficiently than architectures which use Deep Learning networks in conjunction with bespoke search modules (e.g. Monte Carlo Tree Search).

 

Competition and USPs

We compose a feasibility analysis - see above - and research similar applications in the SNET market place. Given the history of AI applications and consumers of SNET, we believe our application will be superficially successful - e.g. in terms of API calls - while representing a novel technical approach.

Links and references

Our Rigel project from DFR3 - currently on track with already 2 milestones delivered.

https://proposals.deepfunding.ai/graduated/accepted/e5b9301b-f1e3-4198-b6f3-66ee3484b311?id=weaknesses&commentId=1694189367893

Our Website

https://mlabs.city/

Proposal Video

DF Spotlight Day - DFR4 - MLabs - Neural Search

7 June 2024
  • Total Milestones

    4

  • Total Budget

    $150,000 USD

  • Last Updated

    7 Aug 2024

Milestone 1 - API Calls & Hostings

Status
😐 Not Started
Description

This milestone represents the required reservation of 25% of your total requested budget for API calls or hosting costs. Because it is required we have prefilled it for you and it cannot be removed or adapted.

Deliverables

You can use this amount for payment of API calls on our platform. Use it to call other services, or use it as a marketing instrument to have other parties try out your service. Alternatively you can use it to pay for hosting and computing costs.

Budget

$37,500 USD

Link URL

Milestone 2 - Baseline System Development

Status
🧐 In Progress
Description

This work task comprises the development of the baseline reinforcement learning loop for the neural search algorithm and the AlphaGo-style model we will compare our results with. Risk is low - we have already tested the basic approach to reinforcement learning but need to implement a full-scale version which is tightly integrated with the neural search. We will train a neural network to play Go using the AlphaGo architecture.

Deliverables

Tested implementation of neural search with reinforcement learning, and a working version of an AlphaGo-style neural network.

Budget

$40,000 USD

Link URL

Milestone 3 - Implement Neural Search Basic Model

Status
😐 Not Started
Description

This work task comprises the development of the Neural Search model to play Go using the reinforcement learning framework developed in the previous work package. The algorithm will be used to train a number of neural networks with different architectures. The sensitivity to hyperparameters will be evaluated, and a systematic approach to design established.

Deliverables

Tested implementation of neural search for playing Go.

Budget

$37,500 USD

Link URL

Milestone 4 - Implement Neural Search Advanced Model

Status
😐 Not Started
Description

This work task comprises the further development of the Neural Search model to play Go. Lessons learned from the previous work package will inform changes to the underlying neural search algorithm and how it is architected into the solution. The setting of hyperparameters will be further investigated.

Deliverables

Tested implementation of advanced version of neural search for playing Go.

Budget

$35,000 USD

Link URL

Join the Discussion (0)

Reviews & Rating

New reviews and ratings are disabled for Awarded Projects

Sort by

13 ratings
  • 0
    user-icon
    HenriqC
    Jun 9, 2024 | 1:10 PM

    Overall

    5

    • Feasibility 4
    • Viability 5
    • Desirabilty 5
    • Usefulness 5
    Very promising novel architectural solution

    Feasibility

    Technically a relatively complex project. It is very useful that the proposal draws the picture of the architecture against the past and current context. The differences and improvements to related solutions are logically explained. Proposal itself provides comprehensive information and sufficient details to get a grasp of what is meant to be done. It is a great foundation that the basics of the solution are already prototyped and there is some evidence on the potential performance. The building blocks of the system are well known and solid. Of course, the ultimate capability becomes obvious only after the implementation.

     

    Viability

    MLabs has implemented really complex projects and earned a solid reputation in the space. The proposal gives sufficient information about time expectations, human resources allocated to the project and financial sustainability. The company itself is well resourced from the perspectives of both mental and financial capital. The proposing entity has not yet (as far as I know) launched services on the marketplace and even though there is no reason to expect any difficulties it is good to note that many new projects have faced delays due to technical integration challenges. All things considered, with this team the project is very viable.

     

    Desirability

    Many long-existing AI algorithms have a lot of potential, arising not only from new hardware but also from new insightful combinations such as presented in this proposal. Go has become one of the kinds of industry standards to demonstrate the performance of different AI architectures and implementations. It gives some very objective performance metrics and grabs attention of industry participants. According to my knowledge, this is a novel solution bringing something new to the table by using sound and tested building blocks.

     

    Usefulness

    This is a way to unite powerful methods that have previously been used more or less independently. Potential real world applications beyond Go may be plenty. It would also be great for the SingularityNET marketplace to introduce this design solution to the industry. In the best case scenario, the successful service will span several new solutions based on the design choices pioneered in this project. 

     

    user-icon
    Luke Mahoney (MLabs)
    Jun 10, 2024 | 12:25 PM
    Project Owner

    Thank you for your positive review

  • 0
    user-icon
    Devbasrahtop
    May 24, 2024 | 2:32 AM

    Overall

    5

    • Feasibility 5
    • Viability 4
    • Desirabilty 5
    • Usefulness 5
    Neural Search for AI

    Overall 

    By directly integrating search algorithms into neural network designs, MLabs LTD's Neural Search solution solves a major gap in the present AI landscape and is both unique and technically sound. The project is a great contender for funding because it is ambitious and supported by a highly skilled crew. Nevertheless, a more thorough examination of competitors and risk mitigation techniques might be beneficial for the project.

    Feasibility 

    Technically, the project is doable and has defined deliverables and milestones. A full grasp of the problems and potential solutions is demonstrated by the in-depth description of the Neural Search architecture and its possible uses. Then, if the plan included more precise information about potential risks and how to mitigate them, it would be stronger.

    Viability

    The viability of the project is high given the extensive experience of the team in AI and machine learning. The budget and timeline seem reasonable for the scope of work. More detailed information on the market potential and adoption strategies would enhance this rating.

    Desirability

    The desirability of the project is very high, as it promises to bring a novel approach to AI by embedding search capabilities within neural networks. This can significantly enhance the performance and efficiency of AI models, making it a valuable addition to the SingularityNET ecosystem.

    Usefulness

    The project has great potential, especially for applications like strategic planning and gaming that call for intricate decision-making and optimization. The suggested Go engine will provide an engaging example of the technology's potential.

    user-icon
    Luke Mahoney (MLabs)
    May 27, 2024 | 7:21 AM
    Project Owner

    Devbasrahtop, thank you for the thoughtful review.

  • 0
    user-icon
    Ayo OluAyoola
    May 25, 2024 | 3:31 PM

    Overall

    5

    • Feasibility 4
    • Viability 4
    • Desirabilty 5
    • Usefulness 5
    AI for Making Searches Smarter

    Usefulness
    This project offers a new and improved way to combine searching techniques with neural networks. It can help AI systems make better decisions more quickly, which is useful for various applications like playing complex games (e.g., Go), processing natural language, and solving puzzles. By making search an integrated part of the neural network, it streamlines processes and can significantly enhance AI performance.

    Desirability
    The ability to create smarter, more efficient AI systems is highly desirable. This project promises to make AI systems more capable and versatile, which is appealing to both developers and users. The potential to improve AI performance in popular applications, like strategic gaming and language processing, makes it an attractive prospect in the AI community.

    Viability
    The project appears viable, given the expertise of the team and the solid proof-of-concept results. They have already demonstrated the effectiveness of Neural Search on small-scale problems, which is a positive indicator. However, there is always some uncertainty when scaling up to larger, more complex problems. The project's success will depend on continued development and testing.

    Feasibility
    The team has outlined a clear plan and has the necessary skills and experience. They have considered potential challenges, like the need for training stabilisation measures, and proposed solutions. The feasibility is rated slightly lower due to the inherent risks of developing and scaling new AI technologies, but the groundwork and initial successes are obvious

    Summary
    The Neural Search project is promising, with significant potential to advance AI capabilities. It addresses a clear need in the field and offers a novel solution that could lead to more efficient and intelligent AI systems. The project is well-planned and backed by a capable team, making it a strong candidate for success.

    user-icon
    Luke Mahoney (MLabs)
    May 27, 2024 | 7:23 AM
    Project Owner

    Thank you for considering our project, Ayo.

  • 0
    user-icon
    Tu Nguyen
    May 22, 2024 | 9:40 AM

    Overall

    5

    • Feasibility 5
    • Viability 5
    • Desirabilty 3
    • Usefulness 5
    NEURAL SEARCH

    Mlab is a well-known team in the community. The problem they solve is that current Deep Learning architectures cannot incorporate search as an intrinsic part of their models. They have developed a new deep learning component called Neural Search capable of learning to search in the latent space of a neural network. But the information they give is quite detailed.
    They should determine the start and end times of important milestones. This will help make the proposal more detailed. Additionally, they should also allocate more detailed budgets based on milestones. Additionally, they should define detailed indicators to measure outputs.

    user-icon
    Luke Mahoney (MLabs)
    May 27, 2024 | 7:12 AM
    Project Owner

    Thanks for the review, Tu.

  • 0
    user-icon
    ZeroTwo
    Jun 9, 2024 | 11:28 AM

    Overall

    4

    • Feasibility 4
    • Viability 4
    • Desirabilty 3
    • Usefulness 4
    Neural Search

    This proposal has a new approach called Neural Search which is an important technological breakthrough in incorporating Monte Carlo Tree Search into neural networks, promising to improve performance in applications that require complex searches. In addition, the team proposing this project has extensive experience in the field of innovation and artificial intelligence research, giving them confidence that they can produce innovative and useful solutions. Promising initial trial results also provide evidence that this approach has the potential to be an effective solution in a variety of contexts.

    Testing conducted primarily on small problems results in limitations in understanding how this approach will perform on a larger scale. Additionally, implementing Neural Search requires deep technical understanding and careful parameter tuning. it is worth trying to conduct large-scale testing, develop the tools and guidelines needed for implementation, and conduct further research on the use of manually controlled parameters.

    Overall, this proposal promises to be a significant contributor in the development of decentralized AI platforms. By addressing the challenges faced through appropriate solutions, this proposal has the potential to pave the way for new solutions in the development of search algorithms integrated with neural networks.

  • 0
    user-icon
    Max1524
    Jun 8, 2024 | 8:47 AM

    Overall

    4

    • Feasibility 4
    • Viability 4
    • Desirabilty 4
    • Usefulness 4
    Technical analysis is more clear in the milestone

    Budget and costs are distributed relatively evenly across 4 milestones. This makes it quite difficult for me to choose the key milestone as the quintessence of this proposal. Perhaps milestones 3 and 4 have the most technology integration. In fact, every milestone is important, but the prominence of milestones 3 and 4 is clearly visible. I want the team to explain more about the technology aspect of these two milestones.

  • 0
    user-icon
    Joseph Gastoni
    May 21, 2024 | 3:40 AM

    Overall

    4

    • Feasibility 4
    • Viability 4
    • Desirabilty 4
    • Usefulness 5
    an ideation phase for a conversational AI system

    This proposal focuses on an ideation phase for a conversational AI system to manage complex projects in sustainable communities. Here's an analysis of its viability for securing funding for a full proposal:

    Strengths:

    • Real-world testing environment: Testing the solution on a working farm (Tyddyn Teg) offers valuable data and user feedback.
    • Focus on a specific need: Sustainable communities face unique challenges in project management, making the solution relevant.
    • Experienced team: A team with diverse expertise in AI, agriculture, and user interfaces strengthens the proposal.

    Weaknesses:

    • Uncertainty of feasible solution: The success hinges on identifying a practical solution within resource constraints during the ideation phase.
    • Long-term plan unclear: The proposal lacks details on how to maintain, scale, and ensure data security beyond the initial project.
    • API integration details missing: The specifics of integrating the conversational AI with SingularityNET's platform need clarification.

    Recommendations for Strengthening Viability:

    • Develop a clear minimum viable product (MVP) for the ideation phase. Focus on core functionalities that can be realistically achieved with available resources.
    • Outline a plan for data collection, storage, and security during the voyage. Address potential privacy concerns in a community setting.
    • Specify how the developed APIs will integrate with SingularityNET. Explain the value proposition for the platform and its users.
    • Define success metrics for the ideation phase. Outline clear criteria for determining the feasibility and impact of the proposed solution.

    By addressing these weaknesses and incorporating these recommendations, the proposal can increase its chances of securing funding for the full project.

    Additional Considerations:

    • The proposal could benefit from emphasizing the replicability and scalability of the solution beyond the specific farm environment.
    • Highlighting the potential cost-savings and efficiency gains for sustainable communities can strengthen the proposal's appeal.

  • 0
    user-icon
    Nicolad2008
    Jun 7, 2024 | 6:41 AM

    Overall

    4

    • Feasibility 4
    • Viability 4
    • Desirabilty 4
    • Usefulness 4
    Neurology with whom and reality

    The project aims to build the process of exploring and optimizing searches right in the neural network layers that opened a new approach in the field of artificial intelligence. However, the project faces great challenges in applying this method to large -scale issues and the standard that Alphago has established in the field of neuroscience. The training of algorithms for large amounts of data and complex calculations, this is still a significant challenge. Therefore, although the development potential is very large, the project needs to prove more efficiency and the ability to apply widely to truly succeed.

    user-icon
    Luke Mahoney (MLabs)
    Jun 10, 2024 | 12:22 PM
    Project Owner

    Thank you

  • 1
    user-icon
    Nicolas Rodriguez
    May 22, 2024 | 8:28 PM

    Overall

    4

    • Feasibility 5
    • Viability 4
    • Desirabilty 5
    • Usefulness 5
    A Paradigm Shift in AI Search Capabilities

    Hi Luke, interesting Git.

    I'll give you a detailed review of the project

    Feasibility

    Your project demonstrates exceptional feasibility. The comprehensive feasibility analysis you provided, along with the successful proof of concept in classification, puzzle solving, and GPT model enhancement, strongly supports the technical viability of your approach. The clear outline of your architecture and algorithm further reinforces your preparedness to execute this project effectively.

    Viability

    The viability of your project is high. The potential applications of Neural Search in various domains, such as natural language processing and reinforcement learning, indicate a significant market opportunity. While the market for Go-playing AI might be niche, the broader implications of your technology could attract considerable interest from researchers and industry professionals alike.

    Desirability

    Your project is highly desirable. The ability to integrate search directly into neural network architectures is a groundbreaking concept with the potential to revolutionize how AI models are designed and trained. Your approach addresses the limitations of current deep learning architectures by offering a more efficient and elegant way to incorporate search capabilities.

    Usefulness

    The usefulness of your project is undeniable. By enabling neural networks to learn and optimize search processes internally, you're opening up new possibilities for developing more intelligent and adaptable AI systems. The potential for Neural Search to improve performance, reduce computational costs, and enhance reasoning capabilities across various applications makes it a highly valuable tool for the AI community.

    In summary, your Neural Search project is exceptionally feasible, highly viable, extremely desirable, and incredibly useful. It has the potential to make a significant contribution to the field of AI and drive innovation in numerous domains. I wholeheartedly encourage you to pursue this project, as it represents a significant leap forward in AI research and development.

    I am the designer of Transcendence Platform. We made a proposal for AI-enabled holograms, aiming to create a holographic virtual assistant for various applications, such as customer care, emotional support, advertising, and entertainment. By combining advanced AI with holographic technology, we enable natural and empathetic interactions that enhance the user experience. I invite you to watch it and give me your opinion. Thank you very much. Check it out here: https://deepfunding.ai/proposal/revolutionizing-assistance-3d-holographic-ai/

    user-icon
    Luke Mahoney (MLabs)
    May 27, 2024 | 7:14 AM
    Project Owner

    Nicolas, thank you for your thoughtful and positive review. Will checkout your proposal. Bests.

  • 0
    user-icon
    digiRuds
    May 29, 2024 | 9:59 PM

    Overall

    4

    • Feasibility 4
    • Viability 4
    • Desirabilty 4
    • Usefulness 4
    The Future of AI Search

    i think it is feasible but need to watch out and fine-tune the Neural Search components, handling complex search spaces, and optimizing computational resources for large-scale problems making sure not to affect the project
    The team seems to be qualified for this

    user-icon
    Luke Mahoney (MLabs)
    Jun 10, 2024 | 12:21 PM
    Project Owner

    Thank you for the feedback.

  • 0
    user-icon
    ThanhTrixie
    Jun 9, 2024 | 1:28 PM

    Overall

    3

    • Feasibility 3
    • Viability 3
    • Desirabilty 3
    • Usefulness 4
    Names, roles and responsibilities of each member

    I want to know clearly about the specific division of tasks for each member but I can't see that. This is also a relatively important factor in feasibility. In addition, I was a bit disappointed when I saw the names of the team members' profiles. It doesn't show anything worth talking about.

    user-icon
    Luke Mahoney (MLabs)
    Jun 10, 2024 | 1:40 PM
    Project Owner

    TrucTrixie, thanks for the review.

    Ibrahim Abdelghany and Mark Bedworth worked together on the early conceptual stages and will specify and finalize the design with input from Mark Florisson. Ibrahim will lead the implementation and will be assisted by a new team member, Banashree Sarma. Banashree is an AI researcher currently pursuing her PhD at the Indian Institute of Technology.

  • 0
    user-icon
    BlackCoffee
    Jun 10, 2024 | 12:21 AM

    Overall

    3

    • Feasibility 3
    • Viability 3
    • Desirabilty 3
    • Usefulness 3
    Currently, member identities are not transparent

    I know Mlab and what they have achieved. Obviously, Mlab has a certain reputation in the community, which contributes to a significant plus point for feasibility. But that does not mean that the team presents the members' names lightly. This sketchiness makes it impossible for newcomers to fully appreciate the possible transparency that the team has (no specific records name of any member - 1 account pending)

    user-icon
    Luke Mahoney (MLabs)
    Jun 10, 2024 | 12:48 PM
    Project Owner

    Mark Florisson is a well-known Python contributor who has made significant contributions to high-performance, compiler-related libraries such as Numba and PyPy.

    Mark Florisson on GitHub

    As mentioned on our team page, Ibrahim Abdelghany has over 10 years of industry and research experience.

    Ibrahim Abdelghany on GitHub

    Mark Bedworth has over 40 years of experience, but unfortunately not much of a public profile.

  • 0
    user-icon
    pindiyaa
    May 28, 2024 | 5:41 AM

    Overall

    2

    • Feasibility 4
    • Viability 2
    • Desirabilty 2
    • Usefulness 3
    MLABS Neural Search from alphago

    i think this similar project is done by AlphaGo, the milestone is already doen by alphago, so why do you want to create that one since its was already done and avaialable? its my question, however the general feasibility viability and usefullness was good, i dont think it needs to be redone again, 
    good luck with the voting.

    user-icon
    Luke Mahoney (MLabs)
    May 29, 2024 | 2:46 PM
    Project Owner

    Thank you for your comment. This project is meant to highlight the capability of neural search compared with state-of-the-art algorithms. Alpha-Go is one such algorithm that we will compete with to show the benefit of neural search in reasoning tasks.

    The state-of-the-art is to use neural networks to evaluate branches and to use an external tree search algorithm to explore possible paths. With the tree search on the outside, the neural network needs to be run once for every path being considered. Neural search internalises the tree search so that the network can learn the tree, the fitness of particular moves, and heuristics for selecting promising strategies for exploration. It does so in a single forward pass.

    Neural search is a novel addition to the fundamentals of neural networks which improves training capability, inference speed and quality of search. It applies to a very broad range of problems. We have chosen to pit it against AlphaGo to demonstrate its ability to finesse an existing benchmark which is considered to be at the pinnacle of what can be achieved with neural network-driven search.

Summary

Overall Community

4

from 13 reviews
  • 5
    4
  • 4
    6
  • 3
    2
  • 2
    1
  • 1
    0

Feasibility

4.1

from 13 reviews

Viability

3.8

from 13 reviews

Desirabilty

3.8

from 13 reviews

Usefulness

4.3

from 13 reviews

Get Involved

Contribute your talents by joining your dream team and project. Visit the job board at Freelance DAO for opportunites today!

View Job Board