Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
louisbrulenaudet 
posted an update May 27
Post
977
I've just open sourced RAGoon, a small utility I use to integrate knowledge from the web into LLM inference based on Groq speed and pure Google search performance ⚡

RAGoon is a Python library available on PyPI that aims to improve the performance of language models by providing contextually relevant information through retrieval-based querying, parallel web scraping, and data augmentation techniques. It offers an integration of various APIs (OpenAI, Groq), enabling users to retrieve information from the web, enrich it with domain-specific knowledge, and feed it to language models for more informed responses.
from groq import Groq
# from openai import OpenAI
from ragoon import RAGoon

# Initialize RAGoon instance
ragoon = RAGoon(
    google_api_key="your_google_api_key",
    google_cx="your_google_cx",
    completion_client=Groq(api_key="your_groq_api_key")
)

# Search and get results
query = "I want to do a left join in python polars"
results = ragoon.search(
    query=query,
    completion_model="Llama3-70b-8192",
)

# Print list of results
print(results)

For the time being, this project remains simple, but can easily be integrated into a RAG pipeline.

Link to GitHub : https://github.com/louisbrulenaudet/ragoon