MVP of an idea using multiple local LLM models to simulate and play D&D
-
Updated
Nov 3, 2024 - Python
MVP of an idea using multiple local LLM models to simulate and play D&D
Local RAG researcher agent built using Langgraph, DeepSeek R1 and Ollama
Handy tool to measure the performance and efficiency of LLMs workloads.
Ollama with Let's Encrypt Using Docker Compose
AURORA (Artificial Unified Responsive Optimized Reasoning Agent) uses lobes and web research for RAG based memory and learning.
Effortlessly rename files using local AI - No tokens, No API. Based on Ollama.
Issue report classification demo with SetFit and Ollama for NASA's Flight System software repositories
This repo brings numerous use cases from the Open Source Ollama
End To End Gen AI App Using DeepSeek-R1 With Langchain And Ollama. Released in January 2025, this model is based on DeepSeek-V3 and is focused on advanced reasoning tasks directly competing with OpenAI's o1 model in performance, while maintaining a significantly lower cost structure
A compact LLM research tool for rapid experimentation, powered by open source!
Auto Install and Running Public API Service for Ollama with Any Model (library)
Приложение на Streamlit и Ollama позволяющее получать сжатый текст при помощи открытых LLM моделей
Run Ollama models anywhere easily
CLI tool for chatting with llms running via ollama written using modern python tooling. The tool focuses on enhancing nvim based coding workflow for free cuz author is a damn cheapskate.
Welcome to the Llama-3 Chatbot project! This chatbot allows you to interact with the Llama-3 model via a simple command-line interface. Type your messages, and receive responses from Llama-3.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Ollamate is an AI assistant using Ollama to run your favorite local LLMs.
An Offline intelligent inventory management system that uses LLMs (Large Language Models) to process natural language queries and manage inventory data through MongoDB.
Add a description, image, and links to the ollama-python topic page so that developers can more easily learn about it.
To associate your repository with the ollama-python topic, visit your repo's landing page and select "manage topics."