Priveedly: A django-based content reader and recommender for personal and private use
-
Updated
Jan 24, 2025 - Python
Priveedly: A django-based content reader and recommender for personal and private use
We're on a mission to build a superior, user-owned alternative to Big Tech AI systems.
The Private AI Setup Dream Guide for Demos automates the installation of the software needed for a local private AI setup, utilizing AI models (LLMs and diffusion models) for use cases such as general assistance, business ideas, coding, image generation, systems administration, marketing, planning, and more.
SnapDoc AI processes everything on-device, ensuring your sensitive information never leaves your control. Use voice and text on-device processing in organizations.
An advanced, fully local, and GPU-accelerated RAG pipeline. Features a sophisticated LLM-based preprocessing engine, state-of-the-art Parent Document Retriever with RAG Fusion, and a modular, Hydra-configurable architecture. Built with LangChain, Ollama, and ChromaDB for 100% private, high-performance document Q&A.
This is on-going research regarding the implementation of homomorphic encryption and federated learning for the use case of electric utility infrastructure defect detection using an object detection model in a Private AI framework.
IRISStar is an android app for interfacing with GGUF / llama.cpp models locally.
🔒 100% Private RAG Stack with EmbeddingGemma, SQLite-vec & Ollama - Zero Cost, Offline Capable
This project presents a streamlined interface for interacting with the Ollama API using Spring Boot and WebFlux.
Offline AI journaling app that gives insights based on your entries and runs locally with no cloud or data sharing.
Local private AI assistant powered by FastAPI, Streamlit, FAISS, and TinyLlama with document search and chat capabilities.
Distributed Deep Learning
Plataforma ChatGPT autoalojada con modelos locales y máxima privacidad. LLaMA, Mistral, todo en tu servidor
OpenMined 30DaysOfFLCode Challenge
🤖 Automate local private AI setups for demos, showcasing models for diverse tasks like coding, image generation, and business planning effectively.
Lightweight web UI for llama.cpp with dynamic model switching, chat history & markdown support. No GPU required. Perfect for local AI development.
Add a description, image, and links to the private-ai topic page so that developers can more easily learn about it.
To associate your repository with the private-ai topic, visit your repo's landing page and select "manage topics."