+
Skip to content
View localagi's full-sized avatar

Block or report localagi

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
localAGI/README.md

🧮 localAGI 🧮


Update - archived

Due to several reasons, I left AGiXT development and will also leave github.

Despite not acting anymore in terms of FOSS or offering integration, I will let the pages and repos as reference.

Have fun all!

Update - work is paused. Github is a b****.

image

No, they don't answer to support tickets. My account is probably done. You can message mail me, but i tend to remove everyting in close future.

And I am PRO user.


Fulltime nerd. Passionate developer. DevOp at heart.

Thats me. :bowtie: Building AGI on local hardware.

Building contaners for effectively running a local artificial general intelligence. 🦾

You want to run your own inferences with ease? Good you are awake.

Contact: Find me on AGiXT Discord Server or open an issue here.

🐋 Docker Hub 🐋

🧗‍♀️ Motivation 🧗

After entering the AI-space begin of may 2023, I wanted to try out all cool software available. Local development setups have always been tricky and I struggled installing environments for different projects with different compilations of library versions etc.

After discovering josh-XT/AGiXT - and getting a bit™ euphoric, I started boxing AGiXT into a docker container using a github workflow.

localAGI/AGiXT-docker quickly spawned localAGI/AI-pipeline - and I started reusing the pipeline for different projects.

🎯 Goal 🎯

Having reproducable software environments to spin up services on demand for testing and sky-netting. Setup and streamline docker containers for quick and user friendly usage.

🚀 CUDA enabled. 🖥️ BLAS enabled. 😏 Conda-less. 🧅 Matrix builds. 🏢 Multiarch builds. 🧒 🧑 🧓 For everyone.

🌺 Sharing is caring 🌺

With strong expertise in docker and github workflows I want to test and follow AI-related projects in a comfortable manner.

Working on AI Pipeline to share best practices over several repositories and projects.

🌟 When you like any of my work, leave a star! Thank you! 🌟

State of work

PAUSED, SEE TOP. WILL BE REMOVED SOON. I'LL GIVE GITHUB ONE MORE WEEK TO ANSWER. 🤖

The following projects are built using the AI pipeline.

My maintenance is focussed on build stabilty and availability of service containers. >200h of work. 50.000h of experience.

🧠 Services for running inference

🔥 your cuda card from 🐳 docker containers

Service Release Models API Original Repo
FastChat T5, HF OpenAI lm-sys/FastChat
oobabooga HF, GGML, GPTQ oobabooga oobabooga/text-generation-webui
llama-cpp-python GGML OpenAI abetlen/llama-cpp-python
llama.cpp GGML ? ggerganov/llama.cpp
gpt4all see backend ? nomic-ai/gpt4all
gpt4all-ui GPTJ, MPT, GGML...? ? nomic-ai/gpt4all-ui
stablediffusion2 WIP

🎩 Services for using inference

Service Release Original Repo
AGiXT 🕺 integrated upstream josh-XT/AGiXT
AGiXT-Frontend ✔️ JamesonRGrieve/Agent-LLM-Frontend
gpt-code-ui ricklamers/gpt-code-ui
gpt4free xtekky/gpt4free

🦿 CLI tools and packages

for quantization, conversion, cli-inference etc.

Tool Release Model-types Model-quantisations Original Repo
llama.cpp Llama HF, GGML ggerganov/llama.cpp
ggml Llama GGML ggerganov/ggml
GPTQ-for-Llama Llama GPTQ [cuda old, cuda new] oobabooga/GPTQ-for-Llama
qwopqwop200/GPTQ-for-Llama
AutoGPTQ Llama GPTQ [triton] PanQiWei/AutoGPTQ
starcoder.cpp RNN bigcode-project/starcoder.cpp

Requests

Any? Contact me (curently on AGiXT-Discord)

Things to consider

  • conda is commercial. The license prohibits any commercial use. We try to omit it on our builds, but it's your responsibility.
  • NVidia images have a license. Make sure you read them.
  • streamlit app collects heavy analytics even when running locally. This includes events for every page load, form submission including metadata on queries (like length), browser and client information including host ips. These are all transmitted to a 3rd party analytics group, Segment.com.

Popular repositories Loading

  1. gpt-code-ui-docker gpt-code-ui-docker Public

    Docker builds for https://github.com/ricklamers/gpt-code-ui

    Dockerfile 109 22

  2. gpt4all-docker gpt4all-docker Public

    18 7

  3. gpt4free-docker gpt4free-docker Public

    Docker builds for https://github.com/xtekky/gpt4free

    8

  4. FastChat-docker FastChat-docker Public

    Docker builds for fastchat

    Dockerfile 6 2

  5. gpt4all-ui-docker gpt4all-ui-docker Public

    Docker builds for https://github.com/nomic-ai/gpt4all-ui

    6 3

  6. localAGI localAGI Public

    About Me - github.com/localAGI

    5

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载