这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@shatfield4
Copy link
Collaborator

Pull Request Type

  • ✨ feat
  • 🐛 fix
  • ♻️ refactor
  • 💄 style
  • 🔨 chore
  • 📝 docs

Relevant Issues

resolves #759

What is in this change?

Describe the changes in this PR that are impactful to the repo.

  • Add additional documentation showing how to set environment variables in Ollama in order to get it to work with the Dockerized version of AnythingLLM

Additional Information

Add any other context about the Pull Request here that was not captured above.

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

Go to `http://localhost:3001` and you are now using AnythingLLM! All your data and progress will persist between
container rebuilds or pulls from Docker Hub.

> [!TIP]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you look at the bottom there is already a Common questions and fixes section and this should go there.

It also might be worth to have all Q/A questions be a

HTML embed in the Markdown to make the document not so long.

Or maybe we add a README.md into the aiProviders/ollama folder that has this exact info and links to it from some link on this page in the QA section.

Copy link
Member

@timothycarambat timothycarambat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should really try to keep documentation as short as we can or at least keep single pages from going on and on about information that isn't totally relevant to the page it is on.

For example, this page is about docker. Ollama connection is relevant, but maybe we can put all Ollama stuff into a single place

@timothycarambat timothycarambat merged commit e99c74a into master Feb 22, 2024
@timothycarambat timothycarambat deleted the 759-feat-description-for-configuring-ollama-llm-to-work-out-of-the-box-with-docker-container branch February 22, 2024 02:42
cabwds pushed a commit to cabwds/anything-llm that referenced this pull request Jul 3, 2025
…ockerized version of AnythingLLM (Mintplex-Labs#774)

* update HOW_TO_USE_DOCKER to help with Ollama setup using docker

* update HOW_TO_USE_DOCKER

* styles update

* create separate README for ollama and link to it in HOW_TO_USE_DOCKER

* styling update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEAT]: Description for configuring Ollama LLM to work out of the box with docker container

3 participants