这是indexloc提供的服务,不要输入任何密码
Skip to content

[BUG]: Crash when try embed a document using milvus #2159

@Milor123

Description

@Milor123

How are you running AnythingLLM?

Docker (local)

What happened?

I've tried search in the docs but nothing is here https://docs.useanything.com/setup/vector-database-configuration/local/milvus

I've used a docker image of milvus, and it allow me to access web panel from my pc, but anythingllm have a error when try embed a document.

My milvus is https://github.com/milvus-io/milvus/releases/tag/v2.4.9

  • milvus-standalone-docker-compose-gpu.yml

And I've use it with podman-compose.

Milvus console output:

[minio]      | Status:         1 Online, 0 Offline. 
[minio]      | API: http://10.89.0.6:9000  http://127.0.0.1:9000     
[minio]      | Console: http://10.89.0.6:9001 http://127.0.0.1:9001   
[minio]      | 

Error from podman logs:

[collector] info: Collector hot directory and tmp storage wiped!
[collector] info: Document processor app listening on port 8888
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma

✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 128ms

Start using Prisma Client in Node.js (See: https://pris.ly/d/client)

import { PrismaClient } from '@prisma/client'
const prisma = new PrismaClient()

or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate)

import { PrismaClient } from '@prisma/client/edge'
const prisma = new PrismaClient()


See other ways of importing Prisma Client: http://pris.ly/d/importing-client

Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
Datasource "db": SQLite database "anythingllm.db" at "file:../storage/anythingllm.db"

21 migrations found in prisma/migrations


No pending migrations to apply.
[backend] info: [EncryptionManager] Loaded existing key & salt for encrypting arbitrary data.
[backend] info: [TELEMETRY ENABLED] Anonymous Telemetry enabled. Telemetry helps Mintplex Labs Inc improve AnythingLLM.
[backend] info: prisma:info Starting a sqlite pool with 29 connections.
[backend] info: [TELEMETRY SENT] {"event":"server_boot","distinctId":"7eecf98b-ccfd-40c5-a83b-5885cdac43b6","properties":{"commit":"--","runtime":"docker"}}
[backend] info: [CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services.
[backend] info: [EncryptionManager] Loaded existing key & salt for encrypting arbitrary data.
[backend] info: Primary server in HTTP mode listening on port 3001
[backend] info: [BackgroundWorkerService] Feature is not enabled and will not be started.
[backend] info: [MetaGenerator] fetching custome meta tag settings...
[backend] error: LMStudio:listModels Connection error.
[backend] info: [Event Logged] - update_vector_db
[backend] error: LMStudio:listModels Connection error.
[backend] info: Adding new vectorized document into namespace normal-es
[backend] info: [RecursiveSplitter] Will split with {"chunkSize":128,"chunkOverlap":20}
[backend] info: Chunks created from document: 317
[backend] info: [LMStudioEmbedder] fetch failed
[backend] error: addDocumentToNamespace LMStudio service could not be reached. Is LMStudio running?
[backend] error: Failed to vectorize TopfoodGuntry.txt
[backend] info: Adding new vectorized document into namespace normal-es
[backend] info: [RecursiveSplitter] Will split with {"chunkSize":128,"chunkOverlap":20}
[backend] info: Chunks created from document: 725
[backend] info: [LMStudioEmbedder] fetch failed
[backend] error: addDocumentToNamespace LMStudio service could not be reached. Is LMStudio running?
[backend] error: Failed to vectorize interpretacion_suenos_V2.md
[backend] info: [TELEMETRY SENT] {"event":"documents_embedded_in_workspace","distinctId":"7eecf98b-ccfd-40c5-a83b-5885cdac43b6","properties":{"LLMSelection":"lmstudio","Embedder":"lmstudio","VectorDbSelection":"milvus","TTSSelection":"native","runtime":"docker"}}
[backend] info: [Event Logged] - workspace_documents_added
[backend] info: Adding new vectorized document into namespace normal-es
[backend] info: [RecursiveSplitter] Will split with {"chunkSize":128,"chunkOverlap":20}
[backend] info: Chunks created from document: 386
[backend] info: [LMStudioEmbedder] Embedding 386 chunks of text with jina/jina-embeddings-v2-base-es/jina-embeddings-v2-base-es.gguf.
[backend] error: addDocumentToNamespace 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
[backend] error: Failed to vectorize Libretranslate.md
node:internal/process/promises:288
            triggerUncaughtException(err, true /* fromPromise */);
            ^

Error: 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
    at callErrorFromStatus (/app/server/node_modules/@grpc/grpc-js/build/src/call.js:31:19)
    at Object.onReceiveStatus (/app/server/node_modules/@grpc/grpc-js/build/src/client.js:193:76)
    at /app/server/node_modules/@grpc/grpc-js/build/src/call-interface.js:78:35
    at Object.onReceiveStatus (/app/server/node_modules/@zilliz/milvus2-sdk-node/dist/milvus/utils/Grpc.js:146:25)
    at Object.onReceiveStatus (/app/server/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:360:141)
    at Object.onReceiveStatus (/app/server/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:323:181)
    at /app/server/node_modules/@grpc/grpc-js/build/src/resolving-call.js:129:78
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
for call at
    at ServiceClientImpl.makeUnaryRequest (/app/server/node_modules/@grpc/grpc-js/build/src/client.js:161:32)
    at ServiceClientImpl.<anonymous> (/app/server/node_modules/@grpc/grpc-js/build/src/make-client.js:105:19)
    at /app/server/node_modules/@zilliz/milvus2-sdk-node/dist/milvus/utils/Function.js:32:31
    at new Promise (<anonymous>)
    at /app/server/node_modules/@zilliz/milvus2-sdk-node/dist/milvus/utils/Function.js:29:16
    at Generator.next (<anonymous>)
    at fulfilled (/app/server/node_modules/@zilliz/milvus2-sdk-node/dist/milvus/utils/Function.js:5:58)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  code: 13,
  details: 'Received RST_STREAM with code 2 triggered by internal client error: Protocol error',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} }
}

Node.js v18.20.4

Thank u very much guys!

Are there known steps to reproduce?

Upload a file, nice
Then search the file and move to the space, OK
but when i try "Save and Embed" the docker image crash

Metadata

Metadata

Assignees

Labels

possible bugBug was reported but is not confirmed or is unable to be replicated.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions