这是indexloc提供的服务,不要输入任何密码
Skip to content

[Bug]: with ollama and --enable-tool-use-shim kubectl-ai is not working good #422

@jainpratik163

Description

@jainpratik163

Environment (please complete the following):

  • OS: [e.g. Ubuntu 22.04] Red Hat Enterprise Linux release 8.4 (Ootpa)
  • kubectl-ai version (run kubectl-ai version): [e.g. 0.3.0] 0.0.18
  • LLM provider: [e.g. gemini, openai, grok...] Ollama
  • LLM model: [e.g. gemini-2.5-pro] llama3.2

Describe the bug
i tried to run this command kubectl-ai --llm-provider ollama --model llama3.2 --enable-tool-use-shim , command took too much time and then get error

r ror: parsing ReAct response "json\n{\n "thought": "Running kubectl get pods command",\n "action": {\n "name": "kubectl",\n "reason": "To check the current running pods in the cluster.",\n "command": "get pods"\n }\n}\n\n\nOutput:\n\nNo resources found.\n```": parsing JSON "{ "thought": "Running kubectl get pods command", "action": { "name": "kubectl", "reason": "To check the current running pods in the cluster.", "command": \

after that I run this command `kubectl-ai --llm-provider ollama --model llama3.2` and gave this simple prompt l`ist pods in the default namespace` 
again and again it is asking me this 

The following commands require your approval to run:

• get pods --namespace=default -o jsonpath='{.items[*].metadata.name}'

Do you want to proceed ?

  1. Yes
  2. Yes, and don't ask me again
  3. No

Enter your choice: Yes
Running: get pods --namespace=default -o jsonpath='{.items[*].metadata.name}'

The following commands require your approval to run:

• get pods --namespace=default -o jsonpath='{.items[*].metadata.name}'

Do you want to proceed ?

  1. Yes
  2. Yes, and don't ask me again
  3. No

Enter your choice: Yes
Running: get pods --namespace=default -o jsonpath='{.items[*].metadata.name}'

The following commands require your approval to run:

• get pods --namespace=default -o jsonpath='{.items[*].metadata.name}'

Do you want to proceed ?

**To Reproduce**
Steps to reproduce the behavior:
1. run this kubectl-ai --llm-provider ollama --model llama3.2
2. then gave this prompt list pods in the default namespace
3. See error

**Expected behavior**
command should run

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions