Skip to content

Instantly share code, notes, and snippets.

@iam-veeramalla
Last active March 16, 2026 16:17
Show Gist options
  • Select an option

  • Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.

Select an option

Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.
claude code integration with ollama to use local models

Run Claude with the power of Local LLMs using Ollama

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Pull the Model

ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)

Install Claude

curl -fsSL https://claude.ai/install.sh | bash

Run Claude with Ollama

ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@chinthakindi-saikumar
Copy link

Here are the clear steps @vinoth-6 .

1.Install Ollama
Open CMD/terminal and run below command curl -fsSL https://ollama.com/install.sh | sh
2.Pull the Model
Install model based on your system configuration using belw commands ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance) or ollama pull gemma:2b
*.Optional: ollama run gemma:2b and work in your local
3.Install Claude
Install Claude using below commands macOS, Linux, WSL: curl -fsSL https://claude.ai/install.sh | bash Windows CMD: curl -fsSL https://claude.ai/install.cmd -o install.cmd && install.cmd && del install.cmd
4.Run Claude with Ollama
Launch Claude using below commands ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@Nestleai
Copy link

I've followed this directive and all works but claude, one the first test command "write a python function to reverse a script" its been hyperspacing for 10mins+ Bare in mind i'm working the qwen2.5-coder:7b model. could this mean the model isnt comaptible with my hardware?

@NathanLewis
Copy link

I tried it but claude can't see the local filesystem not even the files in the directory I run it from.

@eshwarvijay
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest
Error: unknown command "launch" for "ollama"

@JenilSavalia
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

same error

@naimroslan
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

Ran to this same issue. I just updated ollama and it works fine now.

@yangboz
Copy link

yangboz commented Mar 5, 2026

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

man , how about Error: unknown command "launch" for "ollama" ? thanks.

@abhayIII
Copy link

abhayIII commented Mar 5, 2026

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

man , how about Error: unknown command "launch" for "ollama" ? thanks.

Upgrading Ollama (restart to upgrade) option made it work for me. After you install claude, restart ollama once. Then give this command.

@DevendraBhoraniya
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

Try Updating the Ollama it worked for me

@cheerygopiramnaidu-web
Copy link

/head/

@seshubabubatchu
Copy link

Hey guys I tried the same but with different model qwen2.5-coder:7b
I tried with the continue extension in vs code and also with the Claude as well, why this is giving the output in only json format

is this something related to the model?

@DevjitSikdar
Copy link

why the heck i am getting ouput in my terminal why can't the claude code access my files?

@ShubhmPatil
Copy link

curl https://claude.ai/install.cmd
This command worked for me for the windows to install claude

@sanjaykadavarath
Copy link

Thank you @iam-veeramalla

@DevendraBhoraniya
Copy link

Hey guys I tried the same but with different model qwen2.5-coder:7b I tried with the continue extension in vs code and also with the Claude as well, why this is giving the output in only json format

is this something related to the model?

i am getting the same problme if you find some solution please tell me

@nipodemus
Copy link

I tried it but claude can't see the local filesystem not even the files in the directory I run it from.

I had same problem. Check /permissions, by default almost all usable are denied. I added for example Bash to Allow rules and file access started working.

@medake
Copy link

medake commented Mar 16, 2026

Hi @iam-veeramalla @ShubhmPatil @chinthakindi-saikumar - I am running Windows with PowerShell and set up Ollama with the local model qwen3.5:9b. I am not able to create or read or write the files in the project directory. After a lot of serch I found that Claude’s agent capabilities (filesystem tools, project scanning) exist only in Claude Code CLI, which requires Anthropic authentication and billing. Under Ollama-only execution, Claude behaves as a plain LLM without tool access, so automated codebase analysis is not possible. How did it work in your case?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment