curl -fsSL https://ollama.com/install.sh | sh
ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)
curl -fsSL https://claude.ai/install.sh | bash
ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b
Hi @iam-veeramalla @ShubhmPatil @chinthakindi-saikumar - I am running Windows with PowerShell and set up Ollama with the local model qwen3.5:9b. I am not able to create or read or write the files in the project directory. After a lot of serch I found that Claude’s agent capabilities (filesystem tools, project scanning) exist only in Claude Code CLI, which requires Anthropic authentication and billing. Under Ollama-only execution, Claude behaves as a plain LLM without tool access, so automated codebase analysis is not possible. How did it work in your case?