Skip to content

Instantly share code, notes, and snippets.

View Mausy5043's full-sized avatar

Maurice (mausy5043) Hendrix Mausy5043

  • The Netherlands
  • 04:40 (UTC +01:00)
View GitHub Profile
@Mausy5043
Mausy5043 / local_ai_copilot_pycharm.md
Created September 21, 2025 12:33 — forked from valosekj/local_ai_copilot_pycharm.md
Setting Up a Local AI Co-Pilot for PyCharm with Continue and Ollama

Steps:

  1. Install the Continue plugin:

    PyCharm --> Settings --> Plugins --> search for "Continue" --> Install

  2. Download and install Ollama

  3. Download some LLM model, which can be run locally. For example, the open-source Granite model from IBM: