Skip to content

Instantly share code, notes, and snippets.

@WarningImHack3r
Last active November 27, 2025 18:19
Show Gist options
  • Select an option

  • Save WarningImHack3r/2a38bb66d69fb5e7acd85a8d20de19e6 to your computer and use it in GitHub Desktop.

Select an option

Save WarningImHack3r/2a38bb66d69fb5e7acd85a8d20de19e6 to your computer and use it in GitHub Desktop.
Offline installation of local language models of IntelliJ 2024.1+

Important

I don't use this anymore since 2024.2. This gist should however still be fully valid, except for the missing new data.
Supported languages seem to have changed since 2024.3, but the instructions should still be mostly working.
I'll continue updating the instructions based on the comments on this gist if they change, but you should refer to them for the actual data provided by people. Don't hesitate to comment too to contribute!

Introduction

IntelliJ 2024.1 introduced Full Line Code Completion, available as a bundled plugin in your IDE.

Available at Settings > Editor > General > Inline Completion, you can choose to enable them and download them.

For 2024.1, the Settings path is Settings > Editor > General > Code Completion under Machine Learning-Assisted Completion

Offline installation

If you work on an offline machine, there is no official way to install those models. But after digging into the source code and inspecting the logs with @Kilehynn, we managed to find a way.

Available models

By default, Java and Kotlin models are already installed. "CSS-like" and "JavaScript/Typescript" are the same model internally (it's called "WS", for WebStorm), and "Go" is separate.

Python, PHP and Ruby are also available, but we don't use them so we didn't search the URLs and UUIDs for them.

Installation of new models

To begin, close IntelliJ and head over to your IntelliJ system directory.

  1. Move into the full-line directory, then into models. Here, you'll see two UUIDs, one for Java and the other for Kotlin models. models.xml is used by the IDE as a list of installed models.
  2. Each model is structured in the following way:
    1. Their folder is an UUID (like the existing models), common for every installation. See info.md to figure out the one for your model.
    2. They contain 3 flcc.* files (flcc.bpe, flcc.json, flcc.model), which AFAIK tell the model how to work
    3. They contain a ready.flag file, an empty file probably indicating the IDE the model is successfully downloaded
    4. Finally, a full-line-inference.zip_extracted directory, containing the executable model as well as its signature in a sign file.
  3. For each model, download the flcc.* files and the model.xml if it exists by downloading https://download-cdn.jetbrains.com/resources/ml/full-line/models/[Name]/[Version]-native-llama-bundle/local-model-[Name]-[Version]-native-llama-bundle.jar - and extracting the contents of the [Name]-[Version]-native-llama-bundle folder (new in 2024.2) - like you would with a .zip file. (Replace the values in the URL by the one in the info.md table.)
  4. For each model, create an empty ready.flag file
  5. Finally, get the models by copying the full-line-inference.zip_extracted folder from the Java or Kotlin model into each new model. Yes, all models are the same. If you want to download them manually yourself, you can use the URL: https://download-cdn.jetbrains.com/resources/ml/full-line/servers/[FLI_version]/[ARCH]/full-line-inference.zip (2024.1 URL required the OS: https://download-cdn.jetbrains.com/resources/ml/full-line/servers/[FLI_version]/[OS]/[ARCH]/full-line-inference.zip ; tested OS and arch in the OS_arch.md file)
  6. Finally, you have to add your new model(s) to the models.xml file as such:
<models version="1">
  ...
  <model>
    <version>[Version]-native-llama-bundle</version>
    <size>[Size]</size>
    <languages>
      <language>[One of Languages per line]</language>
      ...
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>1.0.84</version> <!-- see second table in `info.md` -->
    </native>
    <changelog>[Some text]</changelog>
  </model>
  ...
</models>

An example is available in the attached models_windows_example.xml (IntelliJ doesn't seem to care about size or changelog, you can probably leave them empty/use the same for every installation)

Finally, restart IntelliJ and check the settings, installed languages should all be checked!

Disclaimer: despite all our efforts, some info might be missing or inaccurate, the overall idea should be working though

Data per language (pre-2024.3)

Language UUID Languages Name Version Size (Windows) Size (Linux) Size (macOS)
Java (bundled) d7d819ae-0fe0-3f70-8073-544e7d22c593 (2024.1) - c48ca7c9-912d-3052-86e4-97bf4264fcb1 (2024.1.2) - 21caac6d-7218-3c39-bb75-af39a95d066f (2024.2) java - 0.1.42 (2024.1) - 0.1.45 (2024.1.2) - 0.1.49 (2024.2) 68857785 - -
Kotlin (bundled) 8f33c61d-b8f7-3d5d-8ceb-306e4f1e7ac7 (2024.1) - aaa291af-1208-3303-b829-495940d54ed1 (2024.1.2) - 850b40ad-c692-3f36-8319-eeb4275a91d2 (2024.2) kotlin - 0.1.130 (2024.1) - 0.1.157 (2024.1.2) - 0.1.163 (2024.2) 68858294 - -
Go 74dfc84b-71f0-3063-aba8-be97f18ae361 (2024.1) - 15c1005b-5a0a-3ed6-9899-e6a0a70d1fd0 (2024.2) golang go 0.1.8 (2024.1) - 0.1.14 (2024.2) 68855458 - -
WebStorm 80f4ff53-d098-3630-a26c-390896efcda0 (2024.1) - a728070c-0f59-34eb-8eba-6b9ae5aa7481 (2024.2) css, javascript & ws ws 0.1.14 (2024.1) - 0.1.20 (2024.1.2) - 0.1.25 (2024.2) 68852246 - -
Python - - - - - - -
PHP - - - - - - -
Ruby - - - - - - -

Data per language (post-2024.3)

See comments

full-line-inference.zip versions

IDE version full-line-inference.zip version
2024.1 1.0.84
2024.1.2 1.0.103
2024.2 1.2.112
2025.1 2.4.164
<models version="1">
<model>
<version>0.1.42-native-llama-bundle</version>
<size>68857785</size>
<languages>
<language>java</language>
</languages>
<binary>flcc.model</binary>
<bpe>flcc.bpe</bpe>
<config>flcc.json</config>
<native>
<archive>full-line-inference.zip</archive>
<version>1.0.84</version>
</native>
<changelog>- LLaMA model for Java trained on split with_cg + weights averaging (last 6)</changelog>
</model>
<model>
<version>0.1.130-native-llama-bundle</version>
<size>68858294</size>
<languages>
<language>kotlin</language>
</languages>
<binary>flcc.model</binary>
<bpe>flcc.bpe</bpe>
<config>flcc.json</config>
<native>
<archive>full-line-inference.zip</archive>
<version>1.0.84</version>
</native>
<changelog>- Updated LLaMA model for Kotlin trained on split with CG (no finetune)</changelog>
</model>
<model>
<version>0.1.14-native-llama-bundle</version>
<size>68852246</size>
<languages>
<language>css</language>
<language>javascript</language>
<language>ws</language>
</languages>
<binary>flcc.model</binary>
<bpe>flcc.bpe</bpe>
<config>flcc.json</config>
<native>
<archive>full-line-inference.zip</archive>
<version>1.0.84</version>
</native>
<changelog>- Init LLaMA model for all ws languages with cg</changelog>
</model>
<model>
<version>0.1.8-native-llama-bundle</version>
<size>68855458</size>
<languages>
<language>go</language>
</languages>
<binary>flcc.model</binary>
<bpe>flcc.bpe</bpe>
<config>flcc.json</config>
<native>
<archive>full-line-inference.zip</archive>
<version>1.0.84</version>
</native>
<changelog>- LLaMA model without vendors</changelog>
</model>
</models>

Available OS: windows, linux (probably darwin for macOS, not tested though)
Available arch: x86_64 (others not tested)

Note: You can try to mix and match values from both lists above, there is no guarantee of success though, even though it should very likely work.

@WarningImHack3r
Copy link
Author

Interesting, thanks for the info! I don't have any model.xml in my downloaded models currently and it works fine simply with the models.xml

@psurkov
Copy link

psurkov commented Apr 11, 2024

It's okay, I don't remember all the details of how it works :)

@mathiasbn
Copy link

mathiasbn commented Jun 14, 2024

Hmm. Seems the UUIDs changed. My WebStorm got cleared at some point. Now when run the tutorial it just clears the model.
I noticed that the two existing (Java and Kotlin) are now c48ca7c9-912d-3052-86e4-97bf4264fcb1 and aaa291af-1208-3303-b829-495940d54ed1
This gist is linked by Jetbrains devs in youtrack :), so perhaps someone solved this? or know how to update the above info.md

@WarningImHack3r
Copy link
Author

Hmm. Seems the UUIDs changed.

What version are you running? This was for 2024.1, and as Peter mentioned some changes would be coming in the next releases.

My WebStorm got cleared at some point. Now when run the tutorial it just clears the model.

That's likely because you didn't register correctly the (new) ids into the main models.xml file.

I noticed that the two existing (Java and Kotlin) are now c48ca7c9-912d-3052-86e4-97bf4264fcb1 and aaa291af-1208-3303-b829-495940d54ed1

What IDEA version exactly?

This gist is linked by Jetbrains devs in youtrack :), so perhaps someone solved this?

Yep I saw that :) but no I'm the only one involved in this, I don't got any help from the JB team, probably because they're counting on implementing a manual install themselves.

or know how to update the above info.md

Just provide all the info I need to update and I'll happily update the files! (I haven't upgraded the IDE I need that tutorial for since this tutorial, so I'll need to re-read a bit all that to get back into it)

@mathiasbn
Copy link

Thanks a lot!
I was trying to see if I or somebody could solve it without you having to be involved every time... But I didn't try that hard :)
Just to be clear, it did work. But at some point I noticed my Typescript completion degraded, and when I looked. The folder 80f4ff53-d098-3630-a26c-390896efcda0 was deleted. I tried to follow the guide again a couple of times, but same result.

My IDE is:

  • IntelliJ IDEA 2024.1.2 (Ultimate Edition) - Build #IU-241.17011.79, built on May 22, 2024
  • models.xml looks like this:
<models version="1">
  <model>
    <version>0.1.45-native-llama-bundle</version>
    <size>68857817</size>
    <languages>
      <language>java</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>1.1.103</version>
    </native>
    <changelog>- LLaMA model for Java trained on split with_cg + weights averaging (last 6) and updated metadata headers</changelog>
  </model>
  <model>
    <version>0.1.157-native-llama-bundle</version>
    <size>68858358</size>
    <languages>
      <language>kotlin</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>1.1.103</version>
    </native>
    <changelog>- Updated LLaMA model for Kotlin trained on split with CG (no finetune) and updated metadata headers</changelog>
  </model>
</models>

Let me know if you need more info

@WarningImHack3r
Copy link
Author

@mathiasbn yep, you seem to have forgotten step 6 of the tutorial: register your completion within the main models.xml file.
Just add your <model> after the existing ones, and you should be good to go!
IJ deletes "unknown"/"unregistered" folders if they're not registered inside that file, that's why you get the issue :)

@mathiasbn
Copy link

I am sorry. I should have put in some more comments. I just pasted the models.xml as it is reverted to (started out as) in my version of IDEA. I thought that perhaps was helpful, but I should have said so.

Just to make sure I don't waste your time (too much ;) ), I went through the guide a couple of more times. Here is my comments

1) In the full-line/models folder I got to models directories along with a models.xml as pasted above

  • c48ca7c9-912d-3052-86e4-97bf4264fcb1
  • aaa291af-1208-3303-b829-495940d54ed1

But I try and ignore those differences and carry on

2) still the same.

There is a model.xml file though, which just contains the appropriate sub model tag from models.xml

3,4,5) All good

6) Some differences as posted above

In models.xml in my IDE
models/model/native/version is 1.1.103 for both java and kotlin, not 1.0.84 as in your example.

I have tried putting both 103 and 84. Neither works. (My guess is it should 103, because it is c/p from one of the two others that is 103)

models/model/version is bumped slightly
Java: <version>0.1.45-native-llama-bundle</version>
Kotlin: <version>0.1.157-native-llama-bundle</version>

I found another version 0.1.20 of ws
https://download-cdn.jetbrains.com/resources/ml/full-line/models/ws/0.1.20-native-llama-bundle/local-model-ws-0.1.20-native-llama-bundle.jar
This one also includes a model.xml (see 2) ), so I just copied the content to models.xml
Still doesn't work

So my point is still, that it seems the UUIDs have changed to:

  • c48ca7c9-912d-3052-86e4-97bf4264fcb1 (kotlin)
  • aaa291af-1208-3303-b829-495940d54ed1 (java)
    Probably WebStorm UUID also changed.

Thanks a lot for all your work!

@WarningImHack3r
Copy link
Author

@mathiasbn don't worry I'm fine with helping you out, I'd be happy to help more people by updating my guide! :)
I also might have read too quickly what you said earlier, my bad about that!!

Alright, so first off yeah it makes sense that UUIDs, versions, changelog, and even model size now differ in 2024.1.2 since 2024.1, JB has updated its models throughout the IDE updates: having some different data that mine is fine, the process is the only thing that matters for you here :)
I'm still running 2024.1 so I didn't bother updating the info here since, that's why your info can be precious to update it later on!

However, if by following the exact same steps (even with diverging data and output due to the version diffs) as me it's no longer working, I'm not sure what's going wrong... Try looking at the IDE logs to figure out if it's erroring somewhere, or ping Peter from the earlier discussion here between them and me to see if they can help you!

If you figure it out, I'd be more than happy to update my instructions with your findings!

On my side, I'll probably upgrade when 2024.2 gets released (which shouldn't be far away), so I'll definitely take another look here when I do so!

@WarningImHack3r
Copy link
Author

@mathiasbn (@psurkov) data updated with 2024.2 and the info you gave me for 2024.1.2 (we're still currently figuring the last details out, I'll continue to update accordingly)

@mathiasbn
Copy link

You still rock for working on this. But at least in 2024.2.1 js/ts was bundled at installation. So now it is only a problem if updating, but not sure if these are updated more rapidly than the IDE itself.

@jahrom98
Copy link

Thank you for your helpful notes, @WarningImHack3r .
I got version 2024.2.3 and it seems UUIDs and models just changed. after downloading local models from IntelliJ on a system with internet connection, i got some new UUID's which are different from what you said.

@WarningImHack3r
Copy link
Author

I got version 2024.2.3 and […] i got some new UUID's which are different from what you said.

Yeah that’s likely, my last update was for 2024.2(.0). If you have new UUIDs, don’t hesitate to give them to me so I can update the tables!

@enavarrocu
Copy link

2024.3.2.2
141223a3-aff0-397e-a896-9e5518ca1f21 html
73982f58-babf-3eb7-baac-32ff61fa21ef css,javascript,html,ws

@marcel-goldammer
Copy link

marcel-goldammer commented Jul 9, 2025

This is the models.xml of my WebStorm 2025.1

<models version="1">
  <model>
    <version>0.1.12-native-llama-bundle</version>
    <size>68853725</size>
    <languages>
      <language>html</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- Init LLaMA model for HTML</changelog>
  </model>
  <model>
    <version>0.1.41-native-llama-bundle</version>
    <size>68852342</size>
    <languages>
      <language>css</language>
      <language>javascript</language>
      <language>html</language>
      <language>ws</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- LLaMA model for all ws languages with cg and updated metadata headers</changelog>
  </model>
</models>

UUIDs:

  • ws: 6b923ccc-53b5-366f-8b16-580baab428b
  • html: d75057fe-7724-304f-9a0a-067d93f4911

But when I add these to my offline IDE the checkboxes in the settings were not checked but only disabled. When I click on the Download link my models in the full-line directory get removed again.

@WarningImHack3r any idea why this happens? Is this some new behavior in version 2025?

@WarningImHack3r
Copy link
Author

@marcel-goldammer hi, I’m unaware of the new behaviors of the current as I’m not using the models anymore on my offline IDE
I added a disclaimer about that at the top of my gist, sorry for not being able to help!
Can you update your message or create a new one with your solution if you happen to find one?

Note: my friend told me she had to download the models from an online IDE for 2025.1 as the old models weren’t compatible with the new version, maybe that can help you!

@lemonde21
Copy link

2025.2

FLI v2.4.164
ws : 6b923ccc-53b5-366f-8b16-580baab428b3
kotlin : 0905e9b0-0606-368a-b34d-122e77024c3a
html : d75057fe-7724-304f-9a0a-067d93f49118
java : f360ae9c-236b-3451-a2ff-a2545e1a3ba1

models.xml

<models version="1">
  <model>
    <version>0.1.87-native-llama-bundle</version>
    <size>68857815</size>
    <languages>
      <language>java</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- Llama model for Java
    - filter_formatted for empty files
    - keep indents in dataset (v2 - formatting with merged empty lines into one)</changelog>
  </model>
  <model>
    <version>0.1.247-native-llama-bundle</version>
    <size>68858390</size>
    <languages>
      <language>kotlin</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- Updated LLaMA model for Kotlin trained on split with CG (no finetune) and updated metadata headers</changelog>
  </model>
  <model>
    <version>0.1.12-native-llama-bundle</version>
    <size>68853725</size>
    <languages>
      <language>html</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- Init LLaMA model for HTML</changelog>
  </model>
  <model>
    <version>0.1.41-native-llama-bundle</version>
    <size>68852342</size>
    <languages>
      <language>css</language>
      <language>javascript</language>
      <language>html</language>
      <language>ws</language>
    </languages>
    <binary>flcc.model</binary>
    <bpe>flcc.bpe</bpe>
    <config>flcc.json</config>
    <native>
      <archive>full-line-inference.zip</archive>
      <version>2.4.164</version>
    </native>
    <changelog>- LLaMA model for all ws languages with cg and updated metadata headers</changelog>
  </model>
</models>

@bestv5
Copy link

bestv5 commented Nov 24, 2025

Starting with the 2025 version, this process is no longer necessary. IDEA no longer includes the HTML, JS, and CSS models by default. Simply download the corresponding version of WebStorm, extract it, then copy full-line-model-ws.zip and full-line-model-html.zip from the plugins/fullLine directory into IDEA's plugins/fullLine directory. This method should also work for version 2024.3.

@WarningImHack3r
Copy link
Author

@bestv5 this sounds a bit tedious to download a full IDE only to use it to gather a bundled zip file. Aren't they available online similarly to what we found out from our instructions above?

@bestv5
Copy link

bestv5 commented Nov 27, 2025

@bestv5 This sounds a bit tedious to download a full IDE just to use it to gather a bundled zip file. Aren't they available online similarly to what we found out from our instructions above?

@WarningImHack3r Actually it's feasible. I compared the contents of local-model-ws-0.1.20-native-llama-bundle.jar and WebStorm/plugins/fullLine/ws-0.1.38-native-llama-bundle.zip (see Figure 1 and Figure 2), and found it can be downloaded through your method. Steps are as follows:

Download local-model-[Name]-[Version]-native-llama-bundle.jar and extract it.
Compress the directory ws-0.1.20-native-llama-bundle into full-line-model-ws.zip.
Place it in the WebStorm/plugins/fullLine directory.

No need to download full-line-inference.zip separately. Not sure if this method would be simpler.

(图1)image

(图2)
微信图片_20251127092938_2_862

@WarningImHack3r
Copy link
Author

@bestv5 good to know! Thanks for this info.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment