Supported Models
We support the following models with our models API wrapper (found in src/models) in the project. Listed below are the arguments you can use for --llm when using src/iris.py. You're free to use your own way of instantiating models or adding on to the existing library. Some of them require your own API key or license agreement on HuggingFace.
List of Models
Codegen
codegen-16b-multicodegen25-7b-instructcodegen25-7b-multi
Codellama
Standard Models
codellama-70b-instructcodellama-34bcodellama-34b-pythoncodellama-34b-instructcodellama-13b-instructcodellama-7b-instruct
CodeT5p
codet5p-16b-instructcodet5p-16bcodet5p-6bcodet5p-2b
DeepSeek
deepseekcoder-33bdeepseekcoder-7bdeepseekcoder-v2-15b
Gemini
- All Gemini models are supported, including those below, other model names can be found in the Gemini API documentation.
gemini-1.5-progemini-1.5-flashgemini-progemini-pro-visiongemini-1.0-pro-vision
Gemma
gemma-7bgemma-7b-itgemma-2bgemma-2b-itcodegemma-7b-itgemma-2-27bgemma-2-9b
GPT
gpt-4gpt-3.5gpt-4-1106gpt-4-0613
LLaMA
LLaMA-2
llama-2-7b-chatllama-2-13b-chatllama-2-70b-chatllama-2-7bllama-2-13bllama-2-70b
LLaMA-3
llama-3-8bllama-3.1-8bllama-3-70bllama-3.1-70bllama-3-70b-tai
Mistral
mistral-7b-instructmixtral-8x7b-instructmixtral-8x7bmixtral-8x22bmistral-codestral-22b
Qwen
qwen2.5-coder-7bqwen2.5-coder-1.5bqwen2.5-14bqwen2.5-32bqwen2.5-72b
StarCoder
starcoderstarcoder2-15b
WizardLM
WizardCoder
wizardcoder-15bwizardcoder-34b-pythonwizardcoder-13b-python
WizardLM Base
wizardlm-70bwizardlm-13bwizardlm-30b
Ollama
You need to install the ollama package manually.
qwen2.5-coder:latestqwen2.5:32bllama3.2:latestdeepseek-r1:32bdeepseek-r1:latest