Last updated: Feb 2025

iTerm2, the popular macOS terminal emulator, has introduced a powerful AI plugin that can enhance your command-line experience. By integrating with local AI models using Ollama, you can get intelligent command suggestions, code completions, and more—all without sending your data to external servers or leaving your terminal.
In this guide, we’ll walk you through the steps to enable and use the iTerm2 AI plugin locally with Ollama’s qwen2.5:7b model.
There are two AI features that iTerm introduced:
1. Command Generator: Ask AI to generate a command for a specific task as shown in the GIF above.
2. Codecierge: Ask AI about a task and it will give commands with explanations. Once you run a command AI will analyse it and give remaining commands.
First lets configure the backend to enable AI integration using the steps given below.
Prerequisites
- iTerm2 installed on your macOS. Download here.
- Ollama installed. Follow the installation instructions on the Ollama website.
qwen2.5:7bAI model or any other model of your choice, downloaded via Ollama. Download here
Step 1: Open iTerm2 Settings
Launch iTerm2. Press command + , short keys or from the menu bar, navigate to:
iTerm2 > Settings
Step 2: Access the AI Tab
In the settings window, click on the AI tab located in the sidebar.

Step 3: Install the AI Plugin
Click on the Install the plugin button. If you have already installed the plugin the General tab will looks like below.

Step 4: Download and Run the Plugin

A download will start for the AI plugin. Once downloaded:
- Unzip the plugin file.
- Run the plugin installer.
Running the installer will launch a new application required for the AI features.

Step 5: Move the Plugin to Background
After installation, the plugin application may remain open. You can click on the cross button to move it to the background. The plugin will continue to run and enable AI features in iTerm2.
Step 6: Verify Available Models in Ollama
To ensure that the is available, run the following command in your terminal:LLM
ollama list
You should see an output similar to:
> ollama list
NAME ID SIZE MODIFIED
smollm2:latest cef4a1e09247 1.8 GB 4 days ago
qwen2.5:7b 845dbda0ea48 4.7 GB 10 days ago
Step 7: Configure AI Settings

Return to the AI tab in iTerm2 settings to configure the AI plugin. Click on the “Configure AI Models Manually” button and enter the following configurations:
- API: Llama
- API URL:
http://127.0.0.1:11434/api/chat - Model:
or anyother of your choiseqwen2.5:7b
Explanation of Settings
- API URL: This is the local endpoint where Ollama serves the AI model.
- Model: Specifies the AI model to use. In this case,
qwen2.5:7b. You can go for a smaller model likesmollm2:latestDownload here - Tokens: Sets the maximum number of tokens (words or word pieces) the AI model can generate.
In the AI prompt tab add the following prompt, check the below image for reference:

Interpret this prompt:
\(ai.prompt)
Generate terminal commands for macOS without explanations. Output only the command, that can be directly executed in the terminal, without any descriptions or explanations. It is Paramount that you ONLY return the command without enclosing it in a Markdown code block.
In the Features tab select the configuration you want to enable. With the settings configured, you’re ready to use the AI features.
Command Generator
Press Command + Y to launch the AI command generator UI within iTerm2.In the AI chat interface, you can now ask for command suggestions in plain English (or any supported language).
For example:
- “find top 10 files by size”
- “create a new Git branch and switch to it.”
Press shift + return to submit the request. The AI will generate the appropriate command. Update the command if needed and then press shift + return to run the command.
Codecierge
To access this feature click Toolbelt –> Show Toolbelt, also select the codecierge option. Now you should have a chat window on the right. You can chat with the LLM regarding any task and it will give steps to complete the task.

After each command you run the AI will analyse and update the chat window with the remaining commands.
AI Chat
To access the AI Chat feature click Session –> Open AI Chat… or press shift + control + command + Y keys at the same time. You can also click the “brain” icon in the terminal as shown in the image below to access the chat.

You can chat about the terminal output in the chat like understanding the output of a command.
Conclusion
By integrating iTerm2’s AI plugin with Ollama’s local AI models, you can enhance your productivity while maintaining privacy. The AI features can help you construct complex commands, understand unfamiliar ones, and streamline your workflow.
Feel free to explore different models available in Ollama and adjust the settings to suit your needs.
Troubleshooting
Connection Issues: Check Ollama troubleshooting steps here
Additional Resources
Date: November 2024
Akash Gupta
Senior VoIP Engineer and AI Enthusiast

AI and VoIP Blog
Thank you for visiting the Blog. Hit the subscribe button to receive the next post right in your inbox. If you find this article helpful don't forget to share your feedback in the comments and hit the like button. This will helps in knowing what topics resonate with you, allowing me to create more that keeps you informed.
Thank you for reading, and stay tuned for more insights and guides!

Leave a Reply