AI and VoIP Blog

VOIP | AI | Cloud | Kamailio | Open Source


Maximize Productivity with iTerm2 AI Features with Ollama 100% free

Last updated: Feb 2025

iTerm2 command generator AI in action

iTerm2, the popular macOS terminal emulator, has introduced a powerful AI plugin that can enhance your command-line experience. By integrating with local AI models using Ollama, you can get intelligent command suggestions, code completions, and more—all without sending your data to external servers or leaving your terminal.

In this guide, we’ll walk you through the steps to enable and use the iTerm2 AI plugin locally with Ollama’s qwen2.5:7b model.

There are two AI features that iTerm introduced:
1. Command Generator: Ask AI to generate a command for a specific task as shown in the GIF above.
2. Codecierge: Ask AI about a task and it will give commands with explanations. Once you run a command AI will analyse it and give remaining commands.


First lets configure the backend to enable AI integration using the steps given below.

Prerequisites

  • iTerm2 installed on your macOS. Download here.
  • Ollama installed. Follow the installation instructions on the Ollama website.
  • qwen2.5:7b AI model or any other model of your choice, downloaded via Ollama. Download here

Step 1: Open iTerm2 Settings

Launch iTerm2. Press command + , short keys or from the menu bar, navigate to:

iTerm2 > Settings

Step 2: Access the AI Tab

In the settings window, click on the AI tab located in the sidebar.

Figure 1: Accessing the AI tab in iTerm2 Settings.

Step 3: Install the AI Plugin

Click on the Install the plugin button. If you have already installed the plugin the General tab will looks like below.

iTerm2 AI configuration

Step 4: Download and Run the Plugin

Installing the AI plugin.

A download will start for the AI plugin. Once downloaded:

  1. Unzip the plugin file.
  2. Run the plugin installer.

Running the installer will launch a new application required for the AI features.

Running the plugin installer.

Step 5: Move the Plugin to Background

After installation, the plugin application may remain open. You can click on the cross button to move it to the background. The plugin will continue to run and enable AI features in iTerm2.

Step 6: Verify Available Models in Ollama

To ensure that the LLM is available, run the following command in your terminal:

ollama list

You should see an output similar to:

> ollama list
NAME ID SIZE MODIFIED
smollm2:latest cef4a1e09247 1.8 GB 4 days ago
qwen2.5:7b 845dbda0ea48 4.7 GB 10 days ago

Step 7: Configure AI Settings

iterm2 ai configuration
iterm2 AI configuration

Return to the AI tab in iTerm2 settings to configure the AI plugin. Click on the “Configure AI Models Manually” button and enter the following configurations:

  • API: Llama
  • API URLhttp://127.0.0.1:11434/api/chat
  • Modelqwen2.5:7b or anyother of your choise

Explanation of Settings

  • API URL: This is the local endpoint where Ollama serves the AI model.
  • Model: Specifies the AI model to use. In this case, qwen2.5:7b . You can go for a smaller model like smollm2:latest Download here
  • Tokens: Sets the maximum number of tokens (words or word pieces) the AI model can generate.

In the AI prompt tab add the following prompt, check the below image for reference:

Example AI prompt

Interpret this prompt:
\(ai.prompt)
Generate terminal commands for macOS without explanations. Output only the command, that can be directly executed in the terminal, without any descriptions or explanations. It is Paramount that you ONLY return the command without enclosing it in a Markdown code block.

In the Features tab select the configuration you want to enable. With the settings configured, you’re ready to use the AI features.

Command Generator

Press Command + Y to launch the AI command generator UI within iTerm2.In the AI chat interface, you can now ask for command suggestions in plain English (or any supported language).
For example:

  • “find top 10 files by size”
  • “create a new Git branch and switch to it.”

Press shift + return to submit the request. The AI will generate the appropriate command. Update the command if needed and then press shift + return to run the command.

Codecierge

To access this feature click Toolbelt –> Show Toolbelt, also select the codecierge option. Now you should have a chat window on the right. You can chat with the LLM regarding any task and it will give steps to complete the task.

After each command you run the AI will analyse and update the chat window with the remaining commands.

AI Chat

To access the AI Chat feature click Session –> Open AI Chat… or press shift + control + command + Y keys at the same time. You can also click the “brain” icon in the terminal as shown in the image below to access the chat.

iTerm2 AI chat feature

You can chat about the terminal output in the chat like understanding the output of a command.

Conclusion

By integrating iTerm2’s AI plugin with Ollama’s local AI models, you can enhance your productivity while maintaining privacy. The AI features can help you construct complex commands, understand unfamiliar ones, and streamline your workflow.

Feel free to explore different models available in Ollama and adjust the settings to suit your needs.

Troubleshooting

Connection Issues: Check Ollama troubleshooting steps here

Additional Resources



Date: November 2024

Leave a Reply

Join 48 other subscribers

Akash Gupta
Senior VoIP Engineer and AI Enthusiast



Discover more from AI and VoIP Blog

Subscribe to get the latest posts sent to your email.



Leave a Reply

Discover more from AI and VoIP Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading