mirror of
https://github.com/likelovewant/ollama-for-amd.git
synced 2025-12-23 23:18:26 +00:00
docs: add reference to docs.ollama.com (#12800)
This commit is contained in:
47
docs/integrations/jetbrains.mdx
Normal file
47
docs/integrations/jetbrains.mdx
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
title: JetBrains
|
||||
---
|
||||
|
||||
<Note>This example uses **IntelliJ**; same steps apply to other JetBrains IDEs (e.g., PyCharm).</Note>
|
||||
|
||||
## Install
|
||||
|
||||
Install [IntelliJ](https://www.jetbrains.com/idea/).
|
||||
|
||||
## Usage with Ollama
|
||||
|
||||
<Note>
|
||||
To use **Ollama**, you will need a [JetBrains AI Subscription](https://www.jetbrains.com/ai-ides/buy/?section=personal&billing=yearly).
|
||||
</Note>
|
||||
|
||||
1. In Intellij, click the **chat icon** located in the right sidebar
|
||||
|
||||
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
||||
<img
|
||||
src="/images/intellij-chat-sidebar.png"
|
||||
alt="Intellij Sidebar Chat"
|
||||
width="50%"
|
||||
/>
|
||||
</div>
|
||||
|
||||
2. Select the **current model** in the sidebar, then click **Set up Local Models**
|
||||
|
||||
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
||||
<img
|
||||
src="/images/intellij-current-model.png"
|
||||
alt="Intellij model bottom right corner"
|
||||
width="50%"
|
||||
/>
|
||||
</div>
|
||||
|
||||
3. Under **Third Party AI Providers**, choose **Ollama**
|
||||
4. Confirm the **Host URL** is `http://localhost:11434`, then click **Ok**
|
||||
5. Once connected, select a model under **Local models by Ollama**
|
||||
|
||||
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
||||
<img
|
||||
src="/images/intellij-local-models.png"
|
||||
alt="Zed star icon in bottom right corner"
|
||||
width="50%"
|
||||
/>
|
||||
</div>
|
||||
Reference in New Issue
Block a user