Files
deardiary/www/docs/ai-providers.html

138 lines
4.4 KiB
HTML

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AI Providers - DearDiary</title>
<link rel="stylesheet" href="../css/styles.css">
<link rel="stylesheet" href="../css/docs.css">
</head>
<body>
<nav class="navbar">
<div class="nav-container">
<a href="../" class="logo">
<svg width="32" height="32" viewBox="0 0 100 100"><defs><linearGradient id="g" x1="0%" y1="0%" x2="100%" y2="100%"><stop offset="0%" style="stop-color:#6d28d9"/><stop offset="100%" style="stop-color:#4c1d95"/></linearGradient></defs><rect width="100" height="100" rx="20" fill="url(#g)"/><path d="M25 25 L75 25 L75 80 L25 80 Z" fill="none" stroke="white" stroke-width="3"/></svg>
DearDiary
</a>
<div class="nav-links">
<a href="../">Home</a>
<a href="../docs/">Docs</a>
<a href="../blog/">Blog</a>
<a href="${GIT_URL}" target="_blank">Git Repository</a>
</div>
</div>
</nav>
<div class="docs-layout">
<aside class="docs-sidebar">
<div class="sidebar-section">
<h3>Getting Started</h3>
<a href="installation.html">Installation</a>
<a href="quick-start.html">Quick Start</a>
</div>
<div class="sidebar-section">
<h3>Configuration</h3>
<a href="configuration.html">Environment</a>
<a href="api.html">API Reference</a>
</div>
<div class="sidebar-section">
<h3>Features</h3>
<a href="events.html">Events</a>
<a href="ai-providers.html" class="active">AI Providers</a>
<a href="export-import.html">Export & Import</a>
</div>
</aside>
<main class="docs-content">
<h1>AI Providers</h1>
<p>DearDiary supports multiple AI providers for diary generation. Choose the one that best fits your needs.</p>
<h2>Supported Providers</h2>
<table>
<tr>
<th>Provider</th>
<th>Default Model</th>
<th>Type</th>
</tr>
<tr>
<td><strong>Groq</strong> (Recommended)</td>
<td>llama-3.3-70b-versatile</td>
<td>Cloud</td>
</tr>
<tr>
<td>OpenAI</td>
<td>gpt-4o</td>
<td>Cloud</td>
</tr>
<tr>
<td>Anthropic</td>
<td>claude-3-5-sonnet-latest</td>
<td>Cloud</td>
</tr>
<tr>
<td>Ollama</td>
<td>varies</td>
<td>Local</td>
</tr>
<tr>
<td>LM Studio</td>
<td>varies</td>
<td>Local</td>
</tr>
</table>
<h2>Groq (Recommended)</h2>
<p>Fast inference with a free tier available.</p>
<ol>
<li>Get an API key from <a href="https://console.groq.com" target="_blank">console.groq.com</a></li>
<li>Enter the API key in Settings</li>
<li>Select Groq as your provider</li>
<li>Test connection and save</li>
</ol>
<h2>OpenAI</h2>
<ol>
<li>Get an API key from <a href="https://platform.openai.com" target="_blank">platform.openai.com</a></li>
<li>Optionally select a specific model:
<ul>
<li><code>gpt-4o</code> - Most capable</li>
<li><code>gpt-4o-mini</code> - Faster, cheaper</li>
</ul>
</li>
</ol>
<h2>Anthropic</h2>
<ol>
<li>Get an API key from <a href="https://console.anthropic.com" target="_blank">console.anthropic.com</a></li>
<li>Select Claude model:
<ul>
<li><code>claude-3-5-sonnet-latest</code> - Recommended</li>
<li><code>claude-3-opus-latest</code> - Most capable</li>
</ul>
</li>
</ol>
<h2>Local Models (Ollama)</h2>
<p>Run models locally on your machine for complete privacy.</p>
<ol>
<li>Install <a href="https://ollama.ai" target="_blank">Ollama</a></li>
<li>Pull a model: <code>ollama pull llama3.2</code></li>
<li>Set base URL: <code>http://localhost:11434/v1</code></li>
<li>Enter model name (e.g., <code>llama3.2</code>)</li>
</ol>
<h2>Local Models (LM Studio)</h2>
<ol>
<li>Download <a href="https://lmstudio.ai" target="_blank">LM Studio</a></li>
<li>Download a model</li>
<li>Start local server (click "Start Server")</li>
<li>Set base URL: <code>http://localhost:1234/v1</code></li>
</ol>
</main>
</div>
</body>
</html>