From 05d798342941e773b31df6ccf3d40de7b989bee9 Mon Sep 17 00:00:00 2001 From: Sweaterdog Date: Sat, 1 Mar 2025 13:50:38 -0800 Subject: [PATCH] Update README.md Updated the README for fixed ollama usage, and changed the example model from llama3, to a more capable model, qwen2.5, and also noted to ollama users that any local model can be used as long as it starts with `ollama/` --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 136540f..cbe1197 100644 --- a/README.md +++ b/README.md @@ -42,7 +42,7 @@ You can configure the agent's name, model, and prompts in their profile like `an | Google | `GEMINI_API_KEY` | `gemini-2.0-flash` | [docs](https://ai.google.dev/gemini-api/docs/models/gemini) | | Anthropic | `ANTHROPIC_API_KEY` | `claude-3-5-haiku-20241022` | [docs](https://docs.anthropic.com/claude/docs/models-overview) | | Replicate | `REPLICATE_API_KEY` | `replicate/meta/meta-llama-3-70b-instruct` | [docs](https://replicate.com/collections/language-models) | -| Ollama (local) | n/a | `llama3` | [docs](https://ollama.com/library) | +| Ollama (local) | n/a | `ollama/qwen2.5` | [docs](https://ollama.com/library) | | Groq | `GROQCLOUD_API_KEY` | `groq/mixtral-8x7b-32768` | [docs](https://console.groq.com/docs/models) | | Hugging Face | `HUGGINGFACE_API_KEY` | `huggingface/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B` | [docs](https://huggingface.co/models) | | Novita AI | `NOVITA_API_KEY` | `gryphe/mythomax-l2-13b` | [docs](https://novita.ai/model-api/product/llm-api?utm_source=github_mindcraft&utm_medium=github_readme&utm_campaign=link) | @@ -55,7 +55,8 @@ You can configure the agent's name, model, and prompts in their profile like `an | Hyperbolic | `HYPERBOLIC_API_KEY` | `hb:deepseek-ai/DeepSeek-V3` | [docs](https://docs.hyperbolic.xyz/docs/getting-started) | If you use Ollama, to install the models used by default (generation and embedding), execute the following terminal command: -`ollama pull llama3 && ollama pull nomic-embed-text` +`ollama pull qwen2.5 && ollama pull nomic-embed-text` +Note that you can you any local model, such as `qwen2.5` `llama3` or fine tuned models like `sweaterdog/andy-3.6` But not matter what model, it is recommended you put `ollama/` before the model so Mindcraft can handle it properly. ### Online Servers To connect to online servers your bot will need an official Microsoft/Minecraft account. You can use your own personal one, but will need another account if you want to connect too and play with it. To connect, change these lines in `settings.js`: