Setting Up the Agent Environment
Set up your LLM provider to run the pattern examples. We recommend starting with Ollama - it's free, runs locally, and requires no API keys.
Prerequisites
- .NET 8.0 SDK or later - Download
- Git - to clone the patterns repository
- An LLM provider - see options below
Set up your LLM provider
Choose a provider to get started. Click on a card for setup instructions.
Ollama
RecommendedRun open-source LLMs locally. No API keys, no costs, works offline.
OpenAI
Use OpenAI's API directly with models like GPT-4o.
Azure OpenAI
Enterprise deployments with Azure compliance and security.
OpenRouter
Access 100+ models from OpenAI, Anthropic, Google, and more through one API.
Option 1: Ollama (Recommended)
Free Local
Ollama lets you run open-source LLMs locally on your machine. No API keys, no costs, works offline.
Step 1: Install Ollama
Download and install from ollama.com
Step 2: Pull a Model
ollama pull llama3.2 For smaller machines, try phi3 or llama3.2:1b
Step 3: Run a Pattern
# No environment variables needed - Ollama is the default!
cd patterns/01-prompt-chaining/src/PromptChaining
dotnet run Environment Variables (Optional)
# These are the defaults, only set if you need different values
export OLLAMA_ENDPOINT=http://localhost:11434
export OLLAMA_MODEL=llama3.2 Option 2: OpenAI
Pay-as-you-go
Use OpenAI's API directly with models like GPT-4o.
Step 1: Get an API Key
Create an account at platform.openai.com and generate an API key.
Step 2: Set Environment Variables
# PowerShell
$env:LLM_PROVIDER = "openai"
$env:OPENAI_API_KEY = "sk-..."
$env:OPENAI_MODEL = "gpt-4o-mini" # optional, this is the default # Bash
export LLM_PROVIDER=openai
export OPENAI_API_KEY=sk-...
export OPENAI_MODEL=gpt-4o-mini # optional Step 3: Run a Pattern
cd patterns/01-prompt-chaining/src/PromptChaining
dotnet run Option 3: Azure OpenAI
Enterprise
For enterprise deployments with Azure compliance and security features.
Prerequisites
- Azure subscription
- Azure OpenAI resource with a deployed model
- Azure CLI installed and logged in (
az login)
Set Environment Variables
# PowerShell
$env:LLM_PROVIDER = "azure"
$env:AZURE_OPENAI_ENDPOINT = "https://your-resource.openai.azure.com/"
$env:AZURE_OPENAI_DEPLOYMENT_NAME = "gpt-4o-mini" # Bash
export LLM_PROVIDER=azure
export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
export AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-mini Run a Pattern
cd patterns/01-prompt-chaining/src/PromptChaining
dotnet run Option 4: OpenRouter
Pay-as-you-go Multi-Model
OpenRouter provides access to 100+ models from multiple providers (OpenAI, Anthropic, Google, Meta, Mistral, and more) through a single API. Great for trying different models without managing multiple API keys.
Step 1: Get an API Key
Create an account at openrouter.ai and generate an API key from the Keys page.
Step 2: Set Environment Variables
# PowerShell
$env:LLM_PROVIDER = "openrouter"
$env:OPENROUTER_API_KEY = "sk-or-..."
$env:OPENROUTER_MODEL = "openai/gpt-4o-mini" # optional, this is the default # Bash
export LLM_PROVIDER=openrouter
export OPENROUTER_API_KEY=sk-or-...
export OPENROUTER_MODEL=openai/gpt-4o-mini # optional Popular Models
openai/gpt-4o-mini- Fast and affordable GPT-4oanthropic/claude-3.5-sonnet- Anthropic's Claude 3.5google/gemini-pro-1.5- Google's Gemini Prometa-llama/llama-3.1-70b-instruct- Meta's Llama 3.1
See openrouter.ai/models for the full list.
Step 3: Run a Pattern
cd patterns/01-prompt-chaining/src/PromptChaining
dotnet run Troubleshooting
Ollama: "connection refused"
Make sure Ollama is running. On Windows/Mac, check the system tray. On Linux, run ollama serve.
Ollama: Model not found
Pull the model first: ollama pull llama3.2
OpenAI: 401 Unauthorized
Check that your API key is correct and has available credits.
Azure: Authentication failed
Run az login to authenticate with Azure CLI.
OpenRouter: 401 Unauthorized
Check that your API key is correct. Keys start with sk-or-.
OpenRouter: Model not found
Check the model name format at openrouter.ai/models. Models use provider/model-name format.
Next Steps
Now that your environment is set up, try the first pattern:
Start with Prompt Chaining →