Run LLM locally

Use ollama to run LLMs https://klu.ai/glossary/ollama

Install ollama https://ollama.com/download

Run model llama2

ollama run llama2