The default LLM is Llama 2 run locally by Ollama. You'll need to install the Ollama desktop app and run the following commands to give this site access to the locally running model:
$ ollama run severian/anima
$ OLLAMA_ORIGINS=https://anima-pdf-chat.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve