Ask Anything Ollama • EC2
Ask the model anything you want. Keep prompts short for faster replies on CPU. Your requests are sent to Ollama via a local proxy (so no CORS pain).
● Checking connection…

Settings

Tip: Use the SSH tunnel so the API is reachable at 127.0.0.1:11434.
Theme: “Ask model anything you want”
If the model feels slow, switch to tinyllama or lower max tokens.
Enter = send • Shift+Enter = newline • Click “New chat” to reset context