view reply right now pretty fast image gen is a bit slow but that is my computer not being good enough
view post Post 1990 What do you think of my LLM Chat app so far? Here are some of the features already included (and more are coming):- Chat with AI models โ Local inference via Ollama- Reasoning support โ View model thinking process (DeepSeek-R1, Qwen-QwQ, etc.)- Vision models โ Analyze images with llava, bakllava, moondream- Image generation โ Local GGUF models with GPU acceleration (CUDA)- Fullscreen images โ Click generated images to view in fullscreen- Image attachments โ File picker or clipboard paste (Ctrl+V)- DeepSearch โ Web search with tool use- Inference Stats โ Token counts, speed, duration (like Ollama verbose)- Regenerate โ Re-run any AI response- Copy โ One-click copy AI responses See translation 5 replies ยท ๐ 4 4 + Reply