ChattyUI
Run open-source LLMs locally in the browser using WebGPU
2024-06-07

Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!
ChattyUI is an open-source, browser-based platform that enables users to run powerful open-source language models like Gemma, Mistral, and LLama3 locally using WebGPU. Unlike traditional solutions, ChattyUI operates entirely within your browser, ensuring your data never leaves your device. This eliminates the need for server-side processing, offering enhanced privacy and control. The platform features a user-friendly, Gemini/ChatGPT-like interface, making it accessible for a wide range of users. Additionally, ChattyUI supports models optimized for lower VRAM requirements, making it more accessible for devices with limited resources. Perfect for those seeking privacy and flexibility, ChattyUI brings cutting-edge AI capabilities directly to your browser.
Open Source
Artificial Intelligence
GitHub
Tech