Camel Chat

Camel Chat is a feature-rich Flutter application designed to provide a seamless interface for communicating with large language models (LLMs) served via an Ollama server. It offers a user-friendly way to interact with open-source AI models on your own hardware.

Features

  • Connect to Ollama Servers: Easily connect to any Ollama server with optional basic HTTP authentication.
  • Multiple Model Support: Chat with any model available on your Ollama server.
  • Complete Chat History: View and manage your conversation history.
  • Dark Mode Support: Switch between light and dark themes for comfortable viewing.
  • Custom System Prompts: Define system prompts to set the AI’s behaviour and context.
  • Export Conversations: Export your chats as markdown files for sharing or archiving.
  • Chat Organisation: Auto-generated meaningful titles for your conversations.
  • Responsive UI: Works seamlessly on both mobile and desktop devices.
  • Code Formatting: Proper rendering and formatting of code blocks in responses.
  • Local Storage: All your conversations are stored locally for privacy.

Getting Started

Prerequisites

  • A running Ollama server (local or remote).

Installation

Android

Download and install the APK from the releases page.

Linux

Choose one of the following packages from the releases page:

  • Debian/Ubuntu: Download and install the .deb package.
  • Fedora/RHEL: Download and install the .rpm package.
  • Arch: Download and install .zst package.
  • Other distributions: Download the AppImage, make it executable and run it.

Setting Up Your Ollama Server

  1. Install Ollama from https://ollama.com/.
  2. Pull the models you want to use (e.g., ollama pull gemma3).
  3. Run the Ollama server.
  4. Connect Camel Chat to your server by entering the URL (e.g., http://localhost:11434/).

Roadmap

Here are some features and improvements planned for future releases:

  • Stream Responses: Implement streaming responses for more interactive conversations.
  • File Attachments: Upload and process files during conversations.
  • Chat Statistics: View usage statistics and performance metrics.
  • Release on Flathub
  • Windows & macOS Support
    • nutbutter@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      7
      ·
      2 days ago

      My app is nothing compared to the features Open WebUI. I just wanted to make a simple native app. Honestly, I made this just because I wanted to see if I can make something like that.

      Also, Open WebUI is slightly complex for someone who is not into self-hosting. My app is for someone who just installs Ollama on their laptop or any computer and has exposed it to on the local network.

  • Smee@poeng.link
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    Looks interesting but can’t get it to run on Debian 12, neither the .deb nor the appimage.

      • Smee@poeng.link
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        .deb

        $ /opt/camelchat/camelchat

        opt/camelchat/camelchat: symbol lookup error: /opt/camelchat/camelchat: undefined symbol: g_once_init_enter_pointer

        Appimage

        Set as executable.

        $ ./Camel-Chat-0.2.0-x86_64.AppImage

        tmp/.mount_Camel-7OCAAq/camelchat: symbol lookup error: /tmp/.mount_Camel-7OCAAq/camelchat: undefined symbol: g_once_init_enter_pointer


        From a similar issue for a different app it seems to be a glib issue, requiring glib 2.8+ when Debian12 is shipped with 2.74.6-2+deb12u5.


        Android

        Works perfectly!

      • Smee@poeng.link
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        I did run the app image in terminal and got an error about missing something, and the app image comes with everything bundled right? Might be an issue on my end I suppose.

        I’ll redo it and paste the error message tomorrow.