Flathub Logo


by Petr Kovář
Light theme screenshot

Work with Ollama from GUI

You need Ollama running on your localhost with some model. Once Ollama is running the model can be pulled from follamac or from command line. From command line type something like: ollama pull llama3 If you wish to pull from follamac you can write llama3 into "Model name to pull" input box and click the PULL button.

Follamac is a desktop application which provides convenient way to work with Ollama and large language models (LLMs) and provides these features:

  • pulling/deleting models
  • sending prompts to Ollama (chat or generate)
  • selecting role for a message in the chat mode and possibility to send a system message with the generate mode - basic options (temperature, threads)
  • basic info about selected model
  • code highlighting
  • multiple chats
  • editing/deleting chats
  • editing/deleting messages
  • copy code or whole message to clipboard
  • light and dark theme (defaults to the system setting)

Modificări în versiune 0.1.5

circa 2 luni în urmă
(Built circa 1 lună în urmă)
  • Nu este un jurnalul de modificări de modificări
  • Community built

    This app is developed in the open by an international community, and released under the MIT License.
    Get involved
Dimensiune instalată~301.04 MiB
Dimensiune descărcării108.8 MiB
Arhitecturi disponibilex86_64, aarch64