Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mods does not respect ollama base-url setting #311

Closed
ActuallyAuggie opened this issue Aug 3, 2024 · 3 comments
Closed

Mods does not respect ollama base-url setting #311

ActuallyAuggie opened this issue Aug 3, 2024 · 3 comments

Comments

@ActuallyAuggie
Copy link

Mods seems to use the default ollama base-url: http://127.0.0.1:11434/api no matter what the line is set to.

Setup
MX Linux 23 ahs, xfce4-term, bash

To Reproduce
Change the line to anything, can be either a valid remote server or something nonsensical. The request will always go to the local machine.

Changing the base-url of, for instance, the openai config line does have the intended effect.

@caustiq
Copy link

caustiq commented Aug 13, 2024

Same here.

@jahanson
Copy link

jahanson commented Aug 20, 2024

This is the culprit:

mods/mods.go

Line 315 in 3dc0a94

ccfg.BaseURL = api.BaseURL

should be occfg.BaseURL = api.BaseURL

cerisara added a commit to cerisara/mods that referenced this issue Aug 25, 2024
@mecattaf
Copy link

My mods config is as follows:

default-model: llama3:8b
format-text:
  markdown: '{{ index .Config.FormatText "markdown" }}'
  json: '{{ index .Config.FormatText "json" }}'
roles:
  "default": []
  "summarize_git_diff":
    - https://raw.githubusercontent.com/danielmiessler/fabric/main/patterns/summarize_git_diff/system.md
format: false
role: "default"
raw: false
quiet: false
temp: 1.0
topp: 1.0
no-limit: false
word-wrap: 80
include-prompt-args: false
include-prompt: 0
max-retries: 5
fanciness: 10
status-text: Generating
max-input-chars: 12250
apis:
  ollama:
    base-url: http://localhost:11434/api
    models:
      "llama3:8b":
        aliases: ["llama3"]
        max-input-chars: 650000

When I try running a prompt through mods, I get the following in the ollama serve debug log

[GIN] 2024/08/25 - 12:26:32 | 404 |     415.694µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:32 | 404 |     111.536µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:32 | 404 |     180.034µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:33 | 404 |     109.918µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:35 | 404 |      112.41µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:45 | 404 |     165.582µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:45 | 404 |      230.76µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:45 | 404 |     171.141µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:46 | 404 |     151.101µs |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/08/25 - 12:26:48 | 404 |     171.085µs |       127.0.0.1 | POST     "/api/chat"                          

Will the associated PR solve my issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants