Skip to content

Agent always ever answers {"{" #9419

@Haukinger

Description

@Haukinger

LocalAI version:

localagi image id sha256:a86012ecd9bdf2234771bf13fa5d2c566bba1d7fb04cf3c5fbe82f5bfd9a11c8
localai image id sha256:b22cdc29e51cf1b6152e5c9cc6c99ef935017fd3877226755d7ecd6e6822a6e9

Environment, CPU architecture, OS, and Version:

Windows 10 LSC, Docker Desktop 4.48

Describe the bug

When chatting with the agent, it always replies with {"{"

To Reproduce

Setup new LocalAgi Cuda with docker compose up, create agent, ask it anything

Expected behavior

I'd expect a more "verbose" answer :-)

Logs

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} ," caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} ," caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name":" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name":" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name": "" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "answer" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name": "answer" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line="slot print_timing: id 0 | task 43 | " caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line="prompt eval time = 288.96 ms / 212 tokens ( 1.36 ms per token, 733.67 tokens per second)" caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line=" eval time = 466.06 ms / 18 tokens ( 25.89 ms per token, 38.62 tokens per second)" caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line=" total time = 755.02 ms / 230 tokens" caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line="slot release: id 0 | task 43 | stop processing: n_tokens = 229, truncated = 0" caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG GRPC stderr id="gemma-3-12b-it-127.0.0.1:41785" line="srv update_slots: all slots are idle" caller={caller.file="/build/pkg/model/process.go" caller.L=187 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG [ChatDeltas] streaming completed, accumulated deltas from C++ autoparser total_deltas=18 caller={caller.file="/build/core/backend/llm.go" caller.L=253 }

Apr 18 21:38:23 DEBUG [ChatDeltas] received deltas from backend total_deltas=18 content_chunks=18 reasoning_chunks=0 tool_call_chunks=0 caller={caller.file="/build/pkg/functions/chat_deltas.go" caller.L=31 }

Apr 18 21:38:23 DEBUG [ChatDeltas] deltas present but no tool calls found, falling back to text parsing caller={caller.file="/build/pkg/functions/chat_deltas.go" caller.L=67 }

Apr 18 21:38:23 DEBUG [ChatDeltas] no pre-parsed tool calls, falling back to Go-side text parsing caller={caller.file="/build/core/http/endpoints/openai/chat.go" caller.L=358 }

Apr 18 21:38:23 DEBUG ParseTextContent result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=270 }

Apr 18 21:38:23 DEBUG CaptureLLMResult config=[] caller={caller.file="/build/pkg/functions/parse.go" caller.L=271 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=256 }

Apr 18 21:38:23 DEBUG LLM result(processed) result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=264 }

Apr 18 21:38:23 DEBUG LLM result result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=866 }

Apr 18 21:38:23 DEBUG LLM result(function cleanup) result="{"arguments": {"message": "Hello!"} , "name": "answer"}" caller={caller.file="/build/pkg/functions/parse.go" caller.L=874 }

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions