Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .pipelines/templates/test-cs-steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ steps:
$testDataDir = "$(Build.SourcesDirectory)/test-data-shared"
Write-Host "##vso[task.setvariable variable=repoRoot]$repoRoot"
Write-Host "##vso[task.setvariable variable=testDataDir]$testDataDir"
Write-Host "##vso[task.setvariable variable=FOUNDRY_TESTING_MODE]1"

- task: UseDotNet@2
displayName: 'Use .NET 9 SDK'
Expand Down
1 change: 1 addition & 0 deletions .pipelines/templates/test-js-steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ steps:
$testDataDir = "$(Build.SourcesDirectory)/test-data-shared"
Write-Host "##vso[task.setvariable variable=repoRoot]$repoRoot"
Write-Host "##vso[task.setvariable variable=testDataDir]$testDataDir"
Write-Host "##vso[task.setvariable variable=FOUNDRY_TESTING_MODE]1"

- ${{ if eq(parameters.isWinML, true) }}:
- task: PowerShell@2
Expand Down
1 change: 1 addition & 0 deletions .pipelines/templates/test-python-steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ steps:
$testDataDir = "$(Build.SourcesDirectory)/test-data-shared"
Write-Host "##vso[task.setvariable variable=repoRoot]$repoRoot"
Write-Host "##vso[task.setvariable variable=testDataDir]$testDataDir"
Write-Host "##vso[task.setvariable variable=FOUNDRY_TESTING_MODE]1"

- ${{ if eq(parameters.isWinML, true) }}:
- task: PowerShell@2
Expand Down
1 change: 1 addition & 0 deletions .pipelines/templates/test-rust-steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ steps:
$testDataDir = "$(Build.SourcesDirectory)/test-data-shared"
Write-Host "##vso[task.setvariable variable=repoRoot]$repoRoot"
Write-Host "##vso[task.setvariable variable=testDataDir]$testDataDir"
Write-Host "##vso[task.setvariable variable=FOUNDRY_TESTING_MODE]1"

- ${{ if eq(parameters.isWinML, true) }}:
- task: PowerShell@2
Expand Down
1 change: 1 addition & 0 deletions samples/cs/tool-calling-foundry-local-sdk/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,7 @@ await model.DownloadAsync(progress =>
var response = new ChatMessage
{
Role = "tool",
ToolCallId = chunk!.Choices[0].Message.ToolCalls![0].Id,
Content = result.ToString(),
};
Comment thread
bmehta001 marked this conversation as resolved.
messages.Add(response);
Expand Down
2 changes: 1 addition & 1 deletion samples/js/chat-and-audio-foundry-local/src/app.js
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ async function main() {
},
{ role: "user", content: transcription.text },
])) {
const content = chunk.choices?.[0]?.message?.content;
const content = chunk.choices?.[0]?.delta?.content;
if (content) {
process.stdout.write(content);
}
Expand Down
2 changes: 1 addition & 1 deletion samples/js/native-chat-completions/app.js
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ console.log('\nTesting streaming completion...');
for await (const chunk of chatClient.completeStreamingChat(
[{ role: 'user', content: 'Write a short poem about programming.' }]
)) {
const content = chunk.choices?.[0]?.message?.content;
const content = chunk.choices?.[0]?.delta?.content;
if (content) {
process.stdout.write(content);
}
Expand Down
6 changes: 3 additions & 3 deletions samples/js/tutorial-chat-assistant/app.js
Original file line number Diff line number Diff line change
Expand Up @@ -73,13 +73,13 @@ while (true) {
// Stream the response token by token
process.stdout.write('Assistant: ');
let fullResponse = '';
await chatClient.completeStreamingChat(messages, (chunk) => {
const content = chunk.choices?.[0]?.message?.content;
for await (const chunk of chatClient.completeStreamingChat(messages)) {
const content = chunk.choices?.[0]?.delta?.content;
if (content) {
process.stdout.write(content);
fullResponse += content;
}
});
}
console.log('\n');
// </streaming>

Expand Down
5 changes: 2 additions & 3 deletions samples/python/native-chat-completions/src/app.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
# <complete_code>
# <imports>
import asyncio
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>


async def main():
def main():
# <init>
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
Expand Down Expand Up @@ -64,5 +63,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
5 changes: 2 additions & 3 deletions samples/python/tool-calling/src/app.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# <complete_code>
# <imports>
import asyncio
import json
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>
Expand Down Expand Up @@ -130,7 +129,7 @@ def process_tool_calls(messages, response, client):


# <init>
async def main():
def main():
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
FoundryLocalManager.initialize(config)
Expand Down Expand Up @@ -192,5 +191,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
7 changes: 3 additions & 4 deletions samples/python/tutorial-chat-assistant/src/app.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
# <complete_code>
# <imports>
import asyncio
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>


async def main():
def main():
# <init>
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
Expand Down Expand Up @@ -64,7 +63,7 @@ def ep_progress(ep_name: str, percent: float):
print("Assistant: ", end="", flush=True)
full_response = ""
for chunk in client.complete_streaming_chat(messages):
content = chunk.choices[0].message.content
content = chunk.choices[0].delta.content
if content:
print(content, end="", flush=True)
full_response += content
Expand All @@ -81,5 +80,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
15 changes: 7 additions & 8 deletions samples/python/tutorial-document-summarizer/src/app.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,12 @@
# <complete_code>
# <imports>
import asyncio
import sys
from pathlib import Path
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>


async def summarize_file(client, file_path, system_prompt):
def summarize_file(client, file_path, system_prompt):
"""Summarize a single file and print the result."""
content = Path(file_path).read_text(encoding="utf-8")
messages = [
Expand All @@ -18,7 +17,7 @@ async def summarize_file(client, file_path, system_prompt):
print(response.choices[0].message.content)


async def summarize_directory(client, directory, system_prompt):
def summarize_directory(client, directory, system_prompt):
"""Summarize all .txt files in a directory."""
txt_files = sorted(Path(directory).glob("*.txt"))

Expand All @@ -28,11 +27,11 @@ async def summarize_directory(client, directory, system_prompt):

for txt_file in txt_files:
print(f"--- {txt_file.name} ---")
await summarize_file(client, txt_file, system_prompt)
summarize_file(client, txt_file, system_prompt)
print()


async def main():
def main():
# <init>
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
Expand Down Expand Up @@ -76,10 +75,10 @@ def ep_progress(ep_name: str, percent: float):
# </file_reading>

if target_path.is_dir():
await summarize_directory(client, target_path, system_prompt)
summarize_directory(client, target_path, system_prompt)
else:
print(f"--- {target_path.name} ---")
await summarize_file(client, target_path, system_prompt)
summarize_file(client, target_path, system_prompt)
# </summarization>

# Clean up
Expand All @@ -88,5 +87,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
5 changes: 2 additions & 3 deletions samples/python/tutorial-tool-calling/src/app.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# <complete_code>
# <imports>
import asyncio
import json
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>
Expand Down Expand Up @@ -130,7 +129,7 @@ def process_tool_calls(messages, response, client):


# <init>
async def main():
def main():
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
FoundryLocalManager.initialize(config)
Expand Down Expand Up @@ -197,5 +196,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
5 changes: 2 additions & 3 deletions samples/python/tutorial-voice-to-text/src/app.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
# <complete_code>
# <imports>
import asyncio
from foundry_local_sdk import Configuration, FoundryLocalManager
# </imports>


async def main():
def main():
# <init>
# Initialize the Foundry Local SDK
config = Configuration(app_name="foundry_local_samples")
Expand Down Expand Up @@ -88,5 +87,5 @@ def ep_progress(ep_name: str, percent: float):


if __name__ == "__main__":
asyncio.run(main())
main()
# </complete_code>
8 changes: 6 additions & 2 deletions sdk/cs/test/FoundryLocal.Tests/ChatCompletionsTests.cs
Original file line number Diff line number Diff line change
Expand Up @@ -196,8 +196,10 @@ public async Task DirectTool_NoStreaming_Succeeds()
await Assert.That(response.Choices[0].Message.ToolCalls?[0].FunctionCall?.Arguments).IsEqualTo(expectedArguments);

// Add the response from invoking the tool call to the conversation and check if the model can continue correctly
var toolCallId = response.Choices[0].Message.ToolCalls?[0].Id;
await Assert.That(toolCallId).IsNotNull();
var toolCallResponse = "7 x 6 = 42.";
messages.Add(new ChatMessage { Role = "tool", Content = toolCallResponse });
messages.Add(new ChatMessage { Role = "tool", ToolCallId = toolCallId, Content = toolCallResponse });

Comment thread
bmehta001 marked this conversation as resolved.
// Prompt the model to continue the conversation after the tool call
messages.Add(new ChatMessage { Role = "system", Content = "Respond only with the answer generated by the tool." });
Expand Down Expand Up @@ -300,8 +302,10 @@ public async Task DirectTool_Streaming_Succeeds()
await Assert.That(toolCallResponse?.Choices[0].Message.ToolCalls?[0].FunctionCall?.Arguments).IsEqualTo(expectedArguments);

// Add the response from invoking the tool call to the conversation and check if the model can continue correctly
var toolCallId = toolCallResponse?.Choices[0].Message.ToolCalls?[0].Id;
await Assert.That(toolCallId).IsNotNull();
var toolResponse = "7 x 6 = 42.";
messages.Add(new ChatMessage { Role = "tool", Content = toolResponse });
messages.Add(new ChatMessage { Role = "tool", ToolCallId = toolCallId, Content = toolResponse });

Comment thread
bmehta001 marked this conversation as resolved.
// Prompt the model to continue the conversation after the tool call
messages.Add(new ChatMessage { Role = "system", Content = "Respond only with the answer generated by the tool." });
Expand Down
5 changes: 4 additions & 1 deletion sdk/js/src/openai/chatClient.ts
Original file line number Diff line number Diff line change
Expand Up @@ -167,12 +167,15 @@ export class ChatClient {
if (typeof tool.type !== 'string' || tool.type.trim() === '') {
throw new Error('Each tool must have a "type" property that is a non-empty string.');
}
if (typeof tool.function !== 'object' || tool.function.description.trim() === '') {
if (!tool.function || typeof tool.function !== 'object') {
throw new Error('Each tool must have a "function" property that is a non-empty object.');
}
Comment thread
bmehta001 marked this conversation as resolved.
if (typeof tool.function.name !== 'string' || tool.function.name.trim() === '') {
throw new Error('Each tool\'s function must have a "name" property that is a non-empty string.');
}
if (tool.function.description !== undefined && typeof tool.function.description !== 'string') {
throw new Error('Each tool\'s function "description", if provided, must be a string.');
}
}
}

Expand Down
2 changes: 1 addition & 1 deletion sdk/python/test/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ This test suite mirrors the structure of the JS (`sdk_v2/js/test/`) and C# (`sdk

## Prerequisites

1. **Python 3.10+** (tested with 3.12/3.13)
1. **Python 3.11+** (tested with 3.12/3.13)
2. **SDK installed in editable mode** from the `sdk/python` directory:
```bash
pip install -e .
Expand Down
37 changes: 5 additions & 32 deletions sdk/rust/docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
- [Model Catalog](#model-catalog)
- [Catalog](#catalog)
- [Model](#model)
- [ModelVariant](#modelvariant)
- [OpenAI Clients](#openai-clients)
- [ChatClient](#chatclient)
- [ChatCompletionStream](#chatcompletionstream)
Expand Down Expand Up @@ -131,15 +130,15 @@ pub struct Catalog { /* private fields */ }
| `update_models` | `async fn update_models(&self) -> Result<(), FoundryLocalError>` | Refresh catalog if cache expired or invalidated. |
| `get_models` | `async fn get_models(&self) -> Result<Vec<Arc<Model>>, FoundryLocalError>` | Return all known models. |
| `get_model` | `async fn get_model(&self, alias: &str) -> Result<Arc<Model>, FoundryLocalError>` | Look up a model by alias. |
| `get_model_variant` | `async fn get_model_variant(&self, id: &str) -> Result<Arc<ModelVariant>, FoundryLocalError>` | Look up a variant by unique id. |
| `get_cached_models` | `async fn get_cached_models(&self) -> Result<Vec<Arc<ModelVariant>>, FoundryLocalError>` | Return only variants cached on disk. |
| `get_loaded_models` | `async fn get_loaded_models(&self) -> Result<Vec<Arc<ModelVariant>>, FoundryLocalError>` | Return model variants currently loaded in memory. |
| `get_model_variant` | `async fn get_model_variant(&self, id: &str) -> Result<Arc<Model>, FoundryLocalError>` | Look up a variant by unique id. |
| `get_cached_models` | `async fn get_cached_models(&self) -> Result<Vec<Arc<Model>>, FoundryLocalError>` | Return only variants cached on disk. |
| `get_loaded_models` | `async fn get_loaded_models(&self) -> Result<Vec<Arc<Model>>, FoundryLocalError>` | Return model variants currently loaded in memory. |

---

### Model

Groups one or more `ModelVariant`s sharing the same alias. By default, the cached variant is selected.
Groups one or more variants sharing the same alias. By default, the cached variant is selected.

```rust
pub struct Model { /* private fields */ }
Expand All @@ -149,8 +148,7 @@ pub struct Model { /* private fields */ }
|--------|-----------|-------------|
| `alias` | `fn alias(&self) -> &str` | Alias shared by all variants. |
| `id` | `fn id(&self) -> &str` | Unique identifier of the selected variant. |
| `variants` | `fn variants(&self) -> &[Arc<ModelVariant>]` | All variants in this model. |
| `selected_variant` | `fn selected_variant(&self) -> &ModelVariant` | Currently selected variant. |
| `variants` | `fn variants(&self) -> Vec<Arc<Model>>` | All variants in this model. |
| `select_variant` | `fn select_variant(&self, variant: &Model) -> Result<(), FoundryLocalError>` | Select a variant from `variants()`. |
Comment thread
bmehta001 marked this conversation as resolved.
| `select_variant_by_id` | `fn select_variant_by_id(&self, id: &str) -> Result<(), FoundryLocalError>` | Select a variant by its unique id string. |
| `is_cached` | `async fn is_cached(&self) -> Result<bool, FoundryLocalError>` | Whether the selected variant is cached on disk. |
Expand All @@ -165,31 +163,6 @@ pub struct Model { /* private fields */ }

---

### ModelVariant

A single model variant — one specific id within an alias group.

```rust
pub struct ModelVariant { /* private fields */ }
```

| Method | Signature | Description |
|--------|-----------|-------------|
| `info` | `fn info(&self) -> &ModelInfo` | Full metadata for this variant. |
| `id` | `fn id(&self) -> &str` | Unique identifier. |
| `alias` | `fn alias(&self) -> &str` | Alias shared with sibling variants. |
| `is_cached` | `async fn is_cached(&self) -> Result<bool, FoundryLocalError>` | Whether cached locally. ⚠️ Full IPC per call — prefer `Catalog::get_cached_models()` for batch use. |
| `is_loaded` | `async fn is_loaded(&self) -> Result<bool, FoundryLocalError>` | Whether currently loaded in memory. |
| `download` | `async fn download<F>(&self, progress: Option<F>) -> Result<(), FoundryLocalError>` | Download the variant. `F: FnMut(f64) + Send + 'static` — receives progress as a percentage (0.0–100.0). |
| `path` | `async fn path(&self) -> Result<PathBuf, FoundryLocalError>` | Local file-system path. |
| `load` | `async fn load(&self) -> Result<(), FoundryLocalError>` | Load into memory. |
| `unload` | `async fn unload(&self) -> Result<String, FoundryLocalError>` | Unload from memory. |
| `remove_from_cache` | `async fn remove_from_cache(&self) -> Result<String, FoundryLocalError>` | Remove from local cache. |
| `create_chat_client` | `fn create_chat_client(&self) -> ChatClient` | Create a ChatClient bound to this variant. |
| `create_audio_client` | `fn create_audio_client(&self) -> AudioClient` | Create an AudioClient bound to this variant. |

---

## OpenAI Clients

### ChatClient
Expand Down
Loading