You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This guide will walk you through the process of setting up and using tool calls (function calling) with Large Language Models (LLMs) using the ESP32_AI_Connect library. We'll use the `tool_calls_demo.ino` sketch stored in the examples folder as our reference implementation, explaining each component in detail so you can understand how to integrate AI function calling capabilities into your ESP32 projects.
5
+
This guide will walk you through the process of setting up and using tool calls (tool calling) with Large Language Models (LLMs) using the ESP32_AI_Connect library. We'll use the `tool_calling_demo.ino` sketch stored in the examples folder as our reference implementation, explaining each component in detail so you can understand how to integrate AI tool calling capabilities into your ESP32 projects with ESP32_AI_Connect.
6
6
7
7
## Prerequisites
8
8
@@ -12,7 +12,7 @@ Before you begin, make sure you have:
12
12
- Arduino IDE installed with ESP32 board support
13
13
- ESP32_AI_Connect library installed
14
14
- WiFi connectivity
15
-
- An API key for your chosen AI platform (OpenAI is recommended for tool calls)
15
+
- An API key for your chosen AI platform (Ensure the AI Model you choose supports Tool Calls functionality)
16
16
- Basic understanding of JSON and Arduino programming
17
17
18
18
## Step 1: Enable Tool Calls in Configuration
@@ -21,7 +21,7 @@ First, ensure that tool calls support is enabled in the library configuration fi
21
21
22
22
```cpp
23
23
// --- Tool Calls Support ---
24
-
// Uncomment the following line to enable tool calls (function calling) support
24
+
// Uncomment the following line to enable tool calls (tool calling) support
25
25
// This will add tcToolSetup and tcChat methods to the library
26
26
// If you don't need tool calls, keep this commented out to save memory
This line initializes the AI client with parameters:
63
-
- The platform identifier (typically `"openai"` for tool calls)
63
+
- The platform identifier ("openai", "gemini", "claude" or "deepseek", etc.)
64
64
- Your API key
65
65
- The model name (e.g., `"gpt-3.5-turbo"` or `"gpt-4"`)
66
66
- Optional custom endpoint (if you're using a custom API endpoint)
@@ -93,7 +93,7 @@ void setup() {
93
93
94
94
## Step 5: Define Your Tools
95
95
96
-
Tool calls require defining the functions that the AI can call. Each tool is defined as a JSON object that specifies the function name, description, and parameters:
96
+
Tool calls require defining the tool(s) that the AI can call. Each tool is defined as a JSON object that specifies the function name, description, and parameters:
97
97
98
98
```cpp
99
99
// --- Define Tools for Tool Calling ---
@@ -158,7 +158,7 @@ These optional methods allow you to:
158
158
159
159
Each setter also has a corresponding getter method to retrieve the current value.
160
160
161
-
Different AI platforms support different `toolChoice` parameters. The table below shows the allowed values for each platform:
161
+
Different AI platforms support different `tool_choice` parameters. The table below shows the allowed values for each platform:
162
162
163
163
| Tool Choice Mode | OpenAI API | Gemini API | Anthropic Claude API |
This article is a follow-up to the previous guide "Tool Calls Implementation Basics". If you haven't read that article yet, please do so before continuing, as this guide builds upon the concepts introduced there.
0 commit comments