Skip to content

Commit 25fd3bc

Browse files
author
hideya
committed
Update to ver 0.3.4
1 parent 91eb6a7 commit 25fd3bc

File tree

7 files changed

+412
-302
lines changed

7 files changed

+412
-302
lines changed

.env.template

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@ ANTHROPIC_API_KEY=sk-ant-...
44
OPENAI_API_KEY=sk-proj-...
55
# https://aistudio.google.com/apikey
66
GOOGLE_API_KEY=AI...
7+
# https://console.x.ai
8+
XAI_API_KEY=xai-...
79

810
# BRAVE_API_KEY=BSA...
911
# GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...

README.md

Lines changed: 16 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Simple MCP Client to Explore MCP Servers / TypeScript [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [![npm version](https://img.shields.io/npm/v/@h1deya/mcp-client-cli.svg)](https://www.npmjs.com/package/@h1deya/mcp-client-cli)
1+
# Simple MCP Client to Explore MCP Servers [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [![npm version](https://img.shields.io/npm/v/@h1deya/mcp-client-cli.svg)](https://www.npmjs.com/package/@h1deya/mcp-client-cli)
22

33

44
**Quickly test and explore MCP servers from the command line!**
@@ -21,11 +21,12 @@ A Python equivalent of this utility is available [here](https://pypi.org/project
2121
- Node.js 18+
2222
- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/)
2323
installed to run Python-based local (stdio) MCP servers
24-
- LLM API keys from
24+
- LLM API key(s) from
2525
[OpenAI](https://platform.openai.com/api-keys),
2626
[Anthropic](https://console.anthropic.com/settings/keys),
27+
[Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey),
2728
and/or
28-
[Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey)
29+
[xAI](https://console.x.ai/),
2930
as needed
3031

3132
## Quick Start
@@ -51,6 +52,8 @@ A Python equivalent of this utility is available [here](https://pypi.org/project
5152
// "model": "claude-3-5-haiku-latest",
5253
// "model_provider": "google_genai",
5354
// "model": "gemini-2.5-flash",
55+
// "model_provider": "xai",
56+
// "model": "grok-3-mini",
5457
},
5558

5659
"mcp_servers": {
@@ -71,7 +74,8 @@ A Python equivalent of this utility is available [here](https://pypi.org/project
7174
```bash
7275
echo "ANTHROPIC_API_KEY=sk-ant-...
7376
OPENAI_API_KEY=sk-proj-...
74-
GOOGLE_API_KEY=AI..." > .env
77+
GOOGLE_API_KEY=AI...
78+
XAI_API_KEY=xai-..." > .env
7579

7680
code .env
7781
```
@@ -163,7 +167,7 @@ Create a `llm_mcp_config.json5` file:
163167
{
164168
"llm": {
165169
"model_provider": "openai",
166-
"model": "gpt-4.1-nano",
170+
"model": "gpt-4o-mini",
167171
// model: "o4-mini",
168172
},
169173

@@ -177,7 +181,13 @@ Create a `llm_mcp_config.json5` file:
177181
// "model_provider": "google_genai",
178182
// "model": "gemini-2.5-flash",
179183
// // "model": "gemini-2.5-pro",
180-
// }
184+
// },
185+
186+
// "llm": {
187+
// "model_provider": "xai",
188+
// "model": "grok-3-mini",
189+
// // "model": "grok-4",
190+
// },
181191

182192
"example_queries": [
183193
"Tell me how LLMs work in a few sentences",

README_DEV.md

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ When testing LLM and MCP servers, their settings can be conveniently configured
1313
// "model_provider": "anthropic",
1414
// "model": "claude-3-5-haiku-latest",
1515
// "model_provider": "google_genai",
16-
// "model": "gemini-2.0-flash",
16+
// "model": "gemini-2.5-flash",
1717
},
1818

1919
"mcp_servers": {
@@ -69,12 +69,15 @@ A Python version of this MCP client is available
6969
- npm 7+ (`npx`) to run Node.js-based MCP servers
7070
- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/)
7171
installed to run Python-based MCP servers
72-
- API keys from [Anthropic](https://console.anthropic.com/settings/keys),
73-
[OpenAI](https://platform.openai.com/api-keys), and/or
74-
[Google GenAI](https://aistudio.google.com/apikey)
75-
as needed.
72+
- LLM API keys from
73+
[OpenAI](https://platform.openai.com/api-keys),
74+
[Anthropic](https://console.anthropic.com/settings/keys),
75+
and/or
76+
[Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey)
77+
as needed
7678

7779
## Setup
80+
7881
1. Clone the repository:
7982
```bash
8083
git clone https://github.com/hideya/mcp-client-langchain-ts.git
@@ -126,3 +129,7 @@ See commandline options:
126129
```bash
127130
npm run start:h
128131
```
132+
133+
At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
134+
135+
Example queries can be configured in `llm_mcp_config.json5`

llm_mcp_config.json5

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
// // https://platform.openai.com/docs/pricing
2020
// // https://platform.openai.com/settings/organization/billing/overview
2121
// "model_provider": "openai",
22-
// "model": "gpt-4.1-nano",
22+
// "model": "gpt-4o-mini",
2323
// // "model": "o4-mini",
2424
// // "temperature": 0.0, // 'temperature' is not supported with "o4-mini"
2525
// // "max_completion_tokens": 10000, // Use 'max_completion_tokens' instead of 'max_tokens'
@@ -35,6 +35,15 @@
3535
// "max_tokens": 10000,
3636
},
3737

38+
// "llm": {
39+
// // hhttps://console.x.ai/
40+
// "model_provider": "xai",
41+
// "model": "grok-3-mini",
42+
// // "model": "grok-4",
43+
// // "temperature": 0.0,
44+
// // "max_tokens": 10000,
45+
// },
46+
3847
"example_queries": [
3948
"Read the news headlines on bbc.com",
4049
"Read and briefly summarize the LICENSE file",

0 commit comments

Comments
 (0)