1- # Simple CLI MCP Client Using LangChain / TypeScript [ ![ License: MIT] ( https://img.shields.io/badge/License-MIT-blue.svg )] ( https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE )
2-
3- This is a simple [ Model Context Protocol (MCP)] ( https://modelcontextprotocol.io/ ) client
4- that is intended for trying out MCP servers via a command-line interface.
5-
6- When testing LLM and MCP servers, their settings can be conveniently configured via a configuration file, such as the following:
7-
8- ``` json5
9- {
10- " llm" : {
11- " provider" : " openai" , " model" : " gpt-5-mini" ,
12- // "provider": "anthropic", "model": "claude-3-5-haiku-latest",
13- // "provider": "google_genai", "model": "gemini-2.5-flash",
14- },
15-
16- " mcp_servers" : {
17- " fetch" : {
18- " command" : " uvx" ,
19- " args" : [ " mcp-server-fetch" ]
20- },
21-
22- " weather" : {
23- " command" : " npx" ,
24- " args" : [ " -y" , " @h1deya/mcp-server-weather" ]
25- },
26-
27- // Auto-detection: tries Streamable HTTP first, falls back to SSE
28- " remote-mcp-server" : {
29- " url" : " https://${SERVER_HOST}:${SERVER_PORT}/..."
30- },
31-
32- // Example of authentication via Authorization header
33- " github" : {
34- " type" : " http" , // recommended to specify the protocol explicitly when authentication is used
35- " url" : " https://api.githubcopilot.com/mcp/" ,
36- " headers" : {
37- " Authorization" : " Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}"
38- }
39- },
40- }
41- }
42- ```
43-
44- It leverages [ LangChain ReAct Agent] ( https://github.com/langchain-ai/react-agent-js ) and
45- a utility function ` convertMcpToLangchainTools() ` from [ ` @h1deya/langchain-mcp-tools ` ] ( https://www.npmjs.com/package/@h1deya/langchain-mcp-tools ) .
46- This function handles parallel initialization of specified multiple MCP servers
47- and converts their available tools into an array of LangChain-compatible tools
48- ([ ` StructuredTool[] ` ] ( https://api.js.langchain.com/classes/_langchain_core.tools.StructuredTool.html ) ).
49-
50- This client supports both local (stdio) MCP servers as well as
51- remote (Streamable HTTP / SSE / WebSocket) MCP servers
52- which are accessible via a simple URL and optional headers for authentication and other purposes.
53-
54- This client only supports text results of MCP tool calls and disregards other result types.
55-
56- For the convenience of debugging MCP servers, this client prints local (stdio) MCP server logs to the console.
57-
58- LLMs from Anthropic, OpenAI and Google (GenAI) are currently supported.
59-
60- A Python version of this MCP client is available
61- [ here] ( https://github.com/hideya/mcp-client-langchain-py )
1+ # Building and Running from the Source
622
633## Prerequisites
644
655- Node.js 18+
66- - npm 7+ (` npx ` ) to run Node.js-based MCP servers
67- - [ optional] [ ` uv ` (` uvx ` )] ( https://docs.astral.sh/uv/getting-started/installation/ )
68- installed to run Python-based MCP servers
69- - LLM API keys from
70- [ OpenAI] ( https://platform.openai.com/api-keys ) ,
71- [ Anthropic] ( https://console.anthropic.com/settings/keys ) ,
72- and/or
73- [ Google AI Studio (for GenAI/Gemini)] ( https://aistudio.google.com/apikey )
74- as needed
6+ - git
7+
758
769## Setup
7710
@@ -86,31 +19,12 @@ A Python version of this MCP client is available
8619 npm install
8720 ` ` `
8821
89- 3. Setup API keys:
90- ` ` ` bash
91- cp .env.template .env
92- ` ` `
93- - Update ` .env` as needed.
94- - ` .gitignore` is configured to ignore ` .env`
95- to prevent accidental commits of the credentials.
96-
97- 4. Configure LLM and MCP Servers settings ` llm_mcp_config.json5` as needed.
22+ 3. Setup API keys
9823
99- - [The configuration file format](https://github.com/hideya/mcp-client-langchain-ts/blob/main/llm_mcp_config.json5)
100- for MCP servers follows the same structure as
101- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
102- with one difference: the key name ` mcpServers` has been changed
103- to ` mcp_servers` to follow the snake_case convention
104- commonly used in JSON configuration files.
105- - The file format is [JSON5](https://json5.org/),
106- where comments and trailing commas are allowed.
107- - The format is further extended to replace ` ${...} ` notations
108- with the values of corresponding environment variables.
109- - Keep all the credentials and private info in the ` .env` file
110- and refer to them with ` ${...} ` notation as needed.
24+ 4. Configure LLM and MCP Servers settings ` llm_mcp_config.json5` as needed
11125
11226
113- # # Usage
27+ # # Test Execution
11428
11529Run the app:
11630` ` ` bash
@@ -126,7 +40,3 @@ See commandline options:
12640` ` ` bash
12741npm run start:h
12842` ` `
129-
130- At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
131-
132- Example queries can be configured in ` llm_mcp_config.json5`
0 commit comments