Enable AI-powered industrial automation through Model Context Protocol integration.
The MCP (Model Context Protocol) Tool Connector bridges FrameworX solutions with AI language models, enabling intelligent automation assistance while maintaining industrial-grade safety and determinism. This connector exposes your solution's data and functionality as structured tools that AI models can invoke with parameters, not just generate text responses.
| Property | Value |
|---|---|
| Name | MCP Tool |
| Protocol | Model Context Protocol (MCP) |
| Interface | stdio/HTTP |
| Runtime | .NET 8.0 |
| AI Models | Claude |
| Configuration | Scripts → Classes |
According to the claude documentation, there are three ways of connection to the MCP Server:
The following configuration uses stdio, which requires the MCP server to run on the same machine as the LLM. Therefore, if your solution is hosted on a remote machine, you must install the product in both environments: the local machine (where the LLM runs) and the remote machine (where the runtime is hosted).
Configure your AI client (e.g., Claude Desktop) to connect to the MCP server:
{
"mcpServers": {
"YourSolution": {
"command": "C:\\FrameworX\\fx-10\\net8.0\\TMCPServerStdio\\TMCPServerStdio.exe",
"args": ["/host:127.0.0.1", "/port:3101"],
"transport": "stdio"
}
}
}
If you solution is Windows-only, the command should be "<ProductPath>\\fx-10\\TMCPServerStdio\\TMCPServerStdio.exe".
It was tested only with GitHub Copilot. We expect it to work with other LLMs as well, but they have not been tested.
Program: TMCPServerHttp
Install Path: <ProductPath>\fx-10\net8.0\TMCPServerHttp\ (NET 8.0 Runtime required to execute TMCPServerHttp.dll)
Arguments in LLM Setting, example: .mcp.json (GitHub Copilot) or claude_desktop_config (ClaudeAI) (location may vary depending on the LLM)
Url:<url>: Endpoint containg Host:Port. Ex: http://localhost:3000, https://192.168.20.2:3000
When using certificate HTTP/SSL, the url must include "https"
Headers: X_API_KEY (optional) used by authentication
Arguments in TMCPServerHttp.json (located in C:\Users\Public\Documents\FrameworX\MachineSettings):
X_API_KEY: (optional) used by authentication. It is a string password that will be required when client connects to the MCP server.
/CertFileName:<filename>: Certificate file name. Used by HTTP/SSL. Work with “CertPass“.
/CertPass:<pass>: Certificate password. Used by HTTP/SSL. Work with “CertFileName“.
/CertHash:<hash>: Certificate Thumbprint. Used by HTTP/SSL. Located at: (StoreName.My, StoreLocation.LocalMachine)
/ListenPort:<port>: Port to listen (endpoint). Default is 3000
Runtime section:
/Host<host>: Runtime Host/IP where runtime is running. Default is localhost
/Port:<port>: Runtime Port where runtime is running. Default is 3101
/UserName<name>: User name used by connection to runtime
/Password:<pass>: User password used by connection to runtime
To connect to several runtimes and/or listen several ports, call TMCPServerHttp with argument “/instance:<number>“.
Arguments in section “4” must include “instance”.
Example: TMCPServerHttp /instance:2 => X_API_KEY2, ListenPort2, Runtime2, etc…
{
"servers": {
"FrameworX Runtime - HeaderLayout": {
"type": "http",
"url": "http://localhost:3000",
"headers": {
"X-API-KEY": ""
}
}
}
}
{
"appSettings": {
"X_API_KEY": "",
"CertFileName": "",
"CertPass": "",
"CertHash": "",
"ListenPort": "3000",
"Runtime": {
"Host": "localhost",
"Port": "3101",
"Username": "guest",
"Password": ""
}
}
}
SSE configuration was not tested.
When MCP server is running, by default you have three methods:
And you could have many more, by just creating a custom method, for that follow the steps below:
Description is the question you will ask the LLM.
MethodName is what the LLM will call. Each capital letter will appear as a space in the LLM interface.
Example: GetTankLevel will appear as "Get Tank Level".
ParametersDescription explains what is expected for each parameter.
Parameters define the inputs.
Content defines the behavior or response of the method.
Result is what the LLM will receive in order to generate the answer, which it will then convert into a more human-readable response.
[McpServerTool, Description("<Description>")]
public string <MethodName>(
[Description("<ParametersDescription>")] <Parameters>)
{
<Content>
return <Result>;
}
Example:
[McpServerTool, Description("Get current tank level")]
public string GetTankLevel(
[Description("Tank identifier")] string tankId = "4")
{
return @Tag.Tank_{tankId}_Level"].ToString();
}
After every configuration change or after shutting down and restarting the solution, you must fully restart the Desktop LLM. This means not only closing the application window, but also terminating the process (either from the system tray icon or via the Task Manager). This is a requirement of the LLM.
Once configured, the AI can access your solution data:
General questions
Alarm
Historian
Custom Methods (methods you created)
Build your first MCP Tool and connect it to Claude AI:
How to Build an MCP Tool - Step-by-step guide for creating and deploying MCP Tools
Explore a complete working example with MQTT integration:
SolarPanels MCP Demo - Full solution demonstrating MCP Tools with solar panel monitoring
[McpServerTool, Description("Safe data retrieval")]
public string GetData(string tagName)
{
try
{
if (!@Tag.Exists(tagName))
return $"Error: Tag {tagName} not found";
return @Tag[tagName].ToString();
}
catch (Exception ex)
{
@Info.Trace($"MCP Error: {ex.Message}");
return "Error: Unable to retrieve data";
}
}
MCP Server not starting
AI cannot access methods
Runtime → Startup)Data not updating