Enable AI-powered industrial automation through Model Context Protocol integration.

  • Name: MCPServer
  • Version: 1.0.0.0
  • Interface: TCP/IP
  • Configuration:
    • Scripts / Classes


Overview

The MCP (Model Context Protocol) Tool Connector bridges FrameworX solutions with AI language models, enabling intelligent automation assistance while maintaining industrial-grade safety and determinism. This connector exposes your solution's data and functionality as structured tools that AI models can invoke with parameters, not just generate text responses.

Key Capabilities

  • Structured AI Integration - Expose specific methods and functions to AI models with type-safe parameters
  • Real-time Data Access - Query live tag values, historian data, and alarm states through AI
  • Bi-directional Communication - AI can both read from and write to your industrial systems (with appropriate safeguards)
  • Platform Agnostic - Works with Claude, GPT, and other MCP-compatible AI models


Integration Architecture

MCP Protocol Architecture AI Model (Claude / GPT) MCP Protocol Model Context Protocol FrameworX Solution Structured Methods Tag Data Access Historian Queries Alarm Monitoring Real-Time Events



Technical Specifications

PropertyValue
NameMCP Tool
ProtocolModel Context Protocol (MCP)
Interfacestdio/HTTP
Runtime.NET 8.0
AI ModelsClaude
ConfigurationScripts → Classes

Prerequisites

  • FrameworX 10.1 or later
  • .NET 8.0 runtime
  • Claude Desktop or compatible MCP client
  • Network connectivity

Configuration Workflow

According to the claude documentation, there are three ways of connection to the MCP Server:

  1. stdio
  2. HTTP
  3. SSE 

stdio

The following configuration uses stdio, which requires the MCP server to run on the same machine as the LLM. Therefore, if your solution is hosted on a remote machine, you must install the product in both environments: the local machine (where the LLM runs) and the remote machine (where the runtime is hosted).

Configure your AI client (e.g., Claude Desktop) to connect to the MCP server:

{
  "mcpServers": {
    "YourSolution": {
      "command": "C:\\FrameworX\\fx-10\\net8.0\\TMCPServerStdio\\TMCPServerStdio.exe",
      "args": ["/host:127.0.0.1", "/port:3101"],
      "transport": "stdio"
    }
  }
}

If you solution is Windows-only, the command should be "<ProductPath>\\fx-10\\TMCPServerStdio\\TMCPServerStdio.exe".

HTTP

It was tested only with GitHub Copilot. We expect it to work with other LLMs as well, but they have not been tested.

Program: TMCPServerHttp

Install Path: <ProductPath>\fx-10\net8.0\TMCPServerHttp\ (NET 8.0 Runtime required to execute TMCPServerHttp.dll)

LLM Setting 

Arguments in LLM Setting, example: .mcp.json (GitHub Copilot) or claude_desktop_config (ClaudeAI) (location may vary depending on the LLM)

  1. Url:<url>: Endpoint containg Host:Port. Ex: http://localhost:3000, https://192.168.20.2:3000
    When using certificate HTTP/SSL, the url must include "https"

  2. Headers: X_API_KEY (optional) used by authentication

MCP Server Setting

Arguments in TMCPServerHttp.json (located in C:\Users\Public\Documents\FrameworX\MachineSettings):

  1. X_API_KEY: (optional) used by authentication. It is a string password that will be required when client connects to the MCP server.

  2. /CertFileName:<filename>: Certificate file name. Used by HTTP/SSL. Work with “CertPass“.

  3. /CertPass:<pass>: Certificate password. Used by HTTP/SSL. Work with “CertFileName“.

  4. /CertHash:<hash>: Certificate Thumbprint. Used by HTTP/SSL. Located at: (StoreName.My, StoreLocation.LocalMachine)

  5. /ListenPort:<port>: Port to listen (endpoint). Default is 3000

  6. Runtime section:

    1. /Host<host>: Runtime Host/IP where runtime is running. Default is localhost

    2. /Port:<port>: Runtime Port where runtime is running. Default is 3101

    3. /UserName<name>: User name used by connection to runtime

    4. /Password:<pass>: User password used by connection to runtime

Multiple connections

To connect to several runtimes and/or listen several ports, call TMCPServerHttp with argument “/instance:<number>“.

  1. Arguments in section “4” must include “instance”.

  2. Example: TMCPServerHttp /instance:2 => X_API_KEY2, ListenPort2, Runtime2, etc…


.mcp.json (GitHub Copilot)
{
   "servers": {
    "FrameworX Runtime - HeaderLayout": {
      "type": "http",
      "url": "http://localhost:3000",
      "headers": {
        "X-API-KEY": ""
      }
    }
  }
}
.TMCPServerHttp.json
{
  "appSettings": {
    "X_API_KEY": "",
    "CertFileName": "",
    "CertPass": "",
    "CertHash": "",
    "ListenPort": "3000",
    "Runtime": {
      "Host": "localhost",
      "Port": "3101",
      "Username": "guest",
      "Password": ""
    }
  }
}

SSE

SSE configuration was not tested.



Available Methods

When MCP server is running, by default you have three methods:

  1. Get Value
  2. Get Historian
  3. Get Alarms

And you could have many more, by just creating a custom method, for that follow the steps below:

  1. Navigate to Scripts → Classes
  2. Create new class, select MCP Tool type
  3. Define methods using MCP decorators, where:
  • Description is the question you will ask the LLM.

  • MethodName is what the LLM will call. Each capital letter will appear as a space in the LLM interface.

    • Example: GetTankLevel will appear as "Get Tank Level".

  • ParametersDescription explains what is expected for each parameter.

  • Parameters define the inputs.

  • Content defines the behavior or response of the method.

  • Result is what the LLM will receive in order to generate the answer, which it will then convert into a more human-readable response.

[McpServerTool, Description("<Description>")]
public string <MethodName>(
[Description("<ParametersDescription>")] <Parameters>)
{
	<Content>
    return <Result>;
}

Example:

[McpServerTool, Description("Get current tank level")]
public string GetTankLevel(
[Description("Tank identifier")] string tankId = "4")
{
    return @Tag.Tank_{tankId}_Level"].ToString();
}


After every configuration change or after shutting down and restarting the solution, you must fully restart the Desktop LLM. This means not only closing the application window, but also terminating the process (either from the system tray icon or via the Task Manager). This is a requirement of the LLM.

How to verify if the setting was done correctly

  1. Click in the LLM connector, you should see the <SolutionName> name.
  2. Go in Settings > Developer, and check if your solution is in a good state (running). (depends on the LLM).

Query Through AI

Once configured, the AI can access your solution data:

General questions

  • What is the Server.SystemMonitor.Uptime of BottlingLine MCP Demo?
  • Get value of Tag.Machine1.Temperature1

Alarm

  • Is there any active alarm, or has there been one recently? Any acknowledged?
  • Retrieve the alarms again, and check how many acknowledged alarms do we have? and how many are not?

Historian

  • Retrieve the historical Brewery/BrewHouse/Temperature values from today.
  • Show a trend visualization (only the trend) with the historical values of Brewery/BrewHouse/Temperature from today in JSX format
  • Get all the historical value from today and return the highest and the time that hapenned

Custom Methods (methods you created)

  • Get the last anomalies from the Bottling Line



Getting Started

Quick Start Tutorial

Build your first MCP Tool and connect it to Claude AI:

How to Build an MCP Tool - Step-by-step guide for creating and deploying MCP Tools

Example Implementation

Explore a complete working example with MQTT integration:

SolarPanels MCP Demo - Full solution demonstrating MCP Tools with solar panel monitoring

Best Practices

Security Considerations

  • Authentication - Implement proper authentication for MCP endpoints
  • Authorization - Define clear permission levels for AI operations
  • Validation - Always validate AI-generated parameters before execution
  • Audit Trail - Log all AI-initiated actions for compliance

Performance Optimization

  • Cache frequently accessed data
  • Implement rate limiting for AI queries
  • Use asynchronous methods for long-running operations
  • Monitor resource usage of MCP server process

Error Handling

[McpServerTool, Description("Safe data retrieval")]
public string GetData(string tagName)
{
    try
    {
        if (!@Tag.Exists(tagName))
            return $"Error: Tag {tagName} not found";
        
        return @Tag[tagName].ToString();
    }
    catch (Exception ex)
    {
        @Info.Trace($"MCP Error: {ex.Message}");
        return "Error: Unable to retrieve data";
    }
}

Troubleshooting

MCP Server not starting

  • Verify .NET 8.0 runtime installation
  • Check firewall settings for configured port
  • Confirm TMCPServerStdio.exe path is correct

AI cannot access methods

  • Ensure MCP decorators are properly applied
  • Verify solution is running (Runtime → Startup)
  • Check AI client configuration matches server settings
  • Verify is .NET 8.0 is installed in your computer
  • Restart your LLM completely after run the solution.

Data not updating

  • Confirm tags are properly configured
  • Verify real-time database connectivity
  • Check MCP method error handling

Related Documentation



In this section...