DRAFT v10.1.5. Pre-release draft for content review. Do not link from public material. The final page replaces this draft once 10.1.5 ships.



Property

Value

Sector

Historian & Databases

Group

SQL Databases

Connector

MongoDB Database

Name

MongoDB



New in 10.1.5. MongoDB is a new Dataset provider for Find, Aggregate, Count, Insert, Update, and Delete operations on document databases.

MongoDB document databases.

  • Name: MongoDB
  • Version: 1.0.0.0
  • Protocol: Proprietary (MongoDB Wire Protocol)
  • Interface: TCP/IP
  • Runtime: .NET Framework 4.8 (Windows) and .NET 10 (cross-platform)
  • Configuration:
    • Datasets / DBs
  • Minimum server version: MongoDB 6.0 (recommended 7.0 LTS)


Overview

Use MongoDB with the Datasets module to read and write document data from a FrameworX solution. The provider exposes three query shapes through the standard Dataset Query object and binds full document collections through Dataset Table for CRUD work.

Steps to connect:

  1. Install a MongoDB server and create a database.
  2. Create a Database Connection with the MongoDB provider.
  3. Configure the connection string and test the connection.
  4. Create Dataset Queries for Find, Aggregate, or Count.
  5. Create Dataset Tables to bind collections for insert, update, and delete.
  6. Call the queries and tables from scripts, displays, or reports.

Installation

Install a MongoDB server (6.0 or newer, 7.0 LTS recommended), create a database and user, then move to the FrameworX configuration below.

Install the MongoDB server

Follow the official MongoDB installation guide for your platform at mongodb.com/docs/manual/installation. Summary by target:

  • Windows: download the MongoDB Community Server MSI and run the installer. The default install registers mongod as a Windows service on port 27017.
  • Linux: install the distro package per the vendor instructions (for example, apt install mongodb-org on Debian or Ubuntu with the MongoDB repository configured). Start the service with systemctl start mongod.
  • macOS: brew tap mongodb/brew && brew install mongodb-community, then brew services start mongodb-community.
  • Docker: docker run -d --name fx-mongo -p 27017:27017 mongo:7.

Install the mongosh command-line shell alongside the server. Most platform installers include it by default.

Verify the server is reachable

Run the command below to confirm the server responds:

mongosh "mongodb://localhost:27017" --quiet --eval "db.adminCommand('ping').ok"

A successful ping returns 1.

Create the database and user

Run these commands in mongosh to create a target database, a starter collection, and a user for FrameworX to authenticate as:

use plant_data

db.readings.insertOne({ plant: "Plant01", value: 0, ts: new Date() })

db.createUser({
  user: "fxuser",
  pwd: "<choose-a-strong-password>",
  roles: [ { role: "readWrite", db: "plant_data" } ]
})

The user is stored against the plant_data database above. Use plant_data as the Auth Source value in the FrameworX connection string in the next section, or use admin when the user lives in the admin database.

Platform compatibility

The MongoDB connector targets netstandard2.0 and runs unchanged on both .NET Framework 4.8 (Designer and Windows Runtime) and .NET 10 (cross-platform Runtime on Windows, Linux, and containers). The vendored MongoDB driver ships in both flavors with the FrameworX installation. The mongod server is OS-native and has no .NET dependency.


Configuration

Follow the steps below to connect FrameworX to a MongoDB server.

  1. Access Datasets / DBs.
  2. Click the plus icon to create a new Database Connection.
  3. In the Name field, enter a name for the connection, for example MongoDB1.
  4. Choose MongoDB Data Provider as the Provider.
  5. Click OK.
  6. In the data grid, click the Connection String column of the newly created row.
  7. Configure the connection. The Designer dialog shows structured fields for Server, Port, User, Password, Database, Auth Source, and TLS. For replica sets and Atlas clusters, use the Advanced Connection URI field with a mongodb:// or mongodb+srv:// URI, which overrides the structured fields.
  8. Click Test to verify the connection with the MongoDB server.

Connection string options

Field

Required

Description

Example

Server

Yes

Host name or IP of the MongoDB server.

localhost

Port

No

TCP port. Default is 27017.

27017

User

No

Database user. Leave empty for unauthenticated access.

appuser

Password

No

User password. Stored encrypted in the solution file.

********

Database

Yes

Target database name.

plant_data

Auth Source

No

SCRAM authentication database. Default is admin.

admin

TLS

No

Set to true for TLS/SSL connections.

true

Connection URI

No

Full MongoDB URI. Overrides the structured fields. Use for replica sets and mongodb+srv:// Atlas URIs.

mongodb+srv://user:pwd@cluster0.example.net/?retryWrites=true

Query shapes

MongoDB queries use BSON filter documents and aggregation pipelines rather than SQL statements. The provider routes the Command Text of a Dataset Query to the correct driver operation based on the input shape:

Input shape

Operation

Description

Plain collection name, no braces or brackets

Find

Returns all documents in the collection, with columns auto-detected from the first document.

JSON object starting with {

Find or Count

Reads collection, filter, sort, and limit fields. If the object has a count key, runs Count and returns a single-row table.

JSON array starting with [

Aggregate

Treated as an aggregation pipeline. Requires a default collection set on the Dataset Query.


Configuration Example

This example shows how to read the latest production readings from a MongoDB collection, run an aggregation, and write a new record. The example requires objects in other modules of the platform.

In Datasets / Queries, create the queries below. For full details on query configuration, see Datasets Queries Reference.

  • QueryFindReadings
    • DB: the MongoDB connection created above.
    • Command Text:
      { "collection": "readings", "filter": { "plant": "Plant01" }, "sort": { "ts": -1 }, "limit": 10 }
  • QueryAggregateHourly
    • DB: the MongoDB connection.
    • Default collection: readings.
    • Command Text (aggregation pipeline):
      [
        { "$match": { "plant": "Plant01" } },
        { "$group": { "_id": { "$dateTrunc": { "date": "$ts", "unit": "hour" } }, "avg": { "$avg": "$value" } } },
        { "$sort": { "_id": 1 } }
      ]
  • QueryCount
    • Command Text:
      { "count": "readings", "filter": { "quality": "good" } }

In Datasets / Tables, create one table bound to the readings collection for insert and update work. A Dataset Table configured on a MongoDB collection uses the document _id as the primary key. Updates apply $set on the tracked columns and preserve any fields outside the tracked column list, matching the behavior of SQL Dataset Tables backed by a CommandBuilder.

In Unified Namespace / Tags, create the tags used in the example:

  • QueryResult: receives the output of the Find and Aggregate queries.
  • TriggerFind, TriggerAggregate, TriggerInsert: script task triggers.
  • LatestPlantCode, LatestValue: values written back to MongoDB.

In Scripts / Tasks, create the tasks below, each triggered by the matching tag:

  • Find task

    @Tag.QueryResult = @Dataset.Query.QueryFindReadings.SelectCommand();
    @Info.Trace("Find OK: " + @Tag.QueryResult);
  • Aggregate task

    @Tag.QueryResult = @Dataset.Query.QueryAggregateHourly.SelectCommand();
    @Info.Trace("Aggregate OK: " + @Tag.QueryResult);
  • Insert task (via Dataset Table)

    @Dataset.Table.TableReadings.AddRow();
    @Dataset.Table.TableReadings.Row["plant"] = @Tag.LatestPlantCode;
    @Dataset.Table.TableReadings.Row["value"] = @Tag.LatestValue;
    @Dataset.Table.TableReadings.Row["ts"] = DateTime.UtcNow;
    int i = @Dataset.Table.TableReadings.Save();
    @Info.Trace("Insert OK: " + i);

After you finish the configuration and create the scripts, run the solution and trigger each task. Values arrive in the MongoDB readings collection and the Find and Aggregate results populate the QueryResult tag.

Notes on writes

  • The Save call on a Dataset Table dispatches to MongoDB UpdateOne with $set on tracked columns. Fields outside the tracked column list are preserved on update.
  • Batched changes use BulkWrite with the same tracking rules.
  • Authentication supports SCRAM-SHA-256, TLS, replica sets, and mongodb+srv:// URIs.

In this section...