Semantic Core for Sensor Telemetry Ingestion for Digital Twins

Digital twin platforms for smart cities must continuously receive different types of data from sensors, gateways, and services, but in real situations these data is heterogeneous in terms of indicator names, measurement units, time rules, and object identification, which makes integrations expensive and fragile, while second verification becomes complicated. In this paper, a minimal semantic core for “first-stage” telemetry receiving of the DTwin platform, where semantics are used as operational rules during data ingestion. The core includes a machine-readable model of entities and relation-ships, dictionaries of metrics and measurement units, a unified event format with sep-aration into a stable envelope and payload, formal validation against data schemas, a mapping table for transforming raw fields into standardized measurements [name, value, unit], as well as an ingestion service with canonicalization of the event record and integrity control through the SHA-256 cryptographic hash. The implementation ensures ingestion of correct events, rejection of incorrect ones without recording, and reproducible verification through control examples, a testing protocol, and evidence snapshots. In smart city settings, such a telemetry ingestion foundation can support reliable monitoring of municipal buildings and infrastructure, including energy efficiency, indoor environmental quality, and data-driven operational decision-making.
The proposed approach creates a core for stable integration of different sensor data into digital twins and further scaling of the platform.

Liked Liked