The Foundation

fw-libs — Core Libraries

The shared infrastructure every FiWorks component builds on

Shared C Libraries

Before any Generic Enabler gets rewritten, the foundation must be solid. The fw-libs are a set of C libraries that provide the common infrastructure every FiWorks component needs. The core libraries — fwBase, fwAlloc, fwJson, fwHash, and fwTrace — are battle-hardened in Orion-LD 1.x. The remaining libraries are new, purpose-built for the FiWorks platform. Every GE rewrite starts at 50% done because of them.

LibraryWhat It DoesWhy It Matters
fwBase Base types & utilities — bool, string ops, time functions, error stack, FLOG callback infrastructure Every fw-lib depends on fwBase — it's the foundation of the foundation
fwAlloc Bump allocator — O(1) alloc, bulk free of entire request context, no GC, no fragmentation Replaces JVM GC, Node.js V8 GC — in every GE
fwJson JSON parser — in-place zero-copy, null-terminates in the read buffer, zero allocation for string values, produces FtNode tree Replaces Jackson, body-parser, JSON.parse — in every GE
fwHash Hash tables — branch-prediction-optimized, open addressing, Robin Hood probing Subscriptions, device registries, token caches, session stores
fwProm Prometheus metrics — lock-free atomics, allocation-free updates, counters/gauges/histograms, on-demand rendering Built-in observability for every GE, no sidecar needed
fwTrace Structured logging — near-zero overhead, 6400 trace levels, callback-based library logging Replaces Morgan, SLF4J, Winston — in every GE
fwHttp HTTP server — epoll edge-triggered, zero-copy parsing, pre-allocated connection pool (1024 conns), SO_REUSEPORT multi-process scaling Replaces Express, libmicrohttpd, Vert.x, Spring — in every GE
fwJsonld JSON-LD @context engine — download remote contexts (fwHttp client), parse into dual fwHash tables (name→URI, URI→name), thread-safe context cache, full expansion/compaction of entities, prefix expansion, @vocab fallback, recursive context resolution, core context versioning (v1.0–v1.9) Every GE that touches NGSI-LD data needs @context handling. Without fwJsonld, each GE would reimplement it. Designed for full JSON-LD compliance (no strcmp shortcuts — everything through fwHash)
fwNgsild NGSI-LD validation & rendering — entity/attribute/subscription/registration validation, attribute type detection, GeoJSON/DateTime/URI checking, normalized/concise/simplified format conversion Shared validation layer for any GE that processes NGSI-LD payloads
fwNgsiv2 NGSIv2 validation and normalization — payload validation (different attribute structure from NGSI-LD), query string parsing (q, mq, georel, geometry, coords), representation modes (keyValues, values, unique), NGSIv2 ↔ FtNode normalization The NGSIv2 counterpart to fwNgsild — thinner (no @context semantic layer) but needed for the FiWorks NGSIv2 Broker. Enables NGSIv2 ↔ NGSI-LD translation when both brokers share the same kdb

Dependency Architecture

The libraries form a clean layered stack. Each layer depends only on layers below it — no circular dependencies, no diamond problems:

fw-libs dependency layers

Test Infrastructure

ToolWhat It DoesWhy It Matters
fwTest Smart test runner (bash + Python) — directive-based test files, REGEX pattern matching for dynamic output, SORT blocks for non-deterministic order, GUI diff viewer, index/range selection, retest-failures mode The standard test harness for every fw-lib and every FiWorks application

This is not a framework — it's a toolkit. Each library is independent, has no dependencies beyond libc, and compiles in under a second. A GE rewrite pulls in exactly what it needs.

With fw-libs in place, the question for each GE is not “how do we handle HTTP/JSON/memory/logging/@context?” but “what domain-specific logic remains?” For Wilma, it's JWT verification and proxying. For Keyrock, it's OAuth2 flows and user CRUD. For the IoT agents, it's protocol decoding. Even the JSON-LD @context engine — the semantic layer that makes NGSI-LD interoperable — is a shared library, not reimplemented per GE. The plumbing is done.

The AI Accelerator

With a Claude Max subscription acting as a software factory, what was once a multi-year team effort becomes a focused sprint. AI handles the boilerplate — protocol implementations, data structure plumbing, test scaffolding — while the human architect focuses on design decisions and domain-specific optimization. Each GE rewrite includes a development effort estimate assuming this model.