R.CAUDLE · Riverman Position · Doctrine Rev 01 · 2026.05.16
On this page
  1. § 01Ownership is capability
  2. § 02The fractal
  3. § 03Substrate distinction
  4. § 04Where this leads

Position · The working doctrine

It's not a policy problem.It's an engineering problem.

That's where I work.

Sovereignty isn't something somebody gives you. It's something you build. It means owning the infrastructure, training your own engineers, writing your own code, and being able to audit what's running on your networks.

If you can't do it, you don't own it. And if you don't own it, it's not sovereign.

By River Caudle

§ 01 — Ownership is capability

Sovereignty is built, not granted.

The word gets used loosely. A nation can declare digital sovereignty in a press release; a plant can hang a "secured by" sticker on a switch. Neither is the thing. Sovereignty is the operational capability to do the work — design the network, write the firmware, read the traffic, replace the part. Anything short of that is dependence with branding.

What ownership requires

  • Infrastructure — the iron, the wire, the rack. Owned and physically accessible.
  • Engineers — trained inside the organization, not rented by the hour.
  • Code — auditable, modifiable, and not phoning home.
  • Visibility — packet-level insight into what's running, and what's talking to whom.

What you've actually got, otherwise

  • A subscription — to the infrastructure someone else owns.
  • A vendor — whose engineers do the work, on their schedule.
  • A black box — whose firmware can be updated without your knowledge.
  • A dashboard — that shows you what the vendor wants you to see.
"If you can't do it, you don't own it. And if you don't own it, it's not sovereign."

§ 02 — The fractal

Same unit at every scale.

Digital sovereignty at the national level and operational technology independence at the plant level are the same problem at different scales. Different vocabulary, identical structure. Who owns the stack? Who can see the traffic? Who do the devices call home to? Who can alter the firmware? The questions don't change between a continent and a control room — the answers just have more zeros.

National scale

  • Cloud providers as foreign infrastructure
  • Standards bodies as policy levers
  • Supply chains as attack surface
  • Trade controls as engineering constraints

Plant scale

  • SCADA vendors as foreign infrastructure
  • Standards documents as design levers
  • Vendor firmware as attack surface
  • Licensing terms as engineering constraints

§ 03 — The substrate distinction

Control systems act on physics, not on information.

This is where most OT security writing falls apart. The plant's substrate is governed by a different ranking than the information layer that watches it. Two governance models, one architecture. The fractal doesn't collapse at the top: same unit at every level, only scope changes.

OT governance

  • Safety — does it hurt anyone?
  • Reliability — does it stay running?
  • Performance — does it meet the spec?

In that order. Always.

IT governance

  • Confidentiality — is the data exposed?
  • Integrity — has the data been altered?
  • Availability — can the data be reached?

The CIA triad. Different problem.

§ 04 — Where this leads

The work is the doctrine.

Position is only useful if it shows up in how networks get built. Each of the frameworks below is this doctrine, operationalized — applied to a slice of the work where decisions actually get made.

"It's not a policy problem. It's an engineering problem. That's where I work."

Position · River Caudle · MMXXVI