Protocols

Protocols are the digital sensors of the Blankstate ecosystem—deterministic instruments for measuring human interaction. They define what to observe, how to measure it, and why it matters.

Protocol Constraints

Every protocol must satisfy these invariants. Any protocol that violates these constraints is not a Blankstate protocol.

Deterministic

Same input + protocol = Same output. Always.

Zero-Shot

No training required per use case.

Self-Supervised

Protocol defines its own measurement criteria.

Glass-Box

Every finding traceable to source.

Real-Time Capable

Sub-second analysis for streaming.

Edge-Ready

Can run locally, federated, or cloud.

How Protocols Work

1. Define What to Observe

Protocols contain nuances—specific patterns, concepts, or behaviors to detect. Each nuance has semantic indicators and scoring weights.

2. Process Through the Engine

The IBF Engine ingests interaction data and converts it to a Unified Interaction Representation (UIR) before protocol analysis. No raw content is stored—only structured signals.

3. Produce Traceable Results

Every score, every detection, every insight links back to the source. You can always ask "why?" and get an evidence-based answer. That's the glass-box guarantee.

Interaction Types

Protocols measure interactions across all relationship types.

Human ↔ Human

Conversations, meetings, negotiations, interviews, written exchanges

Human ↔ System

Voice assistants, chatbots, UI interactions, forms, navigation

System ↔ System

API calls, event streams, log sequences, data pipelines

Human ↔ Environment

Sensor data, biometrics, IoT interactions, physical-digital bridging

Universal Modality Coverage

Protocols are modality-agnostic. All modalities convert to a Unified Interaction Representation (UIR) before protocol analysis. One protocol. Any modality. Same measurement.

Implemented
Partial
Roadmap
StatusModalityData TypesNotes
Text Documents.pdf, .docx, .pptx, .rtf, .txtFull extraction with page/slide segmentation
Structured Data.xlsx, .csv, JSON, XMLSheet/table parsing with structure preservation
Scanned DocumentsImage-based PDFs, photos of documentsOCR extraction via advanced handler
AudioCalls, meetings, voice notes, podcastsTranscription with speaker diarization and timing
ImagesEmbedded images, screenshots, diagramsVLM description; OCR for text-in-image
VideoScreen recordings, video meetingsFrame extraction, visual context
System LogsApplication logs, error traces, audit trailsText-based logs supported; event semanticization planned
Event StreamsAPI request/response, webhooks, pub/subReal-time ingestion pipeline
Behavioral SignalsClick streams, navigation paths, form submissionsSession reconstruction
Sensor DataIoT telemetry, biometrics, environmentalSignal abstraction

Abstraction Principle

Regardless of source modality, the protocol engine receives:

Semantic content
Temporal structure
Actant involvement
Reconstruction chain

Current Release

STABLE
Protocol 1.0

Document + audio transcription with entity extraction

View details

Continue Exploring