Examples
Complete, runnable examples demonstrating common patterns and use cases for Cerulion Graph Editor. Each example includes step-by-step instructions, code samples, and expected output.Prerequisites
Graph Editor Installed
Cerulion Graph Editor installed and running. See Installation.
Basic Knowledge
Understanding of nodes, topics, and schemas. See Quickstart if needed.
Language Familiarity
Basic knowledge of Rust, Python, or C++ (examples show multiple languages).
Time Available
Each example takes 20-30 minutes to complete.
Temperature Pipeline
A complete end-to-end example building a temperature monitoring pipeline from scratch.Problem Statement
Build a system that:- Reads temperature data from a sensor
- Converts temperature from Celsius to Fahrenheit
- Logs the converted temperature
- Runs continuously, processing new readings as they arrive
Solution
Create a simple pipeline: Publisher → Processor → SubscriberStep-by-Step
1
Create schema
Create a
TemperatureReading schema:2
Create publisher node
Create a
Temperature Publisher node with one output port of type TemperatureReading.Write code:3
Create processor node
Create a
Celsius to Fahrenheit node with input and output ports of type TemperatureReading.Write conversion code:4
Create subscriber node
Create a
Temperature Logger node with one input port of type TemperatureReading.Write logging code:5
Connect and run
- Connect
Temperature Publisher→Celsius to Fahrenheit→Temperature Logger - Generate and run
You should see:
Temperature: 72.50°F (timestamp: ...)Expected Output
Key Takeaways
- Schemas define data structure for type safety
- Nodes process data with single responsibility
- Topics automatically created when connecting nodes
- Framework handles all communication details
Fan-Out Pattern
Distribute data from one publisher to multiple subscribers.Problem Statement
You have one temperature sensor and need to send data to:- A logger
- An alert system
- A statistics calculator
Solution
Connect one output port to multiple input ports:Step-by-Step
1
Create schema and publisher
Create
TemperatureReading schema and Temperature Publisher node (same as temperature pipeline example).2
Create three subscriber nodes
Create three nodes:
Temperature Logger- Logs readingsTemperature Alert- Alerts on high temperatureTemperature Statistics- Calculates averages
TemperatureReading.3
Write subscriber code
4
Connect in fan-out pattern
Connect
Temperature Publisher output to all three subscriber inputs.You should see three connections from one output port.
Understanding Fan-Out
- One-to-many - One publisher, many subscribers
- Independent processing - Subscribers don’t affect each other
- Same data - All subscribers receive identical data
- Automatic distribution - Framework handles distribution
Use Cases
- Broadcasting data to multiple consumers
- Parallel processing of the same data
- Multiple systems monitoring the same stream
- Redundancy and backup systems
Fan-In Join Pattern
Combine data from multiple publishers into a single subscriber using a join node.Problem Statement
You have multiple temperature sensors and need to:- Collect readings from Sensor A, B, and C
- Combine them into a single average temperature
- Process the combined result
Solution
Use multiple publishers connected to a join node:Step-by-Step
1
Create schema with sensor ID
2
Create three sensor publishers
Create
Sensor A, Sensor B, Sensor C nodes, each with one output port.Write code for each:3
Create join node
Create a
Temperature Join node with:- Three input ports:
sensor_a_in,sensor_b_in,sensor_c_in - One output port:
combined_out
4
Connect in fan-in pattern
Connect all sensors to the join node, then join to processor.
You should see three connections converging into the join node.
Understanding Fan-In Join
- Many-to-one - Many publishers, one subscriber (via join)
- Synchronization - Join node coordinates multiple inputs
- Combination logic - Join node merges data appropriately
- Order independence - Handles data arriving in any order
Use Cases
- Data aggregation from multiple sources
- Sensor fusion (merging multiple sensor readings)
- Load balancing (collecting results from parallel workers)
- Synchronizing multiple data streams
Multi-Language Pipeline
Build pipelines using Rust, Python, and C++ nodes in the same graph.Problem Statement
Build a system that:- Reads data in Python (easy data manipulation)
- Processes in Rust (high performance)
- Logs in C++ (system integration)
- All nodes work together seamlessly
Solution
Build a pipeline where each language handles what it’s best at:Step-by-Step
1
Create schemas
2
Create Python reader node
Create a
Data Reader node (Python) with one output port of type RawData.Write Python code:3
Create Rust processor node
Create a
Data Processor node (Rust) with input RawData and output ProcessedData.Write Rust code:4
Create C++ logger node
Create a
System Logger node (C++) with one input port of type ProcessedData.Write C++ code:5
Connect nodes
Connect:
Data Reader (Python) → Data Processor (Rust) → System Logger (C++)The framework automatically handles serialization and type conversion between languages. No glue code needed.
Understanding Multi-Language Support
- Schemas are language-agnostic - Defined in YAML, work for all languages
- Automatic code generation - Framework generates types for each language
- Binary compatibility - Data structures are compatible across languages
- Framework handles communication - Serialization and transport are automatic
Language Selection Guide
- Python - Data parsing, string manipulation, rapid prototyping
- Rust - High performance, memory safety, concurrent processing
- C++ - System integration, low-level control, existing C++ libraries
Key Benefits
Use Best Language
Use each language for what it’s best at. No need to compromise.
Reuse Existing Code
Integrate existing libraries from any language without rewriting.
Team Flexibility
Team members can work in their preferred language.
Performance
Use Rust/C++ for performance-critical parts, Python for rapid development.