Data Model
Starting small, let's first focus on the most important part of any domain, the data. For this, Hiveio leverages a few standards to provide a schema-based serialization solution based on JSON. JSON has been the universal standard for data exchanges for some time. Combined with a flexible and robust schema solution using the JSON Schema specification, Hiveio is able to provide a comparable serializing solution to the likes of Protobuf, Avro, or Thrift.
JSON Serialization
Hiveio uses the JSON Schema and Flux Standard Action specifications to automatically serialize and validate your data Models for network transport. Combined they become the standard format to transfer data to nearly any device or system. Complete with versioning and a schema registry, Hiveio provides a universal application framework that can seamlessly run in a variety of clients and server environments.
- JSON Schema
- The JSON Schema specification allows us to define a transport schema to validate incoming data. This can also be used to document your API with a definition that can be operated against. The specification defines that schemas can be hosted, giving you the ability to serve schemas through a static web server schema registry. You can use this to version your transport schemas and host them internally, externally, or both if you prefer.
- Flux Standard Action
- The Flux Standard Action specification allows us to define a lightweight, network data payload used to build your services. Minimally, this provides our network payload structure with explicit support for typed data definintions. Payload objects are defined and validated by their associated JSON Schemas.
These Models would then be packaged up and built into your client side domain logic or in the growing list of Docker images supporting the infrastructure layer of the Hiveio framework.
Schema Registry
A schema registry can easily be achieved by combining concepts in the JSON Schema specification and adding a static file server to serve the JSON schemas. Using the $id
keyword, you can specify a URI for the schema for hosting. You could use Node.js to host the files or even Nginx or Apache Web Server to serve the static JSON schemas.
We have not decided to build and maintain a solution specific to the Hiveio framework at this time due to time constraints but either hosting solution should be straightforward. Furthermore, hosting Schemas may not be necessary for your solution depending on the complexity of the domain and/or team(s) involved in building and maintaining your solutions.