Use JSON schemas to validate the layout and types of the YAML config.
authorPhilippe Proulx <eeppeliteloop@gmail.com>
Fri, 22 May 2020 15:06:31 +0000 (11:06 -0400)
committerPhilippe Proulx <eeppeliteloop@gmail.com>
Fri, 29 May 2020 19:23:10 +0000 (15:23 -0400)
This patch changes `config_parse.py` so as to use JSON schemas
(<https://json-schema.org/>) to validate many aspects of a barectf YAML
configuration file instead of having redundant, manual checks.

JSON schemas are found in the `barectf` package's `schemas` directory by
an instance of the new `_SchemaValidator` class. Such an object finds
all the YAML files in the `schemas` directory and, for each of them:

1. Loads it (YAML to Python dictionary).
2. Gets its ID (`$id` property).
3. Adds the schema (1.) to a local schema store using its ID (2.).

Then _SchemaValidator.validate() uses this schema store to build a JSON
schema reference resolver and validator from the `jsonschema` package
(<https://pypi.org/project/jsonschema/>).

The `jsonschema` dependency is added to `pyproject.toml` and
`poetry.lock`. We need a version which supports JSON Schema draft 7
because we need the conditional keywords (`if`/`then`).

Because the barectf YAML configuration file format supports features
which do not exist natively in YAML (inclusions, field type aliases,
field type inheritance, and log level aliases), we can't have a single
JSON schema for the "raw" configuration file. There are actually five
validation steps, each one having its JSON schema:

1. Make sure the configuration object is minimally valid, that is, it's
   an object and has a `version` property with a supported value.

   Schema: `config/config-min.yaml`.

2. Make sure the configuration object is valid for the inclusion phase.

   Those schemas only validate that the metadata/stream/event objects
   are objects and that any `$include` property is valid.

   Each time the YAML configuration parser loads a partial YAML file for
   inclusion, it validates the resulting object using the corresponding
   schema depending on what's being included.

   Knowing those objects are valid as such is enough to process
   inclusions without caring about the resulting configuration object's
   validity.

   Schemas:

   * `2/config/clock-pre-include.yaml`
   * `2/config/config-pre-include.yaml`
   * `2/config/event-pre-include.yaml`
   * `2/config/include-prop.yaml`
   * `2/config/metadata-pre-include.yaml`
   * `2/config/stream-pre-include.yaml`
   * `2/config/trace-pre-include.yaml`

3. Make sure the configuration object is valid for the field type
   expansion phase.

   This schema digs into compound field types recursively to make sure
   any field type is either a string (field type alias) or an object
   with either a `class` property or an `$inherit`/`inherit` (mutually
   exclusive) property.

   Knowing the configuration object is valid as such is enough to expand
   field types without caring about the resulting configuration object's
   validity. So, for example, a resulting, expanded field type could be:

       class: string
       size: 16
       value-type:
         class: array
       meow: mix

   We just don't care at this point, as long as field types are
   "complete".

   Schema: `2/config/config-pre-field-type-expansion.yaml`.

4. Make sure the configuration object is valid for the log level
   expansion phase.

   This schema validates the `$log-levels` property of the metadata
   object as well as the `log-level` property of any event object.

   Knowing the configuration object is valid as such is enough to expand
   log levels, that is, to replace log level strings with their numeric
   value, without caring about the resulting configuration object's
   validity.

   Schema: `2/config/config-pre-log-level-expansion.yaml`.

5. Make sure the configuration object is valid.

   This validates the final, effective configuration object which, at
   this point, do not contain any:

   * `$include` properties.
   * Strings as field types.
   * `$inherit`/`inherit` properties (field type objects).
   * `$log-levels` property (metadata object).
   * Strings for the `log-level` properties (event objects).

   Also, this is the configuration object which the `--dump-config`
   option prints now.

   * `2/config/byte-order-prop.yaml`
   * `2/config/config.yaml`
   * `2/config/field-type.yaml`
   * `2/config/uuid-prop.yaml`

After step 5., there's not much to validate in `config_parse.py` itself:

* Make sure referred clock type objects (in `property-mappings` properties
  of integer field type objects) exist.

* Make sure identifiers are valid (the schemas do not always validate
  that they exclude CTF keywords for technical reasons).

* Make sure alignment values are valid (powers of two).

* Make sure there's only one default stream type name and that it
  exists.

* Make sure the values of enumeration field type members are within the
  field type's range depending on the value (integer) field type's size.

* Everything that remains in `_BarectfMetadataValidator`.

* Everything in `_MetadataSpecialFieldsValidator`, which is untouched
  because that's not something we validate with JSON schemas, although
  we could possibly, but it might stretch the use case.

_SchemaValidator.validate() catches a `jsonschema` exception and
converts it to a barectf `ConfigParseError` exception to avoid leaking
`jsonschema` objects (implementation detail) from barectf calls. I made
an effort to make the error object as readable as possible, for example
converting the instance path to context object names, but there's room
for improvement here.

No functional change intended, except for the modified raised
`ConfigParseError` objects. Tests are not changed and pass.

Signed-off-by: Philippe Proulx <eeppeliteloop@gmail.com>
17 files changed:
barectf/config_parse.py
barectf/schemas/2/config/byte-order-prop.yaml [new file with mode: 0644]
barectf/schemas/2/config/clock-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/config-pre-field-type-expansion.yaml [new file with mode: 0644]
barectf/schemas/2/config/config-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/config-pre-log-level-expansion.yaml [new file with mode: 0644]
barectf/schemas/2/config/config.yaml [new file with mode: 0644]
barectf/schemas/2/config/event-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/field-type.yaml [new file with mode: 0644]
barectf/schemas/2/config/include-prop.yaml [new file with mode: 0644]
barectf/schemas/2/config/metadata-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/stream-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/trace-pre-include.yaml [new file with mode: 0644]
barectf/schemas/2/config/uuid-prop.yaml [new file with mode: 0644]
barectf/schemas/config/config-min.yaml [new file with mode: 0644]
poetry.lock
pyproject.toml

index c81889d0dd443f9f4527982e3981a2e270fc7a3a..3f553c8e5b4d770f42f8b85179dfbc00a7ce3863 100644 (file)
 
 from barectf import metadata
 from barectf import config
+import pkg_resources
 import collections
+import jsonschema
 import datetime
 import barectf
+import os.path
 import enum
 import yaml
 import uuid
@@ -96,25 +99,10 @@ class _Integer(_PseudoObj):
         super().__init__()
         self.size = None
         self.byte_order = None
-        self.reset_align()
-        self.reset_signed()
-        self.reset_base()
-        self.reset_encoding()
-        self.reset_property_mappings()
-
-    def reset_align(self):
         self.align = None
-
-    def reset_signed(self):
         self.signed = False
-
-    def reset_base(self):
         self.base = 10
-
-    def reset_encoding(self):
         self.encoding = metadata.Encoding.NONE
-
-    def reset_property_mappings(self):
         self.property_mappings = []
 
     @property
@@ -140,9 +128,6 @@ class _FloatingPoint(_PseudoObj):
         self.exp_size = None
         self.mant_size = None
         self.byte_order = None
-        self.reset_align()
-
-    def reset_align(self):
         self.align = 8
 
     @property
@@ -160,13 +145,6 @@ class _Enum(_PseudoObj):
         self.value_type = None
         self.members = collections.OrderedDict()
 
-    @property
-    def last_value(self):
-        if len(self.members) == 0:
-            return
-
-        return list(self.members.values())[-1][1]
-
     @property
     def real_align(self):
         return self.value_type.real_align
@@ -178,12 +156,11 @@ class _Enum(_PseudoObj):
 class _String(_PseudoObj):
     def __init__(self):
         super().__init__()
-        self.reset_encoding()
-
-    def reset_encoding(self):
         self.encoding = metadata.Encoding.UTF8
 
-    real_align = 8
+    @property
+    def real_align(self):
+        return 8
 
     def _to_public(self):
         return metadata.String(self.encoding)
@@ -206,13 +183,7 @@ class _Array(_PseudoObj):
 class _Struct(_PseudoObj):
     def __init__(self):
         super().__init__()
-        self.reset_min_align()
-        self.reset_fields()
-
-    def reset_min_align(self):
         self.min_align = 1
-
-    def reset_fields(self):
         self.fields = collections.OrderedDict()
 
     @property
@@ -249,41 +220,14 @@ class _Trace(_PseudoObj):
 class _Clock(_PseudoObj):
     def __init__(self):
         super().__init__()
-        self.reset_name()
-        self.reset_uuid()
-        self.reset_description()
-        self.reset_freq()
-        self.reset_error_cycles()
-        self.reset_offset_seconds()
-        self.reset_offset_cycles()
-        self.reset_absolute()
-        self.reset_return_ctype()
-
-    def reset_name(self):
         self.name = None
-
-    def reset_uuid(self):
         self.uuid = None
-
-    def reset_description(self):
         self.description = None
-
-    def reset_freq(self):
         self.freq = int(1e9)
-
-    def reset_error_cycles(self):
         self.error_cycles = 0
-
-    def reset_offset_seconds(self):
         self.offset_seconds = 0
-
-    def reset_offset_cycles(self):
         self.offset_cycles = 0
-
-    def reset_absolute(self):
         self.absolute = False
-
-    def reset_return_ctype(self):
         self.return_ctype = 'uint32_t'
 
     def _to_public(self):
@@ -374,28 +318,110 @@ class _Metadata(_PseudoObj):
                                  self.default_stream_name)
 
 
-def _is_assoc_array_prop(node):
-    return isinstance(node, dict)
+# This JSON schema reference resolver only serves to detect when it
+# needs to resolve a remote URI.
+#
+# This must never happen in barectf because all our schemas are local;
+# it would mean a programming or schema error.
+class _RefResolver(jsonschema.RefResolver):
+    def resolve_remote(self, uri):
+        # this must never happen: all our schemas are local
+        raise RuntimeError('Missing local schema with URI "{}"'.format(uri))
+
+
+# Schema validator which considers all the schemas found in the barectf
+# package's `schemas` directory.
+#
+# The only public method is validate() which accepts an instance to
+# validate as well as a schema short ID.
+class _SchemaValidator:
+    def __init__(self):
+        subdirs = ['config', os.path.join('2', 'config')]
+        schemas_dir = pkg_resources.resource_filename(__name__, 'schemas')
+        self._store = {}
+
+        for subdir in subdirs:
+            dir = os.path.join(schemas_dir, subdir)
+
+            for file_name in os.listdir(dir):
+                if not file_name.endswith('.yaml'):
+                    continue
+
+                with open(os.path.join(dir, file_name)) as f:
+                    schema = yaml.load(f, Loader=yaml.SafeLoader)
+
+                assert '$id' in schema
+                schema_id = schema['$id']
+                assert schema_id not in self._store
+                self._store[schema_id] = schema
+
+    @staticmethod
+    def _dict_from_ordered_dict(o_dict):
+        dct = {}
+
+        for k, v in o_dict.items():
+            new_v = v
+
+            if type(v) is collections.OrderedDict:
+                new_v = _SchemaValidator._dict_from_ordered_dict(v)
 
+            dct[k] = new_v
 
-def _is_array_prop(node):
-    return isinstance(node, list)
+        return dct
 
+    def _validate(self, instance, schema_short_id):
+        # retrieve full schema ID from short ID
+        schema_id = 'https://barectf.org/schemas/{}.json'.format(schema_short_id)
+        assert schema_id in self._store
 
-def _is_int_prop(node):
-    return type(node) is int
+        # retrieve full schema
+        schema = self._store[schema_id]
 
+        # Create a reference resolver for this schema using this
+        # validator's schema store.
+        resolver = _RefResolver(base_uri=schema_id, referrer=schema,
+                                store=self._store)
 
-def _is_str_prop(node):
-    return type(node) is str
+        # create a JSON schema validator using this reference resolver
+        validator = jsonschema.Draft7Validator(schema, resolver=resolver)
 
+        # Validate the instance, converting its
+        # `collections.OrderedDict` objects to `dict` objects so as to
+        # make any error message easier to read (because
+        # validator.validate() below uses str() for error messages, and
+        # collections.OrderedDict.__str__() is bulky).
+        validator.validate(self._dict_from_ordered_dict(instance))
 
-def _is_bool_prop(node):
-    return type(node) is bool
+    # Validates `instance` using the schema having the short ID
+    # `schema_short_id`.
+    #
+    # A schema short ID is the part between `schemas/` and `.json` in
+    # its URI.
+    #
+    # Raises a `ConfigParseError` object, hiding any `jsonschema`
+    # exception, on validation failure.
+    def validate(self, instance, schema_short_id):
+        try:
+            self._validate(instance, schema_short_id)
+        except jsonschema.ValidationError as exc:
+            # convert to barectf `ConfigParseError` exception
+            contexts = ['Configuration object']
+            contexts += ['"{}" property'.format(p) for p in exc.absolute_path]
+            schema_ctx = ''
+
+            if len(exc.context) > 0:
+                msgs = '; '.join([e.message for e in exc.context])
+                schema_ctx = ': {}'.format(msgs)
 
+            new_exc = ConfigParseError(contexts.pop(),
+                                       '{}{} (from schema "{}")'.format(exc.message,
+                                                                        schema_ctx,
+                                                                        schema_short_id))
 
-def _is_valid_alignment(align):
-    return ((align & (align - 1)) == 0) and align > 0
+            for ctx in reversed(contexts):
+                new_exc.append_ctx(ctx)
+
+            raise new_exc
 
 
 def _byte_order_str_to_bo(bo_str):
@@ -418,42 +444,37 @@ def _encoding_str_to_encoding(encoding_str):
         return metadata.Encoding.NONE
 
 
-_re_iden = re.compile(r'^[a-zA-Z][a-zA-Z0-9_]*$')
-_ctf_keywords = set([
-    'align',
-    'callsite',
-    'clock',
-    'enum',
-    'env',
-    'event',
-    'floating_point',
-    'integer',
-    'stream',
-    'string',
-    'struct',
-    'trace',
-    'typealias',
-    'typedef',
-    'variant',
-])
-
+def _validate_identifier(iden, ctx_obj_name, prop):
+    assert type(iden) is str
+    ctf_keywords = {
+        'align',
+        'callsite',
+        'clock',
+        'enum',
+        'env',
+        'event',
+        'floating_point',
+        'integer',
+        'stream',
+        'string',
+        'struct',
+        'trace',
+        'typealias',
+        'typedef',
+        'variant',
+    }
 
-def _is_valid_identifier(iden):
-    if not _re_iden.match(iden):
-        return False
+    if iden in ctf_keywords:
+        fmt = 'Invalid {} (not a valid identifier): "{}"'
+        raise ConfigParseError(ctx_obj_name, fmt.format(prop, iden))
 
-    if _re_iden in _ctf_keywords:
-        return False
 
-    return True
+def _validate_alignment(align, ctx_obj_name):
+    assert align >= 1
 
-
-def _get_first_unknown_prop(node, known_props):
-    for prop_name in node:
-        if prop_name in known_props:
-            continue
-
-        return prop_name
+    if (align & (align - 1)) != 0:
+        raise ConfigParseError(ctx_obj_name,
+                               'Invalid alignment: {}'.format(align))
 
 
 def _append_error_ctx(exc, obj_name, msg=None):
@@ -461,43 +482,37 @@ def _append_error_ctx(exc, obj_name, msg=None):
     raise
 
 
+# Entities.
+#
+# Order of values is important here.
+@enum.unique
+class _Entity(enum.IntEnum):
+    TRACE_PACKET_HEADER = 0
+    STREAM_PACKET_CONTEXT = 1
+    STREAM_EVENT_HEADER = 2
+    STREAM_EVENT_CONTEXT = 3
+    EVENT_CONTEXT = 4
+    EVENT_PAYLOAD = 5
+
+
 # This validator validates the configured metadata for barectf specific
 # needs.
 #
 # barectf needs:
 #
-#   * all header/contexts are at least byte-aligned
-#   * all integer and floating point number sizes to be <= 64
-#   * no inner structures or arrays
+# * All header/contexts are at least byte-aligned.
+# * No nested structures or arrays.
 class _BarectfMetadataValidator:
     def __init__(self):
         self._type_to_validate_type_func = {
-            _Integer: self._validate_int_type,
-            _FloatingPoint: self._validate_float_type,
-            _Enum: self._validate_enum_type,
-            _String: self._validate_string_type,
             _Struct: self._validate_struct_type,
             _Array: self._validate_array_type,
         }
 
-    def _validate_int_type(self, t, entity_root):
-        if t.size > 64:
-            raise ConfigParseError('Integer type', 'Size must be lesser than or equal to 64 bits')
-
-    def _validate_float_type(self, t, entity_root):
-        if t.exp_size + t.mant_size > 64:
-            raise ConfigParseError('Floating point number type', 'Size must be lesser than or equal to 64 bits')
-
-    def _validate_enum_type(self, t, entity_root):
-        if t.value_type.size > 64:
-            raise ConfigParseError('Enumeration type', 'Integer type\'s size must be lesser than or equal to 64 bits')
-
-    def _validate_string_type(self, t, entity_root):
-        pass
-
     def _validate_struct_type(self, t, entity_root):
         if not entity_root:
-            raise ConfigParseError('Structure type', 'Inner structure types are not supported as of this version')
+            raise ConfigParseError('Structure type',
+                                   'Inner structure types are not supported as of this version')
 
         for field_name, field_type in t.fields.items():
             if entity_root and self._cur_entity is _Entity.TRACE_PACKET_HEADER:
@@ -508,13 +523,16 @@ class _BarectfMetadataValidator:
             try:
                 self._validate_type(field_type, False)
             except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Structure type\' field "{}"'.format(field_name))
+                _append_error_ctx(exc, 'Structure type\'s field "{}"'.format(field_name))
 
     def _validate_array_type(self, t, entity_root):
         raise ConfigParseError('Array type', 'Not supported as of this version')
 
     def _validate_type(self, t, entity_root):
-        self._type_to_validate_type_func[type(t)](t, entity_root)
+        func = self._type_to_validate_type_func.get(type(t))
+
+        if func is not None:
+            func(t, entity_root)
 
     def _validate_entity(self, t):
         if t is None:
@@ -522,11 +540,10 @@ class _BarectfMetadataValidator:
 
         # make sure entity is byte-aligned
         if t.real_align < 8:
-            raise ConfigParseError('Root type', 'Alignment must be at least byte-aligned')
+            raise ConfigParseError('Root type',
+                                   'Alignment must be at least 8')
 
-        # make sure entity is a structure
-        if type(t) is not _Struct:
-            raise ConfigParseError('Root type', 'Expecting a structure type')
+        assert type(t) is _Struct
 
         # validate types
         self._validate_type(t, True)
@@ -540,9 +557,7 @@ class _BarectfMetadataValidator:
             _append_error_ctx(exc, 'Trace', 'Invalid packet header type')
 
         for stream_name, stream in meta.streams.items():
-            if not _is_valid_identifier(stream_name):
-                raise ConfigParseError('Trace', 'Stream name "{}" is not a valid C identifier'.format(stream_name))
-
+            _validate_identifier(stream_name, 'Trace', 'stream name')
             self._cur_entity = _Entity.STREAM_PACKET_CONTEXT
 
             try:
@@ -569,9 +584,9 @@ class _BarectfMetadataValidator:
 
             try:
                 for ev_name, ev in stream.events.items():
-                    if not _is_valid_identifier(ev_name):
-                        raise ConfigParseError('Stream "{}"'.format(stream_name),
-                                               'Event name "{}" is not a valid C identifier'.format(ev_name))
+                    _validate_identifier(ev_name,
+                                         'Stream "{}"'.format(stream_name),
+                                         'event name')
 
                     self._cur_entity = _Entity.EVENT_CONTEXT
 
@@ -597,7 +612,9 @@ class _BarectfMetadataValidator:
     def _validate_default_stream(self, meta):
         if meta.default_stream_name:
             if meta.default_stream_name not in meta.streams.keys():
-                raise ConfigParseError('barectf metadata', 'Default stream name ("{}") does not exist'.format(meta.default_stream_name))
+                fmt = 'Default stream name ("{}") does not exist'
+                raise ConfigParseError('barectf metadata',
+                                       fmt.format(meta.default_stream_name))
 
     def validate(self, meta):
         self._validate_entities_and_names(meta)
@@ -605,8 +622,10 @@ class _BarectfMetadataValidator:
 
 
 # This validator validates special fields of trace, stream, and event
-# types. For example, if checks that the "stream_id" field exists in the
-# trace packet header if there's more than one stream, and much more.
+# types.
+#
+# For example, it checks that the "stream_id" field exists in the trace
+# packet header if there's more than one stream, and much more.
 class _MetadataSpecialFieldsValidator:
     def _validate_trace_packet_header_type(self, t):
         # needs "stream_id" field?
@@ -866,195 +885,6 @@ class _MetadataSpecialFieldsValidator:
                 _append_error_ctx(exc, 'Stream "{}"'.format(stream.name), 'Invalid')
 
 
-# Entities. Order of values is important here.
-@enum.unique
-class _Entity(enum.IntEnum):
-    TRACE_PACKET_HEADER = 0
-    STREAM_PACKET_CONTEXT = 1
-    STREAM_EVENT_HEADER = 2
-    STREAM_EVENT_CONTEXT = 3
-    EVENT_CONTEXT = 4
-    EVENT_PAYLOAD = 5
-
-
-# Since type inheritance allows types to be only partially defined at
-# any place in the configuration, this validator validates that actual
-# trace, stream, and event types are all complete and valid. Therefore
-# an invalid, but unusued type alias is accepted.
-class _MetadataTypesHistologyValidator:
-    def __init__(self):
-        self._type_to_validate_type_histology_func = {
-            _Integer: self._validate_integer_histology,
-            _FloatingPoint: self._validate_float_histology,
-            _Enum: self._validate_enum_histology,
-            _String: self._validate_string_histology,
-            _Struct: self._validate_struct_histology,
-            _Array: self._validate_array_histology,
-        }
-
-    def _validate_integer_histology(self, t):
-        # size is set
-        if t.size is None:
-            raise ConfigParseError('Integer type', 'Missing size')
-
-    def _validate_float_histology(self, t):
-        # exponent digits is set
-        if t.exp_size is None:
-            raise ConfigParseError('Floating point number type',
-                                   'Missing exponent size')
-
-        # mantissa digits is set
-        if t.mant_size is None:
-            raise ConfigParseError('Floating point number type',
-                                   'Missing mantissa size')
-
-        # exponent and mantissa sum is a multiple of 8
-        if (t.exp_size + t.mant_size) % 8 != 0:
-            raise ConfigParseError('Floating point number type',
-                                   'Mantissa and exponent sizes sum must be a multiple of 8')
-
-    def _validate_enum_histology(self, t):
-        # integer type is set
-        if t.value_type is None:
-            raise ConfigParseError('Enumeration type', 'Missing value type')
-
-        # there's at least one member
-        if not t.members:
-            raise ConfigParseError('Enumeration type', 'At least one member required')
-
-        # no overlapping values and all values are valid considering
-        # the value type
-        ranges = []
-
-        if t.value_type.signed:
-            value_min = -(1 << t.value_type.size - 1)
-            value_max = (1 << (t.value_type.size - 1)) - 1
-        else:
-            value_min = 0
-            value_max = (1 << t.value_type.size) - 1
-
-        for label, value in t.members.items():
-            for rg in ranges:
-                if value[0] <= rg[1] and rg[0] <= value[1]:
-                    raise ConfigParseError('Enumeration type\'s member "{}"',
-                                           'Overlaps another member'.format(label))
-
-            name_fmt = 'Enumeration type\'s member "{}"'
-            msg_fmt = 'Value {} is outside the value type range [{}, {}]'
-
-            if value[0] < value_min or value[0] > value_max:
-                raise ConfigParseError(name_fmt.format(label),
-                                       msg_fmt.format(value[0], value_min, value_max))
-
-            if value[1] < value_min or value[1] > value_max:
-                raise ConfigParseError(name_fmt.format(label),
-                                       msg_fmt.format(value[0], value_min, value_max))
-
-            ranges.append(value)
-
-    def _validate_string_histology(self, t):
-        # always valid
-        pass
-
-    def _validate_struct_histology(self, t):
-        # all fields are valid
-        for field_name, field_type in t.fields.items():
-            try:
-                self._validate_type_histology(field_type)
-            except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Structure type\'s field "{}"'.format(field_name))
-
-    def _validate_array_histology(self, t):
-        # length is set
-        if t.length is None:
-            raise ConfigParseError('Array type', 'Missing length')
-
-        # element type is set
-        if t.element_type is None:
-            raise ConfigParseError('Array type', 'Missing element type')
-
-        # element type is valid
-        try:
-            self._validate_type_histology(t.element_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Array type', 'Invalid element type')
-
-    def _validate_type_histology(self, t):
-        if t is None:
-            return
-
-        self._type_to_validate_type_histology_func[type(t)](t)
-
-    def _validate_entity_type_histology(self, t):
-        if t is None:
-            return
-
-        if type(t) is not _Struct:
-            raise ConfigParseError('Root type', 'Expecting a structure type')
-
-        self._validate_type_histology(t)
-
-    def _validate_event_types_histology(self, ev):
-        ev_name = ev.name
-
-        # validate event context type
-        try:
-            self._validate_entity_type_histology(ev.context_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Event "{}"'.format(ev.name),
-                              'Invalid context type')
-
-        # validate event payload type
-        try:
-            self._validate_entity_type_histology(ev.payload_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Event "{}"'.format(ev.name),
-                              'Invalid payload type')
-
-    def _validate_stream_types_histology(self, stream):
-        stream_name = stream.name
-
-        # validate stream packet context type
-        try:
-            self._validate_entity_type_histology(stream.packet_context_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Stream "{}"'.format(stream_name),
-                              'Invalid packet context type')
-
-        # validate stream event header type
-        try:
-            self._validate_entity_type_histology(stream.event_header_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Stream "{}"'.format(stream_name),
-                              'Invalid event header type')
-
-        # validate stream event context type
-        try:
-            self._validate_entity_type_histology(stream.event_context_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Stream "{}"'.format(stream_name),
-                              'Invalid event context type')
-
-        # validate events
-        for ev in stream.events.values():
-            try:
-                self._validate_event_types_histology(ev)
-            except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Stream "{}"'.format(stream_name),
-                                  'Invalid event')
-
-    def validate(self, meta):
-        # validate trace packet header type
-        try:
-            self._validate_entity_type_histology(meta.trace.packet_header_type)
-        except ConfigParseError as exc:
-            _append_error_ctx(exc, 'Metadata\'s trace', 'Invalid packet header type')
-
-        # validate streams
-        for stream in meta.streams.values():
-            self._validate_stream_types_histology(stream)
-
-
 class _YamlConfigParser:
     def __init__(self, include_dirs, ignore_include_not_found, dump_config):
         self._class_name_to_create_type_func = {
@@ -1071,441 +901,146 @@ class _YamlConfigParser:
             'structure': self._create_struct,
             'array': self._create_array,
         }
-        self._type_to_create_type_func = {
-            _Integer: self._create_integer,
-            _FloatingPoint: self._create_float,
-            _Enum: self._create_enum,
-            _String: self._create_string,
-            _Struct: self._create_struct,
-            _Array: self._create_array,
-        }
         self._include_dirs = include_dirs
         self._ignore_include_not_found = ignore_include_not_found
         self._dump_config = dump_config
+        self._schema_validator = _SchemaValidator()
 
     def _set_byte_order(self, metadata_node):
-        if 'trace' not in metadata_node:
-            raise ConfigParseError('Metadata', 'Missing "trace" property')
-
-        trace_node = metadata_node['trace']
-
-        if not _is_assoc_array_prop(trace_node):
-            raise ConfigParseError('Metadata\'s "trace" property',
-                                   'Must be an associative array')
-
-        if 'byte-order' not in trace_node:
-            raise ConfigParseError('Metadata\'s "trace" property',
-                                   'Missing "byte-order" property')
-
-        bo_node = trace_node['byte-order']
-
-        if not _is_str_prop(bo_node):
-            raise ConfigParseError('Metadata\'s "trace" property',
-                                   '"byte-order" property must be a string ("le" or "be")')
-
-        self._bo = _byte_order_str_to_bo(bo_node)
-
-        if self._bo is None:
-            raise ConfigParseError('Metadata\'s "trace" property',
-                                   'Invalid "byte-order" property: must be "le" or "be"')
-
-    def _lookup_type_alias(self, name):
-        if name in self._tas:
-            return copy.deepcopy(self._tas[name])
+        self._bo = _byte_order_str_to_bo(metadata_node['trace']['byte-order'])
+        assert self._bo is not None
 
     def _set_int_clock_prop_mapping(self, int_obj, prop_mapping_node):
-        unk_prop = _get_first_unknown_prop(prop_mapping_node, ['type', 'name', 'property'])
-
-        if unk_prop:
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   'Unknown property: "{}"'.format(unk_prop))
-
-        if 'name' not in prop_mapping_node:
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   'Missing "name" property')
-
-        if 'property' not in prop_mapping_node:
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   'Missing "property" property')
-
         clock_name = prop_mapping_node['name']
-        prop = prop_mapping_node['property']
-
-        if not _is_str_prop(clock_name):
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   '"name" property must be a string')
-
-        if not _is_str_prop(prop):
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   '"property" property must be a string')
+        clock = self._clocks.get(clock_name)
 
-        if clock_name not in self._clocks:
+        if clock is None:
             raise ConfigParseError('Integer type\'s clock property mapping',
                                    'Invalid clock name "{}"'.format(clock_name))
 
-        if prop != 'value':
-            raise ConfigParseError('Integer type\'s clock property mapping',
-                                   'Invalid "property" property: "{}"'.format(prop))
-
-        mapped_clock = self._clocks[clock_name]
         prop_mapping = _PropertyMapping()
-        prop_mapping.object = mapped_clock
-        prop_mapping.prop = prop
+        prop_mapping.object = clock
+        prop_mapping.prop = 'value'
         int_obj.property_mappings.append(prop_mapping)
 
-    def _get_first_unknown_type_prop(self, type_node, known_props):
-        kp = known_props + ['inherit', 'class']
-
-        if self._version >= 201:
-            kp.append('$inherit')
-
-        return _get_first_unknown_prop(type_node, kp)
-
-    def _create_integer(self, obj, node):
-        if obj is None:
-            # create integer object
-            obj = _Integer()
-
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'size',
-            'align',
-            'signed',
-            'byte-order',
-            'base',
-            'encoding',
-            'property-mappings',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Integer type',
-                                   'Unknown property: "{}"'.format(unk_prop))
+    def _create_integer(self, node):
+        obj = _Integer()
 
         # size
-        if 'size' in node:
-            size = node['size']
-
-            if not _is_int_prop(size):
-                raise ConfigParseError('Integer type',
-                                       '"size" property of integer type object must be an integer')
-
-            if size < 1:
-                raise ConfigParseError('Integer type',
-                                       'Invalid integer size: {}'.format(size))
-
-            obj.size = size
+        obj.size = node['size']
 
         # align
-        if 'align' in node:
-            align = node['align']
-
-            if align is None:
-                obj.reset_align()
-            else:
-                if not _is_int_prop(align):
-                    raise ConfigParseError('Integer type',
-                                           '"align" property of integer type object must be an integer')
-
-                if not _is_valid_alignment(align):
-                    raise ConfigParseError('Integer type',
-                                           'Invalid alignment: {}'.format(align))
+        align_node = node.get('align')
 
-                obj.align = align
+        if align_node is not None:
+            _validate_alignment(align_node, 'Integer type')
+            obj.align = align_node
 
         # signed
-        if 'signed' in node:
-            signed = node['signed']
-
-            if signed is None:
-                obj.reset_signed()
-            else:
-                if not _is_bool_prop(signed):
-                    raise ConfigParseError('Integer type',
-                                           '"signed" property of integer type object must be a boolean')
+        signed_node = node.get('signed')
 
-                obj.signed = signed
+        if signed_node is not None:
+            obj.signed = signed_node
 
         # byte order
-        if 'byte-order' in node:
-            byte_order = node['byte-order']
+        obj.byte_order = self._bo
+        bo_node = node.get('byte-order')
 
-            if byte_order is None:
-                obj.byte_order = self._bo
-            else:
-                if not _is_str_prop(byte_order):
-                    raise ConfigParseError('Integer type',
-                                           '"byte-order" property of integer type object must be a string ("le" or "be")')
-
-                byte_order = _byte_order_str_to_bo(byte_order)
-
-                if byte_order is None:
-                    raise ConfigParseError('Integer type',
-                                           'Invalid "byte-order" property in integer type object')
-
-                obj.byte_order = byte_order
-        else:
-            obj.byte_order = self._bo
+        if bo_node is not None:
+            obj.byte_order = _byte_order_str_to_bo(bo_node)
 
         # base
-        if 'base' in node:
-            base = node['base']
-
-            if base is None:
-                obj.reset_base()
+        base_node = node.get('base')
+
+        if base_node is not None:
+            if base_node == 'bin':
+                obj.base = 2
+            elif base_node == 'oct':
+                obj.base = 8
+            elif base_node == 'dec':
+                obj.base = 10
             else:
-                if not _is_str_prop(base):
-                    raise ConfigParseError('Integer type',
-                                           '"base" property of integer type object must be a string ("bin", "oct", "dec", or "hex")')
-
-                if base == 'bin':
-                    base = 2
-                elif base == 'oct':
-                    base = 8
-                elif base == 'dec':
-                    base = 10
-                elif base == 'hex':
-                    base = 16
-                else:
-                    raise ConfigParseError('Integer type',
-                                           'Unknown "base" property value: "{}" ("bin", "oct", "dec", and "hex" are accepted)'.format(base))
-
-                obj.base = base
+                assert base_node == 'hex'
+                obj.base = 16
 
         # encoding
-        if 'encoding' in node:
-            encoding = node['encoding']
+        encoding_node = node.get('encoding')
 
-            if encoding is None:
-                obj.reset_encoding()
-            else:
-                if not _is_str_prop(encoding):
-                    raise ConfigParseError('Integer type',
-                                           '"encoding" property of integer type object must be a string ("none", "ascii", or "utf-8")')
-
-                encoding = _encoding_str_to_encoding(encoding)
-
-                if encoding is None:
-                    raise ConfigParseError('Integer type',
-                                           'Invalid "encoding" property in integer type object')
-
-                obj.encoding = encoding
+        if encoding_node is not None:
+            obj.encoding = _encoding_str_to_encoding(encoding_node)
 
         # property mappings
-        if 'property-mappings' in node:
-            prop_mappings = node['property-mappings']
-
-            if prop_mappings is None:
-                obj.reset_property_mappings()
-            else:
-                if not _is_array_prop(prop_mappings):
-                    raise ConfigParseError('Integer type',
-                                           '"property-mappings" property of integer type object must be an array')
-
-                if len(prop_mappings) > 1:
-                    raise ConfigParseError('Integer type',
-                                           'Length of "property-mappings" array in integer type object must be 1')
-
-                for index, prop_mapping in enumerate(prop_mappings):
-                    if not _is_assoc_array_prop(prop_mapping):
-                        raise ConfigParseError('Integer type',
-                                               'Elements of "property-mappings" property of integer type object must be associative arrays')
+        pm_node = node.get('property-mappings')
 
-                    if 'type' not in prop_mapping:
-                        raise ConfigParseError('Integer type',
-                                               'Missing "type" property in integer type object\'s "property-mappings" array\'s element #{}'.format(index))
-
-                    prop_type = prop_mapping['type']
-
-                    if not _is_str_prop(prop_type):
-                        raise ConfigParseError('Integer type',
-                                               '"type" property of integer type object\'s "property-mappings" array\'s element #{} must be a string'.format(index))
-
-                    if prop_type == 'clock':
-                        self._set_int_clock_prop_mapping(obj, prop_mapping)
-                    else:
-                        raise ConfigParseError('Integer type',
-                                               'Unknown property mapping type "{}" in integer type object\'s "property-mappings" array\'s element #{}'.format(prop_type, index))
+        if pm_node is not None:
+            assert len(pm_node) == 1
+            self._set_int_clock_prop_mapping(obj, pm_node[0])
 
         return obj
 
-    def _create_float(self, obj, node):
-        if obj is None:
-            # create floating point number object
-            obj = _FloatingPoint()
-
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'size',
-            'align',
-            'byte-order',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Floating point number type',
-                                   'Unknown property: "{}"'.format(unk_prop))
+    def _create_float(self, node):
+        obj = _FloatingPoint()
 
         # size
-        if 'size' in node:
-            size = node['size']
-
-            if not _is_assoc_array_prop(size):
-                raise ConfigParseError('Floating point number type',
-                                       '"size" property must be an associative array')
-
-            unk_prop = _get_first_unknown_prop(size, ['exp', 'mant'])
-
-            if unk_prop:
-                raise ConfigParseError('Floating point number type\'s "size" property',
-                                       'Unknown property: "{}"'.format(unk_prop))
-
-            if 'exp' in size:
-                exp = size['exp']
-
-                if not _is_int_prop(exp):
-                    raise ConfigParseError('Floating point number type\'s "size" property',
-                                           '"exp" property must be an integer')
-
-                if exp < 1:
-                    raise ConfigParseError('Floating point number type\'s "size" property',
-                                           'Invalid exponent size: {}')
-
-                obj.exp_size = exp
-
-            if 'mant' in size:
-                mant = size['mant']
-
-                if not _is_int_prop(mant):
-                    raise ConfigParseError('Floating point number type\'s "size" property',
-                                           '"mant" property must be an integer')
-
-                if mant < 1:
-                    raise ConfigParseError('Floating point number type\'s "size" property',
-                                           'Invalid mantissa size: {}')
-
-                obj.mant_size = mant
+        size_node = node['size']
+        obj.exp_size = size_node['exp']
+        obj.mant_size = size_node['mant']
 
         # align
-        if 'align' in node:
-            align = node['align']
-
-            if align is None:
-                obj.reset_align()
-            else:
-                if not _is_int_prop(align):
-                    raise ConfigParseError('Floating point number type',
-                                           '"align" property must be an integer')
-
-                if not _is_valid_alignment(align):
-                    raise ConfigParseError('Floating point number type',
-                                           'Invalid alignment: {}'.format(align))
+        align_node = node.get('align')
 
-                obj.align = align
+        if align_node is not None:
+            _validate_alignment(align_node, 'Floating point number type')
+            obj.align = align_node
 
         # byte order
-        if 'byte-order' in node:
-            byte_order = node['byte-order']
+        obj.byte_order = self._bo
+        bo_node = node.get('byte-order')
 
-            if byte_order is None:
-                obj.byte_order = self._bo
-            else:
-                if not _is_str_prop(byte_order):
-                    raise ConfigParseError('Floating point number type',
-                                           '"byte-order" property must be a string ("le" or "be")')
-
-                byte_order = _byte_order_str_to_bo(byte_order)
-
-                if byte_order is None:
-                    raise ConfigParseError('Floating point number type',
-                                           'Invalid "byte-order" property')
-        else:
-            obj.byte_order = self._bo
+        if bo_node is not None:
+            obj.byte_order = _byte_order_str_to_bo(bo_node)
 
         return obj
 
-    def _create_enum(self, obj, node):
-        if obj is None:
-            # create enumeration object
-            obj = _Enum()
-
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'value-type',
-            'members',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Enumeration type',
-                                   'Unknown property: "{}"'.format(unk_prop))
+    def _create_enum(self, node):
+        obj = _Enum()
 
         # value type
-        if 'value-type' in node:
-            value_type_node = node['value-type']
-
-            try:
-                obj.value_type = self._create_type(value_type_node)
-            except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Enumeration type', 'Cannot create integer type')
+        try:
+            obj.value_type = self._create_type(node['value-type'])
+        except ConfigParseError as exc:
+            _append_error_ctx(exc, 'Enumeration type',
+                              'Cannot create integer type')
 
         # members
-        if 'members' in node:
-            members_node = node['members']
-
-            if not _is_array_prop(members_node):
-                raise ConfigParseError('Enumeration type',
-                                       '"members" property must be an array')
+        members_node = node.get('members')
 
-            cur = 0
-            last_value = obj.last_value
-
-            if last_value is None:
-                cur = 0
+        if members_node is not None:
+            if obj.value_type.signed:
+                value_min = -(1 << obj.value_type.size - 1)
+                value_max = (1 << (obj.value_type.size - 1)) - 1
             else:
-                cur = last_value + 1
+                value_min = 0
+                value_max = (1 << obj.value_type.size) - 1
 
-            for index, m_node in enumerate(members_node):
-                if not _is_str_prop(m_node) and not _is_assoc_array_prop(m_node):
-                    raise ConfigParseError('Enumeration type',
-                                           'Invalid member #{}: expecting a string or an associative array'.format(index))
+            cur = 0
 
-                if _is_str_prop(m_node):
+            for m_node in members_node:
+                if type(m_node) is str:
                     label = m_node
                     value = (cur, cur)
                     cur += 1
                 else:
-                    unk_prop = _get_first_unknown_prop(m_node, [
-                        'label',
-                        'value',
-                    ])
-
-                    if unk_prop:
-                        raise ConfigParseError('Enumeration type',
-                                               'Unknown member object property: "{}"'.format(unk_prop))
-
-                    if 'label' not in m_node:
-                        raise ConfigParseError('Enumeration type',
-                                               'Missing "label" property in member #{}'.format(index))
-
+                    assert type(m_node) is collections.OrderedDict
                     label = m_node['label']
-
-                    if not _is_str_prop(label):
-                        raise ConfigParseError('Enumeration type',
-                                               '"label" property of member #{} must be a string'.format(index))
-
-                    if 'value' not in m_node:
-                        raise ConfigParseError('Enumeration type',
-                                               'Missing "value" property in member ("{}")'.format(label))
-
                     value = m_node['value']
 
-                    if not _is_int_prop(value) and not _is_array_prop(value):
-                        raise ConfigParseError('Enumeration type',
-                                               'Invalid member ("{}"): expecting an integer or an array'.format(label))
-
-                    if _is_int_prop(value):
+                    if type(value) is int:
                         cur = value + 1
                         value = (value, value)
                     else:
-                        if len(value) != 2:
-                            raise ConfigParseError('Enumeration type',
-                                                   'Invalid member ("{}"): range must have exactly two items'.format(label))
-
+                        assert type(value) is list
+                        assert len(value) == 2
                         mn = value[0]
                         mx = value[1]
 
@@ -1516,445 +1051,151 @@ class _YamlConfigParser:
                         value = (mn, mx)
                         cur = mx + 1
 
-                obj.members[label] = value
-
-        return obj
-
-    def _create_string(self, obj, node):
-        if obj is None:
-            # create string object
-            obj = _String()
+                name_fmt = 'Enumeration type\'s member "{}"'
+                msg_fmt = 'Value {} is outside the value type range [{}, {}]'
 
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'encoding',
-        ])
+                if value[0] < value_min or value[0] > value_max:
+                    raise ConfigParseError(name_fmt.format(label),
+                                           msg_fmt.format(value[0],
+                                                          value_min,
+                                                          value_max))
 
-        if unk_prop:
-            raise ConfigParseError('String type',
-                                   'Unknown object property: "{}"'.format(unk_prop))
+                if value[1] < value_min or value[1] > value_max:
+                    raise ConfigParseError(name_fmt.format(label),
+                                           msg_fmt.format(value[0],
+                                                          value_min,
+                                                          value_max))
 
-        # encoding
-        if 'encoding' in node:
-            encoding = node['encoding']
+                obj.members[label] = value
 
-            if encoding is None:
-                obj.reset_encoding()
-            else:
-                if not _is_str_prop(encoding):
-                    raise ConfigParseError('String type',
-                                           '"encoding" property of must be a string ("none", "ascii", or "utf-8")')
+        return obj
 
-                encoding = _encoding_str_to_encoding(encoding)
+    def _create_string(self, node):
+        obj = _String()
 
-                if encoding is None:
-                    raise ConfigParseError('String type',
-                                           'Invalid "encoding" property')
+        # encoding
+        encoding_node = node.get('encoding')
 
-                obj.encoding = encoding
+        if encoding_node is not None:
+            obj.encoding = _encoding_str_to_encoding(encoding_node)
 
         return obj
 
-    def _create_struct(self, obj, node):
-        if obj is None:
-            # create structure object
-            obj = _Struct()
-
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'min-align',
-            'fields',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Structure type',
-                                   'Unknown object property: "{}"'.format(unk_prop))
+    def _create_struct(self, node):
+        obj = _Struct()
 
         # minimum alignment
-        if 'min-align' in node:
-            min_align = node['min-align']
+        min_align_node = node.get('min-align')
 
-            if min_align is None:
-                obj.reset_min_align()
-            else:
-                if not _is_int_prop(min_align):
-                    raise ConfigParseError('Structure type',
-                                           '"min-align" property must be an integer')
-
-                if not _is_valid_alignment(min_align):
-                    raise ConfigParseError('Structure type',
-                                           'Invalid minimum alignment: {}'.format(min_align))
-
-                obj.min_align = min_align
+        if min_align_node is not None:
+            _validate_alignment(min_align_node, 'Structure type')
+            obj.min_align = min_align_node
 
         # fields
-        if 'fields' in node:
-            fields = node['fields']
-
-            if fields is None:
-                obj.reset_fields()
-            else:
-                if not _is_assoc_array_prop(fields):
-                    raise ConfigParseError('Structure type',
-                                           '"fields" property must be an associative array')
+        fields_node = node.get('fields')
 
-                for field_name, field_node in fields.items():
-                    if not _is_valid_identifier(field_name):
-                        raise ConfigParseError('Structure type',
-                                               '"{}" is not a valid field name'.format(field_name))
+        if fields_node is not None:
+            for field_name, field_node in fields_node.items():
+                _validate_identifier(field_name, 'Structure type', 'field name')
 
-                    try:
-                        obj.fields[field_name] = self._create_type(field_node)
-                    except ConfigParseError as exc:
-                        _append_error_ctx(exc, 'Structure type',
-                                          'Cannot create field "{}"'.format(field_name))
+                try:
+                    obj.fields[field_name] = self._create_type(field_node)
+                except ConfigParseError as exc:
+                    _append_error_ctx(exc, 'Structure type',
+                                      'Cannot create field "{}"'.format(field_name))
 
         return obj
 
-    def _create_array(self, obj, node):
-        if obj is None:
-            # create array object
-            obj = _Array()
-
-        unk_prop = self._get_first_unknown_type_prop(node, [
-            'length',
-            'element-type',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Array type',
-                                   'Unknown property: "{}"'.format(unk_prop))
+    def _create_array(self, node):
+        obj = _Array()
 
         # length
-        if 'length' in node:
-            length = node['length']
+        obj.length = node['length']
 
-            if not _is_int_prop(length):
-                raise ConfigParseError('Array type',
-                                       '"length" property must be an integer')
+        # element type
+        try:
+            obj.element_type = self._create_type(node['element-type'])
+        except ConfigParseError as exc:
+            _append_error_ctx(exc, 'Array type', 'Cannot create element type')
 
-            if type(length) is int and length < 0:
-                raise ConfigParseError('Array type',
-                                       'Invalid length: {}'.format(length))
+        return obj
 
-            obj.length = length
-
-        # element type
-        if 'element-type' in node:
-            element_type_node = node['element-type']
-
-            try:
-                obj.element_type = self._create_type(node['element-type'])
-            except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Array type', 'Cannot create element type')
-
-        return obj
-
-    def _create_type(self, type_node):
-        if type(type_node) is str:
-            t = self._lookup_type_alias(type_node)
-
-            if t is None:
-                raise ConfigParseError('Type',
-                                       'Unknown type alias "{}"'.format(type_node))
-
-            return t
-
-        if not _is_assoc_array_prop(type_node):
-            raise ConfigParseError('Type',
-                                   'Expecting associative arrays or string (type alias name)')
-
-        # inherit:
-        #   v2.0:  "inherit"
-        #   v2.1+: "$inherit"
-        inherit_node = None
-
-        if self._version >= 200:
-            if 'inherit' in type_node:
-                inherit_prop = 'inherit'
-                inherit_node = type_node[inherit_prop]
-
-        if self._version >= 201:
-            if '$inherit' in type_node:
-                if inherit_node is not None:
-                    raise ConfigParseError('Type',
-                                           'Cannot specify both "inherit" and "$inherit" properties of type object: prefer "$inherit"')
-
-                inherit_prop = '$inherit'
-                inherit_node = type_node[inherit_prop]
-
-        if inherit_node is not None and 'class' in type_node:
-            raise ConfigParseError('Type',
-                                   'Cannot specify both "{}" and "class" properties in type object'.format(inherit_prop))
-
-        if inherit_node is not None:
-            if not _is_str_prop(inherit_node):
-                raise ConfigParseError('Type',
-                                       '"{}" property of type object must be a string'.format(inherit_prop))
-
-            base = self._lookup_type_alias(inherit_node)
-
-            if base is None:
-                raise ConfigParseError('Type',
-                                       'Cannot inherit from type alias "{}": type alias does not exist at this point'.format(inherit_node))
-
-            func = self._type_to_create_type_func[type(base)]
-        else:
-            if 'class' not in type_node:
-                raise ConfigParseError('Type',
-                                       'Does not inherit, therefore must have a "class" property')
-
-            class_name = type_node['class']
-
-            if type(class_name) is not str:
-                raise ConfigParseError('Type', '"class" property must be a string')
-
-            if class_name not in self._class_name_to_create_type_func:
-                raise ConfigParseError('Type',
-                                       'Unknown class "{}"'.format(class_name))
-
-            base = None
-            func = self._class_name_to_create_type_func[class_name]
-
-        return func(base, type_node)
-
-    def _register_type_aliases(self, metadata_node):
-        self._tas = dict()
-
-        if 'type-aliases' not in metadata_node:
-            return
-
-        ta_node = metadata_node['type-aliases']
-
-        if ta_node is None:
-            return
-
-        if not _is_assoc_array_prop(ta_node):
-            raise ConfigParseError('Metadata',
-                                   '"type-aliases" property must be an associative array')
-
-        for ta_name, ta_type in ta_node.items():
-            if ta_name in self._tas:
-                raise ConfigParseError('Metadata',
-                                       'Duplicate type alias "{}"'.format(ta_name))
-
-            try:
-                t = self._create_type(ta_type)
-            except ConfigParseError as exc:
-                _append_error_ctx(exc, 'Metadata',
-                                  'Cannot create type alias "{}"'.format(ta_name))
-
-            self._tas[ta_name] = t
+    def _create_type(self, type_node):
+        return self._class_name_to_create_type_func[type_node['class']](type_node)
 
     def _create_clock(self, node):
         # create clock object
         clock = _Clock()
 
-        if not _is_assoc_array_prop(node):
-            raise ConfigParseError('Metadata',
-                                   'Clock objects must be associative arrays')
-
-        known_props = [
-            'uuid',
-            'description',
-            'freq',
-            'error-cycles',
-            'offset',
-            'absolute',
-            'return-ctype',
-        ]
-
-        if self._version >= 201:
-            known_props.append('$return-ctype')
-
-        unk_prop = _get_first_unknown_prop(node, known_props)
-
-        if unk_prop:
-            raise ConfigParseError('Clock',
-                                   'Unknown property: "{}"'.format(unk_prop))
-
         # UUID
-        if 'uuid' in node:
-            uuidp = node['uuid']
-
-            if uuidp is None:
-                clock.reset_uuid()
-            else:
-                if not _is_str_prop(uuidp):
-                    raise ConfigParseError('Clock',
-                                           '"uuid" property must be a string')
+        uuid_node = node.get('uuid')
 
-                try:
-                    uuidp = uuid.UUID(uuidp)
-                except:
-                    raise ConfigParseError('Clock', 'Malformed UUID: "{}"'.format(uuidp))
-
-                clock.uuid = uuidp
+        if uuid_node is not None:
+            try:
+                clock.uuid = uuid.UUID(uuid_node)
+            except:
+                raise ConfigParseError('Clock', 'Malformed UUID: "{}"'.format(uuid_node))
 
         # description
-        if 'description' in node:
-            desc = node['description']
-
-            if desc is None:
-                clock.reset_description()
-            else:
-                if not _is_str_prop(desc):
-                    raise ConfigParseError('Clock',
-                                           '"description" property must be a string')
+        descr_node = node.get('description')
 
-                clock.description = desc
+        if descr_node is not None:
+            clock.description = descr_node
 
         # frequency
-        if 'freq' in node:
-            freq = node['freq']
+        freq_node = node.get('freq')
 
-            if freq is None:
-                clock.reset_freq()
-            else:
-                if not _is_int_prop(freq):
-                    raise ConfigParseError('Clock',
-                                           '"freq" property must be an integer')
-
-                if freq < 1:
-                    raise ConfigParseError('Clock',
-                                           'Invalid frequency: {}'.format(freq))
-
-                clock.freq = freq
+        if freq_node is not None:
+            clock.freq = freq_node
 
         # error cycles
-        if 'error-cycles' in node:
-            error_cycles = node['error-cycles']
-
-            if error_cycles is None:
-                clock.reset_error_cycles()
-            else:
-                if not _is_int_prop(error_cycles):
-                    raise ConfigParseError('Clock',
-                                           '"error-cycles" property must be an integer')
+        error_cycles_node = node.get('error-cycles')
 
-                if error_cycles < 0:
-                    raise ConfigParseError('Clock',
-                                           'Invalid error cycles: {}'.format(error_cycles))
-
-                clock.error_cycles = error_cycles
+        if error_cycles_node is not None:
+            clock.error_cycles = error_cycles_node
 
         # offset
-        if 'offset' in node:
-            offset = node['offset']
-
-            if offset is None:
-                clock.reset_offset_seconds()
-                clock.reset_offset_cycles()
-            else:
-                if not _is_assoc_array_prop(offset):
-                    raise ConfigParseError('Clock',
-                                           '"offset" property must be an associative array')
+        offset_node = node.get('offset')
 
-                unk_prop = _get_first_unknown_prop(offset, ['cycles', 'seconds'])
+        if offset_node is not None:
+            # cycles
+            offset_cycles_node = offset_node.get('cycles')
 
-                if unk_prop:
-                    raise ConfigParseError('Clock',
-                                           'Unknown offset property: "{}"'.format(unk_prop))
+            if offset_cycles_node is not None:
+                clock.offset_cycles = offset_cycles_node
 
-                # cycles
-                if 'cycles' in offset:
-                    offset_cycles = offset['cycles']
+            # seconds
+            offset_seconds_node = offset_node.get('seconds')
 
-                    if offset_cycles is None:
-                        clock.reset_offset_cycles()
-                    else:
-                        if not _is_int_prop(offset_cycles):
-                            raise ConfigParseError('Clock\'s "offset" property',
-                                                   '"cycles" property must be an integer')
-
-                        if offset_cycles < 0:
-                            raise ConfigParseError('Clock\'s "offset" property',
-                                                   'Invalid cycles: {}'.format(offset_cycles))
-
-                        clock.offset_cycles = offset_cycles
-
-                # seconds
-                if 'seconds' in offset:
-                    offset_seconds = offset['seconds']
-
-                    if offset_seconds is None:
-                        clock.reset_offset_seconds()
-                    else:
-                        if not _is_int_prop(offset_seconds):
-                            raise ConfigParseError('Clock\'s "offset" property',
-                                                   '"seconds" property must be an integer')
-
-                        if offset_seconds < 0:
-                            raise ConfigParseError('Clock\'s "offset" property',
-                                                   'Invalid seconds: {}'.format(offset_seconds))
-
-                        clock.offset_seconds = offset_seconds
+            if offset_seconds_node is not None:
+                clock.offset_seconds = offset_seconds_node
 
         # absolute
-        if 'absolute' in node:
-            absolute = node['absolute']
-
-            if absolute is None:
-                clock.reset_absolute()
-            else:
-                if not _is_bool_prop(absolute):
-                    raise ConfigParseError('Clock',
-                                           '"absolute" property must be a boolean')
-
-                clock.absolute = absolute
-
-        # return C type:
-        #   v2.0:  "return-ctype"
-        #   v2.1+: "$return-ctype"
-        return_ctype_node = None
+        absolute_node = node.get('absolute')
 
-        if self._version >= 200:
-            if 'return-ctype' in node:
-                return_ctype_prop = 'return-ctype'
-                return_ctype_node = node[return_ctype_prop]
+        if absolute_node is not None:
+            clock.absolute = absolute_node
 
-        if self._version >= 201:
-            if '$return-ctype' in node:
-                if return_ctype_node is not None:
-                    raise ConfigParseError('Clock',
-                                           'Cannot specify both "return-ctype" and "$return-ctype" properties: prefer "$return-ctype"')
+        return_ctype_node = node.get('$return-ctype')
 
-                return_ctype_prop = '$return-ctype'
-                return_ctype_node = node[return_ctype_prop]
+        if return_ctype_node is None:
+            return_ctype_node = node.get('return-ctype')
 
         if return_ctype_node is not None:
-            if return_ctype_node is None:
-                clock.reset_return_ctype()
-            else:
-                if not _is_str_prop(return_ctype_node):
-                    raise ConfigParseError('Clock',
-                                           '"{}" property of must be a string'.format(return_ctype_prop))
-
-                clock.return_ctype = return_ctype_node
+            clock.return_ctype = return_ctype_node
 
         return clock
 
     def _register_clocks(self, metadata_node):
         self._clocks = collections.OrderedDict()
-
-        if 'clocks' not in metadata_node:
-            return
-
-        clocks_node = metadata_node['clocks']
+        clocks_node = metadata_node.get('clocks')
 
         if clocks_node is None:
             return
 
-        if not _is_assoc_array_prop(clocks_node):
-            raise ConfigParseError('Metadata',
-                                   '"clocks" property must be an associative array')
-
         for clock_name, clock_node in clocks_node.items():
-            if not _is_valid_identifier(clock_name):
-                raise ConfigParseError('Metadata',
-                                       'Invalid clock name: "{}"'.format(clock_name))
-
-            if clock_name in self._clocks:
-                raise ConfigParseError('Metadata',
-                                       'Duplicate clock "{}"'.format(clock_name))
+            _validate_identifier(clock_name, 'Metadata', 'clock name')
+            assert clock_name not in self._clocks
 
             try:
                 clock = self._create_clock(clock_node)
@@ -1966,320 +1207,142 @@ class _YamlConfigParser:
             self._clocks[clock_name] = clock
 
     def _create_env(self, metadata_node):
-        env = collections.OrderedDict()
-
-        if 'env' not in metadata_node:
-            return env
-
-        env_node = metadata_node['env']
+        env_node = metadata_node.get('env')
 
         if env_node is None:
-            return env
-
-        if not _is_assoc_array_prop(env_node):
-            raise ConfigParseError('Metadata',
-                                   '"env" property must be an associative array')
+            return collections.OrderedDict()
 
         for env_name, env_value in env_node.items():
-            if env_name in env:
-                raise ConfigParseError('Metadata',
-                                       'Duplicate environment variable "{}"'.format(env_name))
-
-            if not _is_valid_identifier(env_name):
-                raise ConfigParseError('Metadata',
-                                       'Invalid environment variable name: "{}"'.format(env_name))
-
-            if not _is_int_prop(env_value) and not _is_str_prop(env_value):
-                raise ConfigParseError('Metadata',
-                                       'Invalid environment variable value ("{}"): expecting integer or string'.format(env_name))
-
-            env[env_name] = env_value
-
-        return env
-
-    def _register_log_levels(self, metadata_node):
-        self._log_levels = dict()
+            _validate_identifier(env_name, 'Metadata',
+                                 'environment variable name')
 
-        # log levels:
-        #   v2.0:  "log-levels"
-        #   v2.1+: "$log-levels"
-        log_levels_node = None
-
-        if self._version >= 200:
-            if 'log-levels' in metadata_node:
-                log_levels_prop = 'log-levels'
-                log_levels_node = metadata_node[log_levels_prop]
-
-        if self._version >= 201:
-            if '$log-levels' in metadata_node:
-                if log_levels_node is not None:
-                    raise ConfigParseError('Metadata',
-                                           'Cannot specify both "log-levels" and "$log-levels" properties of metadata object: prefer "$log-levels"')
-
-                log_levels_prop = '$log-levels'
-                log_levels_node = metadata_node[log_levels_prop]
-
-        if log_levels_node is None:
-            return
-
-        if not _is_assoc_array_prop(log_levels_node):
-            raise ConfigParseError('Metadata',
-                                   '"{}" property (metadata) must be an associative array'.format(log_levels_prop))
-
-        for ll_name, ll_value in log_levels_node.items():
-            if ll_name in self._log_levels:
-                raise ConfigParseError('"{}" property"'.format(log_levels_prop),
-                                       'Duplicate entry "{}"'.format(ll_name))
-
-            if not _is_int_prop(ll_value):
-                raise ConfigParseError('"{}" property"'.format(log_levels_prop),
-                                       'Invalid entry ("{}"): expecting an integer'.format(ll_name))
-
-            if ll_value < 0:
-                raise ConfigParseError('"{}" property"'.format(log_levels_prop),
-                                       'Invalid entry ("{}"): value must be positive'.format(ll_name))
-
-            self._log_levels[ll_name] = ll_value
+        return copy.deepcopy(env_node)
 
     def _create_trace(self, metadata_node):
         # create trace object
         trace = _Trace()
 
-        if 'trace' not in metadata_node:
-            raise ConfigParseError('Metadata', 'Missing "trace" property')
-
         trace_node = metadata_node['trace']
 
-        if not _is_assoc_array_prop(trace_node):
-            raise ConfigParseError('Metadata',
-                                   '"trace" property must be an associative array')
-
-        unk_prop = _get_first_unknown_prop(trace_node, [
-            'byte-order',
-            'uuid',
-            'packet-header-type',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Trace',
-                                   'Unknown property: "{}"'.format(unk_prop))
-
         # set byte order (already parsed)
         trace.byte_order = self._bo
 
         # UUID
-        if 'uuid' in trace_node and trace_node['uuid'] is not None:
-            uuidp = trace_node['uuid']
-
-            if not _is_str_prop(uuidp):
-                raise ConfigParseError('Trace',
-                                       '"uuid" property must be a string')
+        uuid_node = trace_node.get('uuid')
 
-            if uuidp == 'auto':
-                uuidp = uuid.uuid1()
+        if uuid_node is not None:
+            if uuid_node == 'auto':
+                trace.uuid = uuid.uuid1()
             else:
                 try:
-                    uuidp = uuid.UUID(uuidp)
+                    trace.uuid = uuid.UUID(uuid_node)
                 except:
                     raise ConfigParseError('Trace',
-                                           'Malformed UUID: "{}"'.format(uuidp))
-
-            trace.uuid = uuidp
+                                           'Malformed UUID: "{}"'.format(uuid_node))
 
         # packet header type
-        if 'packet-header-type' in trace_node and trace_node['packet-header-type'] is not None:
+        pht_node = trace_node.get('packet-header-type')
+
+        if pht_node is not None:
             try:
-                ph_type = self._create_type(trace_node['packet-header-type'])
+                trace.packet_header_type = self._create_type(pht_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Trace',
                                   'Cannot create packet header type')
 
-            trace.packet_header_type = ph_type
-
         return trace
 
-    def _lookup_log_level(self, ll):
-        if _is_int_prop(ll):
-            return ll
-        elif _is_str_prop(ll) and ll in self._log_levels:
-            return self._log_levels[ll]
-
     def _create_event(self, event_node):
+        # create event object
         event = _Event()
 
-        if not _is_assoc_array_prop(event_node):
-            raise ConfigParseError('Event', 'Expecting associative array')
-
-        unk_prop = _get_first_unknown_prop(event_node, [
-            'log-level',
-            'context-type',
-            'payload-type',
-        ])
-
-        if unk_prop:
-            raise ConfigParseError('Event',
-                                   'Unknown property: "{}"'.format(unk_prop))
-
-        if 'log-level' in event_node and event_node['log-level'] is not None:
-            ll_node = event_node['log-level']
-
-            if _is_str_prop(ll_node):
-                ll_value = self._lookup_log_level(event_node['log-level'])
-
-                if ll_value is None:
-                    raise ConfigParseError('Event\'s "log-level" property',
-                                           'Cannot find log level "{}"'.format(ll_node))
-
-                ll = metadata.LogLevel(event_node['log-level'], ll_value)
-            elif _is_int_prop(ll_node):
-                if ll_node < 0:
-                    raise ConfigParseError('Event\'s "log-level" property',
-                                           'Invalid value {}: value must be positive'.format(ll_node))
-
-                ll = metadata.LogLevel(None, ll_node)
-            else:
-                raise ConfigParseError('Event\'s "log-level" property',
-                                       'Must be either a string or an integer')
+        log_level_node = event_node.get('log-level')
 
-            event.log_level = ll
+        if log_level_node is not None:
+            assert type(log_level_node) is int
+            event.log_level = metadata.LogLevel(None, log_level_node)
 
-        if 'context-type' in event_node and event_node['context-type'] is not None:
-            ctx_type_node = event_node['context-type']
+        ct_node = event_node.get('context-type')
 
+        if ct_node is not None:
             try:
-                t = self._create_type(event_node['context-type'])
+                event.context_type = self._create_type(ct_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Event',
                                   'Cannot create context type object')
 
-            event.context_type = t
+        pt_node = event_node.get('payload-type')
 
-        if 'payload-type' in event_node and event_node['payload-type'] is not None:
+        if pt_node is not None:
             try:
-                t = self._create_type(event_node['payload-type'])
+                event.payload_type = self._create_type(pt_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Event',
                                   'Cannot create payload type object')
 
-            event.payload_type = t
-
         return event
 
     def _create_stream(self, stream_name, stream_node):
+        # create stream object
         stream = _Stream()
 
-        if not _is_assoc_array_prop(stream_node):
-            raise ConfigParseError('Stream objects must be associative arrays')
-
-        known_props = [
-            'packet-context-type',
-            'event-header-type',
-            'event-context-type',
-            'events',
-        ]
-
-        if self._version >= 202:
-            known_props.append('$default')
-
-        unk_prop = _get_first_unknown_prop(stream_node, known_props)
-
-        if unk_prop:
-            add = ''
-
-            if unk_prop == '$default':
-                add = ' (use version 2.2 or greater)'
-
-            raise ConfigParseError('Stream',
-                                   'Unknown property{}: "{}"'.format(add, unk_prop))
+        pct_node = stream_node.get('packet-context-type')
 
-        if 'packet-context-type' in stream_node and stream_node['packet-context-type'] is not None:
+        if pct_node is not None:
             try:
-                t = self._create_type(stream_node['packet-context-type'])
+                stream.packet_context_type = self._create_type(pct_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Stream',
                                   'Cannot create packet context type object')
 
-            stream.packet_context_type = t
+        eht_node = stream_node.get('event-header-type')
 
-        if 'event-header-type' in stream_node and stream_node['event-header-type'] is not None:
+        if eht_node is not None:
             try:
-                t = self._create_type(stream_node['event-header-type'])
+                stream.event_header_type = self._create_type(eht_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Stream',
                                   'Cannot create event header type object')
 
-            stream.event_header_type = t
+        ect_node = stream_node.get('event-context-type')
 
-        if 'event-context-type' in stream_node and stream_node['event-context-type'] is not None:
+        if ect_node is not None:
             try:
-                t = self._create_type(stream_node['event-context-type'])
+                stream.event_context_type = self._create_type(ect_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'Stream',
                                   'Cannot create event context type object')
 
-            stream.event_context_type = t
-
-        if 'events' not in stream_node:
-            raise ConfigParseError('Stream', 'Missing "events" property')
-
-        events = stream_node['events']
-
-        if events is not None:
-            if not _is_assoc_array_prop(events):
-                raise ConfigParseError('Stream',
-                                       '"events" property must be an associative array')
-
-            if not events:
-                raise ConfigParseError('Stream', 'At least one event is needed')
-
-            cur_id = 0
+        events_node = stream_node['events']
+        cur_id = 0
 
-            for ev_name, ev_node in events.items():
-                try:
-                    ev = self._create_event(ev_node)
-                except ConfigParseError as exc:
-                    _append_error_ctx(exc, 'Stream',
-                                      'Cannot create event "{}"'.format(ev_name))
+        for ev_name, ev_node in events_node.items():
+            try:
+                ev = self._create_event(ev_node)
+            except ConfigParseError as exc:
+                _append_error_ctx(exc, 'Stream',
+                                  'Cannot create event "{}"'.format(ev_name))
 
-                ev.id = cur_id
-                ev.name = ev_name
-                stream.events[ev_name] = ev
-                cur_id += 1
+            ev.id = cur_id
+            ev.name = ev_name
+            stream.events[ev_name] = ev
+            cur_id += 1
 
-        if '$default' in stream_node and stream_node['$default'] is not None:
-            default_node = stream_node['$default']
+        default_node = stream_node.get('$default')
 
-            if not _is_bool_prop(default_node):
+        if default_node is not None:
+            if self._meta.default_stream_name is not None and self._meta.default_stream_name != stream_name:
+                fmt = 'Cannot specify more than one default stream (default stream already set to "{}")'
                 raise ConfigParseError('Stream',
-                                       'Invalid "$default" property: expecting a boolean')
+                                       fmt.format(self._meta.default_stream_name))
 
-            if default_node:
-                if self._meta.default_stream_name is not None and self._meta.default_stream_name != stream_name:
-                    fmt = 'Cannot specify more than one default stream (default stream already set to "{}")'
-                    raise ConfigParseError('Stream',
-                                           fmt.format(self._meta.default_stream_name))
-
-                self._meta.default_stream_name = stream_name
+            self._meta.default_stream_name = stream_name
 
         return stream
 
     def _create_streams(self, metadata_node):
         streams = collections.OrderedDict()
-
-        if 'streams' not in metadata_node:
-            raise ConfigParseError('Metadata',
-                                   'Missing "streams" property')
-
         streams_node = metadata_node['streams']
-
-        if not _is_assoc_array_prop(streams_node):
-            raise ConfigParseError('Metadata',
-                                   '"streams" property must be an associative array')
-
-        if not streams_node:
-            raise ConfigParseError('Metadata\'s "streams" property',
-                                   'At least one stream is needed')
-
         cur_id = 0
 
         for stream_name, stream_node in streams_node.items():
@@ -2290,7 +1353,7 @@ class _YamlConfigParser:
                                   'Cannot create stream "{}"'.format(stream_name))
 
             stream.id = cur_id
-            stream.name = str(stream_name)
+            stream.name = stream_name
             streams[stream_name] = stream
             cur_id += 1
 
@@ -2298,166 +1361,47 @@ class _YamlConfigParser:
 
     def _create_metadata(self, root):
         self._meta = _Metadata()
-
-        if 'metadata' not in root:
-            raise ConfigParseError('Configuration',
-                                   'Missing "metadata" property')
-
         metadata_node = root['metadata']
 
-        if not _is_assoc_array_prop(metadata_node):
-            raise ConfigParseError('Configuration\'s "metadata" property',
-                                   'Must be an associative array')
-
-        known_props = [
-            'type-aliases',
-            'log-levels',
-            'trace',
-            'env',
-            'clocks',
-            'streams',
-        ]
-
-        if self._version >= 201:
-            known_props.append('$log-levels')
-
-        if self._version >= 202:
-            known_props.append('$default-stream')
-
-        unk_prop = _get_first_unknown_prop(metadata_node, known_props)
-
-        if unk_prop:
-            add = ''
-
-            if unk_prop == '$include':
-                add = ' (use version 2.1 or greater)'
-
-            if unk_prop == '$default-stream':
-                add = ' (use version 2.2 or greater)'
-
-            raise ConfigParseError('Metadata',
-                                   'Unknown property{}: "{}"'.format(add, unk_prop))
-
         if '$default-stream' in metadata_node and metadata_node['$default-stream'] is not None:
             default_stream_node = metadata_node['$default-stream']
-
-            if not _is_str_prop(default_stream_node):
-                raise ConfigParseError('Metadata\'s "$default-stream" property',
-                                       'Expecting a string')
-
             self._meta.default_stream_name = default_stream_node
 
         self._set_byte_order(metadata_node)
         self._register_clocks(metadata_node)
         self._meta.clocks = self._clocks
-        self._register_type_aliases(metadata_node)
         self._meta.env = self._create_env(metadata_node)
         self._meta.trace = self._create_trace(metadata_node)
-        self._register_log_levels(metadata_node)
         self._meta.streams = self._create_streams(metadata_node)
 
         # validate metadata
         try:
-            validator = _MetadataTypesHistologyValidator()
-            validator.validate(self._meta)
-            validator = _MetadataSpecialFieldsValidator()
-            validator.validate(self._meta)
+            _MetadataSpecialFieldsValidator().validate(self._meta)
         except ConfigParseError as exc:
             _append_error_ctx(exc, 'Metadata')
 
         try:
-            validator = _BarectfMetadataValidator()
-            validator.validate(self._meta)
+            _BarectfMetadataValidator().validate(self._meta)
         except ConfigParseError as exc:
             _append_error_ctx(exc, 'barectf metadata')
 
         return self._meta
 
-    def _get_version(self, root):
-        if 'version' not in root:
-            raise ConfigParseError('Configuration',
-                                   'Missing "version" property')
-
-        version_node = root['version']
-
-        if not _is_str_prop(version_node):
-            raise ConfigParseError('Configuration\'s "version" property',
-                                   'Must be a string')
-
-        version_node = version_node.strip()
-
-        if version_node not in ['2.0', '2.1', '2.2']:
-            raise ConfigParseError('Configuration',
-                                   'Unsupported version ({}): versions 2.0, 2.1, and 2.2 are supported'.format(version_node))
-
-        # convert version string to comparable version integer
-        parts = version_node.split('.')
-        version = int(parts[0]) * 100 + int(parts[1])
-
-        return version
-
-    def _get_prefix(self, root):
-        def_prefix = 'barectf_'
-
-        if 'prefix' not in root:
-            return def_prefix
-
-        prefix_node = root['prefix']
-
-        if prefix_node is None:
-            return def_prefix
-
-        if not _is_str_prop(prefix_node):
-            raise ConfigParseError('Configuration\'s "prefix" property',
-                                   'Must be a string')
-
-        if not _is_valid_identifier(prefix_node):
-            raise ConfigParseError('Configuration\'s "prefix" property',
-                                   'Must be a valid C identifier')
-
-        return prefix_node
-
-    def _get_options(self, root):
-        if 'options' not in root:
-            return config.ConfigOptions()
-
-        options_node = root['options']
-
-        if not _is_assoc_array_prop(options_node):
-            raise ConfigParseError('Configuration\'s "options" property',
-                                   'Must be an associative array')
-
-        known_props = [
-            'gen-prefix-def',
-            'gen-default-stream-def',
-        ]
-        unk_prop = _get_first_unknown_prop(options_node, known_props)
-
-        if unk_prop:
-            raise ConfigParseError('Configuration\'s "options" property',
-                                   'Unknown property: "{}"'.format(unk_prop))
+    def _get_prefix(self, config_node):
+        prefix = config_node.get('prefix', 'barectf_')
+        _validate_identifier(prefix, '"prefix" property', 'prefix')
+        return prefix
 
-        gen_prefix_def = None
+    def _get_options(self, config_node):
+        gen_prefix_def = False
+        gen_default_stream_def = False
+        options_node = config_node.get('options')
 
-        if 'gen-prefix-def' in options_node and options_node['gen-prefix-def'] is not None:
-            gen_prefix_def_node = options_node['gen-prefix-def']
-
-            if not _is_bool_prop(gen_prefix_def_node):
-                raise ConfigParseError('Configuration\'s "options" property',
-                                       'Invalid option "gen-prefix-def": expecting a boolean')
-
-            gen_prefix_def = gen_prefix_def_node
-
-        gen_default_stream_def = None
-
-        if 'gen-default-stream-def' in options_node and options_node['gen-default-stream-def'] is not None:
-            gen_default_stream_def_node = options_node['gen-default-stream-def']
-
-            if not _is_bool_prop(gen_default_stream_def_node):
-                raise ConfigParseError('Configuration\'s "options" property',
-                                       'Invalid option "gen-default-stream-def": expecting a boolean')
-
-            gen_default_stream_def = gen_default_stream_def_node
+        if options_node is not None:
+            gen_prefix_def = options_node.get('gen-prefix-def',
+                                              gen_prefix_def)
+            gen_default_stream_def = options_node.get('gen-default-stream-def',
+                                                      gen_default_stream_def)
 
         return config.ConfigOptions(gen_prefix_def, gen_default_stream_def)
 
@@ -2469,8 +1413,10 @@ class _YamlConfigParser:
 
     def _load_include(self, yaml_path):
         for inc_dir in self._include_dirs:
-            # current include dir + file name path
-            # note: os.path.join() only takes the last arg if it's absolute
+            # Current inclusion dir + file name path.
+            #
+            # Note: os.path.join() only takes the last argument if it's
+            # absolute.
             inc_path = os.path.join(inc_dir, yaml_path)
 
             # real path (symbolic links resolved)
@@ -2480,13 +1426,14 @@ class _YamlConfigParser:
             norm_path = os.path.normpath(real_path)
 
             if not os.path.isfile(norm_path):
-                # file does not exist: skip
+                # file doesn't exist: skip
                 continue
 
             if norm_path in self._include_stack:
                 base_path = self._get_last_include_file()
                 raise ConfigParseError('In "{}"',
-                                       'Cannot recursively include file "{}"'.format(base_path, norm_path))
+                                       'Cannot recursively include file "{}"'.format(base_path,
+                                                                                     norm_path))
 
             self._include_stack.append(norm_path)
 
@@ -2496,37 +1443,31 @@ class _YamlConfigParser:
         if not self._ignore_include_not_found:
             base_path = self._get_last_include_file()
             raise ConfigParseError('In "{}"',
-                                   'Cannot include file "{}": file not found in include directories'.format(base_path, yaml_path))
-
-        return None
+                                   'Cannot include file "{}": file not found in include directories'.format(base_path,
+                                                                                                            yaml_path))
 
     def _get_include_paths(self, include_node):
         if include_node is None:
+            # none
             return []
 
-        if _is_str_prop(include_node):
+        if type(include_node) is str:
+            # wrap as array
             return [include_node]
 
-        if _is_array_prop(include_node):
-            for include_path in include_node:
-                if not _is_str_prop(include_path):
-                    raise ConfigParseError('"$include" property',
-                                           'Expecting array of strings')
-
-            return include_node
-
-        raise ConfigParseError('"$include" property',
-                               'Expecting string or array of strings')
+        # already an array
+        assert type(include_node) is list
+        return include_node
 
     def _update_node(self, base_node, overlay_node):
         for olay_key, olay_value in overlay_node.items():
             if olay_key in base_node:
                 base_value = base_node[olay_key]
 
-                if _is_assoc_array_prop(olay_value) and _is_assoc_array_prop(base_value):
+                if type(olay_value) is collections.OrderedDict and type(base_value) is collections.OrderedDict:
                     # merge dictionaries
                     self._update_node(base_value, olay_value)
-                elif _is_array_prop(olay_value) and _is_array_prop(base_value):
+                elif type(olay_value) is list and type(base_value) is list:
                     # append extension array items to base items
                     base_value += olay_value
                 else:
@@ -2535,70 +1476,73 @@ class _YamlConfigParser:
             else:
                 base_node[olay_key] = olay_value
 
-    def _process_node_include(self, last_overlay_node, name,
+    def _process_node_include(self, last_overlay_node,
                               process_base_include_cb,
                               process_children_include_cb=None):
-        if not _is_assoc_array_prop(last_overlay_node):
-            raise ConfigParseError('"$include" property',
-                                   '{} objects must be associative arrays'.format(name))
-
         # process children inclusions first
-        if process_children_include_cb:
+        if process_children_include_cb is not None:
             process_children_include_cb(last_overlay_node)
 
-        if '$include' in last_overlay_node:
-            include_node = last_overlay_node['$include']
+        incl_prop_name = '$include'
+
+        if incl_prop_name in last_overlay_node:
+            include_node = last_overlay_node[incl_prop_name]
         else:
-            # no includes!
+            # no inclusions!
             return last_overlay_node
 
         include_paths = self._get_include_paths(include_node)
         cur_base_path = self._get_last_include_file()
         base_node = None
 
-        # keep the include paths and remove the include property
+        # keep the inclusion paths and remove the `$include` property
         include_paths = copy.deepcopy(include_paths)
-        del last_overlay_node['$include']
+        del last_overlay_node[incl_prop_name]
 
         for include_path in include_paths:
             # load raw YAML from included file
             overlay_node = self._load_include(include_path)
 
             if overlay_node is None:
-                # cannot find include file, but we're ignoring those
-                # errors, otherwise _load_include() itself raises
-                # a config error
+                # Cannot find inclusion file, but we're ignoring those
+                # errors, otherwise _load_include() itself raises a
+                # config error.
                 continue
 
-            # recursively process includes
+            # recursively process inclusions
             try:
                 overlay_node = process_base_include_cb(overlay_node)
             except ConfigParseError as exc:
                 _append_error_ctx(exc, 'In "{}"'.format(cur_base_path))
 
-            # pop include stack now that we're done including
+            # pop inclusion stack now that we're done including
             del self._include_stack[-1]
 
-            # at this point, base_node is fully resolved (does not
-            # contain any include property)
+            # At this point, `base_node` is fully resolved (does not
+            # contain any `$include` property).
             if base_node is None:
                 base_node = overlay_node
             else:
                 self._update_node(base_node, overlay_node)
 
-        # finally, we update the latest base node with our last overlay
-        # node
+        # Finally, update the latest base node with our last overlay
+        # node.
         if base_node is None:
-            # nothing was included, which is possible when we're
-            # ignoring include errors
+            # Nothing was included, which is possible when we're
+            # ignoring inclusion errors.
             return last_overlay_node
 
         self._update_node(base_node, last_overlay_node)
-
         return base_node
 
     def _process_event_include(self, event_node):
-        return self._process_node_include(event_node, 'event',
+        # Make sure the event object is valid for the inclusion
+        # processing stage.
+        self._schema_validator.validate(event_node,
+                                        '2/config/event-pre-include')
+
+        # process inclusions
+        return self._process_node_include(event_node,
                                           self._process_event_include)
 
     def _process_stream_include(self, stream_node):
@@ -2606,31 +1550,37 @@ class _YamlConfigParser:
             if 'events' in stream_node:
                 events_node = stream_node['events']
 
-                if not _is_assoc_array_prop(events_node):
-                    raise ConfigParseError('"$include" property',
-                                           '"events" property must be an associative array')
-
-                events_node_keys = list(events_node.keys())
+                for key in list(events_node):
+                    events_node[key] = self._process_event_include(events_node[key])
 
-                for key in events_node_keys:
-                    event_node = events_node[key]
+        # Make sure the stream object is valid for the inclusion
+        # processing stage.
+        self._schema_validator.validate(stream_node,
+                                        '2/config/stream-pre-include')
 
-                    try:
-                        events_node[key] = self._process_event_include(event_node)
-                    except ConfigParseError as exc:
-                        _append_error_ctx(exc, '"$include" property',
-                                          'Cannot process includes of event object "{}"'.format(key))
-
-        return self._process_node_include(stream_node, 'stream',
+        # process inclusions
+        return self._process_node_include(stream_node,
                                           self._process_stream_include,
                                           process_children_include)
 
     def _process_trace_include(self, trace_node):
-        return self._process_node_include(trace_node, 'trace',
+        # Make sure the trace object is valid for the inclusion
+        # processing stage.
+        self._schema_validator.validate(trace_node,
+                                        '2/config/trace-pre-include')
+
+        # process inclusions
+        return self._process_node_include(trace_node,
                                           self._process_trace_include)
 
     def _process_clock_include(self, clock_node):
-        return self._process_node_include(clock_node, 'clock',
+        # Make sure the clock object is valid for the inclusion
+        # processing stage.
+        self._schema_validator.validate(clock_node,
+                                        '2/config/clock-pre-include')
+
+        # process inclusions
+        return self._process_node_include(clock_node,
                                           self._process_clock_include)
 
     def _process_metadata_include(self, metadata_node):
@@ -2641,64 +1591,256 @@ class _YamlConfigParser:
             if 'clocks' in metadata_node:
                 clocks_node = metadata_node['clocks']
 
-                if not _is_assoc_array_prop(clocks_node):
-                    raise ConfigParseError('"$include" property',
-                                           '"clocks" property must be an associative array')
-
-                clocks_node_keys = list(clocks_node.keys())
-
-                for key in clocks_node_keys:
-                    clock_node = clocks_node[key]
-
-                    try:
-                        clocks_node[key] = self._process_clock_include(clock_node)
-                    except ConfigParseError as exc:
-                        _append_error_ctx(exc, '"$include" property',
-                                          'Cannot process includes of clock object "{}"'.format(key))
+                for key in list(clocks_node):
+                    clocks_node[key] = self._process_clock_include(clocks_node[key])
 
             if 'streams' in metadata_node:
                 streams_node = metadata_node['streams']
 
-                if not _is_assoc_array_prop(streams_node):
-                    raise ConfigParseError('"$include" property',
-                                           '"streams" property must be an associative array')
-
-                streams_node_keys = list(streams_node.keys())
+                for key in list(streams_node):
+                    streams_node[key] = self._process_stream_include(streams_node[key])
 
-                for key in streams_node_keys:
-                    stream_node = streams_node[key]
+        # Make sure the metadata object is valid for the inclusion
+        # processing stage.
+        self._schema_validator.validate(metadata_node,
+                                        '2/config/metadata-pre-include')
 
-                    try:
-                        streams_node[key] = self._process_stream_include(stream_node)
-                    except ConfigParseError as exc:
-                        _append_error_ctx(exc, '"$include" property',
-                                          'Cannot process includes of stream object "{}"'.format(key))
-
-        return self._process_node_include(metadata_node, 'metadata',
+        # process inclusions
+        return self._process_node_include(metadata_node,
                                           self._process_metadata_include,
                                           process_children_include)
 
-    def _process_root_includes(self, root):
-        # The following config objects support includes:
+    def _process_config_includes(self, config_node):
+        # Process inclusions in this order:
+        #
+        # 1. Clock object, event objects, and trace objects (the order
+        #    between those is not important).
+        #
+        # 2. Stream objects.
+        #
+        # 3. Metadata object.
         #
-        #   * Metadata object
-        #   * Trace object
-        #   * Stream object
-        #   * Event object
+        # This is because:
         #
-        # We need to process the event includes first, then the stream
-        # includes, then the trace includes, and finally the metadata
-        # includes.
+        # * A metadata object can include clock objects, a trace object,
+        #   stream objects, and event objects (indirectly).
         #
-        # In each object, only one of the $include and $include-replace
-        # special properties is allowed.
+        # * A stream object can include event objects.
         #
-        # We keep a stack of absolute paths to included files to detect
-        # recursion.
-        if 'metadata' in root:
-            root['metadata'] = self._process_metadata_include(root['metadata'])
+        # We keep a stack of absolute paths to included files
+        # (`self._include_stack`) to detect recursion.
+        #
+        # First, make sure the configuration object itself is valid for
+        # the inclusion processing stage.
+        self._schema_validator.validate(config_node,
+                                        '2/config/config-pre-include')
+
+        # Process metadata object inclusions.
+        #
+        # self._process_metadata_include() returns a new (or the same)
+        # metadata object without any `$include` property in it,
+        # recursively.
+        config_node['metadata'] = self._process_metadata_include(config_node['metadata'])
+
+        return config_node
 
-        return root
+    def _expand_field_type_aliases(self, metadata_node, type_aliases_node):
+        def resolve_field_type_aliases(parent_node, key, from_descr,
+                                       alias_set=None):
+            if key not in parent_node:
+                return
+
+            # This set holds all the aliases we need to expand,
+            # recursively. This is used to detect cycles.
+            if alias_set is None:
+                alias_set = set()
+
+            node = parent_node[key]
+
+            if node is None:
+                return
+
+            if type(node) is str:
+                alias = node
+
+                if alias not in resolved_aliases:
+                    # Only check for a field type alias cycle when we
+                    # didn't resolve the alias yet, as a given node can
+                    # refer to the same field type alias more than once.
+                    if alias in alias_set:
+                        fmt = 'Cycle detected during the "{}" type alias resolution'
+                        raise ConfigParseError(from_descr, fmt.format(alias))
+
+                    # try to load field type alias node named `alias`
+                    if alias not in type_aliases_node:
+                        raise ConfigParseError(from_descr,
+                                               'Type alias "{}" does not exist'.format(alias))
+
+                    # resolve it
+                    alias_set.add(alias)
+                    resolve_field_type_aliases(type_aliases_node, alias,
+                                               from_descr, alias_set)
+                    resolved_aliases.add(alias)
+
+                parent_node[key] = copy.deepcopy(type_aliases_node[node])
+                return
+
+            # traverse, resolving field type aliases as needed
+            for pkey in ['$inherit', 'inherit', 'value-type', 'element-type']:
+                resolve_field_type_aliases(node, pkey, from_descr, alias_set)
+
+            # structure field type fields
+            pkey = 'fields'
+
+            if pkey in node:
+                assert type(node[pkey]) is collections.OrderedDict
+
+                for field_name in node[pkey]:
+                    resolve_field_type_aliases(node[pkey], field_name,
+                                               from_descr, alias_set)
+
+        def resolve_field_type_aliases_from(parent_node, key, parent_node_type_name,
+                                            parent_node_name=None):
+            from_descr = '"{}" property of {}'.format(key,
+                                                      parent_node_type_name)
+
+            if parent_node_name is not None:
+                from_descr += ' "{}"'.format(parent_node_name)
+
+            resolve_field_type_aliases(parent_node, key, from_descr)
+
+        # set of resolved field type aliases
+        resolved_aliases = set()
+
+        # expand field type aliases within trace, streams, and events now
+        resolve_field_type_aliases_from(metadata_node['trace'],
+                                        'packet-header-type', 'trace')
+
+        for stream_name, stream in metadata_node['streams'].items():
+            resolve_field_type_aliases_from(stream, 'packet-context-type',
+                                            'stream', stream_name)
+            resolve_field_type_aliases_from(stream, 'event-header-type',
+                                            'stream', stream_name)
+            resolve_field_type_aliases_from(stream, 'event-context-type',
+                                            'stream', stream_name)
+
+            try:
+                for event_name, event in stream['events'].items():
+                    resolve_field_type_aliases_from(event, 'context-type', 'event',
+                                                    event_name)
+                    resolve_field_type_aliases_from(event, 'payload-type', 'event',
+                                                    event_name)
+            except ConfigParseError as exc:
+                _append_error_ctx(exc, 'Stream "{}"'.format(stream_name))
+
+        # we don't need the `type-aliases` node anymore
+        del metadata_node['type-aliases']
+
+    def _expand_field_type_inheritance(self, metadata_node):
+        def apply_inheritance(parent_node, key):
+            if key not in parent_node:
+                return
+
+            node = parent_node[key]
+
+            if node is None:
+                return
+
+            # process children first
+            for pkey in ['$inherit', 'inherit', 'value-type', 'element-type']:
+                apply_inheritance(node, pkey)
+
+            # structure field type fields
+            pkey = 'fields'
+
+            if pkey in node:
+                assert type(node[pkey]) is collections.OrderedDict
+
+                for field_name, field_type in node[pkey].items():
+                    apply_inheritance(node[pkey], field_name)
+
+            # apply inheritance of this node
+            if 'inherit' in node:
+                # barectf 2.1: `inherit` property was renamed to `$inherit`
+                assert '$inherit' not in node
+                node['$inherit'] = node['inherit']
+                del node['inherit']
+
+            inherit_key = '$inherit'
+
+            if inherit_key in node:
+                assert type(node[inherit_key]) is collections.OrderedDict
+
+                # apply inheritance below
+                apply_inheritance(node, inherit_key)
+
+                # `node` is an overlay on the `$inherit` node
+                base_node = node[inherit_key]
+                del node[inherit_key]
+                self._update_node(base_node, node)
+
+                # set updated base node as this node
+                parent_node[key] = base_node
+
+        apply_inheritance(metadata_node['trace'], 'packet-header-type')
+
+        for stream in metadata_node['streams'].values():
+            apply_inheritance(stream, 'packet-context-type')
+            apply_inheritance(stream, 'event-header-type')
+            apply_inheritance(stream, 'event-context-type')
+
+            for event in stream['events'].values():
+                apply_inheritance(event, 'context-type')
+                apply_inheritance(event, 'payload-type')
+
+    def _expand_field_types(self, metadata_node):
+        type_aliases_node = metadata_node.get('type-aliases')
+
+        if type_aliases_node is None:
+            # If there's no `type-aliases` node, then there's no field
+            # type aliases and therefore no possible inheritance.
+            return
+
+        # first, expand field type aliases
+        self._expand_field_type_aliases(metadata_node, type_aliases_node)
+
+        # next, apply inheritance to create effective field types
+        self._expand_field_type_inheritance(metadata_node)
+
+    def _expand_log_levels(self, metadata_node):
+        if 'log-levels' in metadata_node:
+            # barectf 2.1: `log-levels` property was renamed to `$log-levels`
+            assert '$log-levels' not in node
+            node['$log-levels'] = node['log-levels']
+            del node['log-levels']
+
+        log_levels_key = '$log-levels'
+        log_levels_node = metadata_node.get(log_levels_key)
+
+        if log_levels_node is None:
+            # no log level aliases
+            return
+
+        # not needed anymore
+        del metadata_node[log_levels_key]
+
+        for stream_name, stream in metadata_node['streams'].items():
+            try:
+                for event_name, event in stream['events'].items():
+                    prop_name = 'log-level'
+                    ll_node = event.get(prop_name)
+
+                    if ll_node is None:
+                        continue
+
+                    if type(ll_node) is str:
+                        if ll_node not in log_levels_node:
+                            raise ConfigParseError('Event "{}"'.format(event_name),
+                                                   'Log level "{}" does not exist'.format(ll_node))
+
+                        event[prop_name] = log_levels_node[ll_node]
+            except ConfigParseError as exc:
+                _append_error_ctx(exc, 'Stream "{}"'.format(stream_name))
 
     def _yaml_ordered_dump(self, node, **kwds):
         class ODumper(yaml.Dumper):
@@ -2710,6 +1852,7 @@ class _YamlConfigParser:
 
         ODumper.add_representer(collections.OrderedDict, dict_representer)
 
+        # Python -> YAML
         return yaml.dump(node, Dumper=ODumper, **kwds)
 
     def _yaml_ordered_load(self, yaml_path):
@@ -2733,10 +1876,10 @@ class _YamlConfigParser:
                                    'Cannot open file "{}"'.format(yaml_path))
         except ConfigParseError as exc:
             _append_error_ctx(exc, 'Configuration',
-                              'Unknown error while trying to load file "{}"'.format(yaml_path))
+                                   'Unknown error while trying to load file "{}"'.format(yaml_path))
 
         # loaded node must be an associate array
-        if not _is_assoc_array_prop(node):
+        if type(node) is not collections.OrderedDict:
             raise ConfigParseError('Configuration',
                                    'Root of YAML file "{}" must be an associative array'.format(yaml_path))
 
@@ -2750,54 +1893,80 @@ class _YamlConfigParser:
         self._reset()
         self._root_yaml_path = yaml_path
 
+        # load the configuration object as is from the root YAML file
         try:
-            root = self._yaml_ordered_load(yaml_path)
+            config_node = self._yaml_ordered_load(yaml_path)
         except ConfigParseError as exc:
             _append_error_ctx(exc, 'Configuration',
-                              'Cannot parse YAML file "{}"'.format(yaml_path))
+                                   'Cannot parse YAML file "{}"'.format(yaml_path))
 
-        if not _is_assoc_array_prop(root):
-            raise ConfigParseError('Configuration',
-                                   'Must be an associative array')
-
-        # get the config version
-        self._version = self._get_version(root)
-
-        known_props = [
-            'version',
-            'prefix',
-            'metadata',
-        ]
+        # Make sure the configuration object is minimally valid, that
+        # is, it contains a valid `version` property.
+        #
+        # This step does not validate the whole configuration object
+        # yet because we don't have an effective configuration object;
+        # we still need to:
+        #
+        # * Process inclusions.
+        # * Expand field types (inheritance and aliases).
+        self._schema_validator.validate(config_node, 'config/config-min')
 
-        if self._version >= 202:
-            known_props.append('options')
+        # Process configuration object inclusions.
+        #
+        # self._process_config_includes() returns a new (or the same)
+        # configuration object without any `$include` property in it,
+        # recursively.
+        config_node = self._process_config_includes(config_node)
 
-        unk_prop = _get_first_unknown_prop(root, known_props)
+        # Make sure that the current configuration object is valid
+        # considering field types are not expanded yet.
+        self._schema_validator.validate(config_node,
+                                        '2/config/config-pre-field-type-expansion')
 
-        if unk_prop:
-            add = ''
+        # Expand field types.
+        #
+        # This process:
+        #
+        # 1. Replaces field type aliases with "effective" field
+        #    types, recursively.
+        #
+        #    After this step, the `type-aliases` property of the
+        #    `metadata` node is gone.
+        #
+        # 2. Applies inheritance following the `$inherit`/`inherit`
+        #    properties.
+        #
+        #    After this step, field type objects do not contain
+        #    `$inherit` or `inherit` properties.
+        #
+        # This is done blindly, in that the process _doesn't_ validate
+        # field type objects at this point.
+        self._expand_field_types(config_node['metadata'])
 
-            if unk_prop == 'options':
-                add = ' (use version 2.2 or greater)'
+        # Make sure that the current configuration object is valid
+        # considering log levels are not expanded yet.
+        self._schema_validator.validate(config_node,
+                                        '2/config/config-pre-log-level-expansion')
 
-            raise ConfigParseError('Configuration',
-                                   'Unknown property{}: "{}"'.format(add, unk_prop))
+        # Expand log levels, that is, replace log level strings with
+        # their equivalent numeric values.
+        self._expand_log_levels(config_node['metadata'])
 
-        # process includes if supported
-        if self._version >= 201:
-            root = self._process_root_includes(root)
+        # validate the whole, effective configuration object
+        self._schema_validator.validate(config_node, '2/config/config')
 
         # dump config if required
         if self._dump_config:
-            print(self._yaml_ordered_dump(root, indent=2,
+            print(self._yaml_ordered_dump(config_node, indent=2,
                                           default_flow_style=False))
 
-        # get prefix and metadata
-        prefix = self._get_prefix(root)
-        meta = self._create_metadata(root)
-        opts = self._get_options(root)
+        # get prefix, options, and metadata pseudo-object
+        prefix = self._get_prefix(config_node)
+        opts = self._get_options(config_node)
+        pseudo_meta = self._create_metadata(config_node)
 
-        return config.Config(meta.to_public(), prefix, opts)
+        # create public configuration
+        return config.Config(pseudo_meta.to_public(), prefix, opts)
 
 
 def _from_file(path, include_dirs, ignore_include_not_found, dump_config):
@@ -2807,4 +1976,4 @@ def _from_file(path, include_dirs, ignore_include_not_found, dump_config):
         return parser.parse(path)
     except ConfigParseError as exc:
         _append_error_ctx(exc, 'Configuration',
-                          'Cannot create configuration from YAML file "{}"'.format(path))
+                               'Cannot create configuration from YAML file "{}"'.format(path))
diff --git a/barectf/schemas/2/config/byte-order-prop.yaml b/barectf/schemas/2/config/byte-order-prop.yaml
new file mode 100644 (file)
index 0000000..27c5fdd
--- /dev/null
@@ -0,0 +1,30 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/byte-order-prop.json
+title: Byte order property value
+type: string
+enum:
+  - le
+  - be
diff --git a/barectf/schemas/2/config/clock-pre-include.yaml b/barectf/schemas/2/config/clock-pre-include.yaml
new file mode 100644 (file)
index 0000000..d99ab28
--- /dev/null
@@ -0,0 +1,30 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/clock-pre-include.json
+title: Clock object before inclusions
+type: object
+properties:
+  $include:
+    $ref: https://barectf.org/schemas/2/config/include-prop.json
diff --git a/barectf/schemas/2/config/config-pre-field-type-expansion.yaml b/barectf/schemas/2/config/config-pre-field-type-expansion.yaml
new file mode 100644 (file)
index 0000000..756139a
--- /dev/null
@@ -0,0 +1,109 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/config-pre-field-type-expansion.json
+title: Configuration object before field type expansions
+definitions:
+  partial-field-type:
+    title: Partial field type object
+    oneOf:
+      - type: string
+      - type: object
+        allOf:
+          - oneOf:
+            - properties:
+                class:
+                  type: string
+              required:
+                - class
+            - properties:
+                inherit:
+                  type: string
+              required:
+                - inherit
+            - properties:
+                $inherit:
+                  type: string
+              required:
+                - $inherit
+          - properties:
+              value-type:
+                $ref: '#/definitions/partial-field-type'
+              element-type:
+                $ref: '#/definitions/partial-field-type'
+              fields:
+                type: object
+                patternProperties:
+                  '':
+                    $ref: '#/definitions/partial-field-type'
+      - type: 'null'
+type: object
+properties:
+  metadata:
+    title: Metadata object before field type expansions
+    type: object
+    properties:
+      type-aliases:
+        title: Type aliases object before field type expansions
+        type: object
+        patternProperties:
+          '':
+            $ref: '#/definitions/partial-field-type'
+      trace:
+        title: Trace object before field type expansions
+        type: object
+        properties:
+          packet-header-type:
+            $ref: '#/definitions/partial-field-type'
+      streams:
+        title: Streams object before field type expansions
+        type: object
+        patternProperties:
+          '':
+            title: Stream object before field type expansions
+            type: object
+            properties:
+              packet-context-type:
+                $ref: '#/definitions/partial-field-type'
+              event-header-type:
+                $ref: '#/definitions/partial-field-type'
+              event-context-type:
+                $ref: '#/definitions/partial-field-type'
+              events:
+                type: object
+                patternProperties:
+                  '':
+                    type: object
+                    properties:
+                      context-type:
+                        $ref: '#/definitions/partial-field-type'
+                      payload-type:
+                        $ref: '#/definitions/partial-field-type'
+            required:
+              - events
+    required:
+      - trace
+      - streams
+required:
+  - metadata
diff --git a/barectf/schemas/2/config/config-pre-include.yaml b/barectf/schemas/2/config/config-pre-include.yaml
new file mode 100644 (file)
index 0000000..afec6e1
--- /dev/null
@@ -0,0 +1,32 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/config-pre-include.json
+title: Configuration object before inclusions
+type: object
+properties:
+  metadata:
+    $ref: https://barectf.org/schemas/2/config/metadata-pre-include.json
+required:
+  - metadata
diff --git a/barectf/schemas/2/config/config-pre-log-level-expansion.yaml b/barectf/schemas/2/config/config-pre-log-level-expansion.yaml
new file mode 100644 (file)
index 0000000..5d4734a
--- /dev/null
@@ -0,0 +1,84 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/config-pre-log-level-expansion.json
+title: Configuration object before log level expansions
+definitions:
+  opt-log-levels:
+    title: Log level values object
+    oneOf:
+      - type: object
+        patternProperties:
+          '':
+            type: integer
+            minimum: 0
+      - type: 'null'
+type: object
+properties:
+  metadata:
+    title: Metadata object before log level expansions
+    type: object
+    oneOf:
+      - required:
+          - $log-levels
+      - required:
+          - log-levels
+      - allOf:
+          - not:
+              required:
+                - $log-levels
+          - not:
+              required:
+                - log-levels
+    properties:
+      $log-levels:
+        $ref: '#/definitions/opt-log-levels'
+      log-levels:
+        $ref: '#/definitions/opt-log-levels'
+      streams:
+        title: Streams object before log level expansions
+        type: object
+        patternProperties:
+          '':
+            title: Stream object before log level expansions
+            type: object
+            properties:
+              events:
+                type: object
+                patternProperties:
+                  '':
+                    type: object
+                    properties:
+                      log-level:
+                        oneOf:
+                          - type: string
+                          - type: integer
+                            minimum: 0
+                          - type: 'null'
+            required:
+              - events
+    required:
+      - streams
+required:
+  - metadata
diff --git a/barectf/schemas/2/config/config.yaml b/barectf/schemas/2/config/config.yaml
new file mode 100644 (file)
index 0000000..0f07e44
--- /dev/null
@@ -0,0 +1,230 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/config.json
+title: Effective configuration object
+definitions:
+  opt-bool:
+    oneOf:
+      - type: boolean
+      - type: 'null'
+  opt-string:
+    oneOf:
+      - type: string
+      - type: 'null'
+  opt-int-min-0:
+    oneOf:
+      - type: integer
+        minimum: 0
+      - type: 'null'
+  opt-field-type:
+    oneOf:
+      - $ref: https://barectf.org/schemas/2/config/field-type.json
+      - type: 'null'
+  opt-struct-field-type:
+    oneOf:
+      - $ref: https://barectf.org/schemas/2/config/field-type.json#/definitions/struct-field-type
+      - type: 'null'
+  trace:
+    title: Trace object
+    type: object
+    properties:
+      byte-order:
+        $ref: https://barectf.org/schemas/2/config/byte-order-prop.json
+      uuid:
+        oneOf:
+          - $ref: https://barectf.org/schemas/2/config/uuid-prop.json
+          - type: string
+            const: auto
+          - type: 'null'
+      packet-header-type:
+        $ref: '#/definitions/opt-struct-field-type'
+    required:
+      - byte-order
+    additionalProperties: false
+  clock:
+    title: Clock object
+    type: object
+    oneOf:
+      - required:
+          - $return-ctype
+      - required:
+          - return-ctype
+      - allOf:
+          - not:
+              required:
+                - $return-ctype
+          - not:
+              required:
+                - return-ctype
+    properties:
+      uuid:
+        oneOf:
+          - $ref: https://barectf.org/schemas/2/config/uuid-prop.json
+          - type: 'null'
+      description:
+        $ref: '#/definitions/opt-string'
+      freq:
+        oneOf:
+          - type: integer
+            minimum: 1
+          - type: 'null'
+      error-cycles:
+        $ref: '#/definitions/opt-int-min-0'
+      offset:
+        oneOf:
+          - type: object
+            properties:
+              cycles:
+                $ref: '#/definitions/opt-int-min-0'
+              seconds:
+                $ref: '#/definitions/opt-int-min-0'
+            additionalProperties: false
+          - type: 'null'
+      absolute:
+        $ref: '#/definitions/opt-bool'
+      return-ctype:
+        $ref: '#/definitions/opt-string'
+      $return-ctype:
+        $ref: '#/definitions/opt-string'
+    additionalProperties: false
+  $default-stream:
+    oneOf:
+      - type: string
+        pattern: '^[A-Za-z_][A-Za-z0-9_]*$'
+      - type: 'null'
+  stream:
+    title: Stream object
+    type: object
+    properties:
+      $default:
+        $ref: '#/definitions/opt-bool'
+      packet-context-type:
+        $ref: '#/definitions/opt-struct-field-type'
+      event-header-type:
+        $ref: '#/definitions/opt-struct-field-type'
+      event-context-type:
+        $ref: '#/definitions/opt-struct-field-type'
+      events:
+        title: Events object
+        type: object
+        patternProperties:
+          '^[A-Za-z_][A-Za-z0-9_]*$':
+            $ref: '#/definitions/event'
+        additionalProperties: false
+        minProperties: 1
+    required:
+      - events
+    additionalProperties: false
+  event:
+    title: Event object
+    type: object
+    properties:
+      log-level:
+        $ref: '#/definitions/opt-int-min-0'
+      context-type:
+        $ref: '#/definitions/opt-struct-field-type'
+      payload-type:
+        $ref: '#/definitions/opt-struct-field-type'
+    additionalProperties: false
+type: object
+properties:
+  version:
+    type: string
+    enum:
+      - '2.0'
+      - '2.1'
+      - '2.2'
+  prefix:
+    type: string
+    allOf:
+      - pattern: '^[A-Za-z_][A-Za-z0-9_]*$'
+      - not:
+          enum:
+            - align
+            - callsite
+            - clock
+            - enum
+            - env
+            - event
+            - floating_point
+            - integer
+            - stream
+            - string
+            - struct
+            - trace
+            - typealias
+            - typedef
+            - variant
+  options:
+    title: Configuration options object
+    type: object
+    properties:
+      gen-prefix-def:
+        type: boolean
+      gen-default-stream-def:
+        type: boolean
+    additionalProperties: false
+  metadata:
+    title: Metadata object
+    type: object
+    properties:
+      trace:
+        $ref: '#/definitions/trace'
+      env:
+        title: Environment variables
+        oneOf:
+          - type: object
+            patternProperties:
+              '^[A-Za-z_][A-Za-z0-9_]*$':
+                oneOf:
+                  - type: string
+                  - type: integer
+            additionalProperties: false
+          - type: 'null'
+      clocks:
+        title: Clocks object
+        type: object
+        patternProperties:
+          '^[A-Za-z_][A-Za-z0-9_]*$':
+            $ref: '#/definitions/clock'
+        additionalProperties: false
+      $default-stream:
+        $ref: '#/definitions/opt-string'
+      streams:
+        title: Streams object
+        type: object
+        patternProperties:
+          '^[A-Za-z_][A-Za-z0-9_]*$':
+            $ref: '#/definitions/stream'
+        additionalProperties: false
+        minProperties: 1
+    required:
+      - trace
+      - streams
+    additionalProperties: false
+required:
+  - version
+  - metadata
+additionalProperties: false
diff --git a/barectf/schemas/2/config/event-pre-include.yaml b/barectf/schemas/2/config/event-pre-include.yaml
new file mode 100644 (file)
index 0000000..d4d76ca
--- /dev/null
@@ -0,0 +1,30 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/event-pre-include.json
+title: Event object before inclusions
+type: object
+properties:
+  $include:
+    $ref: https://barectf.org/schemas/2/config/include-prop.json
diff --git a/barectf/schemas/2/config/field-type.yaml b/barectf/schemas/2/config/field-type.yaml
new file mode 100644 (file)
index 0000000..26d5c5a
--- /dev/null
@@ -0,0 +1,312 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/field-type.json
+title: Effective field type object
+definitions:
+  byte-order-prop:
+    title: Byte order property value
+    oneOf:
+      - $ref: https://barectf.org/schemas/2/config/byte-order-prop.json
+      - type: 'null'
+  align-prop:
+    title: Alignment property value
+    oneOf:
+      - type: integer
+        minimum: 1
+      - type: 'null'
+  encoding-prop:
+    title: Encoding property value
+    oneOf:
+      - type: string
+        enum:
+        - utf8
+        - UTF8
+        - utf-8
+        - UTF-8
+        - Utf-8
+        - ascii
+        - Ascii
+        - ASCII
+        - none
+        - None
+        - NONE
+      - type: 'null'
+  int-field-type-class-prop:
+    type: string
+    enum:
+      - int
+      - integer
+  int-field-type:
+    title: Integer field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/int-field-type-class-prop'
+      size:
+        type: integer
+        minimum: 1
+        maximum: 64
+      signed:
+        oneOf:
+          - type: boolean
+          - type: 'null'
+      align:
+        $ref: '#/definitions/align-prop'
+      byte-order:
+        $ref: '#/definitions/byte-order-prop'
+      base:
+        type: string
+        oneOf:
+          - enum:
+            - bin
+            - oct
+            - dec
+            - hex
+          - type: 'null'
+      encoding:
+        $ref: '#/definitions/encoding-prop'
+      property-mappings:
+        oneOf:
+          - type: array
+            items:
+              type: object
+              properties:
+                type:
+                  type: string
+                  const: clock
+                name:
+                  type: string
+                  pattern: '^[A-Za-z_][A-Za-z0-9_]*$'
+                property:
+                  type: string
+                  const: value
+              required:
+                - type
+                - name
+                - property
+              additionalProperties: false
+            minItems: 1
+            maxItems: 1
+          - type: 'null'
+    required:
+      - class
+      - size
+    additionalProperties: false
+  float-field-type-class-prop:
+    type: string
+    enum:
+      - flt
+      - float
+      - floating-point
+  float-field-type:
+    title: Floating point number field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/float-field-type-class-prop'
+      size:
+        type: object
+        properties:
+          exp:
+            type: integer
+          mant:
+            type: integer
+        oneOf:
+          - properties:
+              exp:
+                const: 8
+              mant:
+                const: 24
+          - properties:
+              exp:
+                const: 11
+              mant:
+                const: 53
+        required:
+          - exp
+          - mant
+        additionalProperties: false
+      align:
+        $ref: '#/definitions/align-prop'
+      byte-order:
+        $ref: '#/definitions/byte-order-prop'
+    required:
+      - class
+      - size
+    additionalProperties: false
+  enum-field-type-class-prop:
+    type: string
+    enum:
+      - enum
+      - enumeration
+  enum-field-type:
+    title: Enumeration field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/enum-field-type-class-prop'
+      value-type:
+        $ref: '#/definitions/int-field-type'
+      members:
+        type: array
+        items:
+          anyOf:
+            - type: string
+            - type: object
+              properties:
+                label:
+                  type: string
+                value:
+                  oneOf:
+                    - type: integer
+                    - type: array
+                      items:
+                        type: integer
+                      minItems: 2
+                      maxItems: 2
+              required:
+                - label
+                - value
+              additionalProperties: false
+    required:
+      - class
+      - value-type
+    additionalProperties: false
+  string-field-type-class-prop:
+    type: string
+    enum:
+      - str
+      - string
+  string-field-type:
+    title: String field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/string-field-type-class-prop'
+      encoding:
+        $ref: '#/definitions/encoding-prop'
+    required:
+      - class
+    additionalProperties: false
+  array-field-type-class-prop:
+    type: string
+    const: array
+  array-field-type:
+    title: Array field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/array-field-type-class-prop'
+      element-type:
+        $ref: '#/definitions/field-type'
+      length:
+        type: integer
+        minimum: 0
+    required:
+      - class
+      - element-type
+      - length
+    additionalProperties: false
+  struct-field-type-class-prop:
+    type: string
+    enum:
+      - struct
+      - structure
+  struct-field-type:
+    title: Structure field type object
+    type: object
+    properties:
+      class:
+        $ref: '#/definitions/struct-field-type-class-prop'
+      min-align:
+        $ref: '#/definitions/align-prop'
+      fields:
+        oneOf:
+          - type: object
+            patternProperties:
+              '^[A-Za-z_][A-Za-z0-9_]*$':
+                $ref: '#/definitions/field-type'
+            additionalProperties: false
+          - type: 'null'
+    required:
+      - class
+    additionalProperties: false
+  field-type:
+    type: object
+    properties:
+      class:
+        enum:
+          - int
+          - integer
+          - flt
+          - float
+          - floating-point
+          - enum
+          - enumeration
+          - str
+          - string
+          - array
+          - struct
+          - structure
+    allOf:
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/int-field-type-class-prop'
+        then:
+          $ref: '#/definitions/int-field-type'
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/float-field-type-class-prop'
+        then:
+          $ref: '#/definitions/float-field-type'
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/enum-field-type-class-prop'
+        then:
+          $ref: '#/definitions/enum-field-type'
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/string-field-type-class-prop'
+        then:
+          $ref: '#/definitions/string-field-type'
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/array-field-type-class-prop'
+        then:
+          $ref: '#/definitions/array-field-type'
+      - if:
+          properties:
+            class:
+              $ref: '#/definitions/struct-field-type-class-prop'
+        then:
+          $ref: '#/definitions/struct-field-type'
+    required:
+      - class
+$ref: '#/definitions/field-type'
diff --git a/barectf/schemas/2/config/include-prop.yaml b/barectf/schemas/2/config/include-prop.yaml
new file mode 100644 (file)
index 0000000..b2b8869
--- /dev/null
@@ -0,0 +1,32 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/include-prop.json
+title: Inclusion property value
+oneOf:
+  - type: string
+  - type: array
+    items:
+      type: string
+    minItems: 1
diff --git a/barectf/schemas/2/config/metadata-pre-include.yaml b/barectf/schemas/2/config/metadata-pre-include.yaml
new file mode 100644 (file)
index 0000000..863f2bd
--- /dev/null
@@ -0,0 +1,44 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/metadata-pre-include.json
+title: Metadata object before inclusions
+type: object
+properties:
+  $include:
+    $ref: https://barectf.org/schemas/2/config/include-prop.json
+  clocks:
+    title: Clocks object before inclusions
+    type: object
+    patternProperties:
+      '':
+        $ref: https://barectf.org/schemas/2/config/clock-pre-include.json
+  trace:
+    $ref: https://barectf.org/schemas/2/config/trace-pre-include.json
+  streams:
+    title: Streams object before inclusions
+    type: object
+    patternProperties:
+      '':
+        $ref: https://barectf.org/schemas/2/config/stream-pre-include.json
diff --git a/barectf/schemas/2/config/stream-pre-include.yaml b/barectf/schemas/2/config/stream-pre-include.yaml
new file mode 100644 (file)
index 0000000..b3c2c29
--- /dev/null
@@ -0,0 +1,36 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/stream-pre-include.json
+title: Stream object before inclusions
+type: object
+properties:
+  $include:
+    $ref: https://barectf.org/schemas/2/config/include-prop.json
+  events:
+    title: Events object before inclusions
+    type: object
+    patternProperties:
+      '':
+        $ref: https://barectf.org/schemas/2/config/event-pre-include.json
diff --git a/barectf/schemas/2/config/trace-pre-include.yaml b/barectf/schemas/2/config/trace-pre-include.yaml
new file mode 100644 (file)
index 0000000..e77a96a
--- /dev/null
@@ -0,0 +1,30 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/trace-pre-include.json
+title: Trace object before inclusions
+type: object
+properties:
+  $include:
+    $ref: https://barectf.org/schemas/2/config/include-prop.json
diff --git a/barectf/schemas/2/config/uuid-prop.yaml b/barectf/schemas/2/config/uuid-prop.yaml
new file mode 100644 (file)
index 0000000..25bbe21
--- /dev/null
@@ -0,0 +1,28 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/2/config/uuid-prop.json
+title: UUID property value
+type: string
+pattern: '^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$'
diff --git a/barectf/schemas/config/config-min.yaml b/barectf/schemas/config/config-min.yaml
new file mode 100644 (file)
index 0000000..d06b327
--- /dev/null
@@ -0,0 +1,36 @@
+# The MIT License (MIT)
+#
+# Copyright (c) 2020 Philippe Proulx <pproulx@efficios.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+$schema: http://json-schema.org/draft-07/schema#
+$id: https://barectf.org/schemas/config/config-min.json
+title: Minimal configuration object
+type: object
+properties:
+  version:
+    type: string
+    enum:
+      - '2.0'
+      - '2.1'
+      - '2.2'
+required:
+  - version
index 762625a4b8410ddf91a8cbdd572e7cd2ee5deec6..116f8d6a9a7335706da61f737f6075fc44b63554 100644 (file)
@@ -1,3 +1,66 @@
+[[package]]
+category = "main"
+description = "Classes Without Boilerplate"
+name = "attrs"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+version = "19.3.0"
+
+[package.extras]
+azure-pipelines = ["coverage", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface", "pytest-azurepipelines"]
+dev = ["coverage", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface", "sphinx", "pre-commit"]
+docs = ["sphinx", "zope.interface"]
+tests = ["coverage", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface"]
+
+[[package]]
+category = "main"
+description = "Read metadata from Python packages"
+marker = "python_version < \"3.8\""
+name = "importlib-metadata"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
+version = "1.6.0"
+
+[package.dependencies]
+zipp = ">=0.5"
+
+[package.extras]
+docs = ["sphinx", "rst.linker"]
+testing = ["packaging", "importlib-resources"]
+
+[[package]]
+category = "main"
+description = "An implementation of JSON Schema validation for Python"
+name = "jsonschema"
+optional = false
+python-versions = "*"
+version = "3.2.0"
+
+[package.dependencies]
+attrs = ">=17.4.0"
+pyrsistent = ">=0.14.0"
+setuptools = "*"
+six = ">=1.11.0"
+
+[package.dependencies.importlib-metadata]
+python = "<3.8"
+version = "*"
+
+[package.extras]
+format = ["idna", "jsonpointer (>1.13)", "rfc3987", "strict-rfc3339", "webcolors"]
+format_nongpl = ["idna", "jsonpointer (>1.13)", "webcolors", "rfc3986-validator (>0.1.0)", "rfc3339-validator"]
+
+[[package]]
+category = "main"
+description = "Persistent/Functional/Immutable data structures"
+name = "pyrsistent"
+optional = false
+python-versions = "*"
+version = "0.16.0"
+
+[package.dependencies]
+six = "*"
+
 [[package]]
 category = "main"
 description = "YAML parser and emitter for Python"
@@ -6,6 +69,14 @@ optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
 version = "5.3.1"
 
+[[package]]
+category = "main"
+description = "Python 2 and 3 compatibility utilities"
+name = "six"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
+version = "1.15.0"
+
 [[package]]
 category = "main"
 description = "ANSII Color formatting for output in terminal."
@@ -14,11 +85,39 @@ optional = false
 python-versions = "*"
 version = "1.1.0"
 
+[[package]]
+category = "main"
+description = "Backport of pathlib-compatible object wrapper for zip files"
+marker = "python_version < \"3.8\""
+name = "zipp"
+optional = false
+python-versions = ">=2.7"
+version = "1.2.0"
+
+[package.extras]
+docs = ["sphinx", "jaraco.packaging (>=3.2)", "rst.linker (>=1.9)"]
+testing = ["pathlib2", "unittest2", "jaraco.itertools", "func-timeout"]
+
 [metadata]
-content-hash = "093137bc1e41045c348580df35966d4e72d29639853f4b642366f856ba62aae7"
+content-hash = "1b6983518d1480ad0ddbedeaf22f78990a3f1a9975067cecd00b00d493171609"
 python-versions = '^3.5'
 
 [metadata.files]
+attrs = [
+    {file = "attrs-19.3.0-py2.py3-none-any.whl", hash = "sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c"},
+    {file = "attrs-19.3.0.tar.gz", hash = "sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"},
+]
+importlib-metadata = [
+    {file = "importlib_metadata-1.6.0-py2.py3-none-any.whl", hash = "sha256:2a688cbaa90e0cc587f1df48bdc97a6eadccdcd9c35fb3f976a09e3b5016d90f"},
+    {file = "importlib_metadata-1.6.0.tar.gz", hash = "sha256:34513a8a0c4962bc66d35b359558fd8a5e10cd472d37aec5f66858addef32c1e"},
+]
+jsonschema = [
+    {file = "jsonschema-3.2.0-py2.py3-none-any.whl", hash = "sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163"},
+    {file = "jsonschema-3.2.0.tar.gz", hash = "sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"},
+]
+pyrsistent = [
+    {file = "pyrsistent-0.16.0.tar.gz", hash = "sha256:28669905fe725965daa16184933676547c5bb40a5153055a8dee2a4bd7933ad3"},
+]
 pyyaml = [
     {file = "PyYAML-5.3.1-cp27-cp27m-win32.whl", hash = "sha256:74809a57b329d6cc0fdccee6318f44b9b8649961fa73144a98735b0aaf029f1f"},
     {file = "PyYAML-5.3.1-cp27-cp27m-win_amd64.whl", hash = "sha256:240097ff019d7c70a4922b6869d8a86407758333f02203e0fc6ff79c5dcede76"},
@@ -32,6 +131,14 @@ pyyaml = [
     {file = "PyYAML-5.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:95f71d2af0ff4227885f7a6605c37fd53d3a106fcab511b8860ecca9fcf400ee"},
     {file = "PyYAML-5.3.1.tar.gz", hash = "sha256:b8eac752c5e14d3eca0e6dd9199cd627518cb5ec06add0de9d32baeee6fe645d"},
 ]
+six = [
+    {file = "six-1.15.0-py2.py3-none-any.whl", hash = "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"},
+    {file = "six-1.15.0.tar.gz", hash = "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259"},
+]
 termcolor = [
     {file = "termcolor-1.1.0.tar.gz", hash = "sha256:1d6d69ce66211143803fbc56652b41d73b4a400a2891d7bf7a1cdf4c02de613b"},
 ]
+zipp = [
+    {file = "zipp-1.2.0-py2.py3-none-any.whl", hash = "sha256:e0d9e63797e483a30d27e09fffd308c59a700d365ec34e93cc100844168bf921"},
+    {file = "zipp-1.2.0.tar.gz", hash = "sha256:c70410551488251b0fee67b460fb9a536af8d6f9f008ad10ac51f615b6a521b1"},
+]
index 47a85b4908b62d044120c4a0a42a1c8a60c2f727..e2910804ca98527c6346d575882560f42f415a69 100644 (file)
@@ -57,6 +57,7 @@ packages = [{include = 'barectf'}]
 python = '^3.5'
 termcolor = '^1.1'
 pyyaml = '^5.3'
+jsonschema = '^3.2'
 setuptools = '*'
 
 [tool.poetry.scripts]
This page took 0.071871 seconds and 4 git commands to generate.