You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 33 Next »

Data semantics is the study of the meaning and use of specific pieces of data in computer programming and other areas that employ data. When studying a language, semantics refers to what individual words mean and what they mean when put together to form phrases or sentences. In data semantics, the focus is on how a data object represents a concept or object in the real world. The concept of “Decentralized Semantics” is a representation of “data semantics” within an age of distributed ledger technology (DLT) solutions that continue to mould the current decentralisation movement.

Mission and Scope

To define a data capture architecture consisting of immutable schema bases and interoperable overlays for Internet-scale deployment.

Conveners

  • Paul Knowles (Human Colossus Foundation)
  • Robert Mitwicki (Human Colossus Foundation)

Interested Members (add your name if you may be interested in joining this proposed WG)

  • Philippe Page
  • Haydar Majeed
  • Rieks Joosten
  • Lucy Yang
  • Sal D’Agostino
  • Zan McNaught
  • Mike Bennett
  • Colin Wallis
  • John Walker
  • Mark Lizar
  • Burak Serdar
  • Scott Whitmire
  • Dennis Landi
  • Darrell O’Donnell
  • Kalyan Kulkarni
  • Zeljko Milinovic
  • Crt Ahlin
  • Davide Calvi
  • James Hazard
  • Michael Corning
  • Carsten Stöcker
  • Juan Caballero
  • Jim St.Clair
  • Eric Drury
  • Nathan George
  • Jan Lindquist
  • Vipin Bharathan
  • Kamlesh Nagware
  • Vinod Panicker
  • Steven Milstein
  • Vitor Jesus
  • Iain Henderson
  • Casandra Grundstrom

Description

Decentralized Semantics

The post millennial generation has witnessed an explosion of captured data points which has sparked profound possibilities in both Artificial Intelligence (AI) and Internet of Things (IoT) solutions. This has spawned the collective realization that society’s current technological infrastructure is simply not equipped to fully support de-identification or to entice corporations to break down internal data silos, streamline data harmonization processes and ultimately resolve worldwide data duplication and storage resource issues. Developing and deploying the right data capture architecture will improve the quality of externally pooled data for future AI and IoT solutions.

Overlays Capture Architecture (OCA)

OCA is an architecture that presents a schema as a multi-dimensional object consisting of a stable schema base and interoperable overlaysOverlays are task-oriented linked data objects that provide additional extensions, coloration, and functionality to the schema base. This degree of object separation enables issuers to make custom edits to the overlays rather than to the schema base itself. In other words, multiple parties can interact with and contribute to the schema structure without having to change the schema base definition. With schema base definitions remaining stable and in their purest form, a common immutable base object is maintained throughout the capture process which enables data standardization. 

OCA facilitates a unified data language so that harmonized data can be pooled for improved data science, statistics, analytics and other meaningful services.




  • No labels