Heavy data

From XionKB
Revision as of 06:46, 25 February 2023 by Alexander (talk | contribs) (Created page with "'''Heavy data''' is an information engineering term coined by American computer scientist Alexander Nicholi in his essay entitled ''The Superintegration Trifecta''. It refers to a piece of data's intrinsic viability to be ''aggregated'', i.e. comprehensively summed together in the same form from various sources across a network such as the Worldwide Web. It stands in contrast to light data, which is not so suited to aggregation. Heavy data is the data of focus in th...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Heavy data is an information engineering term coined by American computer scientist Alexander Nicholi in his essay entitled The Superintegration Trifecta. It refers to a piece of data's intrinsic viability to be aggregated, i.e. comprehensively summed together in the same form from various sources across a network such as the Worldwide Web. It stands in contrast to light data, which is not so suited to aggregation.

Heavy data is the data of focus in the ib-AP, where it is aggregated in a decentralised fashion without any imposition about the truth of such data.

References