Heavy data: Difference between revisions

From XionKB
Jump to navigationJump to search
(Created page with "'''Heavy data''' is an information engineering term coined by American computer scientist Alexander Nicholi in his essay entitled ''The Superintegration Trifecta''. It refers to a piece of data's intrinsic viability to be ''aggregated'', i.e. comprehensively summed together in the same form from various sources across a network such as the Worldwide Web. It stands in contrast to light data, which is not so suited to aggregation. Heavy data is the data of focus in th...")
 
m (new category)
 
Line 6: Line 6:
* [//www.nichfury.com/p/superintegration ''The Superintegration Trifecta'']
* [//www.nichfury.com/p/superintegration ''The Superintegration Trifecta'']


[[Category:Computing theory]]
[[Category:Computing theory]][[Category:Computing terminology]]

Latest revision as of 03:12, 25 April 2023

Heavy data is an information engineering term coined by American computer scientist Alexander Nicholi in his essay entitled The Superintegration Trifecta. It refers to a piece of data's intrinsic viability to be aggregated, i.e. comprehensively summed together in the same form from various sources across a network such as the Worldwide Web. It stands in contrast to light data, which is not so suited to aggregation.

Heavy data is the data of focus in the ib-AP, where it is aggregated in a decentralised fashion without any imposition about the truth of such data.

References