Definition

A semimeasure is a type of measure that satisfies certain axioms, often used in the context of information theory and algorithmic complexity. It is a function that assigns a non-negative value to every set, such that the value of the union of any finite number of disjoint sets does not exceed the maximum of the values of the individual sets. It is a weaker form of measure compared to a full measure.