This paper develops an information-theoretic framework for algorithmic complexity under regular identifiable fibering. The central question is: when a decoder is given information about the fiber label in a fibered geometric set, how much can the residual description length be reduced, and when does this reduction fail to bring dimension below the ambient rate? We formulate a directional compression principle, proposing that sets admitting regular, identifiable fiber decompositions should remain informationally incompressible at ambient dimension, unless the fiber structure is degenerate or adaptively chosen. The principle is phrased in the language of algorithmic dimension and the point-to-set principle of Lutz and Lutz, which translates pointwise Kolmogorov complexity into Hausdorff dimension. We prove an exact analytical result: under effectively bi-Lipschitz, identifiable, and computable fibering, the complexity of a point splits additively as the sum of fiber-label complexity and along-fiber residual complexity, up to logarithmic overhead, via the chain rule for Kolmogorov complexity. The Kakeya conjecture (asserting that sets containing a unit segment in every direction have Hausdorff dimension n) motivates the framework. The conjecture was recently resolved in R^3 by Wang and Zahl; it remains open in dimension n >= 4, precisely because adaptive fiber selection undermines the naive conditional split in the general case. We isolate this adaptive-fibering obstruction as the key difficulty and propose a formal research program connecting geometric measure theory, algorithmic complexity, and information-theoretic compression.
翻译:暂无翻译