×

Predictive complexity and information. (English) Zbl 1101.68617

Summary: The notions of predictive complexity and of corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional \(KG(x)\) and conditional \(KG(x|y)\) predictive complexities are studied. We define an algorithm which has some “expanding property”. It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information \(IG(y:x)\) is studied. We show that this information is noncommutative in a very strong sense and present asymptotic relations between values \(IG(y:x)\), \(IG(x:y)\), \(KG(x)\) and \(KG(y)\).

MSC:

68Q30 Algorithmic information theory (Kolmogorov complexity, etc.)
68T05 Learning and adaptive systems in artificial intelligence
Full Text: DOI

References:

[1] Cesa-Bianchi, N.; Freund, Y.; Helmbold, D. P.; Haussler, D.; Schapire, R. E.; Warmuth, M. K., How to use expert advice, J. ACM, 44, 427-485 (1997) · Zbl 0890.68066
[2] Cormen, H.; Leiserson, E.; Rivest, R., Introduction to Algorithms (1990), McGraw-Hill: McGraw-Hill New York · Zbl 1158.68538
[3] D. Haussler, J. Kivinen, M.K. Warmuth, Tight worst-case loss bounds for predicting with expert advice, Technical Report UCSC-CRL-94-36, University of California at Santa Cruz, revised December 1994, 1994. Short version in P. Vitányi (Ed.), Computational Learning Theory, Lecture Notes in Computer Science, vol. 904, Springer, Berlin, 1995, pp. 69-83.; D. Haussler, J. Kivinen, M.K. Warmuth, Tight worst-case loss bounds for predicting with expert advice, Technical Report UCSC-CRL-94-36, University of California at Santa Cruz, revised December 1994, 1994. Short version in P. Vitányi (Ed.), Computational Learning Theory, Lecture Notes in Computer Science, vol. 904, Springer, Berlin, 1995, pp. 69-83.
[4] Li, M.; Vitányi, P., An Introduction to Kolmogorov Complexity and its Applications (1997), Springer: Springer New York · Zbl 0866.68051
[5] Rogers, H., Theory of Recursive Functions and Effective Computability (1967), McGraw-Hill: McGraw-Hill New York · Zbl 0183.01401
[6] Vovk, V., Aggregating strategies, (Fulk, M.; Case, J., Proceedings of the 3rd Annual Workshop on Computational Learning Theory (1990), Morgan Kaufmann: Morgan Kaufmann San Mateo, CA), 371-383
[7] Vovk, V., A game of prediction with expert advice, J. Comput. System Sci., 56, 153-173 (1998) · Zbl 0945.68528
[8] V. Vovk, C.J.H.C. Watkins, Universal portfolio selection, Proceedings of the 11th Annual Conference on Computational Learning Theory, 1998, pp. 12-23.; V. Vovk, C.J.H.C. Watkins, Universal portfolio selection, Proceedings of the 11th Annual Conference on Computational Learning Theory, 1998, pp. 12-23.
[9] Vovk, V.; Gammerman, A., Complexity estimation principle, Comput. J., 42, N4, 318-322 (1999) · Zbl 0937.68063
[10] V’yugin, V. V., Does snooping help? Theoret. Comput. Sci., 276, 407-415 (2002) · Zbl 1002.68072
[11] Vyugin, M. V.; V’yugin, V. V., On complexity of easy predictable sequences, Inform. Comput., 178, 241-252 (2002) · Zbl 1012.68090
[12] M.V. Vyugin, V.V. V’yugin, Predictive complexity and information, Proceedings of the 15th International Conference on Computational Learning Theory—COLT’02, Lecture Notes on Artificial Intelligence, vol. 2375, Springer, Berlin, 2002, pp. 90-104.; M.V. Vyugin, V.V. V’yugin, Predictive complexity and information, Proceedings of the 15th International Conference on Computational Learning Theory—COLT’02, Lecture Notes on Artificial Intelligence, vol. 2375, Springer, Berlin, 2002, pp. 90-104. · Zbl 1050.68080
[13] K. Yamanishi, Randomized approximate aggregating strategies and their applications to prediction and discrimination, in: Proceedings of the Eighth Annual ACM Conference on Computational Learning Theory, Assoc. Comput. Machinery, New York, 1995, pp. 83-90.; K. Yamanishi, Randomized approximate aggregating strategies and their applications to prediction and discrimination, in: Proceedings of the Eighth Annual ACM Conference on Computational Learning Theory, Assoc. Comput. Machinery, New York, 1995, pp. 83-90.
[14] Zvonkin, A. K.; Levin, L. A., The complexity of finite objects and the algorithmic concepts of information and randomness, Russ. Math. Surv., 25, 83-124 (1970) · Zbl 0222.02027
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.