Zhelezniak H. Entropy functionals and related Fredholm integral equations with additional singularity.

Українська версія

Thesis for the degree of Doctor of Philosophy (PhD)

State registration number

0823U100082

Applicant for

Specialization

  • 111 - Математика та статистика. Математика

12-01-2023

Specialized Academic Board

ДФ 26.001.354

Taras Shevchenko National University of Kyiv

Essay

The thesis is devoted to the search for extrema of entropy-type functionals constructed for a set of probability measures corresponding to two independent Wiener processes with a drift, as well as a linear combination of independent Wiener process and fractional Brownian motion. The main goal of finding extrema of entropy-type functionals is reduced to finding the solution of Fredholm integral equation of the second kind with kernels with an additional singularity, that is, with weakly singular kernels that also have discontinuity points in the numerator. Most of the methods developed for finding the numerical solution of Fredholm integral equations of the second kind with weakly singular kernel consider kernels with a continuous function in the numerator. The dissertation proved a theorem on approximation of solution of integral equation with the kernel containing additional singularity by the solutions of the integral equations whose kernels are weakly singular but the numerator is continuous. Equations of this type with a kernel that has a more complex representation also arise in statistical estimation problems, namely, when constructing a maximum likelihood estimate of the drift parameter in a model with two fractional Brownian motions. For the numerical solution of such equations, the same modified numerical method was applied and two alternative estimates of drift parameter were additionally constructed. Entropy functionals, which are the main object of consideration in the thesis, are inextricably linked with the concept of entropy, which is basic term in thermodynamics, information theory, finance, and a fundamental quantity in statistics and machine learning. It has a huge number of applications, for example, in astronomy, cryptography, signal processing, image analysis, and bioinformatics. Shannon’s work, in which a statistical definition of entropy was introduced, became a pioneer in the theory of information entropy. The well-known one-parameter generalization of Shannon’s entropy – the Renyi entropy – is widely used in statistical physics, ´ quantum mechanics, communication theory, and data processing. In the theory of finance, the principle of minimum cross-entropy, formulated by Kullback and Leibler, was intensively applied, namely the comparison of option prices and the utility maximization with a random time horizon.

Files

Similar theses