Drefs, Jakob (2022) Evolutionary Variational Optimization for Probabilistic Unsupervised Learning. PhD, Universitätsbibliothek Oldenburg.

[img]
Preview
- Updated Version

Volltext (23Mb)
Official URL: https://plus.orbis-oldenburg.de/permalink/f/11df5a...

Abstract

Probabilistic generative modeling is a powerful machine learning paradigm suitable for a variety of tasks, such as missing value estimation, denoising, clustering, outlier detection, or compression. A key factor determining the efficiency and effectiveness of algorithms based on generative models is the internally used approximate inference scheme. This work focuses on inference and learning in generative models with binary latents and studies a novel approach that combines variational optimization of truncated posteriors with evolutionary algorithms. As a distinction to related Gaussian and factored (a.k.a., mean field) variational approximations, the use of truncated variational distributions allows for flexible adaptation to complex posterior landscapes with multiple modes and correlations. Through the application of evolutionary algorithms as integral part of the variational optimization scheme, the approach becomes generically applicable without the need for model-specific analytical derivations (‘black box’ inference). Distinctions to related ‘black box’ inference schemes include that the approach studied here does neither involve sampling approximations nor gradient-based optimization of variational parameters, thereby circumventing auxiliary mechanisms such as continuous relaxations of discrete distributions and variance reduction techniques required by stochastic gradient estimators. Compared to previous truncated variational approximation schemes, the approach studied has the distinguishing feature of guaranteeing a monotonic increase of a log-likelihood lower bound. The studies presented in this thesis consider applications of the novel evolutionary variational optimization to a variety of data models with a wide range of characteristics, including: binary and binary-continuous priors, binary and continuous noise models, shallow and deep model architectures, linear and non-linear latent interaction models including learnable and non-learnable non-linearities, and global and latent-specific variance encodings. When studying deep generative models, the evolutionary variational optimization is combined with automatic differentiation tools for model parameter optimization, thereby demonstrating the suitability of the investigated approach for ‘black box’ learning and inference. The numerical experiments presented focus on image patch modeling and provide extensive performance evaluations using established denoising and inpainting benchmarks that allow for comparison to a broad range of related methods. These investigations reveal that evolutionary variational optimization of expressive data models such as spike-and-slab sparse coding or variational autoencoders results in competitive performances in benchmark settings with only a single noisy image available for training. Analyses of the learned data representations show that the models use comparably sparse encodings in theses cases. Experiments with SARS-CoV-2 microscopy imaging data illustrate potential contributions that the developed methods may deliver in applications, for instance, possibly improving visualization of image details to potentially ease image interpretation. In general, this work highlights the importance of effective procedures for learning and inference in generative models, and opens the door to a variety of possible future research directions.

Item Type: Thesis (PhD)
Additional Information: Zu den Nutzungslizenzen bereits a.a.O. publizierter Elemente dieser Dissertation siehe die Hinweise bei den jeweiligen Kapiteln. Für die bereits bei Springer Nature veröffentlichten Elemente gilt: Reproduced with permission from Springer Nature. The Springer Nature content included in this chapter is not covered by the open access license under which this thesis is published. The right to reuse the content requires the explicit permission of Springer Nature.
Uncontrolled Keywords: Unsupervised Learning Generative Models Variational Methods Evolutionary Algorithms Sparse Coding Variational Autoencoder Denoising Inpainting
Subjects: Generalities, computers, information > Computer science, internet
Divisions: Faculty of Mathematics and Science > Institute of Physics (IfP)
Date Deposited: 17 Jul 2023 08:47
Last Modified: 17 Jul 2023 08:53
URI: https://oops.uni-oldenburg.de/id/eprint/5814
URN: urn:nbn:de:gbv:715-oops-58959
DOI:
Nutzungslizenz:

Actions (login required)

View Item View Item

Document Downloads

More statistics for this item...