MARC보기
LDR00000nmm u2200205 4500
001000000330609
00520241101160100
008181129s2017 ||| | | | eng d
020 ▼a 9780438091412
035 ▼a (MiAaPQ)AAI10871362
035 ▼a (MiAaPQ)OhioLINK:osu1511901271093727
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 248032
0491 ▼f DP
0820 ▼a 004
1001 ▼a Roychowdhury, Anirban.
24510 ▼a Robust and Scalable Algorithms for Bayesian Nonparametric Machine Learning.
260 ▼a [S.l.] : ▼b The Ohio State University., ▼c 2017
260 1 ▼a Ann Arbor : ▼b ProQuest Dissertations & Theses, ▼c 2017
300 ▼a 188 p.
500 ▼a Source: Dissertation Abstracts International, Volume: 79-10(E), Section: B.
500 ▼a Adviser: Srinivasan Parthasarathy.
5021 ▼a Thesis (Ph.D.)--The Ohio State University, 2017.
520 ▼a Bayesian nonparametric techniques provide a rich set of tools for modeling complex probabilistic machine learning problems. However the richness comes at the cost of significant complexity of learning and inference for large scale datasets, in a
520 ▼a First, we develop fast inference algorithms for sequential models with Bayesian nonparametric priors using small-variance asymptotics, an emerging technique for obtaining scalable combinatorial algorithms from rich probabilistic models. We deriv
520 ▼a We start the second section with a novel stick-breaking definition of a certain class of Bayesian nonparametric priors called gamma processes (GP), using its characterization as a completely random measure and attendant Poisson process machinery
520 ▼a In the third section, we use concepts from statistical physics to develop a robust Monte Carlo sampler that efficiently traverses the parameter space. Built on the Hamiltonian Monte Carlo framework, our sampler uses a modified Nose-Poincare Hami
520 ▼a We continue with an L-BFGS optimization algorithm on Riemannian manifolds that uses stochastic variance reduction techniques for fast convergence with constant step sizes, without resorting to standard linesearch methods, and provide a new conve
520 ▼a We finish with a novel technique for learning the mass matrices in Monte Carlo samplers obtained from discretized dynamics that preserve some energy function, by using existing dynamics in the sampling step of a Monte Carlo EM framework, and lea
590 ▼a School code: 0168.
650 4 ▼a Computer science.
690 ▼a 0984
71020 ▼a The Ohio State University. ▼b Computer Science and Engineering.
7730 ▼t Dissertation Abstracts International ▼g 79-10B(E).
773 ▼t Dissertation Abstract International
790 ▼a 0168
791 ▼a Ph.D.
792 ▼a 2017
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T15000214 ▼n KERIS
980 ▼a 201812 ▼f 2019
990 ▼a 관리자 ▼b 관리자