LDR | | 01651nmm uu200385 4500 |
001 | | 000000333838 |
005 | | 20240805174631 |
008 | | 181129s2018 |||||||||||||||||c||eng d |
020 | |
▼a 9780438049475 |
035 | |
▼a (MiAaPQ)AAI10823658 |
035 | |
▼a (MiAaPQ)princeton:12599 |
040 | |
▼a MiAaPQ
▼c MiAaPQ
▼d 248032 |
082 | 0 |
▼a 519 |
100 | 1 |
▼a Wang, Yao. |
245 | 10 |
▼a Estimation Error for Regression and Optimal Convergence Rate. |
260 | |
▼a [S.l.] :
▼b Princeton University.,
▼c 2018 |
260 | 1 |
▼a Ann Arbor :
▼b ProQuest Dissertations & Theses,
▼c 2018 |
300 | |
▼a 62 p. |
500 | |
▼a Source: Dissertation Abstracts International, Volume: 79-10(E), Section: B. |
500 | |
▼a Adviser: Weinan E. |
502 | 1 |
▼a Thesis (Ph.D.)--Princeton University, 2018. |
520 | |
▼a In this thesis, we study the optimal convergence rate for the universal estimation error. Let F be the excess loss class associated with the hypothesis space and n be the size of the data set, we prove that if the Fat-shattering dimension satisf |
520 | |
▼a Training in practice may only explore a certain subspace in F. It is useful to bound the complexity of the subspace explored instead of the whole F. This is done for the gradient descent method. |
590 | |
▼a School code: 0181. |
650 | 4 |
▼a Applied mathematics. |
690 | |
▼a 0364 |
710 | 20 |
▼a Princeton University.
▼b Mathematics. |
773 | 0 |
▼t Dissertation Abstracts International
▼g 79-10B(E). |
773 | |
▼t Dissertation Abstract International |
790 | |
▼a 0181 |
791 | |
▼a Ph.D. |
792 | |
▼a 2018 |
793 | |
▼a English |
856 | 40 |
▼u http://www.riss.kr/pdu/ddodLink.do?id=T14998591
▼n KERIS |
980 | |
▼a 201812
▼f 2019 |
990 | |
▼a 관리자 |