LDR | | 02042nmm uu200421 4500 |
001 | | 000000334202 |
005 | | 20240805175330 |
008 | | 181129s2018 |||||||||||||||||c||eng d |
020 | |
▼a 9780438178175 |
035 | |
▼a (MiAaPQ)AAI10828817 |
035 | |
▼a (MiAaPQ)washington:18951 |
040 | |
▼a MiAaPQ
▼c MiAaPQ
▼d 248032 |
082 | 0 |
▼a 004 |
100 | 1 |
▼a Jaech, Aaron. |
245 | 10 |
▼a Low-Rank RNN Adaptation for Context-Aware Language Modeling. |
260 | |
▼a [S.l.] :
▼b University of Washington.,
▼c 2018 |
260 | 1 |
▼a Ann Arbor :
▼b ProQuest Dissertations & Theses,
▼c 2018 |
300 | |
▼a 124 p. |
500 | |
▼a Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B. |
500 | |
▼a Adviser: Mari Ostendorf. |
502 | 1 |
▼a Thesis (Ph.D.)--University of Washington, 2018. |
520 | |
▼a A long-standing weakness of statistical language models is that their performance drastically degrades if they are used on data that varies even slightly from the data on which they were trained. In practice, applications require the use of adap |
520 | |
▼a The current standard approach to recurrent neural network language model adaptation is to apply a simple linear shift to the recurrent and/or output layer bias vector. Although this is helpful, it does not go far enough. This thesis introduces a |
520 | |
▼a In our experiments on several different datasets and multiple types of context, the increased adaptation of the recurrent layer is always helpful, as measured by perplexity, the standard for evaluating language models. We also demonstrate impact |
590 | |
▼a School code: 0250. |
650 | 4 |
▼a Computer science. |
650 | 4 |
▼a Statistics. |
690 | |
▼a 0984 |
690 | |
▼a 0463 |
710 | 20 |
▼a University of Washington.
▼b Electrical Engineering. |
773 | 0 |
▼t Dissertation Abstracts International
▼g 79-12B(E). |
773 | |
▼t Dissertation Abstract International |
790 | |
▼a 0250 |
791 | |
▼a Ph.D. |
792 | |
▼a 2018 |
793 | |
▼a English |
856 | 40 |
▼u http://www.riss.kr/pdu/ddodLink.do?id=T14999220
▼n KERIS |
980 | |
▼a 201812
▼f 2019 |
990 | |
▼a 관리자 |