Log-linear models, extensions, and applications / edited by Aleksandr Aravkin [and six others].
2018
QA278 .L64 2018eb
Formats
| Format | |
|---|---|
| BibTeX | |
| MARCXML | |
| TextMARC | |
| MARC | |
| DublinCore | |
| EndNote | |
| NLM | |
| RefWorks | |
| RIS |
Linked e-resources
Linked Resource
Details
Title
Log-linear models, extensions, and applications / edited by Aleksandr Aravkin [and six others].
ISBN
9780262351607 (electronic bk.)
0262351609 (electronic bk.)
9780262039505
0262351609 (electronic bk.)
9780262039505
Published
Cambridge : MIT Press, 2018.
Language
English
Description
1 online resource (214 pages).
Call Number
QA278 .L64 2018eb
Dewey Decimal Classification
519.5/35
Summary
Log-linear models play a key role in modern big data and machine learning applications. From simple binary classification models through partition functions, conditional random fields, and neural nets, log-linear structure is closely related to performance in certain applications and influences fitting techniques used to train models. This volume covers recent advances in training models with log-linear structures, cover the underlying geometry, optimization techniques, and multiple applications. The first chapter shows readers the inner workings of machine learning, providing insights into the geometry of log-linear and neural net models. The other chapters range from introductory material to optimization techniques to involved use cases. The book, which grew out of a NIPS workshop, is suitable for graduate students doing research in machine learning, in particular deep learning, variable selection, and applications for speech recognition. The contributors come from academia and industry, allowing readers to view the field from both perspectives.
Access Note
Access limited to authorized users.
Source of Description
OCLC-licensed vendor bibliographic record.
Added Author
Record Appears in