Think! Evidence

Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning

Show simple item record

dc.creator R. Spiegel
dc.creator R. Spiegel
dc.creator R. Spiegel
dc.date 2007-09-01T00:00:00Z
dc.date.accessioned 2015-07-20T22:16:20Z
dc.date.available 2015-07-20T22:16:20Z
dc.identifier 1863-0383
dc.identifier https://doaj.org/article/c6f02dbe98f54ab48dcbee517cc0c4a3
dc.identifier.uri http://evidence.thinkportal.org/handle/123456789/18399
dc.description Recurrent neural networks are frequently applied to simulate sequence learning applications such as language processing, sensory-motor learning, etc. For this purpose, they often apply a truncated gradient descent (=error correcting) learning algorithm. In order to converge to a solution that is congruent with a target set of sequences, many iterations of sequence presentations and weight adjustments are typically needed. Moreover, there is no guarantee of finding the global minimum of error in a multidimensional error landscape resulting from the discrepancy between target values and the network’s prediction. This paper presents a new approach of inferring the global error minimum right from the start. It further applies this information to reverse-engineer the weights. As a consequence, learning is speeded-up tremendously, whilst computationally-expensive iterative training trials can be skipped. Technology applications in established and emerging industries will be discussed.
dc.language English
dc.publisher International Association of Online Engineering (IAOE)
dc.relation http://www.online-journals.org/index.php/i-jet/article/view/100/75
dc.relation https://doaj.org/toc/1863-0383
dc.rights CC BY
dc.source International Journal of Emerging Technologies in Learning (iJET), Vol 2, Iss 3 (2007)
dc.subject Gaussian processes
dc.subject Error-correction
dc.subject Bayes theorem
dc.subject Sequential learning
dc.subject Recurrent neural networks
dc.subject Technology (General)
dc.subject T1-995
dc.subject Technology
dc.subject T
dc.subject DOAJ:Technology (General)
dc.subject DOAJ:Technology and Engineering
dc.subject Theory and practice of education
dc.subject LB5-3640
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Technology (General)
dc.subject T1-995
dc.subject Technology
dc.subject T
dc.subject DOAJ:Technology (General)
dc.subject DOAJ:Technology and Engineering
dc.subject Theory and practice of education
dc.subject LB5-3640
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Technology (General)
dc.subject T1-995
dc.subject Technology
dc.subject T
dc.subject Theory and practice of education
dc.subject LB5-3640
dc.subject Education
dc.subject L
dc.subject Technology (General)
dc.subject T1-995
dc.subject Technology
dc.subject T
dc.subject Theory and practice of education
dc.subject LB5-3640
dc.subject Education
dc.subject L
dc.subject Technology (General)
dc.subject T1-995
dc.subject Technology
dc.subject T
dc.subject Theory and practice of education
dc.subject LB5-3640
dc.subject Education
dc.subject L
dc.title Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning
dc.type article


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search Think! Evidence


Browse

My Account