Reading PAGE

Peer Evaluation activity

Downloads 1
Views 1
Following... 4

Total impact ?

    Send a

    Matteo has...

    Trusted 5
    Reviewed 0
    Emailed 0
    Shared/re-used 0
    Discussed 0
    Invited 0
    Collected 0

     

    This was brought to you by:

    block this user Matteo Gagliolo

    Post Doctorate

    CoMo, VUB, Brussels

    Training recurrent networks by Evolino.

    Export to Mendeley

    In recent years, gradient-based LSTM recurrent neural networks (RNNs) solved many previously RNN-unlearnable tasks. Sometimes, however, gradient information is of little use for training RNNs, due to numerous local minima. For such cases, we present a novel method: EVOlution of systems with LINear Outputs (Evolino). Evolino evolves weights to the nonlinear, hidden nodes of RNNs while computing optimal linear mappings from hidden state to output, using methods such as pseudo-inverse-based linear regression. If we instead use quadratic programming to maximize the margin, we obtain the first evolutionary recurrent support vector machines. We show that Evolino-based LSTM can solve tasks that Echo State nets (Jaeger, 2004a) cannot and achieves higher accuracy in certain continuous function generation tasks than conventional gradient descent RNNs, including gradient-based LSTM.

    Oh la laClose

    Your session has expired but don’t worry, your message
    has been saved.Please log in and we’ll bring you back
    to this page. You’ll just need to click “Send”.

    Your evaluation is of great value to our authors and readers. Many thanks for your time.

    Review Close

    Short review
    Select a comment
    Select a grade
    You and the author
    Anonymity My review is anonymous( Log in  or  Register )
    publish
    Close

    When you're done, click "publish"

    Only blue fields are mandatory.

    Relation to the author*
    Overall Comment*
    Anonymity* My review is anonymous( Log in  or  Register )
     

    Focus & Objectives*

    Have the objectives and the central topic been clearly introduced?

    Novelty & Originality*

    Do you consider this work to be an interesting contribution to knowledge?

    Arrangement, Transition and Logic

    Are the different sections of this work well arranged and distributed?

    Methodology & Results

    Is the author's methodology relevant to both the objectives and the results?

    Data Settings & Figures

    Were tables and figures appropriate and well conceived?

    References and bibliography

    Is this work well documented and has the bibliography been properly established?

    Writing

    Is this work well written, checked and edited?

    Write Your Review (you can paste text as well)
    Please be civil and constructive. Thank you.


    Grade (optional, N/A by default)

    N/A 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 10
    Close

    Your mailing list is currently empty.
    It will build up as you send messages
    and links to your peers.

     No one besides you has access to this list.
    Close
    Enter the e-mail addresses of your recipients in the box below.  Note: Peer Evaluation will NOT store these email addresses   log in
    Your recipients

    Your message:

    Your email : Your email address will not be stored or shared with others.

    Your message has been sent.

    Description

    Title : Training recurrent networks by Evolino.
    Author(s) : Jürgen Schmidhuber, Daan Wierstra, Matteo Gagliolo, Faustino Gomez
    Abstract : In recent years, gradient-based LSTM recurrent neural networks (RNNs) solved many previously RNN-unlearnable tasks. Sometimes, however, gradient information is of little use for training RNNs, due to numerous local minima. For such cases, we present a novel method: EVOlution of systems with LINear Outputs (Evolino). Evolino evolves weights to the nonlinear, hidden nodes of RNNs while computing optimal linear mappings from hidden state to output, using methods such as pseudo-inverse-based linear regression. If we instead use quadratic programming to maximize the margin, we obtain the first evolutionary recurrent support vector machines. We show that Evolino-based LSTM can solve tasks that Echo State nets (Jaeger, 2004a) cannot and achieves higher accuracy in certain continuous function generation tasks than conventional gradient descent RNNs, including gradient-based LSTM.
    Keywords : animals, artificial intelligence, humans, memory, models, neurological, neural networks (computer), neurons, neurons physiology, nonlinear dynamics, serial learning, serial learning physiology, signal processing, computer assisted

    Subject : unspecified
    Area : Other
    Language : English
    Year : 2007

    Affiliations CoMo, VUB, Brussels
    Journal : Neural Computation
    Volume : 19
    Issue : 3
    Publisher : MIT Press
    Pages : 757-779
    Url : http://www.ncbi.nlm.nih.gov/pubmed/17298232

    Leave a comment

    This contribution has not been reviewed yet. review?

    You may receive the Trusted member label after :

    • Reviewing 10 uploads, whatever the media type.
    • Being trusted by 10 peers.
    • If you are blocked by 10 peers the "Trust label" will be suspended from your page. We encourage you to contact the administrator to contest the suspension.

    Does this seem fair to you? Please make your suggestions.

    Please select an affiliation to sign your evaluation:

    Cancel Evaluation Save

    Please select an affiliation:

    Cancel   Save

    Matteo's Peer Evaluation activity

    Downloads 1
    Views 1
    Following... 4

    Matteo has...

    Trusted 5
    Reviewed 0
    Emailed 0
    Shared/re-used 0
    Discussed 0
    Invited 0
    Collected 0
    Invite this peer to...
    Title
    Start date (dd/mm/aaaa)
    Location
    URL
    Message
    send
    Close

    Full Text request

    Your request will be sent.

    Please enter your email address to be notified
    when this article becomes available

    Your email


     
    Your email address will not be shared or spammed.