Using Multi-Layer Recurrent Neural Network for language models

February 22, 2018 - Reading time: ~1 minute

Here is another example of how to use Multi-Layer Recurrent Neural Network (RNN package) designed for character-level language models. This neural network was trained using 165,000+ real titles of acts submitted to the Congress from CONGRESS.GOV. The training was performed using GPU. Then the trained RNN was used to create "fake" titles. Use this link to find which bill title is real and which is created by RNN.

This example of the RNN package is provided by Jahred Adelman (NIU).

Category:

Mission

Our mission is to promote computing for science and education. We believe any knowledge software should be free and accessible.