MIT Twitterbot postdoc: "The basis of my approach is similar to existing work that can simulate Shakespeare."


Here is the citation:
"The Unreasonable Effectiveness of Recurrent Neural Networks"

May 21, 2015

(Scroll down to)


"It looks like we can learn to spell English words. But how about if there is more structure and style in the data? To examine this I downloaded all the works of Shakespeare and concatenated them into a single (4.4MB) file. We can now afford to train a larger network, in this case lets try a 3-layer RNN with 512 hidden nodes on each layer. After we train the network for a few hours we obtain samples such as: (please refer to above link)"