Yes, absolutely. The new language model library is based on this code on github. It's a direct successor to char-rnn, and if you scroll down to the bottom of the readme there's some pretty detailed benchmark information showing how much better it is.Quote from helpiminabox »Are there any speed improvements to the new framework?
My clone of the code can be found here. Be sure to look at the 'dev' branch, master is currently tied up with a pull request. The neural net code is the same, but I've developed additional code to allow much more sophisticated training input and sampling output on Linux. I'll add a tutorial explaining how the new functionality works once I'm ready for a semi-official release together with mtgencode, hopefully this weekend.