2019 Holiday Exchange!
 
A New and Exciting Beginning
 
The End of an Era
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Yet another interesting paper released recently, this paper from Alex Graves introduces a new type of LSTM that provides the benefits of extra memory without increasing the number of parameters. Seems quite promising
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from mwchase »
    Dangit. I'm sure I remember someone in this thread mentioning something about a system for asking questions that structurally analyzed the sentence, then used it to put together a tree of neural nets, where each one is meant to address part of the question in some way. (Or maybe I saw it elsewhere and assumed it was here?)
    Hmmm, the closest paper I can think of is this on on dynamic memory networks. Does this sound right?
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Google released this paper comparing top language models. It's an interesting read
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from Talcos »
    I had to do some rewiring of the image code to get it to behave as I wanted, and in the process I ran into a crippling bug in the underlying machine learning library. It was keeping me from making further progress with the image generation stuff for several days. Fortunately, a fix came out for the bug just the other day (just in time!). But things still aren't working the way they should.

    The good news is that other people have been putting out implementations of the algorithms using other libraries (the one I was trying to use was written in Theano...
    What was the bug and what was the fix? As an avid Theano user, knowing this would be a huge help!
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    A nice long overview of deep learning can be found here by Hinton, Bengio, and LeCun
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from Talcos »
    Quote from Tiir719 »

    EDIT 2: In order to use that specific library, we would have to check and make sure that every word in the mtg corpus has a GloVe vector, though it shouldn't be a difficult thing to work around regardless. A cool thing about this library is that the word vectors are also being updated during training so that they are fine-tuned for our task


    What do you mean? I figured we'd train a fresh GloVe model based on the mtg corpus rather than using a pre-trained model, just like we did with word2vec.
    A common practice in NLP is to initialize with GloVe or word2vec vectors since they're already trained on a huge corpus. From there we could continue training our vectors solely on the mtg corpus to fine-tune if we would like. Since the vectors can continue to train during the training of our net, it may be superfluous to have the middle fine-tuning step. However, due to the nature of the mtg corpus and how it differs from regular English, initializing to pre-trained GloVe/word2vec might not benefit us as much
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from maplesmall »
    Weren't we talking a while back about generating cards word-by-word rather than letter-by-letter? I found someone who's already implemented this in a modified version of Karpathy's code and used it to generate clickbait titles (21 MTG Cards Generated By Machines That May Shock You!). Any chance we could use this for our purposes?
    There are pros and cons to doing it word-by-word compared to character-by-character models. Opening up to word-by-word would allow for some natural language processing tricks which may be useful. However, like Talcos said, it would mean losing the ability for the net to create and use novel words (no more tromple). I definitely think it's worth a try to see how it differs.

    Quote from maplesmall »
    Couldn't we still prime the word-level network with words?
    We could still prime with words, but only words that exist in the corpus

    EDIT: I see Talcos beat me to the punch on answering your questions!

    EDIT 2: In order to use that specific library, we would have to check and make sure that every word in the mtg corpus has a GloVe vector, though it shouldn't be a difficult thing to work around regardless. A cool thing about this library is that the word vectors are also being updated during training so that they are fine-tuned for our task
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from Talcos »
    First, congratulations for the new job and for your engagement!

    As for your idea, that's a very interesting way of approaching the problem. So, as a generative model, would it be like sketching out the idea for the card first, and then making a second pass to hone in on the fine details? If so, that'd be fun because it'd force the second net to work within the constraints set by the first net.
    Thank you very much! One exciting thing about the new job is that half of what I'll be doing is natural language processing, so this project will help my work and my work will help this project.

    And yes, that's the idea. It's (very roughly) similar to the idea in this paper. In this paper, they're training a net to locate people's joints in images. First they do a "coarse" prediction from a convnet, then take some of the hidden states from that net and use those in a "precise" convnet. The "coarse" net gets the general location (in our case the general idea for each of the fields) and then the "precise" net has a much easier time of zeroing in (in our case, making sure all the fields agree). As for specifics: for training, both nets need to try to predict the card (though the second net has a higher weight than the first) and for generating we do the first net normally until generates a card, then do the second net normally (making sure it gets the same first character as before) until it generates a card. For "whispering" to the net, we just whisper in both parts.
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Lately I've been super busy with big (but exciting!) changes including a new job and getting engaged, but pretty much ever free chance I get I've been thinking about the problem/idea of training a net to read the card forward and backwards while still being generative. Recently though I've been taking a new approach and I wanted to see what you guys think. Lets say you're reading a book which has the following sentence: "The hen house is red". As you read it, each word changes your understanding. You read "The hen" and get an image of a hen because at this point you think it's a noun. However, once you get to "The hen house" you realize hen is an adjective and so the meaning is different. If by the time you finish the sentence you're confused, you wouldn't read it backwards ("red is house hen The", ah, now it makes perfect sense), but instead you read it forwards again while keeping in mind what you thought the first time you read it.

    So basically the idea is this: Take the normal net (net1) structure and do a forward pass. Then do the same thing with a different net (net2), except the hidden states in net2 take the final hidden state from net1 as additional input. Thoughts?
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from Talcos »
    The results don't have to be perfect because we can apply a style transfer afterwards; we can smooth over minor irregularities when we bring the image in line with the style we want.
    Just speculating here, but I'm pretty sure we could do the image creation and style transfer at the same time. One could guide the image creation to match a specified style (if we were hallucinating the content) or content (if we were hallucinating the style).
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from mwchase »
    Ugh. I'm getting results, technically, but Theano is really jacking the iteration time up.[...]Switch to float32? Too long to tell.
    Theano can only run on the gpu if you're using float32. Are you using any theano flags when you run python/ipython? Theano will use your cpu by default otherwise
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Anyway, I came here today to say that I came up with this idea of using the algorithm you guys have used, to apply in that card game called Hearthstone.
    There is a popular reddit post here on that very topic that is a fun read.

    As you have said, Hearthstone has a cripplingly tiny card pool for this kind of approach (681 if you count tokens and non-collectibles). It just isn't enough for there to be noticeable trends (moreso now thanks to the addition of cards that are strictly better than previous cards). That's not to say it won't work, it's just a much more difficult problem. As an avid Hearthstone player, I hope you get it to succeed!
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from mwchase »
    I could certainly try it out.
    I'd recommend it to anyone with python knowledge who is interested in jumping into machine learning. It has a bit of a steep learning curve, but I've had a ton of luck with it. There are also many really great libraries built on top of it to make it more user-friendly, my favorite being Lasagne.
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    mwchase I know this is a bit late, but is what you're doing similar to Theano? Symbolic math, graph optimization, etc?
    Posted in: Custom Card Creation
  • posted a message on Generating Magic cards using deep, recurrent neural networks
    Quote from Talcos »
    EDIT(2): Rerunning things. But I have a suspicion that something is wrong. I think it may have something to do with the fact that I try doing some weight sharing and those weights get unshared when I do a cast. But we'll see.
    Which weights are you sharing?
    Posted in: Custom Card Creation
  • To post a comment, please or register a new account.