It's just that I don't want to have my Wikipedia page years from now say "Talcos was instrumental in the development of murderbots, which overtook malaria as the leading cause of infant mortality in West Africa in the early 2030s." Not my cup of tea.
On the bright side, malaria is no longer a problem for West African Infants. So your Wikipedia page could look like: "Talcos's murderbots was instrumental in reducing the amount of death caused by malaria among infants in West Africa."
Currently, we have a website dedicated for generating cards (croxis's), is someone working on a site for generating sets?
Not yet, not as far as I know. However, thanks to maplesmall, we now can upload dumps of cards directly into MSE, and that makes it so much easier to construct sets by hand. I think it may be worthwhile for us to return to the topic once we can automate set construction entirely, instead of a semi-automated approach like I implemented. It's definitely do-able on the balancing and tweaking side of things, we just need theme construction, and there are several works out there that have tackled similar problems with varying degrees of success.
It's just that I don't want to have my Wikipedia page years from now say "Talcos was instrumental in the development of murderbots, which overtook malaria as the leading cause of infant mortality in West Africa in the early 2030s." Not my cup of tea.
On the bright side, malaria is no longer a problem for West African Infants. So your Wikipedia page could look like: "Talcos's murderbots was instrumental in reducing the amount of death caused by malaria among infants in West Africa."
I know I brought up a grim subject, but I literally laughed out loud at that. Thank you.
There are actually several matters that I'm not at liberty to discuss here, but what I can say is that there are people out there who can put a cheery spin on just about anything. It's a phenomenal thing to witness.
@Hardcast I'm asking this before I reinvent the wheel on my side: I guess you'll want to be able to change you input format and still be able to get statistics and scripts (detect errors/boring cards) working too? But what will be your format for that?
I'm taking a break from active development; don't expect me to put up anything new for a little while. Though I do have some big dumps coming out of the latest randomized field order network, I'll try to post MSE sets of those (and hopefully with the card similarity / creativity numbers enabled).
The code is designed so that it's easy to change the format. For example, if you want to tweak the ordering of the fields, all you have to do is modify the array that controls what order they go in. As long as you're consistent within your code, it should be easy to reuse most of what's there with any variation of the format. If anybody out there is doing fun experiments based on mtgencode that they'd like to share, I'd happily merge pull requests to put tools or alternative formatters into 'scripts'. Some form of checker / validator should eventually live in there for sure.
As for mtg-rnn, the customized batcher is feature complete as far as randomizing the training set in a nice way, but I haven't implemented one-card-at-a-time mode yet. That's anticipated to be very slow, so I don't really have much motivation to work on it until I feel like running code on my GPU for a solid week.
I am also considering some more significant changes to the format concerning the way I handle card text, but again, I won't implement that until I have some dedicated time to test.
Not yet, not as far as I know. However, thanks to maplesmall, we now can upload dumps of cards directly into MSE, and that makes it so much easier to construct sets by hand. I think it may be worthwhile for us to return to the topic once we can automate set construction entirely, instead of a semi-automated approach like I implemented. It's definitely do-able on the balancing and tweaking side of things, we just need theme construction, and there are several works out there that have tackled similar problems with varying degrees of success.
I've been thinking about this lately - I'm intrigued by the possibility of a generate-filter-regenerate approach using multiple networks.
The simplest approach would be to start with a maximally stable network for initial generation, probably one that doesn't include cardnames and uses whatever fixed field order we can get the most consistent cards out of. Then we filter out cards that we like to fill the set skeleton, and simply abuse the random field order network to name them with priming. If the filters were sufficiently good we could do other clever things like regenerate manacosts or types that the filter believed were inappropriate. Then all we need is some art
I'd love to write a driver that coordinates this across mtgencode and mtg-rnn. The biggest development roadblocks are need for design skeletons and effective filtering scripts.
Not yet, not as far as I know. However, thanks to maplesmall, we now can upload dumps of cards directly into MSE, and that makes it so much easier to construct sets by hand. I think it may be worthwhile for us to return to the topic once we can automate set construction entirely, instead of a semi-automated approach like I implemented. It's definitely do-able on the balancing and tweaking side of things, we just need theme construction, and there are several works out there that have tackled similar problems with varying degrees of success.
I've been thinking about this lately - I'm intrigued by the possibility of a generate-filter-regenerate approach using multiple networks.
The simplest approach would be to start with a maximally stable network for initial generation, probably one that doesn't include cardnames and uses whatever fixed field order we can get the most consistent cards out of. Then we filter out cards that we like to fill the set skeleton, and simply abuse the random field order network to name them with priming. If the filters were sufficiently good we could do other clever things like regenerate manacosts or types that the filter believed were inappropriate. Then all we need is some art
I'd love to write a driver that coordinates this across mtgencode and mtg-rnn. The biggest development roadblocks are need for design skeletons and effective filtering scripts.
Very interesting ideas. Just so you know, there's a possibility that we might get a workshop or conference paper out of this down the road. At the beginning I hadn't given the idea much thought; we were mostly just taking existing ideas and fine-tuning them for our applications, which comes across as more of an engineering task than a research one. But as we move forward, we start getting into lots of unexplored, untamed territory, and Magic seems like a perfectly good vehicle for that exploration. Interestingly enough, there actually have been many worthwhile publications in the fields of computerscience, mathematics, sociology, and economics that center on Magic: The Gathering (to name a few). After all, it seems that virtually everyone in the scientific community plays, has played, or knows people who play Magic.
It's just something to think about. It would look good on your resume, it would bring honor to your people, and you might even get invited to speak somewhere. After all, conference publications are how I've gotten to see so much of Europe in spite of my modest income. Well, that and I have a knack for showmanship. After my last talk in Spain, an Italian researcher came up to me and said "I have no idea who you are, but I could listen to you talk for hours! We must do lunch!" And then I spent the next hour wowing him with my encyclopedic knowledge of his people's ancient culture (SPQR 4 lyfe!). After that came some very fruitful discussions about important research topics. The point is that travel has given me the opportunity to cultivate networks of alliances across continents and cultures, and it would benefit you to do the same.
I say you and not me because I'll be locked in my lab at least until the end of Spring reanacting scenes from The Miracle Worker with my machines ("Water. W. A. T. E. R!"). That and you have a funded position with travel grants, both departmental and from your graduate school. I'd be fine with doing most of the authorship, as that's my bread and butter, but a lot of conferences expect representatives of accepted publications to show up. Of course, if that's not possible for you, there are plenty of workshops out there too.
That goes for anyone else here in a similar position (with time and resources to spare). And that's if any of this pans out in the way that we'd like.
EDIT: I should clarify and say that I'm not trying to spring anything on any of y'all. I'm just speaking in hypotheticals. But if we do happen to stumble upon something worthwhile, that could translate into a workshop paper in the Spring. lol
---
By the way, I hope that I haven't given any of you here any false impressions about the kind of life that people like me lead. Yes, I get to travel, shake hands with important people, and occasionally get to pass through restricted areas guarded by men with guns to see all kinds of cool stuff. At the same time, people like me live on near-poverty wages, are heavily dependent on federal funding (the availability of which is subject to the whims of politicians who rarely understand science), and most of what we do amounts to teaching classes and churning out publications.
Croxis's site hasn't actually worked right in several days. It's down now. Yesterday it would render cards but not generate new ones.
Well, I'm sure it'll be back up before too long. Right now he might be busy teaching his first classes of the semester, and may have overlooked the fact that his server went down. It happens.
---
EDIT: By the way, I'll be looking into the whole neural turing machine business and I'll be sure to let you know if it shows any promising signs for Magic card generation.
"Okay, let's scale the auto-encoder up to some larger data."
...
"Huh. That's kind of slow."
...
"Huh. The fan's kind of loud."
At this point, it would probably be prudent to start converting this stuff to Numpy. I'll paste a dump of this run, though. It should finish soon.
The basic idea of this net is to try to prove out a particular encoding scheme for one field. Basically, if we can get an encoding and a net size for every field on a card, DAMMIT, I forgot to make the query step work nice again. Anyway, if we can get a fixed-length encoding that fosters the right kind of creativity for individual fields, we can then just concatenate the encoded fields and decode the result. Or something. I'm kind of throwing darts.
Just got the results. I knew the error was high, but I didn't quite expect...
The only name it learned was "Assembly-Worker". Everything else autoencoded to an uncertain value representing either 'Aheaha' or 'D`Hane'.
I've been pretty busy with work lately and am having troubles connecting to the computer which is running my experiments, so I haven't been able to do much testing on my nets just yet. I just wanted to share what differences there are in the experiments I'm running compared to char-rnn because I think some people were asking.
char-rnn uses RMSprop (see lecture 6e) for parameter updates whereas I'm using Adam. In addition, each element in a batch (using batch size 64 for bigger nets, 128 for smaller nets) is one card, so there's no bleed or splicing or anything. I'm also using peephole connections for the lstm layers, which char-rnn may or may not be doing. I believe that's all the differences so far, though I'm playing around with a lot more net architectures than char-rnn allows.
Card generation is slow, but only because I've been too lazy to make the change needed to speed it up. On the plus side, I can generate cards a batch at a time instead of one at a time.
The basic idea of this net is to try to prove out a particular encoding scheme for one field. Basically, if we can get an encoding and a net size for every field on a card, DAMMIT, I forgot to make the query step work nice again. Anyway, if we can get a fixed-length encoding that fosters the right kind of creativity for individual fields, we can then just concatenate the encoded fields and decode the result. Or something. I'm kind of throwing darts.
Just got the results. I knew the error was high, but I didn't quite expect...
The only name it learned was "Assembly-Worker". Everything else autoencoded to an uncertain value representing either 'Aheaha' or 'D`Hane'.
Well, that's something.
That is something! Dart throwing is good.
For the record, names are likely to be the hardest to find an efficient encoding for, because there's not a lot of repetition. For things like creature types, on the other hand, you might have much better luck.
---
By the way, I think that it's probably feasible to incorporate a neural Turing machine implementation like this one into our usual scripts.
The controller, the part that we communicate with, is an LSTM implementation as we're used to seeing. The Turing machine part (the permanent storage) is hidden in the undercarriage. What should happen is that the LSTM network uses the TM to record pertinent information about the card that it's studying, and its short-term memory can then be used to remember where it stored things in memory and its current position in the card, that sort of thing.
Ideally, if we trained the network properly, that could solve our X/kicker/etc. issues, because the network is able to leave memos for its future self to read, such as "The X remains undefined!".
Now, there are some unknowns here, like whether clutter is going to be an issue with the permanent storage. Unlike the LSTM, which can just forget things naturally, the permanent storage unit is.. well, just that. The LSTM controller would have to deliberately erase stale memos or it would risk carrying them over to future cards. Now, it might do that, it might not. Alternatively we can purge the permanent storage ourselves when we hit the end of a card.
At least, that's my understanding of it so far. Still looking into the subject for my research.
I've been pretty busy with work lately and am having troubles connecting to the computer which is running my experiments, so I haven't been able to do much testing on my nets just yet. I just wanted to share what differences there are in the experiments I'm running compared to char-rnn because I think some people were asking.
char-rnn uses RMSprop (see lecture 6e) for parameter updates whereas I'm using Adam. In addition, each element in a batch (using batch size 64 for bigger nets, 128 for smaller nets) is one card, so there's no bleed or splicing or anything. I'm also using peephole connections for the lstm layers, which char-rnn may or may not be doing. I believe that's all the differences so far, though I'm playing around with a lot more net architectures than char-rnn allows.
Card generation is slow, but only because I've been too lazy to make the change needed to speed it up. On the plus side, I can generate cards a batch at a time instead of one at a time.
Thank you for the update! And no, that's perfectly fine, I understand that you're probably very busy with work.
I have yet to read the paper on ADAM, though I think I may have heard of it. I'll definitely give that a look.
And I don't think we're using peephole connections in our implementation, the implementation is simpler than that. But I'm sure those modifications could be made.
Okay. After some blind flailing with error addressing (what I have ended up with cannot be efficient), I have numpy. I can now generate entirely new nonsense, much faster. Current status: 'Aaedhap', and '```Biege'. I think it's time to give the hidden layer more neurons.
I mistyped earlier. This is all just on creature types.
Okay, currently, I have the hidden layer at half the size of the data vectors. It's probably overfitting the training data, but that's still information.
I think I need to tune my training parameters... Guessing some numbers, taking the iteration count way down... 'Loasw $#E%Pj%P', '!10B.J+A0Cigxmc'
Okay, let's bump up the iterations a little...
...
After varying the training parameters largely at random, it looks like my approach is susceptible to local minima. ... I will now attempt to stochastically derive the identity function. 'Skvue"$Q "$` %"', '`0D "$!$3Sbvew'. The hell of it is that my error checking code (independent of the algorithm itself) says that the net has a low error in this state. The large number of neurons must be throwing it off.
ETA: With many iterations, it goes from 'Yovue 0', 'Ibvsu' to 'Yovue', 'Ibvwu'. This might represent an improvement.
ETA: A-ha! The trick was to adjust the training parameters way down. We can then get the identity in 200 iterations.
I love the little things I see, like when the decoding is a homophone of the original input (jellyfish vs jellifish). I've noticed things like this before with the network we've been working with, where the machine comes into conflict with the quirks of English orthography. Like the network, when coming up with names for cards, is comfortable spelling "necromancy" as "necromancie", which is closer to French orthography but is also preserved in the plural as necromancies, a word that does not occur in the input corpus. In general, any rule set that attempts to describe how words are spelled in English cannot be both concise and correct.
By the way, I modified the training script to use ADAM instead of RMSprop, though I didn't make any other changes like adding in peephole connections. I have a few cycles to spare this evening so I figured I'd check and see what difference it makes to performance when all else is kept the same (just for fun). I'll let y'all know if anything comes of it.
Lots of the replacements are a result of the network getting one bit flipped.
>>> ord('i') ^ ord('y')
16
Tada.
Anyway, I'm now messing with the idea of using my encoding to do mashups between creature types. Doing the big network experiments now. I don't expect them to bear fruit, since large networks can represent the encoding twice. Once I get some results from that, I'll take it back to around 80 or 90 HN.
By the way, I modified the training script to use ADAM instead of RMSprop, though I didn't make any other changes like adding in peephole connections. I have a few cycles to spare this evening so I figured I'd check and see what difference it makes to performance when all else is kept the same (just for fun). I'll let y'all know if anything comes of it.
What parameters did you use? After enough training ADAM and RMSprop should converge to about the same loss, but from everything I've tried--both in work experiments and mtg experiments--it speeds up training drastically
By the way, I modified the training script to use ADAM instead of RMSprop, though I didn't make any other changes like adding in peephole connections. I have a few cycles to spare this evening so I figured I'd check and see what difference it makes to performance when all else is kept the same (just for fun). I'll let y'all know if anything comes of it.
What parameters did you use? After enough training ADAM and RMSprop should converge to about the same loss, but from everything I've tried--both in work experiments and mtg experiments--it speeds up training drastically
I used the default learning rate, 0.9 for the first beta, 0.999 for the second beta, with an epsilon of 1 * 10^-8. In short, the default values (that the Torch implementation provides).
And yes, after looking over the paper, faster convergence is exactly what I'm hoping for. I just wanted to see how well it behaved ceteris paribus before considering more drastic architectural changes.
EDIT: Yes, it converges very quickly. I like. Excellent suggestion.
I think I've got enough tech lined up to start extending this to other fields. ... Right after I refactor all of the code. The name stuff is really script-y, and the autoencoder is, while mostly vectorized and pretty fast, doing a bunch of loopy casts to make the dimensions line up.
How complicated are the changes to make the training script use ADAM? I could possibly add that as an option for mtg-rnn.
Obscenely easy, in fact.
Instead of:
local _, loss = optim.rmsprop(feval, params, optim_state)
you just have
local _, loss = optim.adam(feval, params, optim_state)
But the ADAM has some extra parameters that you can optionally pass to it that you could not with RMSprop, and we might consider exposing those options to the user.
EDIT: Peepholes, the more I think about it, seem like a fun idea to try. I would need to figure out how to solder them onto our existing implementation, of course. As you can see from the attached graph, which I lovingly borrowed from Gers et al. 2002., they allow us to see the current state of the cell even if the gates are closed. Anyway, this evidently proved to be helpful for motor control and rhythm detection tasks.
Tiir, correct me if I'm wrong, but is the idea that all cells on the same layer have peephole connections to the other cells on that layer? Just making sure I understand how this needs to work.
EDIT: Speaking of rhythm detection tasks, I wonder how well this will work for jazz music. It takes special training to listen for the notes that the musician isn't playing. That gives me fun ideas.
EDIT(2): Ohh, I think I see what would happen. Is it that neighboring cells are more likely to fall into agreement with each other, reducing the chance of concept drift? That seems like a possible outcome. Worth experimenting with, in any case.
This is getting a little ridiculous. It turns out that, in general, the network learns better if I just force the training parameters down, and it doesn't learn appreciably slower.
Also, looking at the merge results again, I suddenly noticed that they're basically "left wins". There's no obvious reason for that to be the case, but there it is. Left must just be stronger with these parameters and seed.
they allow us to see the current state of the cell even if the gates are closed
[...]
Tiir, correct me if I'm wrong, but is the idea that all cells on the same layer have peephole connections to the other cells on that layer? Just making sure I understand how this needs to work.
Interesting idea, but not quite. The peephole connections are extra sets of weights from the cell of a unit to the inputs of that unit's input/output/forget gates. The idea being that how much the cell takes in/forgets/sends out takes into consideration what it's currently remembering. After all, it's easier to decide whether or not you should forget something if you first know what it is you'd be forgetting. For example, a unit remembering that it saw X in the cost wouldn't want to forget anything until it used the X later. Similar logic as the forget gate for the input and output gates.
they allow us to see the current state of the cell even if the gates are closed
[...]
Tiir, correct me if I'm wrong, but is the idea that all cells on the same layer have peephole connections to the other cells on that layer? Just making sure I understand how this needs to work.
Interesting idea, but not quite. The peephole connections are extra sets of weights from the cell of a unit to the inputs of that unit's input/output/forget gates. The idea being that how much the cell takes in/forgets/sends out takes into consideration what it's currently remembering. After all, it's easier to decide whether or not you should forget something if you first know what it is you'd be forgetting. For example, a unit remembering that it saw X in the cost wouldn't want to forget anything until it used the X later. Similar logic as the forget gate for the input and output gates.
Ohh, gotcha. I was under the impression that we were talking about some kind of cross-chatter mechanism. Like if all of a cell's friends smoked cigarettes, then it seems like a very appealing idea to that cell. But then I started to wonder about the computational costs because now I have to worry about information moving in multiple directions and that makes unrolling everything more complicated. What you're suggesting seems much more reasonable.
EDIT: And relatively easy to implement. Looking into that. Karpathy noted that they were "useless", and chose not to implement them. And yet, from what you've been telling me, they most definitely have a use. I'm interested to see how things turn out.
EDIT(2): Almost.. got it.. almost. May have made a typo, which is common for me after a long day, lol.
EDIT(3): And we are a go! Might have made a mistake. Might not have. I suppose I'll know soon enough. Right now I'm testing ADAM + peephole connections without biases on the gates.
I enjoy reading all the details and ideas being bounced around here. Though I am a programmer, I am not well versed in python scripting and the enigmatic LUA. #NotMyLanguage. Being unfamiliar with these languages, is there any particular reason (aside from the library) that python/LUA was chosen for coding a network? For example, I heard somewhere that if your doing a lot of string manipulation that I should look into using Perl since it is "more suited" for that task.
Edit: I think a more clearer example would be asking Vader to choke someone instead of asking a storm trooper to do it. Sure a storm trooper can choke someone but not as cool and efficient as Vader.
By the way, I hope that I haven't given any of you here any false impressions about the kind of life that people like me lead. Yes, I get to travel, shake hands with important people, and occasionally get to pass through restricted areas guarded by men with guns to see all kinds of cool stuff. At the same time, people like me live on near-poverty wages, are heavily dependent on federal funding (the availability of which is subject to the whims of politicians who rarely understand science), and most of what we do amounts to teaching classes and churning out publications.
On the bright side, you'll have the coolest "old-man-stories" when your (legally?) old enough to tell one. Especially if you ended up with far more interesting stories due to jumbled memories (During the time I was working with the government, I get to pass through cool stuff with guns guarding an area with restricted men. )
I enjoy reading all the details and ideas being bounced around here. Though I am a programmer, I am not well versed in python scripting and the enigmatic LUA. #NotMyLanguage. Being unfamiliar with these languages, is there any particular reason (aside from the library) that python/LUA was chosen for coding a network? For example, I heard somewhere that if your doing a lot of string manipulation that I should look into using Perl since it is "more suited" for that task.
Lua has good cross-platform support because it's built on plain and simple ANSI C. It's an extremely lightweight, dynamically typed language, but also one that supports features like first-class functions and garbage collection. I think the sophisticated but flexible memory management and speed are deciding factors for it being the language Torch is built on.
I can't speak to Python's use as a language for ML library development, but it is one of my favorites and is my go-to language for any programs of small to moderate size. It's also my favorite language to teach apprentice codesmiths, so it has that going for it as well. Meanwhile, I tend to do anything bulky in a language like C++/Java.
In my day-to-day work, I use Lua, Python, C, C++, and Java.
I also periodically make use of Fortran, Prolog, and Haskell.
And then there's a slew of formal (non-programming) design languages that are too numerous for me to count but that end up playing a role in variety of projects that I undertake.
On the bright side, you'll have the coolest "old-man-stories" when your (legally?) old enough to tell one. Especially if you ended up with far more interesting stories due to jumbled memories (During the time I was working with the government, I get to pass through cool stuff with guns guarding an area with restricted men. )
Rofl, I never thought of it quite like that before.
EDIT: I made a small mistake with the peephole connections. It worked, but not the way I meant for it to, and it slowed things down rather than speeding them up (my bad). I'll return to that in the morning when I have time, fix it, and then try running the training script again.
EDIT(2): Rerunning things. But I have a suspicion that something is wrong. I think it may have something to do with the fact that I try doing some weight sharing and those weights get unshared when I do a cast. But we'll see.
I also periodically make use of Fortran, Prolog, and Haskell.
Wow, those are three languages I rarely see mentioned in the same breath Except perhaps to finish the sentence “The following languages share almost nothing in common:”
EDIT(2): Rerunning things. But I have a suspicion that something is wrong. I think it may have something to do with the fact that I try doing some weight sharing and those weights get unshared when I do a cast. But we'll see.
I'm messing with finding parameters to create "perfect fits" to the training data. The idea I have then, is that we can use the network as a coding scheme, to render a creature type field (which would naively encode to 120 neurons) into perhaps 80. Using that as a jumping-off point, we can then do stuff like see if there's any further dependencies, or combine creature types at the encoded level. To do this, I'm going to need to add serialization code at some point; since this is all hand-rolled (except for numpy), I'm just regenerating the network from seed every time I want to do anything.
Intuitively speaking, it doesn't make sense to go above 90 neurons, because the alphabet I'm using simply doesn't have a high enough information density to justify it. The trick to finding low-error networks appears to be really low feedback (as I've said before) and a few thousand generations. Don't quote me on this, but I suspect the sweet spot for feedback is for the first-order factor to be within an order of magnitude of 0.5 divided by the number of input neurons, and the second-order factor to be any number significantly smaller than the first-order factor.
Does anyone have any intuitions about where would be interesting to go from here? Once I can create encoders like this, I can, like I said, try to encode the encoding (which should reveal higher-order dependencies), give the network fancy merging capabilities ("create the average of these two encoded vectors" Will this look like anything? I don't know!), or simply move on to other fields and see if my scheme generalizes.
For now, I'm going to bump up the iteration count on this 80 HN net until it learns that it really shouldn't be "Harpi". NOTE FROM THE FUTURE: I done goofed, see spoiler.
I wanted the net to learn words as they start, and as they end. So, I've got a possibly-not-exhaustive list of creature types, and the maximum length of any of them happens to be 15. The encoding scheme is just "pad the input to max length on the left, on the right, and concatenate them". The encoder runs on them, and to get the merged output, each half of the output is decoded separately, the THERE WAS A TYPO IN THE MERGE CODE GEEZ regions corresponding to the decoded output are summed together, and that is decoded. The point of this all is to have a network that responds to commonalities in both Fox and Rhox, Horror and Horse.
Thinking on this stuff has me thinking about a few changes to make. One is, instead of specifying a single first-order weight that gets applied directly to the deltas, I should really either modify the weight or average the deltas. That would save me from having to do mental arithmetic on long decimals to avoid chaos and local minima, and it would also make a given set of parameters more portable between networks of different sizes. Of course, actually making this change would involve figuring out what the heck math I'm actually having this stuff do...
ETA: got a good network with 81 HN. Not totally convinced that this isn't partially due to the extra 481 weights perturbing the random number distribution among the matrices (I'm doing most of this with the same seed), but hey, now I've got a perfectly fitted auto-encoder. It also handles a bit of data not in the training set, and some of my early "merge" experimentation.
ETA: "I can't see how to optimize the inner loop any more." I said. "Installing NumPyPy will be quick and painless." I said. "Let's see what kind of performance I get." I said. Ugh. 6x slowdown. It must not be running long enough for pypy to warm up. Either that, or NumPyPy isn't a good fit for this code. But, it was a nice orthogonal experiment.
On the bright side, malaria is no longer a problem for West African Infants. So your Wikipedia page could look like: "Talcos's murderbots was instrumental in reducing the amount of death caused by malaria among infants in West Africa."
Not yet, not as far as I know. However, thanks to maplesmall, we now can upload dumps of cards directly into MSE, and that makes it so much easier to construct sets by hand. I think it may be worthwhile for us to return to the topic once we can automate set construction entirely, instead of a semi-automated approach like I implemented. It's definitely do-able on the balancing and tweaking side of things, we just need theme construction, and there are several works out there that have tackled similar problems with varying degrees of success.
I know I brought up a grim subject, but I literally laughed out loud at that. Thank you.
There are actually several matters that I'm not at liberty to discuss here, but what I can say is that there are people out there who can put a cheery spin on just about anything. It's a phenomenal thing to witness.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
I'm taking a break from active development; don't expect me to put up anything new for a little while. Though I do have some big dumps coming out of the latest randomized field order network, I'll try to post MSE sets of those (and hopefully with the card similarity / creativity numbers enabled).
The code is designed so that it's easy to change the format. For example, if you want to tweak the ordering of the fields, all you have to do is modify the array that controls what order they go in. As long as you're consistent within your code, it should be easy to reuse most of what's there with any variation of the format. If anybody out there is doing fun experiments based on mtgencode that they'd like to share, I'd happily merge pull requests to put tools or alternative formatters into 'scripts'. Some form of checker / validator should eventually live in there for sure.
As for mtg-rnn, the customized batcher is feature complete as far as randomizing the training set in a nice way, but I haven't implemented one-card-at-a-time mode yet. That's anticipated to be very slow, so I don't really have much motivation to work on it until I feel like running code on my GPU for a solid week.
I am also considering some more significant changes to the format concerning the way I handle card text, but again, I won't implement that until I have some dedicated time to test.
EDIT:
I've been thinking about this lately - I'm intrigued by the possibility of a generate-filter-regenerate approach using multiple networks.
The simplest approach would be to start with a maximally stable network for initial generation, probably one that doesn't include cardnames and uses whatever fixed field order we can get the most consistent cards out of. Then we filter out cards that we like to fill the set skeleton, and simply abuse the random field order network to name them with priming. If the filters were sufficiently good we could do other clever things like regenerate manacosts or types that the filter believed were inappropriate. Then all we need is some art
I'd love to write a driver that coordinates this across mtgencode and mtg-rnn. The biggest development roadblocks are need for design skeletons and effective filtering scripts.
Haha, you and me both.
Very interesting ideas. Just so you know, there's a possibility that we might get a workshop or conference paper out of this down the road. At the beginning I hadn't given the idea much thought; we were mostly just taking existing ideas and fine-tuning them for our applications, which comes across as more of an engineering task than a research one. But as we move forward, we start getting into lots of unexplored, untamed territory, and Magic seems like a perfectly good vehicle for that exploration. Interestingly enough, there actually have been many worthwhile publications in the fields of computer science, mathematics, sociology, and economics that center on Magic: The Gathering (to name a few). After all, it seems that virtually everyone in the scientific community plays, has played, or knows people who play Magic.
It's just something to think about. It would look good on your resume, it would bring honor to your people, and you might even get invited to speak somewhere. After all, conference publications are how I've gotten to see so much of Europe in spite of my modest income. Well, that and I have a knack for showmanship. After my last talk in Spain, an Italian researcher came up to me and said "I have no idea who you are, but I could listen to you talk for hours! We must do lunch!" And then I spent the next hour wowing him with my encyclopedic knowledge of his people's ancient culture (SPQR 4 lyfe!). After that came some very fruitful discussions about important research topics. The point is that travel has given me the opportunity to cultivate networks of alliances across continents and cultures, and it would benefit you to do the same.
I say you and not me because I'll be locked in my lab at least until the end of Spring reanacting scenes from The Miracle Worker with my machines ("Water. W. A. T. E. R!"). That and you have a funded position with travel grants, both departmental and from your graduate school. I'd be fine with doing most of the authorship, as that's my bread and butter, but a lot of conferences expect representatives of accepted publications to show up. Of course, if that's not possible for you, there are plenty of workshops out there too.
That goes for anyone else here in a similar position (with time and resources to spare). And that's if any of this pans out in the way that we'd like.
EDIT: I should clarify and say that I'm not trying to spring anything on any of y'all. I'm just speaking in hypotheticals. But if we do happen to stumble upon something worthwhile, that could translate into a workshop paper in the Spring. lol
---
By the way, I hope that I haven't given any of you here any false impressions about the kind of life that people like me lead. Yes, I get to travel, shake hands with important people, and occasionally get to pass through restricted areas guarded by men with guns to see all kinds of cool stuff. At the same time, people like me live on near-poverty wages, are heavily dependent on federal funding (the availability of which is subject to the whims of politicians who rarely understand science), and most of what we do amounts to teaching classes and churning out publications.
Well, I'm sure it'll be back up before too long. Right now he might be busy teaching his first classes of the semester, and may have overlooked the fact that his server went down. It happens.
---
EDIT: By the way, I'll be looking into the whole neural turing machine business and I'll be sure to let you know if it shows any promising signs for Magic card generation.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
...How long before we can pass the Turing test for the 'bot impersonating MaRo?
I̟̥͍̠ͅn̩͉̣͍̬͚ͅ ̬̬͖t̯̹̞̺͖͓̯̤h̘͍̬e͙̯͈̖̼̮ ̭̬f̺̲̲̪i͙͉̟̩̰r̪̝͚͈̝̥͍̝̲s̼̻͇̘̳͔ͅt̲̺̳̗̜̪̙ ̳̺̥̻͚̗ͅm̜̜̟̰͈͓͎͇o̝̖̮̝͇m̯̻̞̼̫̗͓̤e̩̯̬̮̩n͎̱̪̲̹͖t͇̖s̰̮ͅ,̤̲͙̻̭̻̯̹̰ ̖t̫̙̺̯͖͚̯ͅh͙̯̦̳̗̰̟e͖̪͉̼̯ ̪͕g̞̣͔a̗̦t̬̬͓͙̫̖̭̻e̩̻̯ ̜̖̦̖̤̭͙̬t̞̹̥̪͎͉ͅo͕͚͍͇̲͇͓̺ ̭̬͙͈̣̻t͈͍͙͓̫̖͙̩h̪̬̖̙e̗͈ ̗̬̟̞̺̤͉̯ͅa̦̯͚̙̜̮f͉͙̲̣̞̼t̪̤̞̣͚e̲͉̳̥r͇̪̙͚͓l̥̞̞͎̹̯̹ͅi͓̬f̮̥̬̞͈ͅe͎ ̟̩̤̳̠̯̩̯o̮̘̲p̟͚̣̞͉͓e͍̩̣n͔̼͕͚̜e̬̱d̼̘͎̖̹͍̮̠,͖̺̭̱̮ ̣̲͖̬̪̭̥a̪͚n̟̲̝̤̤̞̗d̘̱̗͇̮͕̳͕͔ ͖̞͉͎t̹̙͎h̰̱͉̗e̪̞̱̝̹̩ͅ ̠̱̩̭̦p̯̙e͓o̳͚̰̯̺̱̰͔̘p̬͎̱̣̼̩͇l̗̟̖͚̠e̱͉͔̱̦̬̟̙ ̖͚̪͔̼̦w̺̖̤̱e͖̗̻̦͓̖̘̜r̭̥e͔̹̫̱͕̦̰͕ ̗͔̠p̠̗͍͍̱̳̠r̰͔͎̰o͉̥͓̰͚̥s̟͚̹̱͔̣t͉̙̳̖͖̪̮r̥̘̥͙̹a͉̟̫̟̳̠̟̭t͈̜̰͈͎e̞̣̭̲̬ ͚̗̯̟͙i͍͖̰̘̦͖͉ṇ̮̻̯̦̲̩͍ ̦̮͚̫̤t͉͖̫͕ͅͅh͙̮̻̘̣̮̼e͕̺ ͙l͕̠͎̰̥i̲͓͉̲g̫̳̟͈͇̖h̠̦̖t͓̯͎̗ ̳̪̘̟̙̩̦o̫̲f̙͔̰̙̠ ̹̪̗͇̯t͖̼̼͉͖̬h̹͇̩e͚̖̺̤͉̹͕̪ ͚͓̭̝̺G͎̗̯̩o̫̯̮̟̮̳̘d̜̲͙̠-̩̳̯̲̗̜P̹̘̥͉̝h͍͈̗̖̝ͅa͍̗̮̼̗r̜̖͇̙̺a̭̺͔̞̳͈o̪̣͓̯̬͙̯̰̗h̖̦͈̥̯͔.͇̣̙̝
...
"Huh. That's kind of slow."
...
"Huh. The fan's kind of loud."
At this point, it would probably be prudent to start converting this stuff to Numpy. I'll paste a dump of this run, though. It should finish soon.
The basic idea of this net is to try to prove out a particular encoding scheme for one field. Basically, if we can get an encoding and a net size for every field on a card, DAMMIT, I forgot to make the query step work nice again. Anyway, if we can get a fixed-length encoding that fosters the right kind of creativity for individual fields, we can then just concatenate the encoded fields and decode the result. Or something. I'm kind of throwing darts.
Just got the results. I knew the error was high, but I didn't quite expect...
The only name it learned was "Assembly-Worker". Everything else autoencoded to an uncertain value representing either 'Aheaha' or 'D`Hane'.
Well, that's something.
char-rnn uses RMSprop (see lecture 6e) for parameter updates whereas I'm using Adam. In addition, each element in a batch (using batch size 64 for bigger nets, 128 for smaller nets) is one card, so there's no bleed or splicing or anything. I'm also using peephole connections for the lstm layers, which char-rnn may or may not be doing. I believe that's all the differences so far, though I'm playing around with a lot more net architectures than char-rnn allows.
Card generation is slow, but only because I've been too lazy to make the change needed to speed it up. On the plus side, I can generate cards a batch at a time instead of one at a time.
Haha, sorry about that. I'll try to keep things positive.
And that's a very good question, I'm not entirely sure.
That is something! Dart throwing is good.
For the record, names are likely to be the hardest to find an efficient encoding for, because there's not a lot of repetition. For things like creature types, on the other hand, you might have much better luck.
---
By the way, I think that it's probably feasible to incorporate a neural Turing machine implementation like this one into our usual scripts.
The controller, the part that we communicate with, is an LSTM implementation as we're used to seeing. The Turing machine part (the permanent storage) is hidden in the undercarriage. What should happen is that the LSTM network uses the TM to record pertinent information about the card that it's studying, and its short-term memory can then be used to remember where it stored things in memory and its current position in the card, that sort of thing.
Ideally, if we trained the network properly, that could solve our X/kicker/etc. issues, because the network is able to leave memos for its future self to read, such as "The X remains undefined!".
Now, there are some unknowns here, like whether clutter is going to be an issue with the permanent storage. Unlike the LSTM, which can just forget things naturally, the permanent storage unit is.. well, just that. The LSTM controller would have to deliberately erase stale memos or it would risk carrying them over to future cards. Now, it might do that, it might not. Alternatively we can purge the permanent storage ourselves when we hit the end of a card.
At least, that's my understanding of it so far. Still looking into the subject for my research.
Thank you for the update! And no, that's perfectly fine, I understand that you're probably very busy with work.
I have yet to read the paper on ADAM, though I think I may have heard of it. I'll definitely give that a look.
And I don't think we're using peephole connections in our implementation, the implementation is simpler than that. But I'm sure those modifications could be made.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
I mistyped earlier. This is all just on creature types.
Okay, currently, I have the hidden layer at half the size of the data vectors. It's probably overfitting the training data, but that's still information.
...
Assembly-Worker: 'Askzmblxlscp+Db', 'Acqfibfy%}Jrkcs'
Everything else: 'Askzibdp`!!`', '!`1B(B&H Mz`Ics'
I think I need to tune my training parameters... Guessing some numbers, taking the iteration count way down... 'Loasw $#E%Pj%P', '!10B.J+A0Cigxmc'
Okay, let's bump up the iterations a little...
...
After varying the training parameters largely at random, it looks like my approach is susceptible to local minima. ... I will now attempt to stochastically derive the identity function. 'Skvue"$Q "$` %"', '`0D "$!$3Sbvew'. The hell of it is that my error checking code (independent of the algorithm itself) says that the net has a low error in this state. The large number of neurons must be throwing it off.
ETA: With many iterations, it goes from 'Yovue 0', 'Ibvsu' to 'Yovue', 'Ibvwu'. This might represent an improvement.
ETA: A-ha! The trick was to adjust the training parameters way down. We can then get the identity in 200 iterations.
Amphin ('Amphin', 'Amphin')
Angel ('Angel', 'Angel')
Antelope ('Antelope', 'Antelope')
Ape ('Ape', 'Ape')
Archon ('Archon', 'Archon')
Assembly-Worker ('Assembly-Worker', 'Assembly-Worker')
Atog ('Atog', 'Atog')
Aurochs ('Aurochs', 'Aurochs')
Avatar ('Avatar', 'Avatar')
Aven ('Aven', 'Aven')
Badger ('Badger', 'Badger')
Basilisk ('Basilisk', 'Basilisk')
Bat ('Bat', 'Bat')
Bear ('Bear', 'Bear')
Beast ('Beast', 'Beast')
Beeble ('Beeble', 'Beeble')
Bird ('Bird', 'Bird')
Blinkmoth ('Blinkmoth', 'Blinkmoth')
Boar ('Boar', 'Boar')
Bringer ('Bringer', 'Bringer')
Brushwagg ('Brushwagg', 'Brushwagg')
Camarid ('Camarid', 'Camarid')
Camel ('Camel', 'Camel')
Caribou ('Caribou', 'Caribou')
Carrier ('Carrier', 'Carrier')
Cat ('Cat', 'Cat')
Centaur ('Centaur', 'Centaur')
Cephalid ('Cephalid', 'Cephalid')
Chimera ('Chimera', 'Chimera')
Cockatrice ('Cockatrice', 'Cockatrice')
Construct ('Construct', 'Construct')
Crab ('Crab', 'Crab')
Crocodile ('Crocodile', 'Crocodile')
Cyclops ('Cyclops', 'Cyclops')
Dauthi ('Dauthi', 'Dauthi')
Demon ('Demon', 'Demon')
Devil ('Devil', 'Devil')
Djinn ('Djinn', 'Djinn')
Dragon ('Dragon', 'Dragon')
Drake ('Drake', 'Drake')
Dreadnought ('Dreadnought', '`Readnought')
Drone ('Drone', 'Drone')
Dryad ('Dryad', 'Dryad')
Dwarf ('Dwarf', 'Dwarf')
Eldrazi ('Eldrazi', 'Eldrazi')
Elemental ('Elemental', 'Elemental')
Elephant ('Elephant', 'Elephant')
Elf ('Elf', 'Elf')
Elk ('Elk', 'Elk')
Eye ('Eye', 'Eye')
Faerie ('Faerie', 'Faerie')
Ferret ('Ferret', 'Ferret')
Fish ('Fish', 'Fish')
Fox ('Fox', 'Fox')
Frog ('Frog', 'Frog')
Fungus ('Fungus', 'Fungus')
Gargoyle ('Gargoyle', 'Gargoyle')
Germ ('Germ', 'Germ')
Giant ('Giant', 'Giant')
Gnome ('Gnome', 'Gnome')
Goat ('Goat', 'Goat')
Goblin ('Goblin', 'Goblin')
God ('God', 'God')
Golem ('Golem', 'Golem')
Gorgon ('Gorgon', 'Gorgon')
Gremlin ('Gremlin', 'Gremlin')
Griffin ('Griffin', 'Griffin')
Hag ('Hag', 'Hag')
Harpy ('Harpy', 'Harpy')
Hellion ('Hellion', 'Hellion')
Hippo ('Hippo', 'Hippo')
Hippogriff ('Hippogriff', 'Hi`Pogriff')
Homarid ('Homarid', 'Homarid')
Homunculus ('Homunculus', 'Homunculus')
Horror ('Horror', 'Horror')
Horse ('Horse', 'Horse')
Hound ('Hound', 'Hound')
Human ('Human', 'Human')
Hydra ('Hydra', 'Hydra')
Hyena ('Hyena', 'Hyena')
Illusion ('Illusion', 'Illusion')
Imp ('Imp', 'Imp')
Incarnation ('Incarnatiol', 'Incarnation')
Insect ('Insect', 'Insect')
Jellyfish ('Jellyfish', 'Jellyfish')
Juggernaut ('Juggernaut', 'Juggernaut')
Kavu ('Kavu', 'Kavu')
Kirin ('Kirin', 'Kirin')
Kithkin ('Kithkin', 'Kithkin')
Kitsune ('Kitsune', 'Kitsune')
Kobold ('Kobold', 'Kobold')
Kor ('Kor', 'Kor')
Kraken ('Kraken', 'Kraken')
Lamia ('Lamia', 'Lamia')
Lammasu ('Lammasu', 'Lammasu')
Leech ('Leech', 'Leech')
Leonin ('Leonin', 'Leonin')
Leviathan ('Leviathan', 'Leviathan')
Lhurgoyf ('Lhurgoyf', 'Lhurgoyf')
Licid ('Licid', 'Licid')
Lizard ('Lizard', 'Lizard')
Loxodon ('Loxodon', 'Loxodon')
Manticore ('Manticore', 'Manticore')
Masticore ('Masticore', 'Masticore')
Merfolk ('Merfolk', 'Merfolk')
Metathran ('Metathran', 'Metathran')
Minotaur ('Minotaur', 'Minotaur')
Mongoose ('Mongoose', 'Mongoose')
Moonfolk ('Moonfolk', 'Moonfolk')
Mutant ('Mutant', 'Mutant')
Myr ('Myr', 'Myr')
Naga ('Naga', 'Naga')
Nantuko ('Nantuko', 'Nantuko')
Nautilus ('Nautilus', 'Nautilus')
Nephilim ('Nephilim', 'Nephilim')
Nezumi ('Nezumi', 'Nezumi')
Nightmare ('Nightmare', 'Nightmare')
Nightstalker ('Nightstalker', 'Nightstalker')
Noggle ('Noggle', 'Noggle')
Nymph ('Nymph', 'Nymph')
Octopus ('Octopus', 'Octopus')
Ogre ('Ogre', 'Ogre')
Ooze ('Ooze', 'Ooze')
Orb ('Orb', 'Orb')
Orc ('Orc', 'Orc')
Orgg ('Orgg', 'Orgg')
Orochi ('Orochi', 'Orochi')
Ouphe ('Ouphe', 'Ouphe')
Ox ('Ox`', 'Ox')
Oyster ('Oyster', 'Oyster')
Pegasus ('Pegasus', 'Pegasus')
Pentavite ('Pentavite', 'Pentavite')
Pest ('Pest', 'Pest')
Phelddagrif ('Phelddagrid', 'Phelddagrif')
Phoenix ('Phoenix', 'Phoenix')
Phyrexian ('Phyrexian', 'Phyrexian')
Plant ('Plant', 'Plant')
Prism ('Prism', 'Prism')
Rabbit ('Rabbit', 'Rabbit')
Rakshasas ('Rakshasas', 'Rakshasas')
Rat ('Rat', 'Rat')
Reflection ('Reflection', 'Beflection')
Rhino ('Rhino', 'Rhino')
Rhox ('Rhox', 'Rhox')
Sable ('Sable', 'Sable')
Salamander ('Salamander', 'Calamander')
Sand ('Sand', 'Sand')
Saproling ('Saproling', 'Saproling')
Satyr ('Satyr', 'Satyr')
Scarecrow ('Scarecrow', 'Scarecrow')
Scorpion ('Scorpion', 'Scorpion')
Serpent ('Serpent', 'Serpent')
Shade ('Shade', 'Shade')
Shapeshifter ('Shapeshifter', 'Shapeshifter')
Sheep ('Sheep', 'Sheep')
Siren ('Siren', 'Siren')
Skeleton ('Skeleton', 'Skeleton')
Slith ('Slith', 'Slith')
Sliver ('Sliver', 'Sliver')
Slug ('Slug', 'Slug')
Snake ('Snake', 'Snake')
Soltari ('Soltari', 'Soltari')
Spawn ('Spawn', 'Spawn')
Specter ('Specter', 'Specter')
Sphinx ('Sphinx', 'Sphinx')
Spider ('Spider', 'Spider')
Spike ('Spike', 'Spike')
Spirit ('Spirit', 'Spirit')
Sponge ('Sponge', 'Sponge')
Squid ('Squid', 'Squid')
Squirrel ('Squirrel', 'Squirrel')
Starfish ('Starfish', 'Starfish')
Surrakar ('Surrakar', 'Surrakar')
Tetravite ('Tetravite', 'Tetravite')
Thalakos ('Thalakos', 'Thalakos')
Thopter ('Thopter', 'Thopter')
Thrull ('Thrull', 'Thrull')
Treefolk ('Treefolk', 'Treefolk')
Triskelavite ('Triskelavite', 'Triskelavite')
Troll ('Troll', 'Troll')
Turtle ('Turtle', 'Turtle')
Unicorn ('Unicorn', 'Unicorn')
Vampire ('Vampire', 'Vampire')
Vedalken ('Vedalken', 'Vedalken')
Viashino ('Viashino', 'Viashino')
Volver ('Volver', 'Volver')
Wall ('Wall', 'Wall')
Weird ('Weird', 'Weird')
Werewolf ('Werewolf', 'Werewolf')
Whale ('Whale', 'Whale')
Wolf ('Wolf', 'Wolf')
Wolfir ('Wolfir', 'Wolfir')
Wolverine ('Wolverine', 'Wolverine')
Wombat ('Wombat', 'Wombat')
Worm ('Worm', 'Worm')
Wraith ('Wraith', 'Wraith')
Wurm ('Wurm', 'Wurm')
Yeti ('Yeti', 'Yeti')
Zombie ('Zombie', 'Zombie')
Zubera ('Zubera', 'Zubera')
Treefolf ('Treefolb', 'Treefolf')
ETA: A really constrained network is fun because it gives me nonsense quickly.
Amphin ('Caarm`', '`Qbafd')
Angel ('Dieel', 'Aielm')
Antelope ('Dieemil`', 'Dadalmoo')
Ape ('Caa', 'Are')
Archon ('Caarm`', '`Qbafd')
Assembly-Worker ('Assembly-Worker', 'Assembly-Worker')
Atog ('Ggfd', 'Bene')
Aurochs ('Aaapaas', 'Saa`Iwe')
Avatar ('Caarm`', '`Qbafd')
Aven ('Ggfd', 'Bene')
Badger ('Caarm`', '`Qbafd')
Basilisk ('Aaapaas', 'Saa`Iwe')
Bat ('Caa', 'Are')
Bear ('Car`', 'Bare')
Beast ('Caa`E', 'R`Afd')
Beeble ('Deegmd', 'Fagele')
Bird ('Car`', 'Bare')
Blinkmoth ('Dieemil`', 'Dadalmoo')
Boar ('Car`', 'Bare')
Bringer ('Aaapaas', 'Saa`Iwe')
Brushwagg ('Caapawaae', 'Aausabige')
Camarid ('Aaapaas', 'Saa`Iwe')
Camel ('Dieel', 'Aielm')
Caribou ('Aaapaas', 'Saa`Iwe')
Carrier ('Aaapaas', 'Saa`Iwe')
Cat ('Caa', 'Are')
Centaur ('Aaapaas', 'Saa`Iwe')
Cephalid ('Aaapaas', 'Saa`Iwe')
Chimera ('Aaapaas', 'Saa`Iwe')
Cockatrice ('Caapawaae', 'Aausabige')
Construct ('Caapawaae', 'Aausabige')
Crab ('Car`', 'Bare')
Crocodile ('Caapawaae', 'Aausabige')
Cyclops ('Aaapaas', 'Saa`Iwe')
Dauthi ('Deegmd', 'Fagele')
Demon ('Dieel', 'Aielm')
Devil ('Dieel', 'Aielm')
Djinn ('Dieel', 'Aielm')
Dragon ('Deegmd', 'Fagele')
Drake ('Caa`E', 'R`Afd')
Dreadnought ('Deeeigleek', 'Bemdoafmeg')
Drone ('Dieel', 'Aielm')
Dryad ('Caa`E', 'R`Afd')
Dwarf ('Caa`E', 'R`Afd')
Eldrazi ('Aaapaas', 'Saa`Iwe')
Elemental ('Deeeigleek', 'Bemdoafmeg')
Elephant ('Dieemil`', 'Dadalmoo')
Elf ('Gie$', 'Ene')
Elk ('Gie$', 'Ene')
Eye ('Gie$', 'Ene')
Faerie ('Deegmd', 'Fagele')
Ferret ('Caarm`', '`Qbafd')
Fish ('Car`', 'Bare')
Fox ('Gie$', 'Ene')
Frog ('Ggfd', 'Bene')
Fungus ('Deegmd', 'Fagele')
Gargoyle ('Dieemil`', 'Dadalmoo')
Germ ('Ggfd', 'Bene')
Giant ('Caa`E', 'R`Afd')
Gnome ('Dieel', 'Aielm')
Goat ('Ggfd', 'Bene')
Goblin ('Deegmd', 'Fagele')
God ('Gie$', 'Ene')
Golem ('Dieel', 'Aielm')
Gorgon ('Deegmd', 'Fagele')
Gremlin ('Dieemil`', 'Dadalmoo')
Griffin ('Dieemil`', 'Dadalmoo')
Hag ('Caa', 'Are')
Harpy ('Caa`E', 'R`Afd')
Hellion ('Dieemil`', 'Dadalmoo')
Hippo ('Caa`E', 'R`Afd')
Hippogriff ('Caapawaae', 'Aausabige')
Homarid ('Dieemil`', 'Dadalmoo')
Homunculus ('Deeeigleek', 'Bemdoafmeg')
Horror ('Caarm`', '`Qbafd')
Horse ('Caa`E', 'R`Afd')
Hound ('Dieel', 'Aielm')
Human ('Dieel', 'Aielm')
Hydra ('Caa`E', 'R`Afd')
Hyena ('Caa`E', 'R`Afd')
Illusion ('Dieemil`', 'Dadalmoo')
Imp ('Caa', 'Are')
Incarnation ('Deeeigleek', 'Bemdoafmeg')
Insect ('Caarm`', '`Qbafd')
Jellyfish ('Caapawaae', 'Aausabige')
Juggernaut ('Caapawaae', 'Aausabige')
Kavu ('Car`', 'Bare')
Kirin ('Caa`E', 'R`Afd')
Kithkin ('Aaapaas', 'Saa`Iwe')
Kitsune ('Aaapaas', 'Saa`Iwe')
Kobold ('Deegmd', 'Fagele')
Kor ('Caa', 'Are')
Kraken ('Caarm`', '`Qbafd')
Lamia ('Dieel', 'Aielm')
Lammasu ('Aaapaas', 'Saa`Iwe')
Leech ('Dieel', 'Aielm')
Leonin ('Deegmd', 'Fagele')
Leviathan ('Caapawaae', 'Aausabige')
Lhurgoyf ('Dieemil`', 'Dadalmoo')
Licid ('Dieel', 'Aielm')
Lizard ('Daegmd', 'Fagele')
Loxodon ('Dieemil`', 'Dadalmoo')
Manticore ('Caapawaae', 'Aausabige')
Masticore ('Caapawaae', 'Aausabige')
Merfolk ('Dieemil`', 'Dadalmoo')
Metathran ('Caapawaae', 'Aausabige')
Minotaur ('Dieemil`', 'Dadalmoo')
Mongoose ('Dieemil`', 'Dadalmoo')
Moonfolk ('Dieemil`', 'Dadalmoo')
Mutant ('Caarm`', '`Qbafd')
Myr ('Caa', 'Are')
Naga ('Ggfd', 'Befe')
Nantuko ('Dieemil`', 'Dadalmoo')
Nautilus ('Dieemil`', 'Dadalmoo')
Nephilim ('Dieemil`', 'Dadalmoo')
Nezumi ('Deegmd', 'Fagele')
Nightmare ('Caapawaae', 'Aausabige')
Nightstalker ('Deeeigleek', 'Bemdoafmeg')
Noggle ('Deegmd', 'Fagele')
Nymph ('Dieel', 'Aielm')
Octopus ('Aaapaas', 'Saa`Iwe')
Ogre ('Ggfd', 'Bene')
Ooze ('Ggfd', 'Bene')
Orb ('Caa', 'Are')
Orc ('Caa', 'Are')
Orgg ('Ggfd', 'Bene')
Orochi ('Deegmd', 'Fagele')
Ouphe ('Caa`E', 'R`Afd')
Ox ('Caa', 'Are')
Oyster ('Caarm`', '`Qbafd')
Pegasus ('Aaapaas', 'Saa`Iwe')
Pentavite ('Caapawaae', 'Aausabige')
Pest ('Car`', 'Bare')
Phelddagrif ('Deeeigleek', 'Bemdoafmeg')
Phoenix ('Dieemil`', 'Dadalmoo')
Phyrexian ('Caapawaae', 'Aausabige')
Plant ('Caa`E', 'R`Afd')
Prism ('Caa`E', 'R`Afd')
Rabbit ('Caarm`', '`Qbafd')
Rakshasas ('Caapawaae', 'Aausabige')
Rat ('Caa', 'Are')
Reflection ('Deeeigleek', 'Bemdoafmeg')
Rhino ('Caa`E', 'R`Afd')
Rhox ('Car`', 'Bare')
Sable ('Caa`E', 'R`Afd')
Salamander ('Deeeigleek', 'Bemdoafmeg')
Sand ('Car`', 'Bare')
Saproling ('Caapawaae', 'Aausabige')
Satyr ('Caa`E', 'R`Afd')
Scarecrow ('Caapawaae', 'Aausabige')
Scorpion ('Aaapaas', 'Saa`Iwe')
Serpent ('Aaapaas', 'Saa`Iwe')
Shade ('Caa`E', 'R`Afd')
Shapeshifter ('Caapawaae', 'Aausabige')
Sheep ('Caa`E', 'R`Afd')
Siren ('Caa`E', 'R`Afd')
Skeleton ('Dieemil`', 'Dadalmoo')
Slith ('Caa`E', 'R`Afd')
Sliver ('Caarm`', '`Qbafd')
Slug ('Cab`', 'Bare')
Snake ('Caa`E', 'R`Afd')
Soltari ('Aaapaas', 'Saa`Iwe')
Spawn ('Caa`E', 'R`Afd')
Specter ('Aaapaas', 'Saa`Iwe')
Sphinx ('Caarm`', '`Qbafd')
Spider ('Caarm`', '`Qbafd')
Spike ('Caa`E', 'R`Afd')
Spirit ('Caarm`', '`Qbafd')
Sponge ('Caarm`', '`Qbafd')
Squid ('Caa`E', 'R`Afd')
Squirrel ('Aaapaas', 'Saa`Iwe')
Starfish ('Aaapaas', 'Saa`Iwe')
Surrakar ('Aaapaas', 'Saa`Iwe')
Tetravite ('Caapawaae', 'Aausabige')
Thalakos ('Dieemil`', 'Dadalmoo')
Thopter ('Aaapaas', 'Saa`Iwe')
Thrull ('Deegmd', 'Fagele')
Treefolk ('Dieemil`', 'Dadalmoo')
Triskelavite ('Deeeigleek', 'Bemdoafmeg')
Troll ('Dieel', 'Aielm')
Turtle ('Caarm`', '`Qbafd')
Unicorn ('Dieemil`', 'Dadalmoo')
Vampire ('Aaapaas', 'Saa`Iwe')
Vedalken ('Dieemil`', 'Dadalmoo')
Viashino ('Dieemil`', 'Dadalmoo')
Volver ('Deegmd', 'Fagele')
Wall ('Ggfd', 'Bene')
Weird ('Caa`E', 'R`Afd')
Werewolf ('Dieemil`', 'Dadalmoo')
Whale ('Caa`E', 'R`Afd')
Wolf ('Ggfd', 'Bene')
Wolfir ('Deegmd', 'Fagele')
Wolverine ('Caapawaae', 'Aausabige')
Wombat ('Deegmd', 'Fagele')
Worm ('Ggfd', 'Bene')
Wraith ('Caarm`', '`Qbafd')
Wurm ('Ggfd', 'Befe')
Yeti ('Car`', 'Bare')
Zombie ('Deegmd', 'Fagele')
Zubera ('Caarm`', '`Qbafd')
Treefolf ('Dieemil`', 'Dadalmoo')
ETA: 80 HN at 2000 iterations. Going to walk it back to 200 for comparison.
Amphin ('Amphin', 'Amphin')
Angel ('Angel', 'Angel')
Antelope ('Antelope', 'Antelope')
Ape ('Ape', 'Ape')
Archon ('Archon', 'Archon')
Assembly-Worker ('Ascembly-Worker', 'Assembly-Worker')
Atog ('Atog', 'Atog')
Aurochs ('Aurochs', 'Aurochs')
Avatar ('Avatar', 'Avatar')
Aven ('Aven', 'Aven')
Badger ('Badger', 'Badger')
Basilisk ('Basilisk', 'Basilisk')
Bat ('Bat', 'Bat')
Bear ('Bear', 'Bear')
Beast ('Beast', 'Beast')
Beeble ('Beeble', 'Beeble')
Bird ('Bird', 'Bird')
Blinkmoth ('Blinkmoth', 'Blinkmoth')
Boar ('Boar', 'Boar')
Bringer ('Bringer', 'Bringer')
Brushwagg ('Brushwagg', 'Brushwagg')
Camarid ('Camarid', 'Camarid')
Camel ('Camel', 'Camel')
Caribou ('Caribou', 'Caribou')
Carrier ('Carrier', 'Carrier')
Cat ('Cat', 'Cat')
Centaur ('Centaur', 'Centaur')
Cephalid ('Cephalid', 'Cephalid')
Chimera ('Chimera', 'Chimera')
Cockatrice ('Cockatrice', 'Cockatrice')
Construct ('Construct', 'Construct')
Crab ('Crab', 'Crab')
Crocodile ('Crocodile', 'Crocodile')
Cyclops ('Cyclops', 'Cyclops')
Dauthi ('Dauthi', 'Dauthi')
Demon ('Demon', 'Demon')
Devil ('Devil', 'Devil')
Djinn ('Djinn', 'Djinn')
Dragon ('Dragon', 'Dragon')
Drake ('Drake', 'Drake')
Dreadnought ('Dreadnought', 'Dreadnought')
Drone ('Drone', 'Drone')
Dryad ('Dryad', 'Dryad')
Dwarf ('Dwarf', 'Dwarf')
Eldrazi ('Eldrazi', 'Eldrazi')
Elemental ('Elemental', 'Elemental')
Elephant ('Elephant', 'Elephant')
Elf ('Elf', 'Elf')
Elk ('Elk', 'Elk')
Eye ('Eye', 'Eye')
Faerie ('Faerie', 'Faerie')
Ferret ('Ferret', 'Ferret')
Fish ('Fish', 'Fish')
Fox ('Fox', 'Fox')
Frog ('Frog', 'Frog')
Fungus ('Fungus', 'Fungus')
Gargoyle ('Gargoyle', 'Gargoyle')
Germ ('Germ', 'Germ')
Giant ('Giant', 'Giant')
Gnome ('Gnome', 'Gnome')
Goat ('Goat', 'Goat')
Goblin ('Goblin', 'Goblin')
God ('God', 'God')
Golem ('Golem', 'Golem')
Gorgon ('Gorgon', 'Gorgon')
Gremlin ('Gremlin', 'Gremlin')
Griffin ('Griffin', 'Griffin')
Hag ('Hag', 'Hag')
Harpy ('Harpy', 'Harpy')
Hellion ('Hellion', 'Hellion')
Hippo ('Hippo', 'Hippo')
Hippogriff ('Hippogriff', 'Hippogriff')
Homarid ('Homarid', 'Homarid')
Homunculus ('Homunculus', 'Homunculus')
Horror ('Horror', 'Horror')
Horse ('Horse', 'Horse')
Hound ('Hound', 'Hound')
Human ('Human', 'Human')
Hydra ('Hydra', 'Hydra')
Hyena ('Hyena', 'Hyena')
Illusion ('Illusion', 'Illusion')
Imp ('Imp', 'Imp')
Incarnation ('Incarnation', 'Incarnation')
Insect ('Insect', 'Insect')
Jellyfish ('Jellyfish', 'Jellyfish')
Juggernaut ('Juggernaut', 'Jeggernaut')
Kavu ('Kavu', 'Kavu')
Kirin ('Kirin', 'Kirin')
Kithkin ('Kithkin', 'Kithkin')
Kitsune ('Kitsune', 'Kitsune')
Kobold ('Kobold', 'Kobold')
Kor ('Kor', 'Kor')
Kraken ('Kraken', 'Kraken')
Lamia ('Lamia', 'Lamia')
Lammasu ('Lammasu', 'Lammasu')
Leech ('Leech', 'Leech')
Leonin ('Leonin', 'Leonin')
Leviathan ('Leviathan', 'Leviathan')
Lhurgoyf ('Lhurgoyf', 'Lhurgoyf')
Licid ('Licid', 'Licid')
Lizard ('Lizard', 'Lizard')
Loxodon ('Loxodon', 'Loxodon')
Manticore ('Manticore', 'Manticore')
Masticore ('Masticore', 'Masticore')
Merfolk ('Merfolk', 'Merfolk')
Metathran ('Metathran', 'Metathran')
Minotaur ('Minotaur', 'Minotaur')
Mongoose ('Mongoose', 'Mongoose')
Moonfolk ('Moonfolk', 'Moonfolk')
Mutant ('Mutant', 'Mutant')
Myr ('Myr', 'Myr')
Naga ('Naga', 'Naga')
Nantuko ('Nantuko', 'Nantuko')
Nautilus ('Nautilus', 'Nautilus')
Nephilim ('Nephilim', 'Nephilim')
Nezumi ('Nezumi', 'Nezumi')
Nightmare ('Nightmare', 'Nightmare')
Nightstalker ('Nightstalker', 'Nightstalker')
Noggle ('Noggle', 'Noggle')
Nymph ('Nymph', 'Nymph')
Octopus ('Octopus', 'Octopus')
Ogre ('Ogre', 'Ogre')
Ooze ('Ooze', 'Ooze')
Orb ('Orb', 'Orb')
Orc ('Orc', 'Orc')
Orgg ('Orgg', 'Orgg')
Orochi ('Orochi', 'Orochi')
Ouphe ('Ouphe', 'Ouphe')
Ox ('Ox', 'Ox')
Oyster ('Oyster', 'Oyster')
Pegasus ('Pegasus', 'Pegasus')
Pentavite ('Pentavite', 'Pentavite')
Pest ('Pest', 'Pest')
Phelddagrif ('Phelddagrif', 'Phelddagrif')
Phoenix ('Phoenix', 'Phoenix')
Phyrexian ('Phyrexian', 'Phyrexian')
Plant ('Plant', 'Plant')
Prism ('Prism', 'Prism')
Rabbit ('Rabbit', 'Rabbit')
Rakshasas ('Rakshasas', 'Rakshasas')
Rat ('Rat', 'Rat')
Reflection ('Reflection', 'Reflection')
Rhino ('Rhino', 'Rhino')
Rhox ('Rhox', 'Rhox')
Sable ('Sable', 'Sable')
Salamander ('Salamander', 'Salamander')
Sand ('Sand', 'Sand')
Saproling ('Saproling', 'Saproling')
Satyr ('Satyr', 'Satyr')
Scarecrow ('Scarecrow', 'Scarecrow')
Scorpion ('Scorpion', 'Scorpion')
Serpent ('Serpent', 'Serpent')
Shade ('Shade', 'Shade')
Shapeshifter ('Shapeshifter', 'Shapeshifter')
Sheep ('Sheep', 'Sheep')
Siren ('Siren', 'Siren')
Skeleton ('Skeleton', 'Skeleton')
Slith ('Slith', 'Slith')
Sliver ('Sliver', 'Sliver')
Slug ('Slug', 'Slug')
Snake ('Snake', 'Snake')
Soltari ('Soltari', 'Soltari')
Spawn ('Spawn', 'Spawn')
Specter ('Specter', 'Specter')
Sphinx ('Sphinx', 'Sphinx')
Spider ('Spider', 'Spider')
Spike ('Spike', 'Spike')
Spirit ('Spirit', 'Spirit')
Sponge ('Sponge', 'Sponge')
Squid ('Squid', 'Squid')
Squirrel ('Squirrel', 'Squirrel')
Starfish ('Starfish', 'Starfish')
Surrakar ('Surrakar', 'Surrakar')
Tetravite ('Tetravite', 'Tetravite')
Thalakos ('Thalakos', 'Thalakos')
Thopter ('Thopter', 'Thopter')
Thrull ('Thrull', 'Thrull')
Treefolk ('Treefolk', 'Treefolk')
Triskelavite ('Triskelavite', 'Triskelavite')
Troll ('Troll', 'Troll')
Turtle ('Turtle', 'Turtle')
Unicorn ('Unicorn', 'Unicorn')
Vampire ('Vampire', 'Vampire')
Vedalken ('Vedalken', 'Vedalken')
Viashino ('Viashino', 'Viashino')
Volver ('Volver', 'Volver')
Wall ('Wall', 'Wall')
Weird ('Weird', 'Weird')
Werewolf ('Werewolf', 'Werewolf')
Whale ('Whale', 'Whale')
Wolf ('Wolf', 'Wolf')
Wolfir ('Wolfir', 'Wolfir')
Wolverine ('Wolverine', 'Wolverine')
Wombat ('Wombat', 'Wombat')
Worm ('Worm', 'Worm')
Wraith ('Wraith', 'Wraith')
Wurm ('Wurm', 'Wurm')
Yeti ('Yeti', 'Yeti')
Zombie ('Zombie', 'Zombie')
Zubera ('Zubera', 'Zubera')
Treefolf ('Treefolo', 'Treefolg')
Amphin ('Amphin', 'Amphin')
Angel ('Angel', 'Angel')
Antelope ('Antelope', 'Antelope')
Ape ('Ape', 'Ape')
Archon ('Archon', 'Archon')
Assembly-Worker ('Ascembly-Worker', 'Assembly-Worker')
Atog ('Atog', 'Atog')
Aurochs ('Aurocls', 'Aurochs')
Avatar ('Avatar', 'Avatar')
Aven ('Aven', 'Aven')
Badger ('Badger', 'Badger')
Basilisk ('Basilisk', 'Basilisk')
Bat ('Bat', 'Bat')
Bear ('Bear', 'Bear')
Beast ('Beast', 'Beast')
Beeble ('Beeble', 'Beeble')
Bird ('Bird', 'Bird')
Blinkmoth ('Blinkmoth', 'Blinkmoth')
Boar ('Boar', 'Boar')
Bringer ('Bringer', 'Bringer')
Brushwagg ('Brushwagg', 'Brushwagg')
Camarid ('Camarid', 'Camarid')
Camel ('Camel', 'Camel')
Caribou ('Caribou', 'Caribou')
Carrier ('Carrier', 'Carrier')
Cat ('Cat', 'Cat')
Centaur ('Centaur', 'Centaur')
Cephalid ('Cephalid', 'Cephalid')
Chimera ('Chimera', 'Chimera')
Cockatrice ('Cockatrice', 'Bockatrice')
Construct ('Construct', 'Construct')
Crab ('Crab', 'Crab')
Crocodile ('Crocodile', 'Crocodile')
Cyclops ('Cyclops', 'Cyclops')
Dauthi ('Dauthi', 'Dauthi')
Demon ('Demon', 'Demon')
Devil ('Devil', 'Devil')
Djinn ('Djinn', 'Djinn')
Dragon ('Dragon', 'Dragon')
Drake ('Drake', 'Drake')
Dreadnought ('Dreadnmught', 'Dreadnought')
Drone ('Drone', 'Drone')
Dryad ('Dryad', 'Dryad')
Dwarf ('Dwarf', 'Dwarf')
Eldrazi ('Eldrazi', 'Eldrazi')
Elemental ('Elemental', 'Elemental')
Elephant ('Elephant', 'Elephant')
Elf ('Elf', 'Elf')
Elk ('Elk', 'Elk')
Eye ('Eye', 'Eye')
Faerie ('Faerie', 'Faerie')
Ferret ('Ferret', 'Ferret')
Fish ('Fish', 'Fish')
Fox ('Fox', 'Fox')
Frog ('Frog', 'Frog')
Fungus ('Fungus', 'Fungus')
Gargoyle ('Gargo}Le', 'Gargoyle')
Germ ('Germ', 'Germ')
Giant ('Giant', 'Giant')
Gnome ('Gnome', 'Gnome')
Goat ('Goat', 'Goat')
Goblin ('Goblin', 'Goblin')
God ('God', 'God')
Golem ('Golem', 'Golem')
Gorgon ('Gorgon', 'Gorgon')
Gremlin ('Gremlin', 'Gremlin')
Griffin ('Griffin', 'Griffin')
Hag ('Hag', 'Hag')
Harpy ('Harpi', 'Harpy')
Hellion ('Hellion', 'Hellion')
Hippo ('Hippo', 'Hippo')
Hippogriff ('Hippogriff', 'Hippogriff')
Homarid ('Homarid', 'Homarid')
Homunculus ('Homunculus', 'Homunculus')
Horror ('Horror', 'Horror')
Horse ('Horse', 'Horse')
Hound ('Hound', 'Hound')
Human ('Human', 'Human')
Hydra ('Hydra', 'Hydra')
Hyena ('Hyena', 'Hyena')
Illusion ('Illusion', 'Illusion')
Imp ('Imp', 'Imp')
Incarnation ('Incarnation', 'Incarnation')
Insect ('Insect', 'Insect')
Jellyfish ('Jellyfish', 'Jellyfish')
Juggernaut ('Juggernaut', 'Jeggernaut')
Kavu ('Kavu', 'Kavu')
Kirin ('Kirin', 'Kirin')
Kithkin ('Kithkin', 'Kithkin')
Kitsune ('Kitsune', 'Kitsune')
Kobold ('Kobold', 'Kobold')
Kor ('Kor', 'Kor')
Kraken ('Kraken', 'Kraken')
Lamia ('Lamia', 'Lamia')
Lammasu ('Lammasu', 'Lammasu')
Leech ('Leech', 'Leech')
Leonin ('Leonin', 'Leonin')
Leviathan ('Leviathan', 'Leviathan')
Lhurgoyf ('Lhurgoyf', 'Lhurgoif')
Licid ('Licid', 'Licid')
Lizard ('Lizard', 'Lizard')
Loxodon ('Loxodon', 'Loxodon')
Manticore ('Manticore', 'Manticore')
Masticore ('Masticore', 'Masticore')
Merfolk ('Merfolk', 'Merfolk')
Metathran ('Metathran', 'Metathran')
Minotaur ('Minotaur', 'Minotaur')
Mongoose ('Mongoose', 'Mongoose')
Moonfolk ('Moonfolk', 'Moonfolk')
Mutant ('Mutant', 'Mutant')
Myr ('Myr', 'Myr')
Naga ('Naga', 'Naga')
Nantuko ('Nantuko', 'Nantuko')
Nautilus ('Nautilus', 'Nautilus')
Nephilim ('Nephilim', 'Nephilim')
Nezumi ('Nezumi', 'Nezumi')
Nightmare ('Nightmare', 'Nightmare')
Nightstalker ('Nightstalker', 'Nightstalker')
Noggle ('Noggle', 'Noggle')
Nymph ('Nymph', 'Nymph')
Octopus ('Octopus', 'Octopus')
Ogre ('Ogre', 'Ogre')
Ooze ('Ooze', 'Ooze')
Orb ('Orb', 'Orb')
Orc ('Orc', 'Orc')
Orgg ('Orgg', 'Orgg')
Orochi ('Orochi', 'Orochi')
Ouphe ('Ouphe', 'Ouphe')
Ox ('Ox', 'Ox')
Oyster ('Oyster', 'Oyster')
Pegasus ('Pegasus', 'Pegasus')
Pentavite ('Pentavite', 'Pentavite')
Pest ('Pest', 'Pest')
Phelddagrif ('Phelddagrif', 'Phelddagrif')
Phoenix ('Phoenix', 'Phoenix')
Phyrexian ('Phyrexian', 'Phyrexian')
Plant ('Plant', 'Plant')
Prism ('Prism', 'Prism')
Rabbit ('Rabbit', 'Rabbit')
Rakshasas ('Rakshasas', 'Rakshasas')
Rat ('Rat', 'Rat')
Reflection ('Reflection', 'Reflection')
Rhino ('Rhino', 'Rhino')
Rhox ('Rhox', 'Rhox')
Sable ('Sable', 'Sable')
Salamander ('Salamander', 'Salamander')
Sand ('Sand', 'Sand')
Saproling ('Saproling', 'Saproling')
Satyr ('Satyr', 'Satyr')
Scarecrow ('Scarecrow', 'Scarecrow')
Scorpion ('Scorpion', 'Scorpion')
Serpent ('Serpent', 'Serpent')
Shade ('Shade', 'Shade')
Shapeshifter ('Shapeshifter', 'Shapeshifter')
Sheep ('Sheep', 'Sheep')
Siren ('Siren', 'Siren')
Skeleton ('Skeleton', 'Skeleton')
Slith ('Slith', 'Slith')
Sliver ('Sliver', 'Sliver')
Slug ('Slug', 'Slug')
Snake ('Snake', 'Snake')
Soltari ('Soltari', 'Soltari')
Spawn ('Spawn', 'Spawn')
Specter ('Spectar', 'Specter')
Sphinx ('Sphinx', 'Sphinx')
Spider ('Spider', 'Spider')
Spike ('Spike', 'Spike')
Spirit ('Spirit', 'Spirit')
Sponge ('Sponge', 'Sponoe')
Squid ('Squid', 'Squid')
Squirrel ('Squirrel', 'Squirrel')
Starfish ('Starfish', 'Starfish')
Surrakar ('Surrakar', 'Surrakar')
Tetravite ('Tetravite', 'Tetravite')
Thalakos ('Thalakos', 'Thalakos')
Thopter ('Thoptar', 'Thopter')
Thrull ('Thrull', 'Thrull')
Treefolk ('Treefolk', 'Treefolk')
Triskelavite ('Triskelavite', 'Triskelavite')
Troll ('Troll', 'Troll')
Turtle ('Turtle', 'Turtle')
Unicorn ('Unicorn', 'Unicorn')
Vampire ('Vampire', 'Vampire')
Vedalken ('Vedalken', 'Vedalken')
Viashino ('Viashino', 'Viashino')
Volver ('Volver', 'Volver')
Wall ('Wall', 'Wall')
Weird ('Weird', 'Weird')
Werewolf ('Werewolf', 'Werewolf')
Whale ('Whale', 'Whale')
Wolf ('Wolf', 'Wolf')
Wolfir ('Wolfir', 'Wolfir')
Wolverine ('Wolverine', 'Wolverine')
Wombat ('Wombat', 'Wombat')
Worm ('Worm', 'Worm')
Wraith ('Wraith', 'Wraith')
Wurm ('Wurm', 'Wurm')
Yeti ('Yeti', 'Yeti')
Zombie ('Zombie', 'Zombie')
Zubera ('Zubera', 'Zubera')
Treefolf ('Treefolo', 'Treefolo')
Amphin ('`Mpakn', 'Ioqhin')
Angel ('Dngel', 'Angml')
Antelope ('Gotgmoie', 'Aotuooxe')
Ape ('Vpe', 'Aid')
Archon ('Aracon', 'Ercoon')
Assembly-Worker ('Ikseebly-Worker', 'Assemb`Y)Wkrkmr')
Atog ('Aveo', 'Avon')
Aurochs ('Carc`Lq', 'Aarkfit')
Avatar ('St`Per', 'Arqpmr')
Aven ('Aveo', 'Avon')
Badger ('R`Mfep', 'Ralfep')
Basilisk ('Bacinisk', 'Bashhisk')
Bat ('Rat', '`At')
Bear ('Jegr', 'Beap')
Beast ('Bees`', '`Eaqu')
Beeble ('Feebie', 'Faebie')
Bird ('Gmpd', 'Baze')
Blinkmoth ('Fhmgkcod`', 'Bhhnilodk')
Boar ('Jegr', 'Beap')
Bringer ('Brmggqp', 'Bbangas')
Brushwagg ('Crecigife', 'Bses`Gkcg')
Camarid ('Bmmazid', 'Bima`Il')
Camel ('Vamml', 'C`Mml')
Caribou ('Carc`Lq', 'Aarkfit')
Carrier ('Caprqes', 'Capc`Ap')
Cat ('Rat', '`At')
Centaur ('Belpqqr', 'Belp`Er')
Cephalid ('Cat`Hmid', 'Cea`Ahid')
Chimera ('Cjigerc', 'Ckhogrc')
Cockatrice ('Ccacersicd', 'Ccckavricd')
Construct ('Kodsurwct', 'Cooavrect')
Crab ('Brcc', 'Crab')
Crocodile ('Bricofile', 'Ccobefibe')
Cyclops ('Bxibopq', 'Cpqlopq')
Dauthi ('Dewehi', 'Vaivhi')
Demon ('Domon', 'Femon')
Devil ('Devil', 'Fevim')
Djinn ('Djkon', 'Djinn')
Dragon ('Aracon', 'Ercoon')
Drake ('Draad', 'Dricd')
Dreadnought ('Dreade`Aght', 'Dreadnefomd')
Drone ('Evmge', 'Dvone')
Dryad ('Draad', 'Dricd')
Dwarf ('D}Arl', 'Dvire')
Eldrazi ('Bmlriri', '`Ldrare')
Elemental ('Deeaeotal', 'Eegmdvmal')
Elephant ('Gmt``Cot', 'Ele``Eot')
Elf ('Dmd', 'Emd')
Elk ('D`K$', 'Elh')
Eye ('Fae', 'Aid')
Faerie ('Feebie', 'Faebie')
Ferret ('Fervet', 'Ferret')
Fish ('`Aq`', 'Bisd')
Fox ('Nmt', 'Dmp')
Frog ('Frgg(', 'Grgg')
Fungus ('Gvmges', 'Faoges')
Gargoyle ('Fareoohg', 'Geraoolg')
Germ ('Gere', 'Werm')
Giant ('Rhadd', 'Shalt')
Gnome ('Fnmgm', 'Eoolm')
Goat ('Gggv', 'Ggau')
Goblin ('Emrein', 'Gosdmn')
God ('Dmd', 'Emd')
Golem ('Dolge', 'Eogmm')
Gorgon ('Eorgon', 'Gnsgon')
Gremlin ('Frmelin', 'Gwanoin')
Griffin ('Frmelin', 'Gwanoin')
Hag ('Hag (', 'Baf')
Harpy ('Jarph', 'Hirpq')
Hellion ('Lmleikn', 'Leldeio')
Hippo ('Jirpo', 'Iirpg')
Hippogriff ('`I`Pogbiff', 'Hiqpneriff')
Homarid ('Bmmazid', 'Bima`Il')
Homunculus ('`Omuecflus', 'Homtnephen')
Horror ('Hmrvof', 'Hospor')
Horse ('Iorwm', 'Lkrsg')
Hound ('Lovgd', 'Howmd')
Human ('Lemin', 'Numil')
Hydra ('Kilna', 'Hiera')
Hyena ('Kilna', 'Hiera')
Illusion ('Lmlmkkoo', 'Mlmosmoo')
Imp ('Kmr', 'Amp')
Incarnation ('Iocavnatmon', 'Inkarnarion')
Insect ('Iorggv', 'Insekt')
Jellyfish ('Jmltackv`', 'Jaldhfire')
Juggernaut ('Swegerfaut', 'Buggerfeet')
Kavu ('Smrt', 'Bavu')
Kirin ('Jir`N', 'Jipif')
Kithkin ('Jithkio', 'Kaxhiyn')
Kitsune ('Kmpou~C', 'Ki~Seod')
Kobold ('Morglf', 'Oocgnd')
Kor ('Kmr', 'Amp')
Kraken ('Aracon', 'Ercoon')
Lamia ('Lamih', 'Lmmil')
Lammasu ('Namgaqu!', 'Oamogqu')
Leech ('Lmmk`', 'Leekl')
Leonin ('Deogoo', 'Jeonio')
Leviathan ('Ledaaedan', 'Leveethan')
Lhurgoyf ('Mavcooid', 'Fmergoig')
Licid ('Liaad', 'Lkgad')
Lizard ('Niz`Jd', 'Lijazd')
Loxodon ('Mo|G`On', 'Nghogmn')
Manticore ('Labticord', 'Masvibmre')
Masticore ('Labticord', 'Masvibmre')
Merfolk ('Mavcooid', 'Fmergoig')
Metathran ('Ledayopan', 'Metadppao')
Minotaur ('Lmleucwq', 'Mimoteer')
Mongoose ('Mongookg', 'Ionmomre')
Moonfolk ('Lomoooni', 'Monmfmmo')
Mutant ('Eupadt', 'Murajd')
Myr ('Kmr', 'Amp')
Naga ('Laea', 'Faad')
Nantuko ('Lmleikn', 'Leldeio')
Nautilus ('Kathamus', 'Faqxidus')
Nephilim ('Oataimid', 'Feqhilio')
Nezumi ('Lejumk', 'Nmzeli')
Nightmare ('Niabheard', 'Lagh`Nire')
Nightstalker ('Jig`Tstalker', 'Nighpstadker')
Noggle ('Logglg', 'Nmogle')
Nymph ('Nim`H', 'Mmipm')
Octopus ('Ocvgpes', 'Gapovas')
Ogre ('Ogre(', 'Ogre')
Ooze ('Oorg(', 'Ooze')
Orb ('Grc!', '"Mra')
Orc ('Grc!', '"Mra')
Orgg ('Frgg(', 'Grgg')
Orochi ('Frmcli', 'Gqmgil')
Ouphe ('Japhm', '~Qpif')
Ox ('Sh', '!Op')
Oyster ('Omrtmf', 'Oksvmr')
Pegasus ('Beepqur', 'Padapar')
Pentavite ('Vettavite', 'Tedtavite')
Pest ('Sert', 'Perq')
Phelddagrif ('Thedle`Efif', 'Pheddfafamf')
Phoenix ('Rhmanir', 'Pho`Hij')
Phyrexian ('R`Ipmrimn', 'Riypepibm')
Plant ('Rhadd', 'Shalt')
Prism ('Prism', 'Tpirm')
Rabbit ('Barrat', 'Racbat')
Rakshasas ('Ramraasas', 'Racr``Ras')
Rat ('Rat', '`At')
Reflection ('Teeeeadmon', 'Refdegtimn')
Rhino ('Rhilo', 'Whinm')
Rhox ('Shmh', 'Pioh')
Sable ('Sardm', 'Sipme')
Salamander ('Saldasnder', 'Ramag`Dder')
Sand ('Seld', 'Qa~E')
Saproling ('Raarooing', 'Scqre`Hfg')
Satyr ('Sath`', 'Ratyp')
Scarecrow ('Raarmcsow', 'Scare`Xgw')
Scorpion ('Sild`Con', 'S`G``Ion')
Serpent ('Seppezd', 'Sarpelt')
Shade ('Shade', 'Shade')
Shapeshifter ('Rhapeqbifter', 'Shaqeshebper')
Sheep ('R`Em`', 'Shemp')
Siren ('Rirdl', 'Sirmf')
Skeleton ('Sidleron', 'Siuj`Eer')
Slith ('Rhidh', 'Qli|M')
Sliver ('Shkvmr', 'Siives')
Slug ('Cevg', 'Cnse')
Snake ('Rjake', 'Slire')
Soltari ('Rejtiri', 'Rglpari')
Spawn ('Spakd', 'Rpagd')
Specter ('Bpecter', 'Sadcrar')
Sphinx ('Rpialx', 'Sq`Mlh')
Spider ('Spafmp', 'Ssibes')
Spike ('Spake', 'Rpicd')
Spirit ('Sparap', 'Spareq')
Sponge ('Cpagom', 'Sqonke')
Squid ('Sutid', 'Rqeid')
Squirrel ('Sqe``Pml', 'Squ``Idl')
Starfish ('Cpacfish', 'Swapfisl')
Surrakar ('Sat`Aaar', 'Seqpa`Ar')
Tetravite ('Vettavite', 'Tedtavite')
Thalakos ('Biilkkoj', 'Phah`Iok')
Thopter ('Rdmpupp', 'Rhgpper')
Thrull ('Tmrell', 'Tirell')
Treefolk ('Dseenoli', 'Dgggfomn')
Triskelavite ('Drisoe`Avite', 'Triwignefk`E')
Troll ('Trmml', 'Tpolm')
Turtle ('Turtld', 'Verrle')
Unicorn ('Cnigork', '`Niooro')
Vampire ('Rempipa', 'R`Epire')
Vedalken ('Vamehill', 'Tdualomn')
Viashino ('Wiaahiln', 'Viqqhmln')
Volver ('Vnmver', 'Volvms')
Wall ('Feld', 'Uelm')
Weird ('Seare', 'Vuire')
Werewolf ('Fareoohg', 'Geraoolg')
Whale ('Shade', 'Shade')
Wolf ('Gglg', 'Ggie')
Wolfir ('Vmlfar', 'Wimfmq')
Wolverine ('Wolverine', 'Wolvavife')
Wombat ('Veefap', 'Vimfit')
Worm ('Gere', 'Werm')
Wraith ('Wrailh', '"Wpamxi')
Wurm ('Gere', 'Werm')
Yeti ('Qevi', 'Qe~I')
Zombie ('Jembke', 'Jmmbio')
Zubera ('Lmjdja', 'Zmjera')
Treefolf ('Dseenoli', 'Dgggfomn')
ETA: 20,000 iterations at 40 HN. I'm going to bump up the HN count a little, and start back from 200 iterations.
Amphin ('Amphin', 'Alphin')
Angel ('Angel', 'Angel')
Antelope ('Altenoqe', 'Aotglgpe')
Ape ('Ape', 'Ape')
Archon ('Crchon (', 'Archon')
Assembly-Worker ('Assembly-Worker', 'Assembly-Worker')
Atog ('Atog', 'Atog')
Aurochs ('Curochs', 'Ausncls')
Avatar ('Cvatar', 'Afatar')
Aven ('Aven', 'Avef')
Badger ('Redger `', '` Badger')
Basilisk ('Bisklisk', 'Basilisk')
Bat ('Bat', 'Bat')
Bear ('Bear', 'Bear')
Beast ('Beasd', 'Beast')
Beeble ('Neebme!`', '` Beeble')
Bird ('Cird', 'Bird')
Blinkmoth ('Klinkmoth', 'Blinkmoth')
Boar ('Boar', 'Bmar')
Bringer ('Bpinger', 'Brknger')
Brushwagg ('Brus`wagg', 'Brushwcgg')
Camarid ('Camarid', 'Camarid')
Camel ('Camel', 'Camel')
Caribou ('Caribow', 'Caribot')
Carrier ('Carpier', 'Carbier')
Cat ('Cat', 'Cat')
Centaur ('Ceotaur', 'Cef`aur')
Cephalid ('Cephalid', 'Cephalid')
Chimera ('Chimera', 'Cximepa')
Cockatrice ('Cockatrice', 'Cockatrice')
Construct ('Construct', 'Coostrqct')
Crab ('Crab', 'Crab')
Crocodile ('Crmcodile', 'Crocotile')
Cyclops ('Ciclors', 'Cykmops')
Dauthi ('Dattli', 'dauthh')
Demon ('Demon', 'Demon')
Devil ('Devil', 'Devil')
Djinn ('Djinn', 'Djion')
Dragon ('@ragon(', 'Dvagon')
Drake ('Drake', 'Dsake')
Dreadnought ('Dreadnmught', 'Dreadnoughd')
Drone ('Drone', 'Drone')
Dryad ('Dryad', 'Drial')
Dwarf ('Dward', 'Dwarf')
Eldrazi ('Ellrazi', 'Eldrazi')
Elemental ('Elemen|al', 'Elementan')
Elephant ('Dlephant', 'Elephant')
Elf ('Elf$', 'Elf')
Elk ('Elk', 'Mlk')
Eye ('Eqe', 'Eye')
Faerie ('Naerie `', '` Fierid')
Ferret ('Ferret', 'Ferret')
Fish ('Disl', 'Fish')
Fox ('Dox', 'Fox')
Frog ('Brog', 'Frog')
Fungus ('Fungus', 'Fenges')
Gargoyle ('Gargoile', 'Gavgoyle')
Germ ('Germ', 'Germ')
Giant ('Ciand', 'Giant')
Gnome ('Gnome', 'Gnome')
Goat ('Goat', 'Gmat')
Goblin ('Goblin', 'Gmblin')
God ('God', 'God')
Golem ('Golem', 'Golem')
Gorgon ('Gorgon', 'Gorgon')
Gremlin ('Gpemlin', 'Gpaolin')
Griffin ('Grigfin', 'Grignin')
Hag ('Hag', 'Hag')
Harpy ('Haspy', 'Harpq')
Hellion ('Hellion', 'Lmllion')
Hippo ('Hippo', 'Hippn')
Hippogriff ('Hirpogpiff', 'Hi`pogriff')
Homarid ('Homarid', 'Homarid')
Homunculus ('Homunculus', 'Homulculus')
Horror ('Horrgr', 'Horror')
Horse ('Horse', 'Horse')
Hound ('Houod', 'Hound')
Human ('Human', 'Heman')
Hydra ('Hydra', 'Hyera')
Hyena ('Hyena', 'Hiena')
Illusion ('Illusion', 'Illusion')
Imp ('Amp', 'Imp')
Incarnation ('Incarnation', 'Incarnapiol')
Insect ('Inseat', 'Insecd')
Jellyfish ('Hellifish', 'Jelhifish')
Juggernaut ('Juggernaut', 'Juggernaut')
Kavu ('Savu', 'Kavu')
Kirin ('Kirin', 'Kirin')
Kithkin ('Kithkin', 'Kathiin')
Kitsune ('Kitwune', 'Kmtsune')
Kobold ('Kobold', 'Kobold')
Kor ('Kkr', 'Kor')
Kraken ('Kraken', 'Cvaion')
Lamia ('Lamia', 'Lamia')
Lammasu ('Nammasw', 'Lalmasu')
Leech ('Leecl', 'Leeaj')
Leonin ('Lenfin', 'Leonin')
Leviathan ('Leviathan', 'Leviavhan')
Lhurgoyf ('Haurooyf', 'Lhesgoif')
Licid ('Liaid', 'Licid')
Lizard ('Lirard', 'Lizard')
Loxodon ('Lopodon', 'Loxodon')
Manticore ('Manticore', 'Maotiaore')
Masticore ('Masticore', 'Masticore')
Merfolk ('Merfonk', 'Lesfolk')
Metathran ('Letathran', 'Metathran')
Minotaur ('Lingtaur', 'Minodcur')
Mongoose ('Oongoose', 'Mongoose')
Moonfolk ('Moonfolk', 'Moonfolk')
Mutant ('Lutand', 'Mutand')
Myr ('Myr', 'Myr')
Naga ('Faga', 'Faga')
Nantuko ('Naltuko', 'Nantuko')
Nautilus ('Nautimus', 'Nautilus')
Nephilim ('Nephilie', 'Nephmlim')
Nezumi ('Nqzumi', 'Ne~}li')
Nightmare ('Niehtmare', 'Nighdmare')
Nightstalker ('Big`tstalker', 'Nightspahcer')
Noggle ('Noggle!`', '` Noggle')
Nymph ('Lymph', 'Nymph')
Octopus ('Oatopus', 'Octotus')
Ogre ('Ogre', 'Ogre')
Ooze ('Ooze', 'Ooze')
Orb ('Orc', 'Orc')
Orc ('Orc', 'Orc')
Orgg ('Krgg', 'Orgg')
Orochi ('Nrochi', 'Oroghi')
Ouphe ('Ouphe', 'Outhe')
Ox ('Ox', 'Ox')
Oyster ('Nyster', 'Oxster')
Pegasus ('@egasus', 'Pegarur')
Pentavite ('Peltavite', 'Peltavite')
Pest ('Pesp', 'Pest')
Phelddagrif ('Phe`loaorif', 'Phelddagraf')
Phoenix ('@ioenip', 'Phoenix')
Phyrexian ('Phirezian', 'Phqrehian')
Plant ('Pland', 'Plant')
Prism ('Prism', 'Prism')
Rabbit ('Babbid', 'Rabjit')
Rakshasas ('Bakshasas', 'Racshasqs')
Rat ('Rat', 'Rat')
Reflection ('Seflebdion', 'Refleatimn')
Rhino ('Rhino', 'Rhino')
Rhox ('Rhox', 'Phox')
Sable ('Sable', 'Sable')
Salamander ('Selamander', 'Salamander')
Sand ('Sald', 'Sand')
Saproling ('Siproling', 'Saproling')
Satyr ('Satyz', 'Satyr')
Scarecrow ('Scarecrow', 'Scarecrow')
Scorpion ('Scospion', 'Scorpiol')
Serpent ('Serpenv', 'Serrant')
Shade ('Shade', 'Shade')
Shapeshifter ('Shapeshifter', 'Shapeshidter')
Sheep ('Sieep', 'Sheep')
Siren ('Siren', 'Siren')
Skeleton ('Skeme~on', 'ROaleton')
Slith ('Sliuh', 'Slith')
Sliver ('Sliver', 'Sliver')
Slug ('Smue', 'Slug')
Snake ('Snake', 'Snake')
Soltari ('Sohtari', 'Solpari')
Spawn ('Spawn', 'S`iwn')
Specter ('Specter', 'Wpecter')
Sphinx ('Sp`hfh', 'Qphinx')
Spider ('Spider', 'Spifer')
Spike ('Spike', 'Spike')
Spirit ('Sparit', 'Spipit')
Sponge ('Spmngg', 'Sponge')
Squid ('Squid', 'Squid')
Squirrel ('Squavrel', 'Squapren')
Starfish ('Starfish', 'Star`ish')
Surrakar ('Surrajab', 'Sqrr`kar')
Tetravite ('Tetravite', 'Tetravite')
Thalakos ('Dhalakos', 'Thalakos')
Thopter ('Ploptep', 'Thoptar')
Thrull ('Thruld', 'Taruld')
Treefolk ('Drgefolk', 'Treefolk')
Triskelavite ('Trisielavite', 'Triskelavite')
Troll ('Droll', 'Troll')
Turtle ('Turple', 'Turtld')
Unicorn ('Unigorn', 'Unigorn')
Vampire ('Vampira', 'Wampire')
Vedalken ('Vedalken', 'Vedalkel')
Viashino ('Viaslino', 'Viashino')
Volver ('Rolver', 'Folrer')
Wall ('Wall', 'Well')
Weird ('Veird', 'Weird')
Werewolf ('Werevolf', 'Wevewolf')
Whale ('Whale', 'Whale')
Wolf ('Wolf', 'Wglf')
Wolfir ('Solfir', 'Womfir')
Wolverine ('Wolverine', 'Wolvevine')
Wombat ('Sombat', 'Wombat')
Worm ('Worm', 'Worm')
Wraith ('Sraidh$', 'Srainl')
Wurm ('Wurm', 'Wurm')
Yeti ('Qeti', 'Yeti')
Zombie ('Zombie', 'Zombie')
Zubera ('Zujera', 'Jubera')
Treefolf ('Tqwevkdb', 'Ppdgvono')
Brushwagg ('Bres`wagg', 'Brushwagg')
Cockatrice ('Cockatrice', 'Cockadrice')
Construct ('Construct', 'Coostruct')
Crocodile ('Crocodile', 'Crocodine')
Cyclops ('Cyclops', 'Cqklops')
Dreadnought ('Dreadnmught', 'Dreadnought')
Elephant ('Emephant', 'Elephant')
Fox ('Fox', 'Gox')
Gargoyle ('Gargoile', 'Gargokle')
Golem ('Golem', 'Gmlem')
Gremlin ('Grmmlin', 'Gremlin')
Hippo ('Hippo', 'Hipro')
Homunculus ('Hmmunculus', 'Homunculus')
Hydra ('Hydra', 'Hidra')
Hyena ('Hyena', 'Hiena')
Illusion ('Imlwsion', 'Illusion')
Incarnation ('Incarnation', 'Incapnatkon')
Jellyfish ('Jellyfish', 'Jellifish')
Juggernaut ('Juggernaut', 'Juggepnaut')
Kithkin ('Kithjin', 'Kithkin')
Kobold ('Kobond', 'Kobold')
Leech ('Leech', 'Leeah')
Leviathan ('Leviathan', 'Letiatian')
Lhurgoyf ('Lhurgoyf', 'Llurgoif')
Lizard ('Lirard', 'Lazard')
Loxodon ('Loxodon', 'Lohoeon')
Minotaur ('Minmtaur', 'Minotaur')
Myr ('Myr', 'Mir')
Nautilus ('Nautimus', 'Nautilus')
Nephilim ('Nephilie', 'Nephilim')
Nightmare ('Nightmare', 'Nighdmare')
Nightstalker ('Nightstalker', 'Nightsdalker')
Octopus ('Kctopus', 'Octopus')
Pest ('Pest', 'Rest')
Phyrexian ('Phirmxian', 'Phqrexian')
Reflection ('Reflebdion', 'Reflection')
Rhino ('Phino', 'Rhino')
Salamander ('Salemander', 'Salamander')
Saproling ('Saproling', 'Saprohing')
Scorpion ('Scorpion', 'Scgrpion')
Sheep ('Shee`', 'Sheep')
Skeleton ('Skelevon', 'Skeleton')
Specter ('Speater', 'Specter')
Starfish ('Starfish', 'Suarfish')
Surrakar ('Surrakar', 'Sqrrakar')
Tetravite ('Tetravite', 'Tetvavite')
Thrull ('Thrull', 'T`rull')
Triskelavite ('Trisielavite', 'Triskelavite')
Unicorn ('Unigorn', 'Unicorn')
Treefolf ('\\reefoln', 'Uveefolc')
I love the little things I see, like when the decoding is a homophone of the original input (jellyfish vs jellifish). I've noticed things like this before with the network we've been working with, where the machine comes into conflict with the quirks of English orthography. Like the network, when coming up with names for cards, is comfortable spelling "necromancy" as "necromancie", which is closer to French orthography but is also preserved in the plural as necromancies, a word that does not occur in the input corpus. In general, any rule set that attempts to describe how words are spelled in English cannot be both concise and correct.
By the way, I modified the training script to use ADAM instead of RMSprop, though I didn't make any other changes like adding in peephole connections. I have a few cycles to spare this evening so I figured I'd check and see what difference it makes to performance when all else is kept the same (just for fun). I'll let y'all know if anything comes of it.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
Tada.
Anyway, I'm now messing with the idea of using my encoding to do mashups between creature types. Doing the big network experiments now. I don't expect them to bear fruit, since large networks can represent the encoding twice. Once I get some results from that, I'll take it back to around 80 or 90 HN.
I used the default learning rate, 0.9 for the first beta, 0.999 for the second beta, with an epsilon of 1 * 10^-8. In short, the default values (that the Torch implementation provides).
And yes, after looking over the paper, faster convergence is exactly what I'm hoping for. I just wanted to see how well it behaved ceteris paribus before considering more drastic architectural changes.
EDIT: Yes, it converges very quickly. I like. Excellent suggestion.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
Harpy ('Harpi', 'Harpy')
Hippogriff ('Hippogriff', 'Hippmgriff')
Nantuko ('Nanteko', 'Nantuko')
Nautilus ('Nautimus', 'Nautilus')
Nightstalker ('Nightstalker', 'Nightsdalker')
Phyrexian ('Phyrexian', 'Phxrexian')
Treefolf ('Treefolk', 'Tveefolg')
Merged output
Gargoyle Gargoile
Harpy Harpi
Nantuko Nanteko
Nautilus Nautimus
Treefolf Treefolk
Obscenely easy, in fact.
Instead of:
local _, loss = optim.rmsprop(feval, params, optim_state)
you just have
local _, loss = optim.adam(feval, params, optim_state)
But the ADAM has some extra parameters that you can optionally pass to it that you could not with RMSprop, and we might consider exposing those options to the user.
EDIT: Peepholes, the more I think about it, seem like a fun idea to try. I would need to figure out how to solder them onto our existing implementation, of course. As you can see from the attached graph, which I lovingly borrowed from Gers et al. 2002., they allow us to see the current state of the cell even if the gates are closed. Anyway, this evidently proved to be helpful for motor control and rhythm detection tasks.
Tiir, correct me if I'm wrong, but is the idea that all cells on the same layer have peephole connections to the other cells on that layer? Just making sure I understand how this needs to work.
EDIT: Speaking of rhythm detection tasks, I wonder how well this will work for jazz music. It takes special training to listen for the notes that the musician isn't playing. That gives me fun ideas.
EDIT(2): Ohh, I think I see what would happen. Is it that neighboring cells are more likely to fall into agreement with each other, reducing the chance of concept drift? That seems like a possible outcome. Worth experimenting with, in any case.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
Also, looking at the merge results again, I suddenly noticed that they're basically "left wins". There's no obvious reason for that to be the case, but there it is. Left must just be stronger with these parameters and seed.
Ohh, gotcha. I was under the impression that we were talking about some kind of cross-chatter mechanism. Like if all of a cell's friends smoked cigarettes, then it seems like a very appealing idea to that cell. But then I started to wonder about the computational costs because now I have to worry about information moving in multiple directions and that makes unrolling everything more complicated. What you're suggesting seems much more reasonable.
EDIT: And relatively easy to implement. Looking into that. Karpathy noted that they were "useless", and chose not to implement them. And yet, from what you've been telling me, they most definitely have a use. I'm interested to see how things turn out.
EDIT(2): Almost.. got it.. almost. May have made a typo, which is common for me after a long day, lol.
EDIT(3): And we are a go! Might have made a mistake. Might not have. I suppose I'll know soon enough. Right now I'm testing ADAM + peephole connections without biases on the gates.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
Edit: I think a more clearer example would be asking Vader to choke someone instead of asking a storm trooper to do it. Sure a storm trooper can choke someone but not as cool and efficient as Vader.
On the bright side, you'll have the coolest "old-man-stories" when your (legally?) old enough to tell one. Especially if you ended up with far more interesting stories due to jumbled memories (During the time I was working with the government, I get to pass through cool stuff with guns guarding an area with restricted men. )
Lua has good cross-platform support because it's built on plain and simple ANSI C. It's an extremely lightweight, dynamically typed language, but also one that supports features like first-class functions and garbage collection. I think the sophisticated but flexible memory management and speed are deciding factors for it being the language Torch is built on.
I can't speak to Python's use as a language for ML library development, but it is one of my favorites and is my go-to language for any programs of small to moderate size. It's also my favorite language to teach apprentice codesmiths, so it has that going for it as well. Meanwhile, I tend to do anything bulky in a language like C++/Java.
In my day-to-day work, I use Lua, Python, C, C++, and Java.
I also periodically make use of Fortran, Prolog, and Haskell.
And then there's a slew of formal (non-programming) design languages that are too numerous for me to count but that end up playing a role in variety of projects that I undertake.
I use different languages for different purposes.
Rofl, I never thought of it quite like that before.
EDIT: I made a small mistake with the peephole connections. It worked, but not the way I meant for it to, and it slowed things down rather than speeding them up (my bad). I'll return to that in the morning when I have time, fix it, and then try running the training script again.
EDIT(2): Rerunning things. But I have a suspicion that something is wrong. I think it may have something to do with the fact that I try doing some weight sharing and those weights get unshared when I do a cast. But we'll see.
My LinkedIn profile... thing (I have one of those now!).
My research team's webpage.
The mtg-rnn repo and the mtg-encode repo.
Wow, those are three languages I rarely see mentioned in the same breath Except perhaps to finish the sentence “The following languages share almost nothing in common:”
Intuitively speaking, it doesn't make sense to go above 90 neurons, because the alphabet I'm using simply doesn't have a high enough information density to justify it. The trick to finding low-error networks appears to be really low feedback (as I've said before) and a few thousand generations. Don't quote me on this, but I suspect the sweet spot for feedback is for the first-order factor to be within an order of magnitude of 0.5 divided by the number of input neurons, and the second-order factor to be any number significantly smaller than the first-order factor.
Does anyone have any intuitions about where would be interesting to go from here? Once I can create encoders like this, I can, like I said, try to encode the encoding (which should reveal higher-order dependencies), give the network fancy merging capabilities ("create the average of these two encoded vectors" Will this look like anything? I don't know!), or simply move on to other fields and see if my scheme generalizes.
For now, I'm going to bump up the iteration count on this 80 HN net until it learns that it really shouldn't be "Harpi". NOTE FROM THE FUTURE: I done goofed, see spoiler.
Thinking on this stuff has me thinking about a few changes to make. One is, instead of specifying a single first-order weight that gets applied directly to the deltas, I should really either modify the weight or average the deltas. That would save me from having to do mental arithmetic on long decimals to avoid chaos and local minima, and it would also make a given set of parameters more portable between networks of different sizes. Of course, actually making this change would involve figuring out what the heck math I'm actually having this stuff do...
ETA: got a good network with 81 HN. Not totally convinced that this isn't partially due to the extra 481 weights perturbing the random number distribution among the matrices (I'm doing most of this with the same seed), but hey, now I've got a perfectly fitted auto-encoder. It also handles a bit of data not in the training set, and some of my early "merge" experimentation.
ETA: "I can't see how to optimize the inner loop any more." I said. "Installing NumPyPy will be quick and painless." I said. "Let's see what kind of performance I get." I said. Ugh. 6x slowdown. It must not be running long enough for pypy to warm up. Either that, or NumPyPy isn't a good fit for this code. But, it was a nice orthogonal experiment.