I bet they're similar. Probably more similar than we might care to admit - and even our faculties of reason too. (Though personally, it may only partly indicate that animal instincts are more complex - rather, more that we aren't so far derived from other animals. But maybe I'm too behaviourist.)
Possibly. Although they could be quite dissimilar on an absolute scale and still be analogous. Look at a bat's wing, an elephant's foot, and a human hand.
Private Mod Note
():
Rollback Post to RevisionRollBack
Vive, vale. Siquid novisti rectius istis,
candidus inperti; si nil, his utere mecum.
Do you think we could recognize what another animal feels? A good starting place would probably be our recognition of what other humans feel. If we see the actions of other people and create our theory of mind by projecting our own mental processes onto them (and in doing so, accept that others have minds like ours - even if we don't really know exactly what/how they feel), what would be the equivalent between us and a theory of (non-human) animal minds? It would appear then that convergence wouldn't matter if we use actions to infer intentions, since functionality is the important feature; and given convergence, even though some animal actions may appear odd to us, we should be able to recognize some kind of intent, similar to humans*. Then, it looks like we would have to do what might be considered anthropomorphizing to accept a theory of animal mind. At the same time, we would probably take into account neural structure as a consequence of inferring intent in other humans and doing some neurological inquiring. But is that circular?
We consider ourselves to have minds and intentions. We see other people who act like we would and infer they have minds and intentions. We notice that effects on the brain have systematic effects on minds and intentions. We conclude that the brain is directly related to minds and intentions. But does using this principle deny, or merely complement the inference of superficial intentionality? Am I at all coherent?
*To steal your nice structuring of asides, is a human's attempt to create a theory of animal mind like an autist's attempt to create a traditional theory of mind? Even if one can recognize other minds on a higher order, there would still be something blocking an intuitive recognition.
While it is true that hyenas have more kills stolen from them by lions than vice versa, it still happens. In addition, leopards and cheetahs do in fact suffer from competition from lions (which is called interspecific competition since it's competition between different species), which is why for example leopards will often take their prey up a tree to eat it, and as mentioned earlier, cheetahs are very wary of lions because lions steal kills. It seems that since hyenas live in packs and are strong enough to challenge lions with their numbers and the relative strength of each one, overlapping prey, and stealing of food, natural selection has eventually selected for the ones that are most aggressive toward hyenas to be the most successful. Most other examples of interspecific competition do not involve predators killing each other because fighting is a dangerous activity and extremely energy consuming. Most animal fights consist of lots of bluffing and rituals to scare the opponent before actually having to resort to fighting. However, the benefits to lions fighting hyenas may outweight the costs in this scenario.
Possibly. Although they could be quite dissimilar on an absolute scale and still be analogous. Look at a bat's wing, an elephant's foot, and a human hand.
candidus inperti; si nil, his utere mecum.
Do you think we could recognize what another animal feels? A good starting place would probably be our recognition of what other humans feel. If we see the actions of other people and create our theory of mind by projecting our own mental processes onto them (and in doing so, accept that others have minds like ours - even if we don't really know exactly what/how they feel), what would be the equivalent between us and a theory of (non-human) animal minds? It would appear then that convergence wouldn't matter if we use actions to infer intentions, since functionality is the important feature; and given convergence, even though some animal actions may appear odd to us, we should be able to recognize some kind of intent, similar to humans*. Then, it looks like we would have to do what might be considered anthropomorphizing to accept a theory of animal mind. At the same time, we would probably take into account neural structure as a consequence of inferring intent in other humans and doing some neurological inquiring. But is that circular?
We consider ourselves to have minds and intentions. We see other people who act like we would and infer they have minds and intentions. We notice that effects on the brain have systematic effects on minds and intentions. We conclude that the brain is directly related to minds and intentions. But does using this principle deny, or merely complement the inference of superficial intentionality? Am I at all coherent?
*To steal your nice structuring of asides, is a human's attempt to create a theory of animal mind like an autist's attempt to create a traditional theory of mind? Even if one can recognize other minds on a higher order, there would still be something blocking an intuitive recognition.
I did that joke already. Nobody laughed at me either
Seems the Internet is serious business.
Good game?
candidus inperti; si nil, his utere mecum.