If the mind-body dualism is false, then our mental processes would be determined not by ourselves, but by foreign physical forces alone (be this forces deterministic chain of events or aleatory).
It seems to me that the only way for us to take decisions and be responsible for the outcomes is if those decisions are not resulted by physical forces outside of our control.
So, can of free will exist without mind-body dualism (which most claim to be false) ?
However, I've see people define "self" in such away as to render the point moot. If you assume the atoms that make up your body are "you" and their actions what you 'choose' to do, the issue goes away and you have free will.... of course, so does the computer I'm typing on....
Anyway, I do think that "Free Will" is just an illusion brought upon by our false sense of introspection. We've essentially evolved to believe we have free will when the whole thing is really a sham.
However, I've see people define "self" in such away as to render the point moot. If you assume the atoms that make up your body are "you" and their actions what you 'choose' to do, the issue goes away and you have free will.... of course, so does the computer I'm typing on....
Anyway, I do think that "Free Will" is just an illusion brought upon by our false sense of introspection. We've essentially evolved to believe we have free will when the whole thing is really a sham.
So if no free will is the state of the art, how can morals exist ?
There shouldn't be any difference between a killing lighting bolt and a murderer.
I guess I shouldn't feel bad about the wrongs i did and nor should I feel good about the rights I did since I don't even exist to begin with lol ?
italofoca, your post is really just an Appeal to consequences. The convence of free will in theories about justice does not have any barring on its validity. But, I will address your statements anyway.
Getting ride of the idea of free will only negates the theory of retribution, not rehabilitation.
There shouldn't be any difference between a killing lighting bolt and a murderer.
If we can do something about either, we should. If we can save lives by locking the murderer up, or by finding away to prevent deaths from lightening, these are actions that should be taken.
Do you think we should not bother to prevent lightning strikes because lightning doesn't have a soul? No, the question is a non sequitur. A things free will does not prevent you from trying to rehabilitate it.
I guess I shouldn't feel bad about the wrongs i did and nor should I feel good about the rights I did since I don't even exist to begin with lol ?
If you want to be a functioning member of society and your actions are preventing you from achieving that goal, you should feel bad. Actions can be evaluated in the context of goals.
However, if you DID have Free Will, how would you go about evaluating your actions in terms of "good" and "evil," anyway?
italofoca, your post is really just an Appeal to consequences. The convence of free will in theories about justice does not have any barring on its validity. But, I will address your statements anyway.
I guess it could 'sound' like that but I made a genuine question >_< ..!
Getting ride of the idea of free will only negates the theory of retribution, not rehabilitation.
Wouldn't it also negates the existence of morals? The same way one can say "a earthquake is not moral" (although it does harm) one could say "a murder is not moral". Sentience (requirement for moral) is a illusion without free will because we can't make decisions outside what nature decides. We're just expectorates without any saying.
There's no point in calling certain "actions" wrong since all actions taken by everyone are natural events and nothing more.
By not worrying about who's "fault" evil is. If somethings doing something harmful, it should be stopped regardless if it's "really to blame" or not.
Well, morals is not only about what should be stopped. I guess any immoral action should be stopped but not all actions that should be stopped can be considered immoral.
If we can do something about either, we should. If we can save lives by locking the murderer up, or by finding away to prevent deaths from lightening, these are actions that should be taken.
Do you think we should not bother to prevent lightning strikes because lightning doesn't have a soul? No, the question is a non sequitur. A things free will does not prevent you from trying to rehabilitate it.
The free will believer would consider important to prevent lighting because HE have a conscious and HE should avoid lighting (or prevent lighting to fall on people he cares).
To the free will believer, actions are taken based on his morality and the possibilities offered by the external environment, not the morality of the external environment.
The matter is not if we should prevent harm (because under both worlds we should), but why a individual should not harm others.
Since individuals don't decide anything, the harm that will be done will be done.
If you want to be a functioning member of society and your actions are preventing you from achieving that goal, you should feel bad. Actions can be evaluated in the context of goals.
I don't "want" anything since I don't have will. If I will reach "my goals" or not it will be decided by nature alone and not me. It is either pre-determined by a deterministic chain of events or will be determined by luck (or a mix of both).
If actions can be evaluated in context goals alone, then there's no reason why I shouldn't harm others while chasing my goals or why I shouldn't adopt "harm others" as my goal.
Morals are not needed to justify why we should try to prevent harm as you said. But it is needed to a individual to justify to himself why he shouldn't choose to cause harm. Without Free Will there's no reason why a individual shouldn't harm another, aside from self-preservation.
However, if you DID have Free Will, how would you go about evaluating your actions in terms of "good" and "evil," anyway?
Well if I cause suffering to others or if I screw up to myself (like getting a low grade or cheating in a test), I feel bad about it because I'm responsible for what happened. This feeling will certainly guide me to not make this mistakes again. I know the feeling of guilt is a biochemical reaction but my mistakes aren't (under the free will hypothesis) so the "trigger" will happen.
There's no reason to ever feel guilty in a world without free will, since your mistakes are not your doings. I see no reason to regret a low grade the same way I see no reason to regret someone else low grade, since i'm just as much responsible for my low grade as i'm for anyone else (which is, no responsible at all). Also there's no reason to regret any gain/losses from those grade to my future because the grade was already decided by nature (in deterministic or stochastic fashion).
Wouldn't it also negates the existence of morals? The same way one can say "a earthquake is not moral" (although it does harm) one could say "a murder is not moral". Sentience (requirement for moral) is a illusion without free will because we can't make decisions outside what nature decides. We're just expectorates without any saying.
There's no point in calling certain "actions" wrong since all actions taken by everyone are natural events and nothing more.
It recontextualizes how we evaluate moral actions, but I would argue it does not get ride of them. There are still things we are morally right in doing and others we are not. We should "ought" to do some thing and "ought" not to do others. http://en.wikipedia.org/wiki/Is%E2%80%93ought_problem
Well, morals is not only about what should be stopped. I guess any immoral action should be stopped but not all actions that should be stopped can be considered immoral.
I would disagree.
Assuming we are talking in the context of a moral goal like "humans dying should be prevented."
The free will believer would consider important to prevent lighting because HE have a conscious and HE should avoid lighting (or prevent lighting to fall on people he cares).
To the free will believer, actions are taken based on his morality and the possibilities offered by the external environment, not the morality of the external environment.
Why do you feel the need to blame the dead guy? Why blame anyone?
Just work on fixing the problem, not pointing fingers.
The matter is not if we should prevent harm (because under both worlds we should), but why a individual should not harm others.
Since individuals don't decide anything, the harm that will be done will be done.
Again, whether or not we "blame" people for their actions isn't the point. Some actions need to be prevented, and steps need to be taken to prevent them. "Blame" does not matter in that context.
I don't "want" anything since I don't have will. If I will reach "my goals" or not it will be decided by nature alone and not me. It is either pre-determined by a deterministic chain of events or will be determined by luck (or a mix of both).
If actions can be evaluated in context goals alone, then there's no reason why I shouldn't harm others while chasing my goals or why I shouldn't adopt "harm others" as my goal.
You could just act like you have Free Will anyway. That's what I do.
Morals are not needed to justify why we should try to prevent harm as you said. But it is needed to a individual to justify to himself why he shouldn't choose to cause harm. Without Free Will there's no reason why a individual shouldn't harm another, aside from self-preservation.
Now you sound like a Christian telling an atheist he can't be moral without God.
However, if you DID have Free Will, how would you go about evaluating your actions in terms of "good" and "evil," anyway?
Well if I cause suffering to others or if I screw up to myself (like getting a low grade or cheating in a test), I feel bad about it because I'm responsible for what happened. This feeling will certainly guide me to not make this mistakes again. I know the feeling of guilt is a biochemical reaction but my mistakes aren't (under the free will hypothesis) so the "trigger" will happen.
There's no reason to ever feel guilty in a world without free will, since your mistakes are not your doings. I see no reason to regret a low grade the same way I see no reason to regret someone else low grade, since i'm just as much responsible for my low grade as i'm for anyone else (which is, no responsible at all). Also there's no reason to regret any gain/losses from those grade to my future because the grade was already decided by nature (in deterministic or stochastic fashion).
You either have Free Will right now or you do not.
Why should suddenly realizing you don't need to change anything?
Italofoca is correct in identifying the problem of blame as a crucial consideration in discussions of free will. If you have free will, then you receive blame for committing wrong acts and credit for committing right ones. If we can't properly assign blame to anyone, then, for example, a punitive prison system (such as that of the United States, but not, say, that of Norway) is itself a moral crime... but a moral crime committed by who? If nobody can be assigned blame, nobody is responsible for the status of such a prison system, and nobody can be blamed for failing to fix it.
All of ethics is essentially thrown up in the air, and while it's not impossible that it'll find another way to stand up (it was never really on all that solid ground anyway!), the project of creating a solid, consistent ethics that most people would accede to becomes more difficult. Ethics aren't the only possible grounds for action, but that doesn't make them unimportant.
All that said, I disagree with the premise:
If the mind-body dualism is false, then our mental processes would be determined not by ourselves, but by foreign physical forces alone (be this forces deterministic chain of events or aleatory).
This is the conventional chain of reasoning - and I think it's wrong. To see why, you have to define the term 'ourselves'. If the mind/body dualism is false, then the only thing that can be meant by that term is the physical stuff making up our bodies, and the mental 'software' - the complex, intra-communicative, self-aware system - which is our mind.
The consequences of that reasoning aren't well appreciated, even among professional philosophers. When people talk of hard determinism, they seem to have in mind a universe in which 'your' decisions are overruled by physical law, but that's nonsensical. To be fair, I'm pretty sure nearly everyone would recognize how nonsensical that is, if it's pointed out to them, but it generally isn't and they generally continue to act as though that's meant. It bears repeating, because it's crucially important: at no point does physical law step in and overrule 'you' or force you to change your mind.
In the hard determinist world (or the quantum determinist world, but that's more controversial), you always make exactly the decisions that you wish to make, which are often different than the decisions someone else would make if put in your position. It's true on that view that if the world were physically a bit different, you would make different decisions... but 'you' being physical matter and the complex system built of physical matter, if the world were physically a bit different, you would be different. For unimportant decisions which you make on a lark, the universe (and you) would only need to be a bit different. For critically important decisions that cut to the core of your beliefs, the universe (and you) would need to be very different for you to decide differently.
I'm not sure people think too much about this, but we don't want our decisions to be uncaused. We want them to be caused by us. Uncaused decision-making is as bad as fully determined decision-making on the worst hard determinist/incompatibilist view - from our perspective the decisions might as well then be random. We want our decision-making to be caused - by the right causes (this, incidentally, is why quantum indeterminacy is no help at all for resolving questions of free will, except for 'solutions' that involve pseudo-mystical hand-waving along the lines that quantum indeterminacy ends up being determined by 'your' dualist soul).
Well, if hard determinism is true, our decisions are caused, and they're caused by us. We wouldn't decide otherwise unless we were different. Why doesn't that count as the right sort of cause?
Italofoca is correct in identifying the problem of blame as a crucial consideration in discussions of free will. If you have free will, then you receive blame for committing wrong acts and credit for committing right ones. If we can't properly assign blame to anyone, then, for example, a punitive prison system (such as that of the United States, but not, say, that of Norway) is itself a moral crime... but a moral crime committed by who? If nobody can be assigned blame, nobody is responsible for the status of such a prison system, and nobody can be blamed for failing to fix it.
Only if it exists for retribution, not for rehabilitation.
When my car stops working I don't try to punish it or get angry at it, I just try to get it working again. I need to fix it, not blame it. A prison should be focusing on "fixing" criminals, not "punishing" them.
Anyway,
I hope we can all agree rehabilitation is more useful than retribution with or without free will.
If the mind-body dualism is false, then our mental processes would be determined not by ourselves, but by foreign physical forces alone (be this forces deterministic chain of events or aleatory).
A physicalist might simply say that this is an improper assertion of ontological priority. It is your self that is determined by your mental processes. And without that backwards ontology, the rest does not follow.
Private Mod Note
():
Rollback Post to RevisionRollBack
A limit of time is fixed for thee
Which if thou dost not use for clearing away the clouds from thy mind
It will go and thou wilt go, never to return.
Italofoca is correct in identifying the problem of blame as a crucial consideration in discussions of free will. If you have free will, then you receive blame for committing wrong acts and credit for committing right ones. If we can't properly assign blame to anyone, then, for example, a punitive prison system (such as that of the United States, but not, say, that of Norway) is itself a moral crime... but a moral crime committed by who? If nobody can be assigned blame, nobody is responsible for the status of such a prison system, and nobody can be blamed for failing to fix it.
Only if it exists for retribution, not for rehabilitation.
When my car stops working I don't try to punish it or get angry at it, I just try to get it working again. I need to fix it, not blame it. A prison should be focusing on "fixing" criminals, not "punishing" them.
Anyway,
I hope we can all agree rehabilitation is more useful than retribution with or without free will.
That's what I said. The United States prison system DOES operate on that principle, and we assume it so powerfully that the Norwegian system - with its 21 year maximum sentence even for the most heinous of crimes, provided the criminal is judged to be rehabilitated at the end of the sentence - is met with incredulity when described to Americans.
If we're unable to assign blame, the US prison system is an incredible moral crime, but by who and against whom?
If we're unable to assign blame, the US prison system is an incredible moral crime, but by who and against whom?
By the US government and against mankind (or--at the very least--the criminals).
However, look at the statement. It gives the blame to "the US government" as if the "the US government" was an entity unto itself with its own free will.
Would it not be better to just knock it off, instead of trying to hold "the US government" accountable?
Why does it necessarily matter who is to blame? Just stop it.
If we're unable to assign blame, the US prison system is an incredible moral crime, but by who and against whom?
By the US government and against mankind (or--at the very least--the criminals).
However, look at the statement. It gives the blame to "the US government" as if the "the US government" was an entity unto itself with its own free will.
Would it not be better to just knock it off, instead of trying to hold "the US government" accountable?
Why does it necessarily matter who is to blame? Just stop it.
But if we can't assign blame to people because of the lack of free will, then there's no individual in the US government who can be held responsible for failing to stop it. We can't just translate back to 'respectable' individual-centric language and claim that resolves the problem; it doesn't. If we can't assign blame to individuals, there's nobody who has a moral obligation to fix the US prison system, because there are no moral obligations. Every person who might have the power to stop it can simply say, "The deterministic universe caused me not to stop it."
Saying that someone should knock it off when nobody can be assigned blame or held responsible is borderline nonsensical - you might as well say that a sinkhole ought not open under someone's house or a volcano ought not explode near a city. "Just stop it" isn't an argument.
One thing to be very clear on: we're down the rabbit hole of 'blame' right now. If it seems absurd, it sort of is. It has actually been wielded as a reductio ad absurdum to 'disprove' hard determinism - though I think most people at this point realize that it does nothing of the kind and that such an argument is wishful thinking at best, abject self-delusion at worst. It's not clear either that there is no free will (I've already stated a compatibilist position - a position that says that even if the universe is deterministic (or quantum indeterministic), we can still have free will) or that a lack of free will would mean we could not assign blame; there are proposed solutions in both cases.
But if we can't assign blame to people because of the lack of free will, then there's no individual in the US government who can be held responsible for failing to stop it.
And I'm saying instead of trying to assign blame, you should just try and fix the problem.
Now, finding "the source" of the problem would be a good way to go about trying to fix it. But, again, you don't need to assign "blame" in doing that. You can find something responsible for the problem without blaming it.
We can't just translate back to 'respectable' individual-centric language and claim that resolves the problem; it doesn't. If we can't assign blame to individuals, there's nobody who has a moral obligation to fix the US prison system, because there are no moral obligations. Every person who might have the power to stop it can simply say, "The deterministic universe caused me not to stop it."
Saying that someone should knock it off when nobody can be assigned blame or held responsible is borderline nonsensical - you might as well say that a sinkhole ought not open under someone's house or a volcano ought not explode near a city. "Just stop it" isn't an argument.
I find this line of reasoning borderline non-sense.
If my computer breaks down, is my inability to morally blame this malfunctioning piece of equipment somehow prevent me from finding the part responsable and replacing it? If I find the part responsable, can I not fix the problem because the part doesn't have free will? Of course not, don't be silly.
Also, I find your impaction that acts of nature are unstoppable defeatist.
One thing to be very clear on: we're down the rabbit hole of 'blame' right now. If it seems absurd, it sort of is. It has actually been wielded as a reductio ad absurdum to 'disprove' hard determinism - though I think most people at this point realize that it does nothing of the kind and that such an argument is wishful thinking at best, abject self-delusion at worst. It's not clear either that there is no free will (I've already stated a compatibilist position - a position that says that even if the universe is deterministic (or quantum indeterministic), we can still have free will) or that a lack of free will would mean we could not assign blame; there are proposed solutions in both cases.
We can still find "sources" of the problems, even without blame. My car mechanic does it all the time.
But if we can't assign blame to people because of the lack of free will, then there's no individual in the US government who can be held responsible for failing to stop it.
And I'm saying instead of trying to assign blame, you should just try and fix the problem.
Now, finding "the source" of the problem would be a good way to go about trying to fix it. But, again, you don't need to assign "blame" in doing that. You can find something responsible for the problem without blaming it.
We can't just translate back to 'respectable' individual-centric language and claim that resolves the problem; it doesn't. If we can't assign blame to individuals, there's nobody who has a moral obligation to fix the US prison system, because there are no moral obligations. Every person who might have the power to stop it can simply say, "The deterministic universe caused me not to stop it."
Saying that someone should knock it off when nobody can be assigned blame or held responsible is borderline nonsensical - you might as well say that a sinkhole ought not open under someone's house or a volcano ought not explode near a city. "Just stop it" isn't an argument.
I find this line of reasoning borderline non-sense.
If my computer breaks down, is my inability to morally blame this malfunctioning piece of equipment somehow prevent me from finding the part responsable and replacing it? If I find the part responsable, can I not fix the problem because the part doesn't have free will? Of course not, don't be silly.
Also, I find your impaction that acts of nature are unstoppable defeatist.
One thing to be very clear on: we're down the rabbit hole of 'blame' right now. If it seems absurd, it sort of is. It has actually been wielded as a reductio ad absurdum to 'disprove' hard determinism - though I think most people at this point realize that it does nothing of the kind and that such an argument is wishful thinking at best, abject self-delusion at worst. It's not clear either that there is no free will (I've already stated a compatibilist position - a position that says that even if the universe is deterministic (or quantum indeterministic), we can still have free will) or that a lack of free will would mean we could not assign blame; there are proposed solutions in both cases.
We can still find "sources" of the problems, even without blame. My car mechanic does it all the time.
I think you're missing the point entirely.
Your car do not choose if he will break or not. You, Taylor, can do things to prevent your car for breaking or can fix it when it breaks. No ones denying that anything that cause harm should be stopped. The moral lies in stopping harm, not harm itself, we all agree.
Still, how you convince someone to not do harm? How you convince yourself to not cause harm? What makes it so that harm should be avoided? How you decide if someone's action must be prevented or not (sometimes causing harm is within someone's right and needs)?
Moral is the answer for all those questions. God is not needed to prevent people from hurting each other. Moral is.
Without free will a person who is in position to do harm in order to reach it's goals and not suffer any consequences have zero reasons to not proceed, since the person is not responsible for the consequences of it's own acts.
Your car do not choose if he will break or not. You, Taylor, can do things to prevent your car for breaking or can fix it when it breaks. No ones denying that anything that cause harm should be stopped. The moral lies in stopping harm, not harm itself, we all agree.
And how does my ability to prevent things change with or without free will?
Still, how you convince someone to not do harm? How you convince yourself to not cause harm?
How does my ability to change others get affected if it's revealed none of us has free will? Would that suddenly stop me from being able to change other things around me?
I was unaware that objects without free will were unable to affect other objects without free will.
Moral is the answer for all those questions. God is not needed to prevent people from hurting each other. Moral is.
And, I am saying Free Will and morality don't necessarily have to have anything to do with each other. No more than God and morality does.
If you define morality so that it is completely contingent on "choice," then--yes--you need free will to have morality.
Just like if you define morality such that good can only come from God, then you need God to have morality.
However, you don't need to define it like that at all.
Without free will a person who is in position to do harm in order to reach it's goals and not suffer any consequences have zero reasons to not proceed, since the person is not responsible for the consequences of it's own acts.
Without GOD a person who is in position to do harm in order to reach it's goals and not suffer any consequences have zero reasons to not proceed, since the person is not responsible for the consequences of it's own acts.
Only if you're intent on defining mortally that way.
The principle is identical. The choice is simply determined as based on the components of the car, their interactions and the car's environment (so even more interactions). The choice is obviously fully dependent on the sum of all of the above, but it's still a choice because some of the above factors can be manipulated.
A human being breaks down for similar reasons: his mind is in the right set and the circumstances are right. Both are not set in stone.
So if the principle is identical a person that did harm did not choose to do it (the same way for the car).
There isn't any moral issue. A killer's murder and a lighting bolt struck holds the same amount of moral infraction (zero infraction).
Humans have certain desires. You can convince someone by appealing to those. Or by appealing to yours and forcing the other to take them into account.
This won't work if the choice evolved in causing harm have more appeal considering the consequences for yourself or the feelings of the third party. Without free will people are not responsible for their choices.
Just imagine a human being offered a choice: where do the options he considers come from? Based on which criteria does he decide? How is all of that not pre-determined?
Options are always pre-determined, no matter in which light you see this problem.
The criteria, in a free-will world, wouldn't be pre-determined. It would be arbitrarily determined by the person, influenced by the info he holds.
Define this thing "free will" that you think we do not have in a non-dualistic universe, and why you think it is noteworthy that we don't have it.
Okay.
i. Free Will means decisions are caused in a non-deterministic and non-aleatory chain of events. They are arbitrary and we are the arbiters.
ii. Every physical happening is caused in a deterministic chain of events. The exception are physical happening that occur in stochastic fashion (the quantum stuff).
iii. In the absence of a body-mind duality, our decision making process is a physical happening. If our decision making is a physical happening then it is either deterministic or stochastic. Thus they are not arbitrary and free will doesn't exist.
i. Free Will means decisions are caused in a non-deterministic and non-aleatory chain of events. They are arbitrary and we are the arbiters.
ii. Every physical happening is caused in a deterministic chain of events. The exception are physical happening that occur in stochastic fashion (the quantum stuff).
iii. In the absence of a body-mind duality, our decision making process is a physical happening. If our decision making is a physical happening then it is either deterministic or stochastic. Thus they are not arbitrary and free doesn't exist.
And, what moral repercussions would this realization cause? And--if we don't--how does that change our moral actions?
But if we can't assign blame to people because of the lack of free will, then there's no individual in the US government who can be held responsible for failing to stop it.
And I'm saying instead of trying to assign blame, you should just try and fix the problem.
Now, finding "the source" of the problem would be a good way to go about trying to fix it. But, again, you don't need to assign "blame" in doing that. You can find something responsible for the problem without blaming it.
I don't think you've fully appreciated the problem of blame. That might be my fault; I am trying my darnedest to communicate why it's an issue but I seem to be coming up short.
Sticking with the US prison system as an example. I will take it for granted that US sentencing is in fact based on a punishment principle first, a rehabilitation principle as a distant second. I'm open to evidence that this is not so, but taking it for granted for the sake of the example...
Assume we can't assign blame. That is, nobody can be morally responsible for anything they do, because hard determinism is true and because you can't be morally responsible for an action that you physically could not have prevented (set aside the bit of incoherence to this argument that both crashing00 and I have noted, and the question of definition that B_S raised - this is the incompatibilist hard determinist position, it's just that none of us accept it philosophically).
So, what we have is a prison system which is punishing individuals for crimes which we have already held that they can't be morally responsible for, since they couldn't have physically prevented them.
You're suggesting that this system is broken, so we should just fix it. By what standard do you say that it's broken?
If nobody can ever be morally responsible for any action, good or bad, then in what sense are they moral agents? Is an unjustifiable prison system really 'broken' if it's not being imposed on moral agents?
By what standard do you say 'we should fix it'? More to the point, what do you mean by 'we' and what do you mean by 'should'? There are no moral agents who could be responsible for fixing it, so there are no agents who 'should' do anything!
This is the essential motivation for philosophers to fear the problem of blame - taken to an extreme, it completely eliminates all moral agents, all moral decision-making, all values and all action.
I've already said that I think the problem is easily resolved, of course. I'm not trying to convince you that it's irresolvable. I'm trying to convince you that resolving it matters - that it can't just be brushed aside as unimportant.
We can't just translate back to 'respectable' individual-centric language and claim that resolves the problem; it doesn't. If we can't assign blame to individuals, there's nobody who has a moral obligation to fix the US prison system, because there are no moral obligations. Every person who might have the power to stop it can simply say, "The deterministic universe caused me not to stop it."
Saying that someone should knock it off when nobody can be assigned blame or held responsible is borderline nonsensical - you might as well say that a sinkhole ought not open under someone's house or a volcano ought not explode near a city. "Just stop it" isn't an argument.
I find this line of reasoning borderline non-sense.
If my computer breaks down, is my inability to morally blame this malfunctioning piece of equipment somehow prevent me from finding the part responsable and replacing it? If I find the part responsable, can I not fix the problem because the part doesn't have free will? Of course not, don't be silly.
Also, I find your impaction that acts of nature are unstoppable defeatist.
It's supposed to be defeatist. It's the worst nightmare of generations of ethical philosophers, after all.
Incidentally, this use of 'responsible' is very different from previous uses (where it was used as 'morally responsible'). You're using it here as 'the cause of'. Nobody is claiming that hard determinism means people can't cause the death of other people; some are claiming that hard determinism means that people can't be morally responsible for the death of other people - that the fact that they physically caused the death of another person doesn't mean that they bear any moral burden. It's best to be more careful with terms in a discussion of this kind, since equivocation on terms brings a serious risk of misleading readers.
More to your point:
If your computer breaks down, but there are no moral agents around to judge that it 'should' operate in a particular way, in what sense has it broken down? It's just a physical object. It's not 'wrong' or 'right', it just is.
When there's a being capable of assigning value and defining right and wrong around, that being can define it as broken and decide it has to be fixed, but if there are no moral beings around...
This is the state of the US prison system: It's punishing people who aren't morally guilty, but nobody's around to judge that it's broken. You and I can't decide that it's broken; we're just taking our predetermined actions because of our physical history and physical law. It'll just keep on doing what it's doing.
Define this thing "free will" that you think we do not have in a non-dualistic universe, and why you think it is noteworthy that we don't have it.
Okay.
i. Free Will means decisions are caused in a non-deterministic and non-aleatory chain of events. They are arbitrary and we are the arbiters.
ii. Every physical happening is caused in a deterministic chain of events. The exception are physical happening that occur in stochastic fashion (the quantum stuff).
iii. In the absence of a body-mind duality, our decision making process is a physical happening. If our decision making is a physical happening then it is either deterministic or stochastic. Thus they are not arbitrary and free will doesn't exist.
You could define free will as non-deterministic and non-aleatory, but I don't think you have to. If you define it this way, well, we don't have free will. That's pretty simple. However, I think we can get every important consequence of free will even in a deterministic system - and if we can meet every important consequence of free will except the definitional 'non-deterministic' and 'non-aleatory' bits, well, I don't see that they're crucial to the definition, but if they are, then free will is unimportant and we should be talking about Free WillA - the version that gives us everything we want from free will, but where we're in a deterministic (or quantum indeterministic) universe. Free Will becomes completely inconsequential.
Faced with something like that, well, throw out those aspects of the definition. They are not not a crucial razor for finding important differences.
And, what moral repercussions would this realization cause? And--if we don't--how does that change our moral actions?
Why does it matter if we have free will or not?
If free will doesn't exist the distinction between a sentient and a non-sentient being is false. Everything, a rock, a plant and us holds the same amount of power to change the course of our 'actions' (no power at all).
Moral degenerates from a objective set of principle all sentient beings ought respect into something that don't exist in our reality.
In economic crime theory (Becker's model) we consider that criminals makes a perfectly rational evaluation between committing a crime for certain reward or not taking the risks of the crime's cost (the same way a investor decides his portfolio managing risks based on his preferences).
In empirical verification people find that there's some 'gap'. Not everyone in favorable position to commit a crime do it (unlike a investor that invest when he finds a favorable position for profit). Some researchers call this 'gap' 'the moral cost', which is a arbitrary preference people have to not break rules for the sake of it.
Without morals the moral cost is actually a miss information. As soon as someone truely believe in the non-existance of a moral, the moral cost no longer exist for him and he will commit a crime whenever the opportunity appears (opportunity includes his evaluation of gains/risks in committing the crime).
Ex: Someone find the opportunity to still considerable amount of money from a unknown person and he know that no one will ever find out. Not everyone would do it because they have a arbitrary preference to 'not steal'. If they realize they are not the cause of their own actions (exogenous physical happenings are) there's no reason to keep this preference.
It's really a matter of brain science. Can we cause energy to perform in true random in ways that matter cannot? Can the brain do this? I can't make a call on that. I haven't studied the brain really.
If free will doesn't exist the distinction between a sentient and a non-sentient being is false. Everything, a rock, a plant and us holds the same amount of power to change the course of our 'actions' (no power at all).
Moral degenerates from a objective set of principle all sentient beings ought respect into something that don't exist in our reality....
WHOA WHOA WHOA..... slow down.
You're going to have to make the jump between "No free will" to "No morals."
You can't just say "Sentient is an illusion therefore no morals." Why is one dependent on the other?
Additionally, you've been implying all along that if it was proven that no one had no free will we would just go around raping and plaguing all the time, or something. Is that really the case? Is the only thing stopping you from murdering people your belief in "Free Will?"
It's really a matter of brain science. Can we cause energy to perform in true random in ways that matter cannot? Can the brain do this? I can't make a call on that. I haven't studied the brain really.
If free will doesn't exist the distinction between a sentient and a non-sentient being is false. Everything, a rock, a plant and us holds the same amount of power to change the course of our 'actions' (no power at all).
Moral degenerates from a objective set of principle all sentient beings ought respect into something that don't exist in our reality....
WHOA WHOA WHOA..... slow down.
You're going to have to make the jump between "No free will" to "No morals."
You can't just say "Sentient is an illusion therefore no morals." Why is one dependent on the other?
Without free will our actions cannot be moral or immoral, they are all amoral, like the actions of non-sentient beings. For a action to be moral or immoral the person have to deliberately choose then and without free will people don't choose the actions they take.
A world with only robots and natural phenomena wouldn't have morals since every action is pre-determined. Robots could still be programmed to avoid danger and avoid harm, but any of this action would be moral, since the robot don't choose what they do. If our world is equally pre-determined then morals doesn't exist here too.
Additionally, you've been implying all along that if it was proven that no one had no free will we would just go around raping and plaguing all the time, or something. Is that really the case? Is the only thing stopping you from murdering people your belief in "Free Will?"
People wouldn't rape because rape doesn't bring satisfaction to most of us, gladly. Among other reasons of course.
And there are plenty of things "stopping" me from murdering another besides morals: i. i have no reason to kill anyone, ii. i also have no means to kill anyone, iii. killing people evolves the risks of being caught and send to jail.
But if I didn't believe in free will, had a reason, the means and certainty i would avoid the consequences, I see no reason i shouldn't, since I'm not responsible for my actions.
More often then not the circumstances rewards you for doing something 'immoral', such as cheating and lying. If one realize all possible course of actions are actually amoral natural events, then the satisfaction of taking a moral action is lost.
This actually explain a lot. People lie, cheat and do wrong things all freaking the time. If I've no moral high ground to convince then they shouldn't (hey I don't ask the lighting bolt to not fall !) I guess there's nothing to be done.
The problem with arguing about free will is that you can't argue successfully unless both sides believe in an objectively neutral universe. God help you if an individual involved can't distinguish between subjective and objective.
From the perspective of human society, the concept of free will is irrelevant. Morality is a concept that is ingrained in us and affect our actions like any other causation factor. But just because I believe something is right or wrong doesn't make it so - after all one person can view an action as right while another views that same action as wrong.
To imply the lack of free will no longer means people are responsible for their actions is woefully simplistic. The entire concept of moral responsibility is a subjective human concept, and is not applicable to an objective universe.
Ultimately, that means in an objectively neutral universe, a lack of free will still means people are 'responsible' for their actions, they're just responsible in the context of the human society they live in.
Without free will our actions cannot be moral or immoral, they are all amoral, like the actions of non-sentient beings. For a action to be moral or immoral the person have to deliberately choose then and without free will people don't choose the actions they take
Why?
Continually stating the same thing without justification does not make it so. WHY can morality only exist with choice?
But if I didn't believe in free will, had a reason, the means and certainty i would avoid the consequences, I see no reason i shouldn't, since I'm not responsible for my actions.
Ummm.... well, it might be better for all parties involved if you just keep believing in Free Will then. Maybe you should wait to argue this.
It's about the mind-body dualism.
If the mind-body dualism is false, then our mental processes would be determined not by ourselves, but by foreign physical forces alone (be this forces deterministic chain of events or aleatory).
It seems to me that the only way for us to take decisions and be responsible for the outcomes is if those decisions are not resulted by physical forces outside of our control.
So, can of free will exist without mind-body dualism (which most claim to be false) ?
BGU Control
R Aggro
Standard - For Fun
BG Auras
I started a thread about it a while back:
http://forums.mtgsalvation.com/showthread.php?t=468829
However, I've see people define "self" in such away as to render the point moot. If you assume the atoms that make up your body are "you" and their actions what you 'choose' to do, the issue goes away and you have free will.... of course, so does the computer I'm typing on....
Anyway, I do think that "Free Will" is just an illusion brought upon by our false sense of introspection. We've essentially evolved to believe we have free will when the whole thing is really a sham.
So if no free will is the state of the art, how can morals exist ?
There shouldn't be any difference between a killing lighting bolt and a murderer.
I guess I shouldn't feel bad about the wrongs i did and nor should I feel good about the rights I did since I don't even exist to begin with lol ?
BGU Control
R Aggro
Standard - For Fun
BG Auras
Getting ride of the idea of free will only negates the theory of retribution, not rehabilitation.
By not worrying about who's "fault" evil is. If somethings doing something harmful, it should be stopped regardless if it's "really to blame" or not.
If we can do something about either, we should. If we can save lives by locking the murderer up, or by finding away to prevent deaths from lightening, these are actions that should be taken.
Do you think we should not bother to prevent lightning strikes because lightning doesn't have a soul? No, the question is a non sequitur. A things free will does not prevent you from trying to rehabilitate it.
If you want to be a functioning member of society and your actions are preventing you from achieving that goal, you should feel bad. Actions can be evaluated in the context of goals.
However, if you DID have Free Will, how would you go about evaluating your actions in terms of "good" and "evil," anyway?
I guess it could 'sound' like that but I made a genuine question >_< ..!
Wouldn't it also negates the existence of morals? The same way one can say "a earthquake is not moral" (although it does harm) one could say "a murder is not moral". Sentience (requirement for moral) is a illusion without free will because we can't make decisions outside what nature decides. We're just expectorates without any saying.
There's no point in calling certain "actions" wrong since all actions taken by everyone are natural events and nothing more.
Well, morals is not only about what should be stopped. I guess any immoral action should be stopped but not all actions that should be stopped can be considered immoral.
The free will believer would consider important to prevent lighting because HE have a conscious and HE should avoid lighting (or prevent lighting to fall on people he cares).
To the free will believer, actions are taken based on his morality and the possibilities offered by the external environment, not the morality of the external environment.
The matter is not if we should prevent harm (because under both worlds we should), but why a individual should not harm others.
Since individuals don't decide anything, the harm that will be done will be done.
I don't "want" anything since I don't have will. If I will reach "my goals" or not it will be decided by nature alone and not me. It is either pre-determined by a deterministic chain of events or will be determined by luck (or a mix of both).
If actions can be evaluated in context goals alone, then there's no reason why I shouldn't harm others while chasing my goals or why I shouldn't adopt "harm others" as my goal.
Morals are not needed to justify why we should try to prevent harm as you said. But it is needed to a individual to justify to himself why he shouldn't choose to cause harm. Without Free Will there's no reason why a individual shouldn't harm another, aside from self-preservation.
Well if I cause suffering to others or if I screw up to myself (like getting a low grade or cheating in a test), I feel bad about it because I'm responsible for what happened. This feeling will certainly guide me to not make this mistakes again. I know the feeling of guilt is a biochemical reaction but my mistakes aren't (under the free will hypothesis) so the "trigger" will happen.
There's no reason to ever feel guilty in a world without free will, since your mistakes are not your doings. I see no reason to regret a low grade the same way I see no reason to regret someone else low grade, since i'm just as much responsible for my low grade as i'm for anyone else (which is, no responsible at all). Also there's no reason to regret any gain/losses from those grade to my future because the grade was already decided by nature (in deterministic or stochastic fashion).
BGU Control
R Aggro
Standard - For Fun
BG Auras
http://en.wikipedia.org/wiki/Is%E2%80%93ought_problem
I would disagree.
Assuming we are talking in the context of a moral goal like "humans dying should be prevented."
Why do you feel the need to blame the dead guy? Why blame anyone?
Just work on fixing the problem, not pointing fingers.
Again, whether or not we "blame" people for their actions isn't the point. Some actions need to be prevented, and steps need to be taken to prevent them. "Blame" does not matter in that context.
You could just act like you have Free Will anyway. That's what I do.
Now you sound like a Christian telling an atheist he can't be moral without God.
You either have Free Will right now or you do not.
Why should suddenly realizing you don't need to change anything?
All of ethics is essentially thrown up in the air, and while it's not impossible that it'll find another way to stand up (it was never really on all that solid ground anyway!), the project of creating a solid, consistent ethics that most people would accede to becomes more difficult. Ethics aren't the only possible grounds for action, but that doesn't make them unimportant.
All that said, I disagree with the premise:
This is the conventional chain of reasoning - and I think it's wrong. To see why, you have to define the term 'ourselves'. If the mind/body dualism is false, then the only thing that can be meant by that term is the physical stuff making up our bodies, and the mental 'software' - the complex, intra-communicative, self-aware system - which is our mind.
The consequences of that reasoning aren't well appreciated, even among professional philosophers. When people talk of hard determinism, they seem to have in mind a universe in which 'your' decisions are overruled by physical law, but that's nonsensical. To be fair, I'm pretty sure nearly everyone would recognize how nonsensical that is, if it's pointed out to them, but it generally isn't and they generally continue to act as though that's meant. It bears repeating, because it's crucially important: at no point does physical law step in and overrule 'you' or force you to change your mind.
In the hard determinist world (or the quantum determinist world, but that's more controversial), you always make exactly the decisions that you wish to make, which are often different than the decisions someone else would make if put in your position. It's true on that view that if the world were physically a bit different, you would make different decisions... but 'you' being physical matter and the complex system built of physical matter, if the world were physically a bit different, you would be different. For unimportant decisions which you make on a lark, the universe (and you) would only need to be a bit different. For critically important decisions that cut to the core of your beliefs, the universe (and you) would need to be very different for you to decide differently.
I'm not sure people think too much about this, but we don't want our decisions to be uncaused. We want them to be caused by us. Uncaused decision-making is as bad as fully determined decision-making on the worst hard determinist/incompatibilist view - from our perspective the decisions might as well then be random. We want our decision-making to be caused - by the right causes (this, incidentally, is why quantum indeterminacy is no help at all for resolving questions of free will, except for 'solutions' that involve pseudo-mystical hand-waving along the lines that quantum indeterminacy ends up being determined by 'your' dualist soul).
Well, if hard determinism is true, our decisions are caused, and they're caused by us. We wouldn't decide otherwise unless we were different. Why doesn't that count as the right sort of cause?
Only if it exists for retribution, not for rehabilitation.
When my car stops working I don't try to punish it or get angry at it, I just try to get it working again. I need to fix it, not blame it. A prison should be focusing on "fixing" criminals, not "punishing" them.
Anyway,
I hope we can all agree rehabilitation is more useful than retribution with or without free will.
A physicalist might simply say that this is an improper assertion of ontological priority. It is your self that is determined by your mental processes. And without that backwards ontology, the rest does not follow.
Which if thou dost not use for clearing away the clouds from thy mind
It will go and thou wilt go, never to return.
That's what I said. The United States prison system DOES operate on that principle, and we assume it so powerfully that the Norwegian system - with its 21 year maximum sentence even for the most heinous of crimes, provided the criminal is judged to be rehabilitated at the end of the sentence - is met with incredulity when described to Americans.
If we're unable to assign blame, the US prison system is an incredible moral crime, but by who and against whom?
By the US government and against mankind (or--at the very least--the criminals).
However, look at the statement. It gives the blame to "the US government" as if the "the US government" was an entity unto itself with its own free will.
Would it not be better to just knock it off, instead of trying to hold "the US government" accountable?
Why does it necessarily matter who is to blame? Just stop it.
But if we can't assign blame to people because of the lack of free will, then there's no individual in the US government who can be held responsible for failing to stop it. We can't just translate back to 'respectable' individual-centric language and claim that resolves the problem; it doesn't. If we can't assign blame to individuals, there's nobody who has a moral obligation to fix the US prison system, because there are no moral obligations. Every person who might have the power to stop it can simply say, "The deterministic universe caused me not to stop it."
Saying that someone should knock it off when nobody can be assigned blame or held responsible is borderline nonsensical - you might as well say that a sinkhole ought not open under someone's house or a volcano ought not explode near a city. "Just stop it" isn't an argument.
One thing to be very clear on: we're down the rabbit hole of 'blame' right now. If it seems absurd, it sort of is. It has actually been wielded as a reductio ad absurdum to 'disprove' hard determinism - though I think most people at this point realize that it does nothing of the kind and that such an argument is wishful thinking at best, abject self-delusion at worst. It's not clear either that there is no free will (I've already stated a compatibilist position - a position that says that even if the universe is deterministic (or quantum indeterministic), we can still have free will) or that a lack of free will would mean we could not assign blame; there are proposed solutions in both cases.
Now, finding "the source" of the problem would be a good way to go about trying to fix it. But, again, you don't need to assign "blame" in doing that. You can find something responsible for the problem without blaming it.
I find this line of reasoning borderline non-sense.
If my computer breaks down, is my inability to morally blame this malfunctioning piece of equipment somehow prevent me from finding the part responsable and replacing it? If I find the part responsable, can I not fix the problem because the part doesn't have free will? Of course not, don't be silly.
Also, I find your impaction that acts of nature are unstoppable defeatist.
We can still find "sources" of the problems, even without blame. My car mechanic does it all the time.
I think you're missing the point entirely.
Your car do not choose if he will break or not. You, Taylor, can do things to prevent your car for breaking or can fix it when it breaks. No ones denying that anything that cause harm should be stopped. The moral lies in stopping harm, not harm itself, we all agree.
Still, how you convince someone to not do harm? How you convince yourself to not cause harm? What makes it so that harm should be avoided? How you decide if someone's action must be prevented or not (sometimes causing harm is within someone's right and needs)?
Moral is the answer for all those questions. God is not needed to prevent people from hurting each other. Moral is.
Without free will a person who is in position to do harm in order to reach it's goals and not suffer any consequences have zero reasons to not proceed, since the person is not responsible for the consequences of it's own acts.
BGU Control
R Aggro
Standard - For Fun
BG Auras
How does my ability to change others get affected if it's revealed none of us has free will? Would that suddenly stop me from being able to change other things around me?
I was unaware that objects without free will were unable to affect other objects without free will.
The same thing that does it now.
The same way I do now.
And, I am saying Free Will and morality don't necessarily have to have anything to do with each other. No more than God and morality does.
If you define morality so that it is completely contingent on "choice," then--yes--you need free will to have morality.
Just like if you define morality such that good can only come from God, then you need God to have morality.
However, you don't need to define it like that at all.
Without GOD a person who is in position to do harm in order to reach it's goals and not suffer any consequences have zero reasons to not proceed, since the person is not responsible for the consequences of it's own acts.
Only if you're intent on defining mortally that way.
candidus inperti; si nil, his utere mecum.
So if the principle is identical a person that did harm did not choose to do it (the same way for the car).
There isn't any moral issue. A killer's murder and a lighting bolt struck holds the same amount of moral infraction (zero infraction).
This won't work if the choice evolved in causing harm have more appeal considering the consequences for yourself or the feelings of the third party. Without free will people are not responsible for their choices.
Options are always pre-determined, no matter in which light you see this problem.
The criteria, in a free-will world, wouldn't be pre-determined. It would be arbitrarily determined by the person, influenced by the info he holds.
Okay.
i. Free Will means decisions are caused in a non-deterministic and non-aleatory chain of events. They are arbitrary and we are the arbiters.
ii. Every physical happening is caused in a deterministic chain of events. The exception are physical happening that occur in stochastic fashion (the quantum stuff).
iii. In the absence of a body-mind duality, our decision making process is a physical happening. If our decision making is a physical happening then it is either deterministic or stochastic. Thus they are not arbitrary and free will doesn't exist.
BGU Control
R Aggro
Standard - For Fun
BG Auras
Why does it matter if we have free will or not?
I don't think you've fully appreciated the problem of blame. That might be my fault; I am trying my darnedest to communicate why it's an issue but I seem to be coming up short.
Sticking with the US prison system as an example. I will take it for granted that US sentencing is in fact based on a punishment principle first, a rehabilitation principle as a distant second. I'm open to evidence that this is not so, but taking it for granted for the sake of the example...
Assume we can't assign blame. That is, nobody can be morally responsible for anything they do, because hard determinism is true and because you can't be morally responsible for an action that you physically could not have prevented (set aside the bit of incoherence to this argument that both crashing00 and I have noted, and the question of definition that B_S raised - this is the incompatibilist hard determinist position, it's just that none of us accept it philosophically).
So, what we have is a prison system which is punishing individuals for crimes which we have already held that they can't be morally responsible for, since they couldn't have physically prevented them.
You're suggesting that this system is broken, so we should just fix it. By what standard do you say that it's broken?
If nobody can ever be morally responsible for any action, good or bad, then in what sense are they moral agents? Is an unjustifiable prison system really 'broken' if it's not being imposed on moral agents?
By what standard do you say 'we should fix it'? More to the point, what do you mean by 'we' and what do you mean by 'should'? There are no moral agents who could be responsible for fixing it, so there are no agents who 'should' do anything!
This is the essential motivation for philosophers to fear the problem of blame - taken to an extreme, it completely eliminates all moral agents, all moral decision-making, all values and all action.
I've already said that I think the problem is easily resolved, of course. I'm not trying to convince you that it's irresolvable. I'm trying to convince you that resolving it matters - that it can't just be brushed aside as unimportant.
It's supposed to be defeatist. It's the worst nightmare of generations of ethical philosophers, after all.
Incidentally, this use of 'responsible' is very different from previous uses (where it was used as 'morally responsible'). You're using it here as 'the cause of'. Nobody is claiming that hard determinism means people can't cause the death of other people; some are claiming that hard determinism means that people can't be morally responsible for the death of other people - that the fact that they physically caused the death of another person doesn't mean that they bear any moral burden. It's best to be more careful with terms in a discussion of this kind, since equivocation on terms brings a serious risk of misleading readers.
More to your point:
If your computer breaks down, but there are no moral agents around to judge that it 'should' operate in a particular way, in what sense has it broken down? It's just a physical object. It's not 'wrong' or 'right', it just is.
When there's a being capable of assigning value and defining right and wrong around, that being can define it as broken and decide it has to be fixed, but if there are no moral beings around...
This is the state of the US prison system: It's punishing people who aren't morally guilty, but nobody's around to judge that it's broken. You and I can't decide that it's broken; we're just taking our predetermined actions because of our physical history and physical law. It'll just keep on doing what it's doing.
You could define free will as non-deterministic and non-aleatory, but I don't think you have to. If you define it this way, well, we don't have free will. That's pretty simple. However, I think we can get every important consequence of free will even in a deterministic system - and if we can meet every important consequence of free will except the definitional 'non-deterministic' and 'non-aleatory' bits, well, I don't see that they're crucial to the definition, but if they are, then free will is unimportant and we should be talking about Free WillA - the version that gives us everything we want from free will, but where we're in a deterministic (or quantum indeterministic) universe. Free Will becomes completely inconsequential.
Faced with something like that, well, throw out those aspects of the definition. They are not not a crucial razor for finding important differences.
If free will doesn't exist the distinction between a sentient and a non-sentient being is false. Everything, a rock, a plant and us holds the same amount of power to change the course of our 'actions' (no power at all).
Moral degenerates from a objective set of principle all sentient beings ought respect into something that don't exist in our reality.
In economic crime theory (Becker's model) we consider that criminals makes a perfectly rational evaluation between committing a crime for certain reward or not taking the risks of the crime's cost (the same way a investor decides his portfolio managing risks based on his preferences).
In empirical verification people find that there's some 'gap'. Not everyone in favorable position to commit a crime do it (unlike a investor that invest when he finds a favorable position for profit). Some researchers call this 'gap' 'the moral cost', which is a arbitrary preference people have to not break rules for the sake of it.
Without morals the moral cost is actually a miss information. As soon as someone truely believe in the non-existance of a moral, the moral cost no longer exist for him and he will commit a crime whenever the opportunity appears (opportunity includes his evaluation of gains/risks in committing the crime).
Ex: Someone find the opportunity to still considerable amount of money from a unknown person and he know that no one will ever find out. Not everyone would do it because they have a arbitrary preference to 'not steal'. If they realize they are not the cause of their own actions (exogenous physical happenings are) there's no reason to keep this preference.
BGU Control
R Aggro
Standard - For Fun
BG Auras
You're going to have to make the jump between "No free will" to "No morals."
You can't just say "Sentient is an illusion therefore no morals." Why is one dependent on the other?
Additionally, you've been implying all along that if it was proven that no one had no free will we would just go around raping and plaguing all the time, or something. Is that really the case? Is the only thing stopping you from murdering people your belief in "Free Will?"
This page has some good info on that:
http://en.wikipedia.org/wiki/Neuroscience_of_free_will
Without free will our actions cannot be moral or immoral, they are all amoral, like the actions of non-sentient beings. For a action to be moral or immoral the person have to deliberately choose then and without free will people don't choose the actions they take.
A world with only robots and natural phenomena wouldn't have morals since every action is pre-determined. Robots could still be programmed to avoid danger and avoid harm, but any of this action would be moral, since the robot don't choose what they do. If our world is equally pre-determined then morals doesn't exist here too.
People wouldn't rape because rape doesn't bring satisfaction to most of us, gladly. Among other reasons of course.
And there are plenty of things "stopping" me from murdering another besides morals: i. i have no reason to kill anyone, ii. i also have no means to kill anyone, iii. killing people evolves the risks of being caught and send to jail.
But if I didn't believe in free will, had a reason, the means and certainty i would avoid the consequences, I see no reason i shouldn't, since I'm not responsible for my actions.
More often then not the circumstances rewards you for doing something 'immoral', such as cheating and lying. If one realize all possible course of actions are actually amoral natural events, then the satisfaction of taking a moral action is lost.
This actually explain a lot. People lie, cheat and do wrong things all freaking the time. If I've no moral high ground to convince then they shouldn't (hey I don't ask the lighting bolt to not fall !) I guess there's nothing to be done.
BGU Control
R Aggro
Standard - For Fun
BG Auras
From the perspective of human society, the concept of free will is irrelevant. Morality is a concept that is ingrained in us and affect our actions like any other causation factor. But just because I believe something is right or wrong doesn't make it so - after all one person can view an action as right while another views that same action as wrong.
To imply the lack of free will no longer means people are responsible for their actions is woefully simplistic. The entire concept of moral responsibility is a subjective human concept, and is not applicable to an objective universe.
Ultimately, that means in an objectively neutral universe, a lack of free will still means people are 'responsible' for their actions, they're just responsible in the context of the human society they live in.
TerribleBad at Magic since 1998.A Vorthos Guide to Magic Story | Twitter | Tumblr
[Primer] Krenko | Azor | Kess | Zacama | Kumena | Sram | The Ur-Dragon | Edgar Markov | Daretti | Marath
So, you're rejecting compatibilism?
Compatibilism is the belief that free will and determinism are compatible ideas
Also, our current understanding of physics undermines determinism. [1]
Why?
Continually stating the same thing without justification does not make it so. WHY can morality only exist with choice?
Ummm.... well, it might be better for all parties involved if you just keep believing in Free Will then. Maybe you should wait to argue this.