Instigator / Pro
0
1487
rating
31
debates
35.48%
won
Topic
#3711

THBT: Conscious reason devoid of all emotion is impossible.

Status
Finished

The debate is finished. The distribution of the voting points and the winner are presented below.

Winner & statistics
Better arguments
0
0
Better sources
0
0
Better legibility
0
0
Better conduct
0
0

After not so many votes...

It's a tie!
Parameters
Publication date
Last updated date
Type
Standard
Number of rounds
3
Time for argument
One week
Max argument characters
20,000
Voting period
Two weeks
Point system
Multiple criterions
Voting system
Open
Contender / Con
0
1527
rating
14
debates
39.29%
won
Description

Reason definition: to form conclusions, judgments, or inferences from facts or premises.

consciousness definition: Consciousness, at its simplest, is sentience or awareness of internal and external existence.

Pro: you cannot have reason without emotion
Con: you can have reason without emotion

Round 1
Pro
#1
Definitional outlining, the ontological foundations
Hello, Christian. I have a lot to unpack in this debate (once again). First I must begin by explaining what an emotion is if we're to know whether self aware conscious reason is capable without its input. For the purpose of the debate, we will describe an emotion as: a strong feeling deriving from one's circumstances, mood, or relationships with others. If CON has any reservations about this definition, they are free to edit it, or import their own definition into the discussion.

The neuroscience of emotions
What constitutes an emotion? Well, happiness constitutes an emotion, so does love, so does curiosity, so does admiration, jealousy, envy, etc. Yet, the most peculiar of sensations which could be considered an emotion (and is an emotion) is pain. Even physical pain is an emotion.

The sensation of pain is a necessary function that warns the body of potential or actual injury. It occurs when a nociceptor fiber detects a painful stimulus on the skin or in an internal organ (peripheral nervous system).¹ The detection of that signal is “picked up” by receptors at the dorsal horn of the spinal cord and brainstem and transmitted to various areas of the brain as sensory information.
The facilitators of this pathway are known as neurotransmitters. Neurotransmitters are endogenous chemical messengers that transmit signals across a chemical synapse, from one neuron to another “target” neuron, muscle cell, or gland cell.² Some neurotransmitters are excitatory, facilitating transmission of messages, while others are inhibitory neurotransmitters, impeding transmission.² These chemical messages are critical in the modulation of pain.
Pain, then, is a mental phenomenon, a chemical expressed within the brain, just like happiness is, just like love is. If it then follows that physical pain is created by chemicals in the brain, and the central nervous system is, in actuality, an extension of the brain spread throughout the body, then pain must then necessarily exist in the mind as an emotion. If pain ( a sense of tactile sensation) is in actuality an emotion, it would then follow that a sense of touch itself is also an emotional stimulus as opposed to a physical phenomenon. It then follows that all of our phenomenal, experimental experiences are predicated upon such chemicals (emotions) created within the brain. What happens when we take them away? What are we left with?
 

Avicenna's floating man thought experiment
If it follows that our sense of pain and therefore physical and emotional sense of comfort are in actuality mental phenomena (chemicals like happiness and sadness), then it would follow that a sense of taste and an experience of taste are also chemical reactions (an emotion) within the mind. Although time may exist outside of the human phenomenal experience (meaning it objectively exists outside of the mind), what is not objective is how we experience said time. How fast time feels to move is also necessarily created through chemical reactions in the brain. In theory, a second for a person could feel like 100 years for another if we tweaked their emotional experience of time.
-
Within Avicenna's floating man thought experiment, he asks us to imagine a floating body in a formless environment. What kind of self would be produced from that? There would evidently be no stimulus to thought except through the sensations (emotions) generated by the movements of one's own body. 

What the floating man would be would be a very limited selfhood. He could not be a Tom who enjoys Tom and Jerry or a person who enjoys ice cream. Tom and ice cream do not exist as conceptual categories for the floating man. All the floating man may know is that a thinking thing exists, yet he may not even recognise this thinking thing as himself (many philosophers argue the cognito). It goes too far to imply an "I" in the thinking thing. Jean-Paul-Sartre would argue that one can only come to attain self-awareness through the eyes of another. Without another to contrast with, it would then mean this floating man may never even form an "I".

We're probably all familiar with those who are so depressed they do not even get out of bed in the morning. If simply feeling a negative emotion can do this too you, what would a lack of all experience do to you? well, even if an outside world were to exist you would simply allow yourself to die without even coming to your first thoughts without and ID desires (instinctual desires) propelling your thoughts.

If it remains true curiosity and all desire and experience are generated through chemicals in the brain. Regardless if an outside world exists or not, if one were to turn off these chemicals experience or consciousness (at least self consciousness) would cease all together. Yet if in CONS best case scenario where a sense of touch, pain and a sense of space and time cease but thought does not cease, this person would then still simply let themselves die probably never having another thought again, as one has no emotional drive to do anything (as we literally have a lack of experience of anything).
-
It then follows that emotion is a necessary precursor to consciousness itself, it is my theory that the reason robots have not attained consciousness like humans is likely not to do with a programming problem, but a chemical, biological emotional problem.


Argument through self-interest
Both Aristotle and Plato noticed that everything a human does, we must necessarily gain some sort of pleasure from it. We may recognise there is more pleasure in eating junk food and not going to the gym, yet some still go to the gym despite hating working out due to it making them feel better in the long run. Plato and Aristotle then noticed that humans are physically incapable of performing actions we have no sort of motivation to do (emotions). When one decides to give to charity, or even die for a cause, there is some sort of "greater good" or some form of emotional desire, fulfilment, or driving force to act this way. When this ceases, no action can occur. If Helping another person always left us with the literal sensation of having horse faeces in our mouth. We would (in most cases) become incapable of assisting another, regardless of the reason. This demonstrates through reason that we can influence our emotions to take action, yet when an emotion becomes so strong, reason becomes incapable of controlling it. Freud argues that reason is the horseman and emotions are the horse. Reason simply acts as the arbiter and the director of the horse. Yet the horsemen cannot stop an unruly horse from running. This is the perfect analogy to the relationship between reason and emotion, experience.
-
Freud would argue the reason imagination came to exist, was so we could temporarily said our ID (instinctual desires) when one feels the need to drink water, when we imagine drinking water, our brain is unaware if we're actually drinking it or not, so it temporarily occupies the mind which cannot differentiate reality from fiction. This mechanism is how we can find movies scary, despite recognising they're fake consciously. Our subconscious never comes to know the movies as fake.
-
The point at which Plato and Aristotle disagreed was not whether humans do things for emotional fulfilment, but whether this emotional fulfilment is virtuous. You wouldn't get out of bed in the morning if you didn't believe or hope for something good to happen today or at some point in the future. Everything we do appears to involve the Freudian pleasure principle. Once this is gone, motivation and the desire to do anything (even think and move) will cease. When emotion itself ceases, so does the self and phenomenal experience.

Argument through robotics
Robots are currently incapable of consciousness, and the almost sure reason for this is due to the lack of chemicals within a robot's code. Yet evidently, it appears impossible to give metal a chemical emotion. It would then follow that self-aware metal robots are probably a fiction.


Conclusion
Every sort of phenomenal, experimental experience that we have is necessarily an emotional experience. Even so-called "tactile physical experiences" are actually simply emotions. When we strip someone of these chemicals, they will either be incapable of propelling a thought to fruition or 2. They will be completely incapable of thinking and may be completely unaware of their own existence. Emotion, therefore, is a necessary prerequisite for human consciousness (at least in the material realm).

emotions is the propeller of reason, and reason is the arbiter and director of the emotions.
Con
#2
Defining emotion

I will use a slightly different definition for emotion: a natural instinctive state of mind deriving from one's circumstances, mood, or relationships with others. Two men can be drunk (a feeling) but one can be calm about this while the other panics. We don't consider someone "emotional" because they have the ability to feel pain; we consider them emotional if they're constantly sad or angry.

Pain is an experience but not an emotion. Emotions are opinions about how to feel about something, while pain is an objective experience, such as seeing a color or hearing a sound. Emotions are thus felt in a narrower sense than feelings [1]. Pro wishes to conflate feelings with emotions, but there is a clear difference. One can experience something (a loud noise, for example) and remain completely indifferent to it. In many ways, emotions are a choice. One can often choose to be happy or sad based on how they look at their circumstances. You can't choose not to feel pain in the same way.

Reason without emotion

Certain people have an inability to feel emotions, yet still have the ability to reason [2]. It doesn't require emotion to perform mathematics. One can feel happy, sad, angered, or jealous that 1+1=2, yet the problem itself is not partial to any of these emotions. The floating man can do math or imagine thought experiments without feeling emotional about it.

Self-interest

Computers can work toward a goal without being emotional about it. Motivation is not an emotion in and of itself. There's a difference between simply wanting money and being blinded by greed. Emotions cause someone to overestimate the benefits of one course of action rather than thinking through all options objectively. An emotional chess player may play impulsively, while someone with an inability to feel emotion would consider all options before making a decision. Both may want to earn the prize money, but only one feels emotional about this. A virtuous executioner will be motivated by an objective sense of justice, while an emotional executioner might be motivated by revenge and other times by pity.

Is everything an emotion?

Redefining everything as an emotion is simply changing a definition to win an argument. Emotions (happiness, sadness, anger) are mental responses to outward stimuli. Plants and computers can detect temperatures and light, but these aren't considered emotions. Why should humans detecting pain or outward stimuli be considered emotions just because they are self-aware about it? It seems that computers can have goals, reason, etc. but not feel sadness and happiness (emotions). Just because humans are conscious of the former does not mean they must also be motivated by the ladder.
Round 2
Pro
#3
Defining emotions
I will use a slightly different definition for emotion: a natural instinctive state of mind deriving from one's circumstances, mood, or relationships with others.
Thank you, Christian. I'll accept your definition as we go forth in this debate. Although you may soon find your definition favours my position more than my initial one did, if it follows an emotion is simply a natural instinctive state of mind (such as how pain comes about), your own definition brings into question your segregation of pain as an emotion from happiness.
 

Two men can be drunk (a feeling) but one can be calm about this while the other panics. We don't consider someone "emotional" because they have the ability to feel pain; we consider them emotional if they're constantly sad or angry.

On the surface, this appeared to be a good rebuttal, but upon closer inspection, it holds very little substance and has feet of clay. We may not consider someone "emotional" for getting their ankles knocked by a scooter. When we call someone "emotional," we generally mean they're very flamboyant, expressive, or exaggerative in their expression of emotions. This, then, is a non-argument unless you can actually show emotion and emotional are synonymous words. As you correctly pointed out, we may consider someone emotional when they crash their car (they're going to be very expressive). In this same sense, someone can feel physical pain and be very emotional over and through the physical pain. If it remains true that physical pain is a chemical in the brain just like happiness, you're yet to disprove  create any meaningful distinction outside of the fact that one is controllable and the other not. Which doesn't actually deny physical pain being an emotion (you would have to disprove the neuroscience for that). All it actually exists as is an argument for it being a different form of emotion (one which cannot be controlled).
-
You have yet to justify how an emotional experience and the other having some control makes one an emotion and the other not. I can easily just disagree and say "no" regardless of whether I can control it or not. It's an emotion ( as dictated by the science). Why does being able to control an emotion make it an emotion? What about those with extreme anger issues who cannot control their emotions like those who cannot deal with physical pain? Is such anger not an emotion? And yet, this isn't the only assumption in this section of the debate. What if free will does not exist? Does no one, therefore, have emotions? I'll leave it to you to sort out these conundrums.
-

I. Pain is an experience, not an emotion rebuttal
Pain is an experience but not an emotion
Experience definition dictated by oxford languages: an event or occurrence which leaves an impression on someone.
-
Something being an experience and an emotion are not mutually exclusive. I experience my emotions, but they're still emotions. This definition also proves this point, unless you think an Oxford dictionary is incorrect. If so, that's OK. Dictionaries get things wrong all the time, but you ought to explain why it's wrong.
Emotions are opinions about how to feel about something

Following the definition of opinion, it therefore does not follow that opinions are created just or only from emotions. I have the personal opinion that the ice cream man will come down the road again tomorrow, as he hasn't missed a day doing this trip in years. Although this isn't certain knowledge, it's an opinion which could be wrong.

Within one of CONS sources, it offers a distinction between feelings and emotions, which goes as follows: Let us call the first instance (currently feeling proud about something) an emotional experience, the second instance (being generally proud about that thing) an emotion or sentiment, and the third instance (being a proud kind of person), a trait.
This is true... but it doesn't actually deny my philosophy at all. If we get rid of the ability for us to formulate any one of these 3 statements, we lose the ability to do the other two. Therefore, this article does not debunk my case.
 
 In many ways, emotions are a choice. One can often choose to be happy or sad based on how they look at their circumstances. You can't choose not to feel pain in the same way.

I believe I've already offered an adequate response to this. It is rather vague. I have little reason to believe something is not an emotion just because you can't control it. Some people with OCD cannot control how much they wish to brush their teeth or tie their shoes. Is that not an emotion compelling them to do this? As a result, making a distinction between an emotion being controllable and other inner experiences not being emotions based on controllability does not demonstrate that all emotions are emotions, even within your paradigm, and thus this point is contradictory and therefore debunked. This argument also rests on the idea of free will and intuition, both of which can be wrong.
 

II. reason without emotion rebuttal 
Certain people have an inability to feel emotions, yet still have the ability to reason [2]
CON offers us an article to read to prove his point, although it is obvious CON never actually read the article, as the article itself says people with alexithymia do experience emotions; they are just unable to process them properly to be expressed. The article then goes on to say this has been shown through brain scans through a lack of connectivity in the left and right hemispheres. CON is yet to prove reason can exist without emotion. This article does not do it. 

Psychopaths are also incapable of all sorts of emotions, such as empathy for others. Yet psychopaths are not devoid of all emotion, for everything they do is predicated on self-interest and survival. which in itself is necessarily an emotion prompting them to act in their self-interest. Just as we can see (through looking at psychopaths) that when we shut off empathy they do not care to be empathetic, we can then also reasonably say if we shut off their instincts for self-interest too, they would also not act in that manner, and simply let themselves die.


 It doesn't require emotion to perform mathematics. One can feel happy, sad, angered, or jealous that 1+1=2, yet the problem itself is not partial to any of these emotions. The floating man can do math or imagine thought experiments without feeling emotional about it.
You're right, it doesn't matter what one's emotional state is while doing mathematical equations. However, if someone has no desire or curiosity to solve the math equation in the first place, no thought on the math equation will occur. You're yet to disprove the robust neuroscience I presented except through semantic games and through arbitrary lines on what constitutes an emotion or not. It therefore follows that an experience of time, space, self, body, and balance are all actually emotions. With all this gone, no self can form.

All forms of conscious knowledge require knowledge of oneself as an experiencer of that thing at the same time. If one has no concept of the self, then any conscious knowledge of anything outside of the self is impossible.


III. self interest rebuttal
Computers can work toward a goal without being emotional about it. Motivation is not an emotion in and of itself. 
A human is not a computer. Humans have conscious knowledge of themselves and things outside of themselves. A robot does not. In the same sense that a robot can complete tasks without the need for pleasure, this is because these are autonomy processes with no conscious input. If we were to take away all human conscious emotions, all conscious reasoning would become impossible. This, however, does not mean the body could not do things of itself without you being aware. One only needs to look at animals to notice that even creatures which are not sentient can accomplish tasks. The difference is that this is not done consciously, therefore it is outside the rules of the debate.
-
Emotions cause someone to overestimate the benefits of one course of action rather than thinking through all options objectively. An emotional chess player may play impulsively, while someone with an inability to feel emotion would consider all options before making a decision. Both may want to earn the prize money, but only one feels emotional about this. A virtuous executioner will be motivated by an objective sense of justice, while an emotional executioner might be motivated by revenge and other times by pity.
There's some truth in this. The difference, however, is that a robot does not need conscious thought to make its conscious decisions. Which is obvious, as it isn't conscious. This debate is about whether we can make conscious decisions or not without emotions. Therefore, appealing to robots is a non-sequitur. Robots don't make conscious decisions. Even if we assumed our body would make decisions without our consciousness present, these would not be conscious decisions. Therefore, none of these arguments are applicable to anything I've said thus far.


Redefining everything as an emotion is simply changing a definition to win an argument. Emotions (happiness, sadness, anger) are mental responses to outward stimuli. Plants and computers can detect temperatures and light, but these aren't considered emotions. Why should humans detecting pain or outward stimuli be considered emotions just because they are self-aware about it? It seems that computers can have goals, reason, etc. but not feel sadness and happiness (emotions). Just because humans are conscious of the former does not mean they must also be motivated by the ladder.
As I have previously said, many things go against our intuition. The semantics may agree with you, but science agrees with my semantic usage. Therefore, I say we ought to change the definition if you're incapable of touting my philosophy within the next round. Outside of that, I believe I have reasonably responded to the rest of what you said in this section simply in other parts of the debate.


CONCLUSION

  • CONS own definition has worked against him, and supported my argument indirectly
  • CON uses arguments to robots against me (which doesn't disprove anything i said)
  • I readily debunked both of CONS sources, and one even supported my argument (demonstrating CON never actually read the article).
  • Cons distinction between pain and other sorts of experiences has been proven to not just be vague, but contradictory.

Con
#4
"State of mind" here refers to how someone considers outside circumstances. A therapist can help you improve your state of mind (having a better outlook) even if they don't change the amount of pain you feel. A shot can still hurt, but someone with a positive state of mind won't be as scared of it.

This, then, is a non-argument unless you can actually show emotion and emotional are synonymous words.
Emotional means dominated or prone to emotion [1]. But we wouldn't call someone emotional just because they experience physical pain, unless there are a lot of emotions that go with it (sadness, anger, etc.)

If it remains true that physical pain is a chemical in the brain just like happiness, you're yet to disprove  create any meaningful distinction outside of the fact that one is controllable and the other not.
That's a very important distinction. One is completely the result of outside forces and the other is a partly controllable method for how to react to it. Just because they're both chemicals in the brain doesn't mean that there's not a very clear difference between them.

Why does being able to control an emotion make it an emotion? What about those with extreme anger issues who cannot control their emotions like those who cannot deal with physical pain?
That's just what we refer to when we talk about emotions. Again, anger is distinct because someone with anger issues could go to therapy, while someone with pain would consult a physical doctor. With therapy (and a better state of mind) it would be possible to be less angry, even if only a little.

What if free will does not exist? Does no one, therefore, have emotions?
My opponent has not provided any evidence that free will does not exist, so this point is void anyway. But emotions are the kind of thing that can be improved with therapy. The kind of thing that gets you labeled as "emotional." Not simply things that happen to you.

Something being an experience and an emotion are not mutually exclusive.
Sure, but pain is an experience and not an emotion (even if some things are both). Pro's case relies on redefining physical pain and motivation as emotions, even though few of us would consider James Bond to be emotional most of the time. A non-emotional person can experience pain and take it in stride while trying to make money or break the law without letting emotions get in the way. Pretty much everyone uses the term "emotion" the way I'm using it. Redefining it for the sake of winning an argument seems rather silly.

Therefore, this article does not debunk my case.
You're ignoring the part where it lists pain and hunger as physical sensations and distinct from emotion [2]:

An emotional experience, by virtue of being a conscious experience, is necessarily a feeling, as are physical sensations such as hunger or pain
Back to Pro's argument:
Some people with OCD cannot control how much they wish to brush their teeth or tie their shoes.
OCD makes someone predisposed to certain emotions, but they can still be mitigated with therapy and large amounts of willpower. It's still in the realm of things that can be controlled in the mind and with therapy (even if it's very difficult to do so).

the article itself says people with alexithymia do experience emotions; they are just unable to process them properly to be expressed
Some of them, yes. But the article also says that Caleb doesn't experience any emotions most of the time. As the article points out, there are many different types of alexithymia.

You're right, it doesn't matter what one's emotional state is while doing mathematical equations. However, if someone has no desire or curiosity to solve the math equation in the first place, no thought on the math equation will occur.
Again, someone can want something without being considered emotional. We'd hardly consider a math professor emotional just because they solve a math problem.

Pro again talks about robots not being conscious. But someone who behaves entirely like a robot, solving math problems and conducting scientific studies, should not be considered emotional just because they are conscious of what they are doing. Just being self-aware doesn't suddenly make someone emotional.

The semantics may agree with you, but science agrees with my semantic usage.
But we're having a scientific debate through the lens of English semantics. If the subject of this debate was "Should pain be categorized as an emotion?" or "Is pain very similar to emotions?" then Pro would have a point. But this debate must use the term emotion as it's currently defined. Science doesn't decide what words mean. We do.
Round 3
Pro
#5
Sadly due to time constraints today (busy day) and the effects this debate has on my dopamine receptors. I will not be posting an argument this round. CON admitted his argument was purely rhetorical and not based on any facts of reality, but words. I think that says enough. The neuroscience goes completly untested and con tries to derail the conversation into word games instead of actually talking on the points of empirical evidence and ontology.
Con
#6
I'll post a short argument as well since I want to avoid a final round "blitzkrieg" so to speak.

The science here is very relevant, but words have clear definitions. Emotions and reason can occur independently, but my opponent relies on redefining pain and motivation as emotions, rather than showing how reason is influenced by actual emotions. The science is important, but words have meanings that shouldn't be changed for the sake of winning an argument.