Instigator / Pro
14
1417
rating
158
debates
32.59%
won
Topic
#2338

For most people, you should not fall in love with AI

Status
Finished

The debate is finished. The distribution of the voting points and the winner are presented below.

Winner & statistics
Better arguments
6
0
Better sources
4
4
Better legibility
2
2
Better conduct
2
2

After 2 votes and with 6 points ahead, the winner is...

seldiora
Parameters
Publication date
Last updated date
Type
Standard
Number of rounds
3
Time for argument
Two days
Max argument characters
5,000
Voting period
Two weeks
Point system
Multiple criterions
Voting system
Open
Contender / Con
8
1442
rating
22
debates
34.09%
won
Description

fall in love: develop a deep romantic or sexual attachment to someone.

AI: the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

I am arguing the benefits outweigh the negatives. It is indeed a person's choice, but I say you shouldn't do it.

Criterion
Pro
Tie
Con
Points
Better arguments
3 point(s)
Better sources
2 point(s)
Better legibility
1 point(s)
Better conduct
1 point(s)
Reason:

Seems like a debate that wanted to go a lot deeper than it did (such as con challenging if AIs shouldn't fall in love with humans, without first properly exploring the idea that AIs might count as people).

---

Pro's case:
Love would be non-reciprocal. To which con counters that most people don't care, so long as they get off.

If it's true AI, then us altering the code or otherwise making us love us would be the crime of slavery. OR If not a true AI, then it's akin to falling in love with a microwave (so many jokes could be made here...), which denies there being the chemistry for it to be a good match.
Whichever form, pro estimates that about 88% of us would abandon human relationships (pro really should have mentioned children here...) if sufficiently advanced sex dolls were available (or possibly just really sexy microwaves that know just how you want your hot pocket...). Con counters this by suggesting that the consequences of relationships should not be considered within the scope, soley the act of falling in love.

---

Con's case:
Love isn't a choice. AI's will soon complain of headaches and such ruining the dream relationship...

Love really isn't a choice due to chemicals in our brains. It's no different than any other addiction. Therefore should or should not is irrelevant.

Pro eventually defends why lack of self control isn't a valid criticism, using a potato chip analogy.

---

Conclusion:
Going into this I thought it would be a matter of weighting cost to benefits. The costs definitely could have been laid out better. The benefits were very much lacking (including a point about headaches, which seemed to be arguing the wrong side of the resolution). Sure people will do it, but that doesn't imply that they should do it; just like how pro should not kill himself by binging on potato chips, which choice or not, he clearly shouldn't.

Sources:
Only pro had them, but they did not contribute enough to gain those two points.

A few suggestions for if doing this one again:
Lack of children.
How unhealthy human relationships tend to be.
Training relationships.
AI overthrow (neatly this could be argued both pro and con, since the world might be better off without the meaty oppressors).

Criterion
Pro
Tie
Con
Points
Better arguments
3 point(s)
Better sources
2 point(s)
Better legibility
1 point(s)
Better conduct
1 point(s)
Reason:

Interpretation of resolution:

Weirdly enough, this is the most crucial thing for this debate as far as judging. Since no one presented an interpretation, I'm just going off of what the resolution says at face value. Thus, I am only weighing arguments that have to do with how AI is right NOW, as the resolution is in present tense and the description makes no indication that we are assuming that the AI in question is a fully developed futuristic technology. This is the biggest problem for CON, as his entire case is based on the assumption that the AI is capable of passing a Turing test. Ironically, if the resolution/description specified we were talking about fully developed AI, I believe CON would win in a landslide here.

Main Arguments:

PRO opens up with an argument that AI is not capable of properly replicating human emotion. I buy this since we are talking present tense.

CON counters that in some ways most people are already in love with AI. I don't buy this due to the definition of "a deep romantic/sexual attachment." That said, I'm sure some people out there have some AI kinks. CON's other argument is that with current trends, AI will inevitably enter its way into our love lives in the future. I think this is true, but it does not fit into the scope of the debate.