Would you rather that one person's lifespan was reduced by 20 years...

Author: Savant

Posts

Total: 36
Savant
Savant's avatar
Debates: 22
Posts: 540
3
7
6
Savant's avatar
Savant
3
7
6
...or that everyone on earth had their lifespan reduced by a tenth of a second?

Inspired by this post, which was itself inspired by a similar thought experiment.

Consider the following two options: (1) Reduce one person's lifespan by 20 years or (2) Reduce two people's lifespans by 10 years each. To me, these seem relatively equivalent. We could value equality, but neither of these people is being treated equally to the rest of the population. In terms of preventing harm to people, both options seem equally bad.

But we could break down the problem even further. Suppose we are given the option to reduce ten people's lifespans by 2 years each, and then a hundred people's lifespans by 73 days each, and so on...we're given the option to reduce 10.5 million people's lifespans by a minute each (these are approximations) and finally the opportunity to reduce the lifespan of everyone on earth by a tenth of a second.

Anything less than 10 milliseconds should be unnoticeable, but 70 billion people having their lives reduced by a hundredth of a second is a greater decrease in lifespan than a single person having their lifespan reduced by 20 years. Unnoticeable does not mean nonexistent—years are just a lot of milliseconds, after all. I think most people would rather inconvenience everyone on earth than bite the bullet and kill one person, but that requires us to determine where the analogy breaks down. When does premature death become an inconvenience rather than a tragedy? If human life is sacred, are milliseconds of a human life sacred?

A utilitarian could argue this recursively: Reducing one person's lifespan by X amount is morally equivalent to reducing the lifespan of 2 people by X/2 amount...after enough iterations, we would have to conclude that reducing one person's lifespan by X amount is morally equivalent to reducing the lifespan of 2^999 people by X/(2^999) amount.

We're also not doing some action X to prevent some outcome Y, so this isn't strictly a question of whether the ends justify the means. We're choosing between two direct harms X and Y and deciding which is worse—should we use utility as a measurement here, or something else?
ADreamOfLiberty
ADreamOfLiberty's avatar
Debates: 0
Posts: 2,847
3
2
2
ADreamOfLiberty's avatar
ADreamOfLiberty
3
2
2
-->
@Savant
This is the most interesting question I've seen on this site to date, and one who ever is going to write the moral framework of our new robot overlords will have to know the answer to inside and out.
zedvictor4
zedvictor4's avatar
Debates: 22
Posts: 11,278
3
3
6
zedvictor4's avatar
zedvictor4
3
3
6
-->
@Savant


A lifespan cannot be reduced or increased.....A lifespan is what it is.






Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
Similar question was solved by someone on YouTube long time ago.

Would you rather cause X amount of pain to one person, or X/100 amount of pain to each of 100 persons.

The answer is: It is better to cause X/100 amount of pain to each of 100 persons, rather than X amount of pain to one person.

It is a question of honour and justice, two moral values often ignored by many.
Intelligence_06
Intelligence_06's avatar
Debates: 167
Posts: 3,837
5
8
11
Intelligence_06's avatar
Intelligence_06
5
8
11
-->
@Savant

    ...or that everyone on earth had their lifespan reduced by a tenth of a second?

I say this one, simply because it does not directly ruin the life of basically anyone.
ADreamOfLiberty
ADreamOfLiberty's avatar
Debates: 0
Posts: 2,847
3
2
2
ADreamOfLiberty's avatar
ADreamOfLiberty
3
2
2
-->
@Best.Korea
The answer is: It is better to cause X/100 amount of pain to each of 100 persons, rather than X amount of pain to one person.

It is a question of honour and justice, two moral values often ignored by many.
Justice is complicated (as complicated as ethics). Equality is simple (given context). This is application of a principle of equality.

ADreamOfLiberty
ADreamOfLiberty's avatar
Debates: 0
Posts: 2,847
3
2
2
ADreamOfLiberty's avatar
ADreamOfLiberty
3
2
2
-->
@Intelligence_06
    ...or that everyone on earth had their lifespan reduced by a tenth of a second?
I say this one, simply because it does not directly ruin the life of basically anyone.
Which is to say harm is non-linear.

In OP's example the total lifetime lost is about 20 years in either case, but the problem is when it costs 10,000 human years to save one guy's 20 years.

That might be only a second of life, but if a computer was making the decision and it did not apply non-linear transformations to the final value proposition it would readily sacrifice the one guy.

Also consider this: Nobody would admit to holding a second of their life against someone else's lifetime, but in real world scenarios tradeoffs keep happening in many cases.

By definition doing anything but maximizing life-time will result in sub-optimal lifespans. What happens when the computer is saving 100,000 people the 20 year loss and now everybody is losing days and months?
Savant
Savant's avatar
Debates: 22
Posts: 540
3
7
6
Savant's avatar
Savant
3
7
6
-->
@Best.Korea
The answer is: It is better to cause X/100 amount of pain to each of 100 persons, rather than X amount of pain to one person.
Interesting. What about causing X/100 amount of pain to 200 people? Or 1,000 people? (vs causing X amount of pain to 1 person.) Does utility ever outweigh equality as a value, in your opinion?
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@ADreamOfLiberty
Justice means:

What is banned for one is banned for all.

I dont see how you can ban one person from not feeling pain, and not ban same for everyone else, and call that justice.

I also dont see how you can ban one person from living without banning all from living, and call that justice.
Savant
Savant's avatar
Debates: 22
Posts: 540
3
7
6
Savant's avatar
Savant
3
7
6
Another interesting thought experiment, this one a bit more positive:

You are the life fairy, a magical being who can extend lifespans. You can choose any number, denoted as X, and X random people will have their lifespans extended by 20/X years. You can increase the lifespans of billions of people by fractions of a second, increase 1 lifespan by 20 years, or something completely different. Assume that all of these people would appreciate an increased lifespan. What number do you choose, and how does equality factor in?
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Savant
Interesting. What about causing X/100 amount of pain to 200 people? Or 1,000 people? (vs causing X amount of pain to 1 person.) Does utility ever outweigh equality as a value, in your opinion?
In an ideal world, utility should never outweight justice and equality.

Yes, in an ideal world.

The main two moral theories are:

1. Ban that which does more harm than good (consequentialism).

2. What is banned for one is banned for all (justice, equality).

In your case, it is a case of equality vs utility.

1. Causing 1 person X pain 
2. Causing 200 persons X/100 pain each

2 causes more pain. So from a consequentialist point of view, choosing 2 is wrong.

However, from a point of view of equality, choosing 1 is wrong since it is not an equal treatment for all.

There is a point where utility makes equality undesirable.

For example, being forced to decide to 1. Torture the entire world with X amount of pain or 2. Torture just 1 person with X amount of pain.

From a point of equality, 1 is equal treatment for all. From a point of utility, 2 is an obvious choice.
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Savant
You are the life fairy, a magical being who can extend lifespans. You can choose any number, denoted as X, and X random people will have their lifespans extended by 20/X years. You can increase the lifespans of billions of people by fractions of a second, increase 1 lifespan by 20 years, or something completely different. Assume that all of these people would appreciate an increased lifespan. What number do you choose, and how does equality factor in?
Equality would choose same for all. Consequentialism would likely choose 20 people to extend their life each by 1 year or something like that. Maybe extend the lives of 240 persons by one month?
Lemming
Lemming's avatar
Debates: 6
Posts: 3,205
4
4
10
Lemming's avatar
Lemming
4
4
10
-->
@zedvictor4
@Savant
There 'are jobs though, that decrease the worker's lifespans,
Black lung disease for coal miners, for an example,

Though I would imagine but not know,
That technology might have mitigated this some.
I'd also imagine that one might argue that coal miner's 'choose their profession,
But there 'have been and are societies where individuals are 'forced into their professions by government.

Thus some governments have sacrificed X many people's health by X,
To increase the health by X of X many 'other people,
Assuming the country is strengthened by the coal or trade of the coal.

As I don't expect to have kids,
And  'hope to live to 100 (Probably won't happen)

I sometimes wonder who I would leave my savings to?
(I don't expect to have a close relationship with any nephews, nieces)

Personally I'd rather leave it to a few or single people,
Because I value being appreciated,
Because the change would be more substantial 'looking, than the money spread around,
Though arguably the change might be 'bigger spread around, as many people might need a 'little money at times in their life, to get out of holes,
While once a 'single person has a certain amount of money, they are out of the hole,
Hm, maybe I ought leave it to many.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Of the 'harm question, of life,
. . 
I think it'd be better to use 'examples,
Many people recoil due to non harm values, at the 'concept,
Yet 'situational, behave different.

It's all just a bunch of arbitrary values though, in my view.
Arbitrary actions influencing values,
Some person is nice to you, you value that person 'slightly more,
Then weigh it against some value you have, such as fine living.
. .
School shooters, have been known to spare people who have been nice to them,
They value their insane misguided stupid revenge,
Yet also value other people,
Or perhaps value their revenge and think the person doesn't factor into it,
Though their revenge is still stupid anyhow, misguided, wrongful.
. . . .

At work, were I a boss,
I'd rather 'everyone pitch in to some task,
Than 'one person take it easy,
All suffer together, a bit less.
Though the above assumes a 'contract, all agreed to work, assumes anyone can 'leave if they want (Barring financial straights that come with not working)
I value fairness, honoring one's word.
. .
Though I also value not being in pain, one could also argue the boss has a 'different contract/job,
But if the different job was not also painful, I think I might choose to share the pain of the task (Probably),
I value not feeling guilty.

Were I in a lifeboat,
I don't think my family would appreciate me killing another person, so they could eat more,
I value my families will.
I don't think 'I would appreciate killing a person so I could eat more,
I value not being grossed out by cannibalism,
I 'do value other people,
It might not extend my own life.
Hm. . .
. .
Yet people more easily make decisions where they are not 'close,
Encourage war, or trade, blood diamonds.

I think it likely, that if I could live substantial longer, say 200 years (Though preferably 'much longer), there might be 'no amount of people, I'd not be willing to sacrifice, if all it took was a word.
I value myself.
But I could be wrong of myself.

Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Lemming
I think it likely, that if I could live substantial longer, say 200 years (Though preferably 'much longer), there might be 'no amount of people, I'd not be willing to sacrifice, if all it took was a word.
I have a similar moral problem.

For example, if I had to choose between:
1. Me experiencing great pain
Or
2. Millions of other people experiencing great pain

I would always choose 2.
I dont like experiencing pain myself, and pain of other people doesnt hurt me.
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
Consequentialism is only fun while its the others who get tortured. When its you, then its no longer fun.
Lemming
Lemming's avatar
Debates: 6
Posts: 3,205
4
4
10
Lemming's avatar
Lemming
4
4
10
-->
@Best.Korea
In the Manga Kaiji,
There are times the protagonist refuses to harm other people, for self gain,
For reasons of empathy, honor.
Though there are 'also times he harms other people,
For reasons of survival, empathy, honor.

An Antagonist, Hyoudou Kazutaka,
Makes a point in one scene, as he pokes the broken leg of a man,
How he does not 'feel the pain of that man, or others, when they are in pain,
Nor he asserts, does Kaiji feel their pain.

Yet I remember the Human Derby,
Rails high up above the ground, thinning in width further on one goes,
The way to win,
To 'push those ahead of you,
One 'wins by going last.
. .
The crowd of rich laughing, watching,
But I remember the tears of the participants,
The empathy (Pain), connection,
I remember Kaiji's refusal to push.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Heh,
That said, I don't know where will spin my own moral compass,
Doubtless there have been times on either side,
Push, don't push.
Savant
Savant's avatar
Debates: 22
Posts: 540
3
7
6
Savant's avatar
Savant
3
7
6
Now consider another variation:

The human species has reached a new height of power and will survive for millions of years. Every month for 51,000 years, the world government is given a choice between killing three people or infecting all humans and their descendants with a brand new condition that will kill them one hour before they would otherwise die. This effect is cumulative, but if lifespans get too low, children can still be created and cared for by AI machines. If the government never bites the bullet and kills three people, human lifespans will hit one hour permanently after 51,000 years. The other extreme involves killing around 1.8 million people total. How often should the government choose to directly kill three people, if ever?
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Savant
Well, there would be a point where lifespans would be too short for everyone.

With over 600,000 hours of lost life for each individual, which is a lifetime, the individual would be born just to die one hour later.

Of course, it would still be unjust to kill 3 random people to prevent that, especially if its not the fault of those 3 people.
Savant
Savant's avatar
Debates: 22
Posts: 540
3
7
6
Savant's avatar
Savant
3
7
6
-->
@Best.Korea
These are two killing options, not "doing X to prevent Y." So you'd be killing 3 people early in their lives or killing a lot of people an hour before they'd otherwise die. Neither is just, but you have to choose.
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Savant
Again, consequentialism would kill just 3 people, since that results in greatest overall good.
8 billion people = 8 billion hours when just 1 hour each.
3 persons = 1.8 million hours total.

Just treatment would kill everyone equally.
Best.Korea
Best.Korea's avatar
Debates: 269
Posts: 7,586
4
6
10
Best.Korea's avatar
Best.Korea
4
6
10
-->
@Savant
These are two killing options, not "doing X to prevent Y."
When you have only  2 mutually exclusive options, doing one prevents the other. Thats what I meant.
Dr.Franklin
Dr.Franklin's avatar
Debates: 32
Posts: 10,569
4
7
11
Dr.Franklin's avatar
Dr.Franklin
4
7
11
-->
@zedvictor4
You're dense
zedvictor4
zedvictor4's avatar
Debates: 22
Posts: 11,278
3
3
6
zedvictor4's avatar
zedvictor4
3
3
6
-->
@Dr.Franklin
So how can a life span be any shorter or longer than what it is?


So a person lived for 50 years.......Life span 50years.

If same person had perhaps lived a healthier life style, they may have lived for 60 years.......Lifespan might have been 60years......But this is only speculation.

We can only know a persons actual lifespan, which is the duration of life from conception to death.

Speculation is futile.

Critical-Tim
Critical-Tim's avatar
Debates: 3
Posts: 902
3
2
7
Critical-Tim's avatar
Critical-Tim
3
2
7
-->
@zedvictor4
I agree, you have pointed out a technicality in the example. A person's lifetime cannot be shortened because it is still a person's lifetime. However, the argument remains relevant with some proper articulation. Is it theoretically equivalent to decrease an individual's lifespan from its predetermined expiration as to equally distribute the amount decreased over numerous individuals so that the life taken is unnoticeable?
b9_ntt
b9_ntt's avatar
Debates: 2
Posts: 276
0
2
5
b9_ntt's avatar
b9_ntt
0
2
5
-->
@Savant
The pain of choosing may be mitigated.
1) Ask for volunteers.
2) Choose people who are already on their death beds.
3) Kill unrepentant murderers or career violent offenders.

ebuc
ebuc's avatar
Debates: 0
Posts: 4,270
3
2
4
ebuc's avatar
ebuc
3
2
4
...."8 guys in this country have more money than 4 billion people combined, but the mom buying groceries with food stamps is the problem "...unknown quote

Standard of living is rising and has risen for all, however, the the gap between rich and poor has been and still growing.

Money cant buy happiness, but it can make life easier.

A life of misery is no fun. A life of suffering is no fun.?

Why do so many kill them selves when standard of living is always rising?
zedvictor4
zedvictor4's avatar
Debates: 22
Posts: 11,278
3
3
6
zedvictor4's avatar
zedvictor4
3
3
6
-->
@Critical-Tim
Hi Tim.

Not sure that's articulated enough.

The only relevant appreciators of the exercise would be the deceased, who by definition wouldn't notice any difference in lifespan. Whether that be 10 years or ten days.

As an exercise, one might as well present two comparative mathematical equations and ask, which is best.

Sort of......Which is preferable...... 100 - 10  or 100 - 1 x 10.
Critical-Tim
Critical-Tim's avatar
Debates: 3
Posts: 902
3
2
7
Critical-Tim's avatar
Critical-Tim
3
2
7
-->
@zedvictor4
I agree, you have pointed out a technicality in the example. A person's lifetime cannot be shortened because it is still a person's lifetime. However, the argument remains relevant with some proper articulation. Is it theoretically equivalent to decrease an individual's lifespan from its predetermined expiration as to equally distribute the amount decreased over numerous individuals so that the life taken is unnoticeable?
Hi Tim. I'm not sure that's articulated enough. The only relevant appreciators of the exercise would be the deceased, who by definition wouldn't notice any difference in lifespan. Whether that be 10 years or ten days. As an exercise, one might as well present two comparative mathematical equations and ask which is best. Sort of......Which is preferable...... 100 - 10 or 100 - 1 x 10.
I believe my description of the question was accurate, what makes you believe otherwise? You claim the only relevant appreciators of the exercise would be the deceased. Why do you make the assumption that the people who were affected would be unaware their life was shortened? It seemed to me, being it was not specified, that it was an irrelevant factor and does not contribute to the equation. Perhaps the argument could be made no person would be unaware their life was shortened, but then again, we are speaking theoretical, so we cannot make this assumption unless we specify. Would the morality change depending on if the individuals knew their life was being shortened from if they were ignorant?
zedvictor4
zedvictor4's avatar
Debates: 22
Posts: 11,278
3
3
6
zedvictor4's avatar
zedvictor4
3
3
6
-->
@Critical-Tim
Hi Tim.

In the first instance, I was simply pointing out that a lifespan can neither be shortened or increased.

I don't see how such a simple fact can be affected by moral concepts.


Moral concepts are relative to interactive behaviour, and interactive behaviour might curtail a life's duration, thereby finalising a lifespan.


Two related but separate discussions.





TwoMan
TwoMan's avatar
Debates: 0
Posts: 315
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@zedvictor4
I was simply pointing out that a lifespan can neither be shortened or increased.
I don't see how such a simple fact can be affected by moral concepts.
If a person is made aware, convincingly, that their lifespan will be shortened relative to what it would otherwise be, they would be mentally and emotionally harmed by this knowledge. That is worthy of moral consideration.