Huh. Well said. I concede defeat on this whole thread.
Now let's see some responses to some of the other recent posts!
As you all know, tomorrow is a pretty important day (it's the 2nd
anniversary of ImproveTheWorld)!
Oh, I started a yootles blog: http://dreeves.wordpress.com/
I would derive a lot of utility from y'all checking it out!
Danny
--- \/ FROM Erik Talvitie AT 06.11.06 19:11 (Today) \/ ---
> Aha. But if we take into account the level of meta-utility that Dan
> introduced, we might just find, if we factor in yours and my and lots of
> other people's disutility for living in a society where torture is condoned,
> that it would in fact be allocatively efficient to ban torture, regardless of
> any fairness criterion. Which raises the question, what is it about fairness
> (and by extension search for knowledge) as an additional criterion that
> separates it from alternative criteria like, say "no cruelty to animals" or
> "fancy hotels should put a mint on your pillow?" If the society as a whole
> values fairness, then it will automatically be captured by social welfare. If
> it doesn't, then it won't be, but who says a society that doesn't value
> fairness should be forced to practice it anyway?
>
> Which to me is why saying "Social welfare is the only thing worth fighting
> for" seems a bit meaningless. Isn't that like saying "The only things worth
> studying are in the universe?" I mean, the concept of utility is so
> all-encompassing that efficiency seems nigh impossible to measure and
> therefore doesn't seem to be much help when it comes down to deciding what
> laws to enact or what rallies to attend or what to put on people's pillows.
>
> $0.02 from Erik
>
> Matt Rudary wrote:
>> I meant to put this out here before: Fairness, or at least justice, is not
>> in fact part of maximizing social welfare. Perhaps that was too strong a
>> statement, but here is another shot:
>>
>> A social system that maximizes the sum of the welfare of the individuals in
>> the society is not just. For instance, assuming A) torture is an effective
>> means of obtaining information and B) the standard ticking time bomb
>> torture thought experiment in which getting information from a terrorist
>> would save lives (but only if you get it soon), in such a society torture
>> of one individual would be justified. I know that A is not necessarily a
>> valid assumption, but I would oppose torture even if it were. In general,
>> such a society allows the *premeditated and intentional* sacrifice of a
>> small population *against their will* to benefit a larger population.
>>
>> So ensuring fairness is separate from, though not orthogonal to, maximizing
>> social welfare.
>>
>> Matt
>>
>> Daniel Reeves wrote:
>>
>>> That's another tricky thing about maximizing social welfare (synonymous
>>> with maximizing utility, as Dave notes) -- deciding how to include
>>> nonhumans in the equation. You have to include animals' utility in some
>>> way otherwise it would be ethically A-OK to torture animals for fun.
>>> Or maybe it suffices that there are *people* who get disutility from the
>>> torture of animals. For example, if we had a yootles auction to decide
>>> whether to kill a puppy, we wouldn't need the puppy's participation to
>>> decide not to do it.
>>>
>>> That puts me tentatively in the "animals don't count" camp. Anyone else?
>>>
>>> (I disagree with Dave that 2 & 3 are subsets of 1. Splitting utility
>>> equally is often more important than maximizing the sum of utilities. For
>>> example, it's not OK to steal money from someone who doesn't need it as
>>> much as you.)
>>>
>>> (And knowledge, truth, and scientific understanding are intrinsically
>>> valuable, beyond their applicability to improving social welfare. But
>>> perhaps my own strong feelings about this undermine my own point. In
>>> other words, maybe we don't need to include it for the same reason we
>>> don't need to include animal welfare.)
>>>
>>>
>>> --- \/ FROM Dave Morris AT 06.10.30 11:25 (Oct 30) \/ ---
>>>
>>>> I think that it's important to note that 2 & 3, while distinct and
>>>> interesting components of the discussion, are in fact subsets of 1, which
>>>> could be rephrased in it's general sense as "maximization of utility" if
>>>> you don't want to treat only the defined subset of "human". :-)
>>>>
>>>> On Oct 28, 2006, at 1:30 PM, Daniel Reeves wrote:
>>>>
>>>>> Based on off-line discussion with my grandfather, I propose that there
>>>>> are only three fundamental principles worth fighting for in human
>>>>> society:
>>>>> 1. Social Welfare
>>>>> 2. Fairness
>>>>> 3. The Search for Knowledge
>>>>>
>>>>> (This started with an argument about the parental retort "who says
>>>>> life's supposed to be fair?")
>>>>>
>>>>> (1 and 2 are distinct because if we're all equally miserable, that's
>>>>> fair but not welfare maximizing. Likewise, of the methods for
>>>>> dividing
>>>>> a cake, for example, the method of "I get all of it" maximizes the sum
>>>>> of our utilities, but we nonetheless prefer splitting it in half.)
>>>>>
>>>>> Is there a number 4?
>>>>>
>>>>> --
>>>>> http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves"
>>>>>
>>>>>
>>>>>
>>>> David P. Morris, PhD
>>>> Senior Engineer, ElectroDynamic Applications, Inc.
>>>> morris Æ edapplications.com, (734)?786-1434, fax: (734)?786-3235
>>>>
>>>>
>>>
>
--
http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves"
Let us live!!
Let us love!!
Let us share the deepest secrets of our souls!!
...
You first.
|