X-Spam-Status: No, score=-2.6 required=5.0 tests=BAYES_00 autolearn=unavailable version=3.2.0-r431796 Sender: -2.6 (spamval) -- NONE Return-Path: Received: from newman.eecs.umich.edu (newman.eecs.umich.edu [141.213.4.11]) by boston.eecs.umich.edu (8.12.10/8.13.0) with ESMTP id kA70kR8W026200 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=FAIL) for ; Mon, 6 Nov 2006 19:46:27 -0500 Received: from eyewitness.mr.itd.umich.edu (mx.umich.edu [141.211.176.131]) by newman.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA70kMop006272; Mon, 6 Nov 2006 19:46:22 -0500 Received: FROM newman.eecs.umich.edu (newman.eecs.umich.edu [141.213.4.11]) BY eyewitness.mr.itd.umich.edu ID 454FD759.5F500.3148 ; 6 Nov 2006 19:46:17 -0500 Received: from boston.eecs.umich.edu (boston.eecs.umich.edu [141.213.4.61]) by newman.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA70kEXU006229 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=FAIL); Mon, 6 Nov 2006 19:46:14 -0500 Received: from boston.eecs.umich.edu (localhost.eecs.umich.edu [127.0.0.1]) by boston.eecs.umich.edu (8.12.10/8.13.0) with ESMTP id kA70kE8W026193 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NO); Mon, 6 Nov 2006 19:46:14 -0500 Received: from localhost (dreeves Æ localhost) by boston.eecs.umich.edu (8.12.10/8.12.9/Submit) with ESMTP id kA70kDP0026190; Mon, 6 Nov 2006 19:46:13 -0500 X-Authentication-Warning: boston.eecs.umich.edu: dreeves owned process doing -bs X-X-Sender: dreeves Æ boston.eecs.umich.edu In-Reply-To: <454FCF1A.7090900 Æ eecs.umich.edu> Message-ID: References: <454FC544.9050506 Æ eecs.umich.edu> <454FCF1A.7090900 Æ eecs.umich.edu> MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed X-Spam-Checker-Version: SpamAssassin 3.2.0-r431796 (2006-08-16) on newman.eecs.umich.edu X-Virus-Scan: : UVSCAN at UoM/EECS X-Virus-Scan: : UVSCAN at UoM/EECS Date: Mon, 6 Nov 2006 19:46:13 -0500 (EST) To: Erik Talvitie cc: improvetheworld Æ umich.edu From: Daniel Reeves Subject: Re: social welfare + fairness + knowledge Status: O X-Status: X-Keywords: X-UID: 848 Huh. Well said. I concede defeat on this whole thread. Now let's see some responses to some of the other recent posts! As you all know, tomorrow is a pretty important day (it's the 2nd anniversary of ImproveTheWorld)! Oh, I started a yootles blog: http://dreeves.wordpress.com/ I would derive a lot of utility from y'all checking it out! Danny --- \/ FROM Erik Talvitie AT 06.11.06 19:11 (Today) \/ --- > Aha. But if we take into account the level of meta-utility that Dan > introduced, we might just find, if we factor in yours and my and lots of > other people's disutility for living in a society where torture is condoned, > that it would in fact be allocatively efficient to ban torture, regardless of > any fairness criterion. Which raises the question, what is it about fairness > (and by extension search for knowledge) as an additional criterion that > separates it from alternative criteria like, say "no cruelty to animals" or > "fancy hotels should put a mint on your pillow?" If the society as a whole > values fairness, then it will automatically be captured by social welfare. If > it doesn't, then it won't be, but who says a society that doesn't value > fairness should be forced to practice it anyway? > > Which to me is why saying "Social welfare is the only thing worth fighting > for" seems a bit meaningless. Isn't that like saying "The only things worth > studying are in the universe?" I mean, the concept of utility is so > all-encompassing that efficiency seems nigh impossible to measure and > therefore doesn't seem to be much help when it comes down to deciding what > laws to enact or what rallies to attend or what to put on people's pillows. > > $0.02 from Erik > > Matt Rudary wrote: >> I meant to put this out here before: Fairness, or at least justice, is not >> in fact part of maximizing social welfare. Perhaps that was too strong a >> statement, but here is another shot: >> >> A social system that maximizes the sum of the welfare of the individuals in >> the society is not just. For instance, assuming A) torture is an effective >> means of obtaining information and B) the standard ticking time bomb >> torture thought experiment in which getting information from a terrorist >> would save lives (but only if you get it soon), in such a society torture >> of one individual would be justified. I know that A is not necessarily a >> valid assumption, but I would oppose torture even if it were. In general, >> such a society allows the *premeditated and intentional* sacrifice of a >> small population *against their will* to benefit a larger population. >> >> So ensuring fairness is separate from, though not orthogonal to, maximizing >> social welfare. >> >> Matt >> >> Daniel Reeves wrote: >> >>> That's another tricky thing about maximizing social welfare (synonymous >>> with maximizing utility, as Dave notes) -- deciding how to include >>> nonhumans in the equation. You have to include animals' utility in some >>> way otherwise it would be ethically A-OK to torture animals for fun. >>> Or maybe it suffices that there are *people* who get disutility from the >>> torture of animals. For example, if we had a yootles auction to decide >>> whether to kill a puppy, we wouldn't need the puppy's participation to >>> decide not to do it. >>> >>> That puts me tentatively in the "animals don't count" camp. Anyone else? >>> >>> (I disagree with Dave that 2 & 3 are subsets of 1. Splitting utility >>> equally is often more important than maximizing the sum of utilities. For >>> example, it's not OK to steal money from someone who doesn't need it as >>> much as you.) >>> >>> (And knowledge, truth, and scientific understanding are intrinsically >>> valuable, beyond their applicability to improving social welfare. But >>> perhaps my own strong feelings about this undermine my own point. In >>> other words, maybe we don't need to include it for the same reason we >>> don't need to include animal welfare.) >>> >>> >>> --- \/ FROM Dave Morris AT 06.10.30 11:25 (Oct 30) \/ --- >>> >>>> I think that it's important to note that 2 & 3, while distinct and >>>> interesting components of the discussion, are in fact subsets of 1, which >>>> could be rephrased in it's general sense as "maximization of utility" if >>>> you don't want to treat only the defined subset of "human". :-) >>>> >>>> On Oct 28, 2006, at 1:30 PM, Daniel Reeves wrote: >>>> >>>>> Based on off-line discussion with my grandfather, I propose that there >>>>> are only three fundamental principles worth fighting for in human >>>>> society: >>>>> 1. Social Welfare >>>>> 2. Fairness >>>>> 3. The Search for Knowledge >>>>> >>>>> (This started with an argument about the parental retort "who says >>>>> life's supposed to be fair?") >>>>> >>>>> (1 and 2 are distinct because if we're all equally miserable, that's >>>>> fair but not welfare maximizing. Likewise, of the methods for >>>>> dividing >>>>> a cake, for example, the method of "I get all of it" maximizes the sum >>>>> of our utilities, but we nonetheless prefer splitting it in half.) >>>>> >>>>> Is there a number 4? >>>>> >>>>> -- >>>>> http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves" >>>>> >>>>> >>>>> >>>> David P. Morris, PhD >>>> Senior Engineer, ElectroDynamic Applications, Inc. >>>> morris Æ edapplications.com, (734)?786-1434, fax: (734)?786-3235 >>>> >>>> >>> > -- http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves" Let us live!! Let us love!! Let us share the deepest secrets of our souls!! ... You first.