X-Spam-Status: No, score=-2.6 required=5.0 tests=BAYES_00 autolearn=ham version=3.2.0-r431796 Sender: -2.6 (spamval) -- NONE Return-Path: Received: from newman.eecs.umich.edu (newman.eecs.umich.edu [141.213.4.11]) by boston.eecs.umich.edu (8.12.10/8.13.0) with ESMTP id kA6NR78W021889 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=FAIL) for ; Mon, 6 Nov 2006 18:27:07 -0500 Received: from ghostbusters.mr.itd.umich.edu (ghostbusters.mr.itd.umich.edu [141.211.93.144]) by newman.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA6NR1oX018816; Mon, 6 Nov 2006 18:27:01 -0500 Received: FROM smtp.eecs.umich.edu (smtp.eecs.umich.edu [141.213.4.43]) BY ghostbusters.mr.itd.umich.edu ID 454FC4BC.47A76.12779 ; 6 Nov 2006 18:26:52 -0500 Received: from [141.212.108.83] (neuromancer.eecs.umich.edu [141.212.108.83]) (authenticated bits=0) by smtp.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA6NQppI015852 for ; Mon, 6 Nov 2006 18:26:51 -0500 Message-ID: <454FC544.9050506 Æ eecs.umich.edu> User-Agent: Thunderbird 1.5.0.7 (X11/20060921) MIME-Version: 1.0 References: In-Reply-To: Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 8bit X-Virus-Scan: : UVSCAN at UoM/EECS X-Spam-Checker-Version: SpamAssassin 3.2.0-r431796 (2006-08-16) on newman.eecs.umich.edu X-Virus-Scan: : UVSCAN at UoM/EECS Date: Mon, 06 Nov 2006 18:29:08 -0500 To: improvetheworld Æ umich.edu From: Matt Rudary Subject: Re: social welfare + fairness + knowledge Status: O X-Status: X-Keywords: X-UID: 846 I meant to put this out here before: Fairness, or at least justice, is not in fact part of maximizing social welfare. Perhaps that was too strong a statement, but here is another shot: A social system that maximizes the sum of the welfare of the individuals in the society is not just. For instance, assuming A) torture is an effective means of obtaining information and B) the standard ticking time bomb torture thought experiment in which getting information from a terrorist would save lives (but only if you get it soon), in such a society torture of one individual would be justified. I know that A is not necessarily a valid assumption, but I would oppose torture even if it were. In general, such a society allows the *premeditated and intentional* sacrifice of a small population *against their will* to benefit a larger population. So ensuring fairness is separate from, though not orthogonal to, maximizing social welfare. Matt Daniel Reeves wrote: > That's another tricky thing about maximizing social welfare (synonymous > with maximizing utility, as Dave notes) -- deciding how to include > nonhumans in the equation. You have to include animals' utility in some > way otherwise it would be ethically A-OK to torture animals for fun. > Or maybe it suffices that there are *people* who get disutility from the > torture of animals. For example, if we had a yootles auction to decide > whether to kill a puppy, we wouldn't need the puppy's participation to > decide not to do it. > > That puts me tentatively in the "animals don't count" camp. Anyone else? > > (I disagree with Dave that 2 & 3 are subsets of 1. Splitting utility > equally is often more important than maximizing the sum of utilities. > For example, it's not OK to steal money from someone who doesn't need it > as much as you.) > > (And knowledge, truth, and scientific understanding are intrinsically > valuable, beyond their applicability to improving social welfare. But > perhaps my own strong feelings about this undermine my own point. In > other words, maybe we don't need to include it for the same reason we > don't need to include animal welfare.) > > > --- \/ FROM Dave Morris AT 06.10.30 11:25 (Oct 30) \/ --- > >> I think that it's important to note that 2 & 3, while distinct and >> interesting components of the discussion, are in fact subsets of 1, >> which could be rephrased in it's general sense as "maximization of >> utility" if you don't want to treat only the defined subset of >> "human". :-) >> >> On Oct 28, 2006, at 1:30 PM, Daniel Reeves wrote: >> >>> Based on off-line discussion with my grandfather, I propose that >>> there are only three fundamental principles worth fighting for in >>> human society: >>> 1. Social Welfare >>> 2. Fairness >>> 3. The Search for Knowledge >>> >>> (This started with an argument about the parental retort "who says >>> life's supposed to be fair?") >>> >>> (1 and 2 are distinct because if we're all equally miserable, that's >>> fair but not welfare maximizing. Likewise, of the methods for >>> dividing >>> a cake, for example, the method of "I get all of it" maximizes the sum >>> of our utilities, but we nonetheless prefer splitting it in half.) >>> >>> Is there a number 4? >>> >>> -- >>> http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves" >>> >>> >>> >> David P. Morris, PhD >> Senior Engineer, ElectroDynamic Applications, Inc. >> morris Æ edapplications.com, (734)?786-1434, fax: (734)?786-3235 >> >> >