X-Spam-Status: No, score=-1.0 required=5.0 tests=BAYES_00, RCVD_IN_BL_SPAMCOP_NET autolearn=no version=3.2.0-r431796 Sender: -1.0 (spamval) -- NONE Return-Path: Received: from newman.eecs.umich.edu (newman.eecs.umich.edu [141.213.4.11]) by boston.eecs.umich.edu (8.12.10/8.13.0) with ESMTP id kA7Mg68W008262 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=FAIL) for ; Tue, 7 Nov 2006 17:42:07 -0500 Received: from anniehall.mr.itd.umich.edu (mx.umich.edu [141.211.176.130]) by newman.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA7Mg3qt008346; Tue, 7 Nov 2006 17:42:03 -0500 Received: FROM smtp.eecs.umich.edu (smtp.eecs.umich.edu [141.213.4.43]) BY anniehall.mr.itd.umich.edu ID 45510BB7.3B794.6074 ; 7 Nov 2006 17:41:59 -0500 Received: from [141.212.108.83] (neuromancer.eecs.umich.edu [141.212.108.83]) (authenticated bits=0) by smtp.eecs.umich.edu (8.13.8/8.13.6) with ESMTP id kA7MfwRa014148 for ; Tue, 7 Nov 2006 17:41:58 -0500 Message-ID: <45510C41.5080606 Æ eecs.umich.edu> User-Agent: Thunderbird 1.5.0.7 (X11/20060921) MIME-Version: 1.0 References: In-Reply-To: Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Scan: : UVSCAN at UoM/EECS X-Spam-Checker-Version: SpamAssassin 3.2.0-r431796 (2006-08-16) on newman.eecs.umich.edu X-Virus-Scan: : UVSCAN at UoM/EECS Date: Tue, 07 Nov 2006 17:44:17 -0500 To: improvetheworld Æ umich.edu From: Matt Rudary Subject: Re: social welfare + fairness + knowledge Status: O X-Status: X-Keywords: X-UID: 850 I don't have a whole lot to add to this conversation as my reading on the subject of ethics is woefully short, but a starting point for this discussion may perhaps be found at http://en.wikipedia.org/wiki/Utilitarianism#Criticism_and_defense_of_utilitarianism Matt Dave Morris wrote: > I put forward the somewhat controversial point that sometimes absolutely > horrible things are in fact the ethically correct course of action. We > live in a universe that doesn't care whether we live or die, people > suffer regularly as a part of life. Furthermore we live as a species > filled with people who are willing to commit atrocities, because they > are mentally broken either by genetics or what has been done to them. > This is reality. That one can posit a situation that requires one to do > horrible things in response to this reality does not mean that ones > ethical code is flawed. It means that our universe is flawed (if bad > things happening were the definition of flaw). > > In the extreme and unrealistic ticking bomb situation- where a) I know > the person I have captive knows the answer I need, b) I know that the > threat is real (but for some reason don't know where the bomb is?), and > c) I know that torture is the only solution to get the information- of > course I'd torture the person for information, you'd be a fool not to. > But then in the real world, a, b, and c, are never true. And in the real > world, I would happily sign a universal ban on torture ever even though > I admit to my first assertion. In reality people won't wait to see that > a, b, and c, are true, they'll use it more and more often for more and > more trivial reasons and many many people will suffer all the time, > which is a greater cost than the very low probability event of losing > New York city because you failed to torture the right person at the > right time. Even knowing that we live in a world where our government > can take almost anyone, almost any time, and disappear them to > Guantanimo Bay, and do whatever they want to them there without > oversight or regulation, is a huge cost to me. It really bothers me. And > it didn't even happy to me or anyone I know. That's the realistic > consideration of any utilitarian argument about torture. The realistic > costs in the realistic situations. Maybe the law should be that > torturing a subject for information is a capital offense- and this will > be universally applied, regardless of outcome. In which case, I would > still commit to my assertion at the beginning of this paragraph- any > rational ethical person would. > > So no, utilitarianism is not broken because it can be used to justify > torture. > > And I think Erik put it well- fairness etc. are adequately captured by > utilitarianism as well, since it's important to people, therefore it > provides them with utility. > > The value of this argument is that we accept that the basis for argument > about such topics should be the overall utility of the decision. So when > we decide whether or not to pass a law banning torture, or requiring > hotels to put mints on pillows, we can talk about the utility it will > provide, and remove, to how many, and to whom, and thus come to > agreement on the best course of action. Far more useful than talking > about what feels right or what god says we should do or most other > decision making processes I've seen. > > Just my thoughts, you can tell I've gone over this argument more than > once before. :-) > > Dave > > On Nov 6, 2006, at 5:53 PM, Daniel Reeves wrote: > >> That's another tricky thing about maximizing social welfare >> (synonymous with maximizing utility, as Dave notes) -- deciding how to >> include nonhumans in the equation. You have to include animals' >> utility in some way otherwise it would be ethically A-OK to torture >> animals for fun. >> Or maybe it suffices that there are *people* who get disutility from >> the torture of animals. For example, if we had a yootles auction to >> decide whether to kill a puppy, we wouldn't need the puppy's >> participation to decide not to do it. >> >> That puts me tentatively in the "animals don't count" camp. Anyone else? >> >> (I disagree with Dave that 2 & 3 are subsets of 1. Splitting utility >> equally is often more important than maximizing the sum of utilities. >> For example, it's not OK to steal money from someone who doesn't need >> it as much as you.) >> >> (And knowledge, truth, and scientific understanding are intrinsically >> valuable, beyond their applicability to improving social welfare. But >> perhaps my own strong feelings about this undermine my own point. In >> other words, maybe we don't need to include it for the same reason we >> don't need to include animal welfare.) >> >> >> --- \/ FROM Dave Morris AT 06.10.30 11:25 (Oct 30) \/ --- >> >>> I think that it's important to note that 2 & 3, while distinct and >>> interesting components of the discussion, are in fact subsets of 1, >>> which could be rephrased in it's general sense as "maximization of >>> utility" if you don't want to treat only the defined subset of >>> "human". :-) >>> >>> On Oct 28, 2006, at 1:30 PM, Daniel Reeves wrote: >>> >>>> Based on off-line discussion with my grandfather, I propose that >>>> there are only three fundamental principles worth fighting for in >>>> human society: >>>> 1. Social Welfare >>>> 2. Fairness >>>> 3. The Search for Knowledge >>>> (This started with an argument about the parental retort "who says >>>> life's supposed to be fair?") >>>> >>>> (1 and 2 are distinct because if we're all equally miserable, that's >>>> fair but not welfare maximizing. Likewise, of the methods for >>>> dividing >>>> a cake, for example, the method of "I get all of it" maximizes the >>>> sum >>>> of our utilities, but we nonetheless prefer splitting it in half.) >>>> Is there a number 4? >>>> -- >>>> http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves" >>> David P. Morris, PhD >>> Senior Engineer, ElectroDynamic Applications, Inc. >>> morris Æ edapplications.com, (734) 786-1434, fax: (734) 786-3235 >>> >>> >> >> -- >> http://ai.eecs.umich.edu/people/dreeves - - search://"Daniel Reeves" >> >> "Lassie looked brilliant in part because the farm family she lived >> with was made up of idiots. Remember? One of them was always >> getting pinned under the tractor and Lassie was always rushing >> back to the farmhouse to alert the other ones. She'd whimper and >> tug at their sleeves, and they'd always waste precious minutes >> saying things: "Do you think something's wrong? Do you think she >> wants us to follow her? What is it, girl?", etc., as if this had >> never happened before, instead of every week. What with all the >> time these people spent pinned under the tractor, I don't see how >> they managed to grow any crops whatsoever. They probably got by on >> federal crop supports, which Lassie filed the applications for." >> -- Dave Barry > David P. Morris, PhD > Senior Engineer, ElectroDynamic Applications, Inc. > morris Æ edapplications.com, (734) 786-1434, fax: (734) 786-3235 >