Skip to main contentdfsdf

Matt McAlister's List: Less Wrong 7 - Fun Theory

    • They're just there to collect credit for the Deep Wisdom of asking the question.  It's enough to imply that the question is unanswerable, and therefore, we should all drop dead.
    • The primary experimental result in hedonic psychology - the study of happiness - is that people don't know what makes them happy.

    9 more annotations...

    • So this is the ultimate end of the prophecy of technological progress - just staring at a screen that says "YOU WIN", forever.
    • eternal laziness "sounds like good news" to your present self who still has to work.

    7 more annotations...

    • It's not a worthy use of a human-level intelligence.
    • How large is Fun Space?

    10 more annotations...

    • Is there enough fun in the universe, sufficiently accessible, for a transhuman to jog off the hedonic treadmill - improve their life continuously, at a sufficient rate to leap to an even higher hedonic level before they had a chance to get bored with the previous one?
    • Robin, in "For Discount Rates", pointed out that an investment earning 2% annual interest for 12,000 years adds up to a googol (10^100) times as much wealth; therefore, "very distant future times are ridiculously easy to help via investment".

    7 more annotations...

    • Failing to escape the computer tiger would also have fewer long-term consequences than failing to escape a biological tiger - it would be less a part of the total story of your life - meaning you're also likely to be less emotionally involved.
    • And there's the sense of touch that indicates the wind on your skin; and the proprioceptive sensors that respond to the position of your limbs; and the nerves that record the strain on your muscles.  There's a whole strip of sensorimotor cortex running along the top of your brain, that would be much more intensely involved in "real" running.

    5 more annotations...

    • I don't know if I'd call agriculture a mistake.  But one should at least be aware of the downsides.
    • In Guessing the Teacher's Password I talked about the (well-known) problem in which schools end up teaching verbal behavior rather than real knowledge.  In Truly Part of You I suggested one test for false knowledge:  Imagine deleting a fact from your mind, and ask if it would grow back.

    7 more annotations...

    • jurisprudence constante:  The legal system must above all be predictable, so that people can execute contracts or choose behaviors knowing the legal implications.
    • Judges are not necessarily there to optimize, like an engineer.  The purpose of law is not to make the world perfect.  The law is there to provide a predictable environment in which people can optimize their own futures.

    8 more annotations...

    • a rational agent shouldn't do worse by having more options.  The more available actions you have, the more powerful you become - that's how it should ought to work.
    • the pain of losing something is between 2 and 2.5 times as worse as the joy of gaining it

    6 more annotations...

    • Helion had leaned and said, "Son, once you go in there, the full powers and total command structures of the Rhadamanth Sophotech will be at your command.  You will be invested with godlike powers; but you will still have the passions and distempers of a merely human spirit.  There are two temptations which will threaten you.  First, you will be tempted to remove your human weaknesses by abrupt mental surgery.  The Invariants do this, and to a lesser degree, so do the White Manorials, abandoning humanity to escape from pain.  Second, you will be tempted to indulge your human weakness.  The Cacophiles do this, and to a lesser degree, so do the Black Manorials.  Our society will gladly feed every sin and vice and impulse you might have; and then stand by helplessly and watch as you destroy yourself; because the first law of the Golden Oecumene is that no peaceful activity is forbidden.  Free men may freely harm themselves, provided only that it is only themselves that they harm."
    • Helion looked sardonic.  "'Mistake' is such a simple word.  An adult who suffers a moment of foolishness or anger, one rash moment, has time enough to delete or destroy his own free will, memory, or judgment.  No one is allowed to force a cure on him.  No one can restore his sanity against his will.  And so we all stand quietly by, with folded hands and cold eyes, and meekly watch good men annihilate themselves.  It is somewhat... quaint... to call such a horrifying disaster a 'mistake.'"

    16 more annotations...

    • if you create an AI and tell it to model the world around it, it may form models of people that are people themselves.  Not necessarily the same person, but people nonetheless.
    • The model that most precisely predicts these facts, may well be a 'simulation' detailed enough to be a person in its own right.

    4 more annotations...

    • Creating a true child is the only moral and metaethical problem I know that is even harder than the shape of a Friendly AI.  I would like to be able to create Friendly AI while worrying just about the Friendly AI problems, and not worrying whether I've created someone who will lead a life worth living.  Better by far to just create a Very Powerful Optimization Process, if at all possible.
    • Depending on how you look at it, either no intelligence has 'free will', or anything that simulates alternative courses of action has 'free will'."

    8 more annotations...

    • And suppose that, while these AIs did care for one another, and cared about themselves, and cared how they were treated in the eyes of society -

       

      - these trillions of people also cared, very strongly, about making giant cheesecakes.

    • Would we be right to go on trying to seize the destiny of the galaxy - to make of it a place of peace, freedom, art, aesthetics, individuality, empathy, and other components of humane value?

       

      Or should we be content to have the galaxy be 0.1% eudaimonia and 99.9% cheesecake?

    2 more annotations...

    • The only desire the Culture could not satisfy from within itself was one common to both the descendants of its original human stock and the machines they had (at however great a remove) brought into being: the urge not to feel useless.  The Culture's sole justification for the relatively unworried, hedonistic life its population enjoyed was its good works; the secular evangelism of the Contact Section, not simply finding, cataloguing, investigating and analysing other, less advanced civilizations but - where the circumstances appeared to Contact to justify so doing - actually interfering (overtly or covertly) in the historical processes of those other cultures.
    • Iain Banks is the one to beat

    10 more annotations...

    • Dunbar himself did another regression and found that a community of 150 primates would have to spend 43% of its time on social grooming, which Dunbar interpreted as suggesting that 150 was an upper bound rather than an optimum, when groups were highly incentivized to stay together.
    • LWA suggests that community satisfaction has two peaks, one at size ~7 for simple groups, and one at ~60 for complex groups; and that any community has to fraction, one way or another, by the time it approaches Dunbar's Number.

    9 more annotations...

    • I'm worried about the prospect of nonsentient romantic partners
    • In a nutshell - sex/romance, as we know it now, is a primary dimension of multiplayer fun.  If you take that fun and redirect it to something that isn't socially entangled, if you turn sex into an exclusively single-player game, then you've just made life that much simpler - in the same way that eliminating boredom or sympathy or values over nonsubjective reality or individuals wanting to navigate their own futures, would tend to make life "simpler".  When I consider how easily human existence could collapse into sterile simplicity, if just a single major value were eliminated, I get very protective of the complexity of human existence.

    6 more annotations...

    • Terrence Deacon's The Symbolic Species is the best book I've ever read on the evolution of intelligence.
    • It's not just a question of increased computing capacity, like adding extra processors onto a cluster; it's a question of what kind of signals dominate, in the brain.

    14 more annotations...

    • I think that the difficulty and danger of fiddling with emotions is oft-underestimated.
    • I'm sorry, but there are some things that are much more complicated to actually do than to rattle off as short English phrases, and "changing sex" has to rank very high on that list.  Even if you're omnipotent so far as raw ability goes, it's not like people have a binary attribute reading "M" or "F" that can be flipped as a primitive action.

    9 more annotations...

    • "Stories are about people's pain."  Orson Scott Card.

       

      "Every scene must end in disaster."  Jack Bickham.

    • You simply don't optimize a story the way you optimize a real life.  The best story and the best life will be produced by different criteria.

    13 more annotations...

    • Movies that were made in say the 40s or 50s, seem much more alien - to me - than modern movies allegedly set hundreds of years in the future, or in different universes.  Watch a movie from 1950 and you may see a man slapping a woman.  Doesn't happen a lot in Lord of the Rings, does it?  Drop back to the 16th century and one popular entertainment was setting a cat on fire.  Ever see that in any moving picture, no matter how "lowbrow"?
    • I am saying that the Future would discomfort us because it is better.

    8 more annotations...

    • Utopia and Dystopia have something in common: they both confirm the moral sensibilities you started with.
    • If you're trying to do Fun Theory, you have to come up with a Weirdtopia that's at least arguably-better than Utopia.  This is harder but also directs you to more interesting regions of the answer space.
1 - 20 of 24 Next ›
20 items/page
List Comments (0)