• Sailor Sega Saturn@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you’re lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there’s some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.

    I mean don’t get me wrong I’d give a lot for immortality, but I try to uhh… stay grounded in reality.

      • self@awful.systemsM
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        don’t worry, folk like Yudkowsky have already taken care of substituting one god with another under rationalism’s name in an attempt to grift atheists who miss the comfort of an afterlife to look forward to

        you can thank him by donating a tithe of all your money to effective altruism or whichever “AI alignment” org will save you from the Basilisk these days