How Likely Is Cryonics To Work?

http://lesswrong.com/lw/7sj/how_likely_is_cryonics_to_work/

How Likely Is Cryonics To Work?
Author:jkaufman25 September 2011

If an American signs up for cryonics and pays their ~$300/year, what are their odds of being revived? Talking to people at LessWrong meetups I've heard estimates of 1 in 2. My friend George Dahl, whose opinion I respect a lot, guesses "less than 1 in 10^6". Niether has given me reasons, those numbers are opaque. My estimate of these odds pretty much determines whether I should sign up. I could afford $300/year, and I would if I thought the odds were 1:2, but not if they were 1:10^6 [1].

In order to see how likely this is to work, we should look at the process [2]. I would sign up with a cryonics company and for life insurance. I'd go on living, enjoying my life and the people around me, paying my annual fees, until some point when I died. After death they would drain my blood, replace it with something that doesn't rupture cell walls when it freezes, freeze me in liquid nitrogen, and leave me there for a long time. At some point, probably after the development of nanotechnology, people would revive me, probably as a computer program.

There's a lot of steps there, and it's easy to see ways they could go wrong [3]. Let's consider some cases and try to get probabilities [4]:

Update: the probabilities below are out of date, and only useful for understanding the comments. I've made a spreadsheet listing both my updated probabilities and those for as many other people as I can find: https://docs.google.com/spreadsheet/...


0.03 You mess up the paperwork, either for cryonics or life insurance

0.10 Something happens to you financially where you can no longer afford this

0.06 You die suddenly or in a circumstance where you would not be able to be frozen in time (see leading causes of death)

0.04 You die of something like Altzheimers where the brain is degraded at death (Altzheimers is much more common than brain cancer)

0.01 The cryonics company is temporarily out of capacity and cannot actually take you, perhaps because lots of people died at once

0.02 The life insurance company does not pay out, perhaps it's insolvent, perhaps it argues you're not dead yet

0.02 You die in a hospital that refuses access to you by the cryonics people

0.02 After death your relatives reject your wishes and don't let the cryonics people freeze you

0.10 Some law is passed that prohibits cryonics (before you're even dead)

0.20 The cryonics people make a mistake in freezing you (how do we know they don't make lots of mistakes?)

0.20 Not all of what makes you you is encoded in the physical state of the brain

0.50 The current cryonics process is insufficient to preserve everything

0.35 Other (there are always other things that can go wrong)

0.86 Something goes wrong in getting you frozen

0.30 All people die (nuclear war? comet strike? nanotech?)

0.20 Society falls apart (remember this is the chance that society will fall apart given that we did not see "all people die")

0.10 Some time after you die cryonics is outlawed

0.15 All cryonics companies go out of business

0.30 The cryonics company you chose goes out of business

0.05 Your cryonics company screws something up and you are defrosted (power loss, perhaps. Are we really expecting perfect operation for decades?)

0.30 Other

0.80 Something goes wrong in keeping you frozen

0.10 It is impossible to extract all the information preseved in the frozen brain

0.50 The technology is never developed to extract the information

0.30 No one is interested in my brain's information

0.40 It is too expensive to extract my brain's information

0.03 Reviving people in simulation is impossible

0.20 The technology is never developed to run people in simulation

0.10 Running people in simulation is outlawed

0.10 No one is interested running me in simulation (even though they were interested enough to extract the neccesary information from my frozen brain)

0.05 It is too expensive to run me in simulation (if we get this far I expect cheap powerful computers)

0.40 Other

0.93 Something goes wrong in reviving

0.05 Other

0.05 Something else goes wrong

Combined Probability of Failure: 99.82%

Odds of success: 1 in 567.


If you can think of other ways cryonics might fail, moving probability mass from "other" to something more quantifiable, which would be helpful. If you think my numbers are off for something, please let me know what a better number would be and why. This is not final.

Am I going about this right? Do people here who think it's rational to sign up for cryonics take a "the payoff is really high, so the small probability doesn't matter" view? Am I overly pessimistic about its chances of success?

[1] To figure out what odds I would accept, I think the right approach is to treat this as if I were considering signing up for something certain and see how much I would pay, then see what odds bring this below $300/year. Even at 1:2 odds this is less effective than Village Reach at averting death [2], so this needs to come out of my 'money spent on me' budget. I think $10,000/year is about the most I'd be willing to spend. It's a lot, but not dying would be pretty nice. This means I'd need odds of 1:33 to sign up.

[2] Counter argument: you should care about quality adjusted life years and not deaths averted. Someone revived maybe should expect to have millennia of life at very high quality. This seems less likely to me than just the claim "will be revived". A lot less likely.

[3] In order to deal with independence issues, all my probability guesses are conditional on everything above them not happening. Each of these things must go right, so this works. For example, society collapsing and my cryonics organization going out of business are very much not independent. So the probability assigned to the latter is the chance that society won't collapse, but my organization goes out of business anyway. This means I can just multiply up the subelements to get probabilities for sections, and then multiply up sections to get an overall probability.

[4] This has a lot in common with the Warren formula, which was inspired by the Drake equation. Robin Hanson also has a breakdown. I also found a breakdown on LessWrong that seems really optimistic.

EDIT 2011-09-26: jsalvatier suggested an online spreadsheet, which is very sensible. Created

EDIT 2011-09-27: I've updated my probabilities some, and made the updates on the spreadsheet.

Comments:

I took your estimates, and sorted them into categories. These are the categories I came up with and their total probability of failure, by your estimate:

0.90 Insurmountable technical obstacle (cryonics process at the time you die (not necessarily today) doesn't preserve everything, or technological development stops prior to development of molecular nanotech, or molecular nanotech doesn't do what we think it does and no substitutes exist)

0.74 Other

0.625 Society chooses to let you die or not resurrect you (companies go bankrupt and no one takes over the maintenance; or no one does the resurrection, conditional on no legal problems)

0.56 Societal collapse or human extinction

0.27 Cryonics or resurrection is banned (but I think this is strongly correlated with societal collapse - they're both caused by insanity)

0.27 You aren't actually frozen or your brain is badly damaged first

0.24 Cryonics companies screw up (improper freezing or later thawing)

0.2 You are not your brain

Overall probability of failure: 0.998

This is wildly, wildly pessimistic. Here are my estimates for the same categories:

0.3 Insurmountable technical obstacle

0.0 Other (this list is exhaustive)

0.05 Cryonics or resurrection is banned

0.2 Society chooses to let you die or not resurrect you

0.25 Societal collapse or human extinction

0.15 You aren't actually frozen or your brain is badly damaged first

0.1 Cryonics companies screw up

0.05 You are not your brain

Overall probability of failure: 0.71

.03 seems really high for messing up the paperwork. Sure you might mess up the initial paperwork but then it will be noticed and fixed.

.06 seems too low for chance of dying in a circumstance where they can't preserve you. Especially if one isn't very old, the chance of death from sudden trauma is much higher than other forms of death.

The Alzheimer's thing is over estimated. One can in many places (and the number of places is growing) engage in euthanasia. Even in the US people can directly take steps to drastically decrease their lifespans such as by self-starvation. Also, unless you die in very late stage Alzheimer's most of the information is likely to be intact. Alzheimer's also has a very large genetic component, so if no one in one's family got it one is probably safe.

I don't know why you think the cryonic's company running out of capacity should be that likely- with so few people signed up that simply isn't a serious risk.

The insurance company issues aren't a problem. There are companies now which have policies geared to cryonics. And if a company goes insolvent, unless this happens just when you are dying, switching to another insurance company should not be hard.

You seem to be underestimating the risk of anti-cryonics laws. There's already such a law in British Columbia.

You also seem to be assuming that extraction of information from the brain is the only possible option rather than direct revival.

I also don't know how you are combining these various probabilities. This could drastically alter the probability.

The law in BC only prohibits selling cryonic services, not getting frozen and transported elsewhere: clarification

.06 seems too low for chance of dying in a circumstance where they can't preserve you. Especially if one isn't very old, the chance of death from sudden trauma is much higher than other forms of death.

If I die now (age 25) then yes, it's most likely to be because of an accident. But I'm also unlikely to die now. Looking at overall causes of death, 5% of American deaths in 2007 were due to "accidents (unintentional injuries)". Another 0.8% was due to homicide. I assumed that within the 18.6% of 'other' there might be ~0.2% of other similar things.

I could die of something like a heart attack while far from a hospital or something, though. So I should probably raise this probability to account for that. Do we know how common it is for people to die suddenly, of all causes (excluding accidents and homicide, which I'm already assuming are 100% no-freeze)?

Would 0.20 be better?

One can in many places (and the number of places is growing) engage in euthanasia. Even in the US people can directly take steps to drastically decrease their lifespans such as by self-starvation

I don't know if I would want to be euthanized / frozen if I thought my chances of revival were this low.

Alzheimer's also has a very large genetic component, so if no one in one's family got it one is probably safe.

I also don't know how you are combining these various probabilities. This could drastically alter the probability.

I convert all the probabilities of failure to probabilities of success by subtracting them from 1. Then I multiply them all and subtract the result from one:

P(failure) = 1-P(success)

P(total_success) = P(success_a)*P(success_b)*P(success_c)*...


This seemed to me to be the only way to do it, so I didn't remark on it. Is this actually the right way to do it? Are there other ways that I should have considered? (I tried to deal with independence in footnote 3)

Well, yeah this works if they are all independent probabilities. But some of them are clearly not. For example, a lot of the worst case post-preservation problems are likely to be correlated with each other (a lot of them have large scale catastrophes as likely causes). Those should then reduce the chance of failure. But at the same time, other possibilities are essentially exclusive- say dying from Alzheimer's or dying from traumatic brain injury at a young age. That sort of thing should result in an increased total probability.

Working out how to these all interact might require a much more complicated model (you mentioned the Drake equation as an inspiration and it is interesting to note that it runs into very similar issues). But, I agree that as a very rough approximation, you can assume that everything is independent and probably not be too far off.

That's a really good set of points. This almost suggests that a sufficiently selfish cryonist might want to optimize how popular cryonics becomes. Popular enough to provide long-term security and pull but not so popular to be a target.

The other benefit would be on the revival side. My brain's information is more interesting the fewer peers I have from my own era. These revival problems are actually one of my larger concerns. I can't imagine why anyone would want to run an upload for more time than it would take to have a few conversations.

This works if they are all independent probabilities. But some of them are clearly not.

I tried to define them to be independent:

[3] "all my probability guesses are conditional on everything above them not happening"

So, my probabilities were supposed to be like:

P(failure1)

P(failure2|-failure1)

P(failure3|-failure2&-failure1)

P(failure4|-failure3&-failure2&-failure1)


In some cases they are probably unrelated. Then we can simplify:

P(society falls apart | nothing goes wrong in freezing you)

can be almost perfectly approximated as

P(society falls apart)

Right, but some of them are clearly not independent. See my example of forms of deaths where they are essentially exclusive.

I took your estimates, and sorted them into categories.

80% chance that you are so awesome cool that future society wants to run your upload for a long time? I think it’s at least as likely that just the most interesting ones are run a lot of times by a lot of different people who want to interact with those simulations.

Yes, we're talking about a future with fantastic amounts of computing resources so it wouldn't cost much, but one problem is that a lot of resources are going to be spent working out new projects to run; I'm sure there will always be more computing people would like to do than they actually can do.

Perhaps people signed up for cryonics should try to make themselves as interesting as possible as publicly as possible?

That is true, but this style of analysis is predicated on a sequence of steps, each one of which must succeed, and hence the more steps you have, the lower the end result probability must be; if you were just correcting for overestimation by analysis, then there ought to be analyses or points where one realizes one has been too pessimistic and increases the probability.

But, that can never happen with this kind of analysis: the small result is built into the conjunctions. If one realizes one was wrong in giving the probability of a particular factor, well, one can just 'fix' that by breaking it into some more sub-steps with <1 probability!

Also, some of my 86% of being frozen correctly, for the first section, includes things that we can't tell yet if they worked out all right:

• Not all of what makes you you is encoded in the physical state of the brain

• The current cryonics process is insufficient to preserve everything

And then there are things that are not currently a problem but could become one:

• Some law is passed that prohibits cryonics

• You die in a hospital that refuses access to you by the cryonics people

Actual data on the fraction of the time someone signed up for cryonics and then is actually suspended, in what we think was the correct way, would be really helpful, though.

http://www.alcor.org/cases.html seems like a good starting point.

I did say 'fairly frequently'.... Nor does long involvement necessarily save one; Mike Darwin was rather angry at Ben Best over how he botched Curtis Henderson.

No, in this disjunction of conjunctions, the more details of any kind you add, the less likely a favorable outcome looks. If we expect reality to be unbiased, we should also expect some ratio of favorable to unfavorable details, which, ceteris paribus, should be maintained as we go to higher granularities of detail.

In other words, "motivated stopping" and "motivated continuation" should not, together, be a sufficient explanation for the results of an analysis.


The more details of any kind you add, the less likely a favorable outcome looks.

Say, I went into this thinking my chance of being frozen correctly was 95%. Now, with more details on what has to go right for this to happen, I think 86% is a better estimate. Details don't have to make things less favorable. They just usually do because we are optimistic.

The plan is not for "them" to revive us. The plan is that we, the cryonics community, will revive ourselves.

I think there's a decent chance that even if some of us are revived we won't have any ability to create anywhere near the economic value needed to revive others. We'd probably be pretty useless to the future, so that if reviving people is at all expensive the people revived first would not be able to continue the process.

You seem to be assigning a high probability to exotic problems (information isn't preserved by freezing, global apocalypse) and a low probability to mundane problems (you die of Alzheimer's, cryonics companies go out of business). The reverse seems more likely to me.

The nice thing about these mundane probabilities is that we have precedent and can calculate them. Not that many people die of Alzheimers; it's pretty rare. We don't have a reason to think it's going to get more common, do we? But new technology I know little about could kill everyone. Or there could be something (electrical?) needed for the brain to work, and freezing doesn't capture it.

Without data that we don't have yet (actually reviving a vitrified brain, seeing what gets invented and how it is applied) I don't think it makes sense to assume bad outcomes are unlikely.

Cryonics hinges on more than just surviving all the small P values of danger like power outages or global extinctions. It means actively hoping that humanity gets through the singularity intact, gets to post-scarcity intact, and undergoes a moral revolution intact. It means society must have the knowledge, resources, and will to bring some stranger into the world with absolutely zero value other than novelty.

Cryonics relies on a very narrow bullseye of possible futures.