The probability here is one

Two orphaned sisters separated decades ago in South Korea have been reunited after being hired at the same hospital in Florida.

The women, now both in their 40s, were stunned to learn that they were related, having not seen each other since the early 1970s.

Both women had suffered tragic losses and spent time at orphanages in South Korea before being adopted by American families.

Because it’s happened of course.

10 thoughts on “The probability here is one”

  1. So Much For Subtlety

    Two sisters met each other on Facebook the other day. Also East Asian. Also given up for adoption.

    Given the numbers of Korean girls dumped by their families – Korea used to be the biggest source of adoptees in the world – it is inevitable that many will end up in the US and will eventually bump into each other.

  2. “The probability here is one”

    No, not really.

    Probability is always relative to the evidence, TIm. So the probability is only one if we know everything we know now (including the fact that those two are sisters and that they both went to wortk for the same hospital — basically your premises already include the conclusion, that’s why the probability is one).

    Beforehand, however, the probability that this particular event would happen was not one.

    It’s like rolling two sixes and then saying ‘the probability of that was one, because it happened’. The probability of that particular event, relative to all we know now, is one, sure. But that doesn’t change the fact that before the roll, the probability that it would be two sixes that were rolled was 1/36.

    But, you might say, you mean that the probability that an event of this *kind* will happen is now one (as it has happened). That’s true, but that doesn’t change the fact that, before we knew these facts, this *particular* event was unlikely.

    (Although, as SMFS, says, it wasn’t really _that_ unlikely.)

  3. I remember that when learning probability theory the main obstacle was the lousy English/muddled thinking of the lecturers. They seemed incapable of finding tenses and moods of verbs that clearly distinguished things that had already happened (Tim’s “probability one” events) from things that had yet to happen or might never happen. Once the penny had dropped I just went through my notes with a red pen correcting the English, and discovered to my delight that there wasn’t really much content at all to the introductory stuff. Easy-peasy. It was odd: I never came across the same difficulty in any other branch of maths.

    I did come across one vaguely comparable difficulty in chemistry, but that was less surprising: chemists, on the whole, are not clever in the way mathematicians are.

  4. Bloke in Costa Rica

    There is a big difference in statistics between probability and likelihood. Probability is prior, likelihood is post hoc. Hence it is meaningful to say, “I have a fair die; what is the probability that I roll three sixes in a row?” Having rolled three sixes with some unknown die, it is also meaningful to ask, “what is the likelihood that my die is fair?” Impreciseness of language does indeed cause confusion.

    There’s also an ongoing philosophical conundrum about whether certain types of counterfactual are meaningful. It is one thing to say, “if we keep on driving like this we are going to crash” (type A) and quite another to say, “if we had kept driving like that we would have crashed” (type B). It’s generally accepted that type A counterfactuals have both a truth value and a probability value (we carry on and we crash, or not; 7 times out of 10 in similar circumstances people crash), but type B counterfactuals only have a likelihood value (7 times out of 10 people who carried on like that crashed).

  5. “There is a big difference in statistics between probability and likelihood. Probability is prior, likelihood is post hoc.”

    You’re thinking of the distinction between ‘probability’ and ‘belief’. A belief function Bel(p) assigns a number from 0 to 1 to each proposition, measuring the expert’s belief in the proposition, and provides a calculus for computing the belief in associated propositions like Bel(p and q) or Bel(not p) from knowledge of Bel(p) and Bel(q).

    One particular flavour of belief function is called ‘Bayesian Belief’ and basically reproduces the axioms of probability. Thus, the Bayesian Belief in a proposition is the expert’s judgement as to the probability of the proposition being true, on the assumption that the set of beliefs can be interpreted as a consistent set of probabilities, and use the same algebra as probabilities. However, unlike probabilities, Bayesian Beliefs are subjective – their value depends on how much you know, and can have different values for different people simultaneously.

    A likelihood is a conditional probability that reverse the roles of the condition and outcome. Thus a conditional probability might tell you P(double-six|fair dice), while the corresponding likelihood is L(fair dice|double-six). It’s numerically the same function with the variables swapped round, but it’s not strictly a probability any more because it’s no longer normalised to add up to 1.

    “There’s also an ongoing philosophical conundrum about whether certain types of counterfactual are meaningful.”

    As a believer in the Everett-Wheeler interpretation of Quantum Mechanics (aka Many Worlds, although that’s really a misnomer), I don’t find this to be a difficulty. Everything that can happen, does happen. Probabilities count ‘how many worlds’ it happens in (to put it non-technically). So counterfactuals are real and probabilities are objective.

    Not everyone subscribes, though.

  6. Bloke in Costa Rica

    Yes, I’m talking specifically about a likelihood function as in the post-hoc estimation of a model’s parameters given an outcome or set of outcomes, as opposed to a probability density or mass function producing an estimate of the expected values given the model’s parameters.

  7. BiCR,

    Sorry, I messed up my notation bit.

    The Bayesian inference rule says:
    P(H|E) = [ P(E|H)/P(E) ] P(H)
    P(H) is the prior distribution – your belief in the hypothesis before the evidence is seen.
    P(H|E) is the posterior distribution – your belief in the hypothesis after the evidence is seen.

    And P(E|H) when considered as a function of H (a different function for each possible value of evidence E) is the likelihood. A better way to write it would be L_E(H) (That’s “L subscript E”).

    The probability of double-six given fair dice is what we mean by the ‘likelihood function’ (the one specific to the case of a double-six) of fair dice. Likelihood isn’t a synonym for probability.

    It’s a bit counter-intuitive, which of course is mathematician-speak for ‘weird’.

    Anyway, what I meant was that it’s the posterior distribution that is post-hoc, rather than the likelihood.

  8. Almost the exact same thing happened in China to two orphaned sisters during WW2. In fact, IIRC, they also ended up reuniting at a hospital or some similar institution.

  9. Bloke in Costa Rica

    “The probability of double-six given fair dice is what we mean by the ‘likelihood function’”

    No it ain’t. The likelihood of a die being fair given two sixes rolled is what “we” mean by the likelihood function. What you are describing is the probability mass function for two throws of a fair die (PMF rather than PDF because it’s a discrete distribution).

  10. “The likelihood of a die being fair given two sixes rolled is what “we” mean by the likelihood function.”

    That one’s the posterior distribution.

    But whatever.

Leave a Reply

Your email address will not be published. Required fields are marked *