Absolutely wonderful numerical phenomenon. Who can explain?Maximum likelihood estimate of hypergeometric distribution parameterFinding subset of combinations which satisfy a criterionCan anyone explain one step of derivation in a branching process example?How can I predict the next number give prior history of known sequences?Can you explain this solution?Can someone explain to me why hot hand phenomenon is considered a fallacy?Can anyone explain this dependent probability statement.Is there a higher chance of winning one contest if you enter many? How can we determine when it's most mathematically favorable then?

Is there any reason to concentrate on the Thunderous Smite spell after using its effects?

2000s space film where an alien species has almost wiped out the human race in a war

How does a simple logistic regression model achieve a 92% classification accuracy on MNIST?

Is using gradient descent for MIP a good idea?

What exactly is a marshrutka (маршрутка)?

Can I see Harvest moon in India?

What is a "major country" as named in Bernie Sanders' Healthcare debate answers?

If a space ship entered Earth orbit, how likely is it to be seen?

sed replacing character in a file

What officially disallows US presidents from driving?

Is "you will become a subject matter expert" code for "you'll be working on your own 100% of the time"?

How To Make Earth's Oceans as Brackish as Lyr's

Python web-scraper to download table of transistor counts from Wikipedia

Real mode flat model

What do the French say for “Oh, you shouldn’t have”?

Is low emotional intelligence associated with right-wing and prejudiced attitudes?

Where is it? - The Google Earth Challenge Ep. 2

Fasteners for securing cabinets together

Make 2019 with single digits

What 68-pin connector is this on my 2.5" solid state drive?

Why does the speed of sound decrease at high altitudes although the air density decreases?

Some Prime Peerage

Interaction between Teferi Time Raveler and Enduring Ideal

ColorFunction based on array index in ListLinePlot



Absolutely wonderful numerical phenomenon. Who can explain?


Maximum likelihood estimate of hypergeometric distribution parameterFinding subset of combinations which satisfy a criterionCan anyone explain one step of derivation in a branching process example?How can I predict the next number give prior history of known sequences?Can you explain this solution?Can someone explain to me why hot hand phenomenon is considered a fallacy?Can anyone explain this dependent probability statement.Is there a higher chance of winning one contest if you enter many? How can we determine when it's most mathematically favorable then?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








17












$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$









  • 2




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    9 hours ago

















17












$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$









  • 2




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    9 hours ago













17












17








17


4



$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$




I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)







probability stochastic-processes






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 9 hours ago









Sil

6,0412 gold badges17 silver badges46 bronze badges




6,0412 gold badges17 silver badges46 bronze badges










asked 9 hours ago









Jake MirraJake Mirra

3581 silver badge10 bronze badges




3581 silver badge10 bronze badges










  • 2




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    9 hours ago












  • 2




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    9 hours ago







2




2




$begingroup$
If the threshold really is $e$, I'm ready for my mind to be blown.
$endgroup$
– littleO
9 hours ago




$begingroup$
If the threshold really is $e$, I'm ready for my mind to be blown.
$endgroup$
– littleO
9 hours ago










2 Answers
2






active

oldest

votes


















12














$begingroup$

EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
$$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






share|cite|improve this answer











$endgroup$










  • 1




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    9 hours ago










  • $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    8 hours ago










  • $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    8 hours ago










  • $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    6 hours ago











  • $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    6 hours ago


















10














$begingroup$

I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






share|cite|improve this answer









$endgroup$

















    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );














    draft saved

    draft discarded
















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3355435%2fabsolutely-wonderful-numerical-phenomenon-who-can-explain%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    12














    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      9 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      8 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      6 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      6 hours ago















    12














    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      9 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      8 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      6 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      6 hours ago













    12














    12










    12







    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$



    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 8 hours ago

























    answered 9 hours ago









    Aaron MontgomeryAaron Montgomery

    5,1425 silver badges24 bronze badges




    5,1425 silver badges24 bronze badges










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      9 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      8 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      6 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      6 hours ago












    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      9 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      8 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      6 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      6 hours ago







    1




    1




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    9 hours ago




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    9 hours ago












    $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    8 hours ago




    $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    8 hours ago












    $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    8 hours ago




    $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    8 hours ago












    $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    6 hours ago





    $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    6 hours ago













    $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    6 hours ago




    $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    6 hours ago













    10














    $begingroup$

    I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






    share|cite|improve this answer









    $endgroup$



















      10














      $begingroup$

      I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






      share|cite|improve this answer









      $endgroup$

















        10














        10










        10







        $begingroup$

        I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






        share|cite|improve this answer









        $endgroup$



        I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 9 hours ago









        Jake MirraJake Mirra

        3581 silver badge10 bronze badges




        3581 silver badge10 bronze badges































            draft saved

            draft discarded















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3355435%2fabsolutely-wonderful-numerical-phenomenon-who-can-explain%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            ParseJSON using SSJSUsing AMPscript with SSJS ActivitiesHow to resubscribe a user in Marketing cloud using SSJS?Pulling Subscriber Status from Lists using SSJSRetrieving Emails using SSJSProblem in updating DE using SSJSUsing SSJS to send single email in Marketing CloudError adding EmailSendDefinition using SSJS

            Кампала Садржај Географија Географија Историја Становништво Привреда Партнерски градови Референце Спољашње везе Мени за навигацију0°11′ СГШ; 32°20′ ИГД / 0.18° СГШ; 32.34° ИГД / 0.18; 32.340°11′ СГШ; 32°20′ ИГД / 0.18° СГШ; 32.34° ИГД / 0.18; 32.34МедијиПодациЗванични веб-сајту

            Кастелфранко ди Сопра Становништво Референце Спољашње везе Мени за навигацију43°37′18″ СГШ; 11°33′32″ ИГД / 43.62156° СГШ; 11.55885° ИГД / 43.62156; 11.5588543°37′18″ СГШ; 11°33′32″ ИГД / 43.62156° СГШ; 11.55885° ИГД / 43.62156; 11.558853179688„The GeoNames geographical database”„Istituto Nazionale di Statistica”проширитиууWorldCat156923403n850174324558639-1cb14643287r(подаци)