Binomial to Poisson Approximation : Why does p have to be smallHow Binomial and Normal distributions approximate Poisson distribution respectively?How to choose between poisson and binomial distributionsPoisson Approximation of BinomialWhy does $p$ have to be moderate in the Poisson approximation to binomial random variable?How to prove Poisson Distribution is the approximation of Binomial Distribution?Simple Question on interpretation of Poisson dist. as approximation to Binomial dist.Did computers render useless the teaching of approximating the Binomial with Poisson and Normal distribution?Probability: $Ysim P(lambda) approx B sim (n,p)$ for $n$ large, $p$ small enough.Why do probabilities have to be small in poisson distribution?Connection between the Binomial distribution, Poisson distribution and Normal distribution
Certain search in list
Is an entry level DSLR going to shoot nice portrait pictures?
Check if three arrays contains the same element
Is it a problem if <h4>, <h5> and <h6> are smaller than regular text?
Why do some employees fill out a W-4 and some don't?
Pre-1972 sci-fi short story or novel: alien(?) tunnel where people try new moves and get destroyed if they're not the correct ones
Playing a Character as Unobtrusive and Subservient, Yet Not Passive
When would it be advantageous not apply Training Ground's cost reduction?
Should I give professor gift at the beginning of my PhD?
How to handle self harm scars on the arm in work environment?
English word for "product of tinkering"
is it possible for a vehicle to be manufactured witout a catalitic converter
Does a scale have more than seven chords?
Winning Strategy for the Magician and his Apprentice
What makes Ada the language of choice for the ISS's safety-critical systems?
Has there been a multiethnic Star Trek character?
Is using haveibeenpwned to validate password strength rational?
A IP can traceroute to it, but can not ping
Bent Peugeot Carbolite 103 Frame
How can I make some of my chapters "come to life"?
Is White controlling this game?
What's up with this leaf?
Meaning of 'lose their grip on the groins of their followers'
Fail to return int value from a function
Binomial to Poisson Approximation : Why does p have to be small
How Binomial and Normal distributions approximate Poisson distribution respectively?How to choose between poisson and binomial distributionsPoisson Approximation of BinomialWhy does $p$ have to be moderate in the Poisson approximation to binomial random variable?How to prove Poisson Distribution is the approximation of Binomial Distribution?Simple Question on interpretation of Poisson dist. as approximation to Binomial dist.Did computers render useless the teaching of approximating the Binomial with Poisson and Normal distribution?Probability: $Ysim P(lambda) approx B sim (n,p)$ for $n$ large, $p$ small enough.Why do probabilities have to be small in poisson distribution?Connection between the Binomial distribution, Poisson distribution and Normal distribution
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
add a comment |
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
add a comment |
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
New contributor
edited 7 hours ago
Ibrahim
New contributor
asked 8 hours ago
IbrahimIbrahim
162
162
New contributor
New contributor
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3252245%2fbinomial-to-poisson-approximation-why-does-p-have-to-be-small%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
answered 7 hours ago
FakeAnalyst56FakeAnalyst56
7517
7517
add a comment |
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
answered 7 hours ago
user10354138user10354138
13k21125
13k21125
add a comment |
add a comment |
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3252245%2fbinomial-to-poisson-approximation-why-does-p-have-to-be-small%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown