Combinatorics problems that can be solved more easily using probabilityFind the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zerosLooking for an article on general principles of discrete mathematicsElementary problems with group theoretic solutionsMore tricky probability ProblemsNumber of binary $Mtimes N$ matrices with even row sums, even col sums and $K$ ones, $K$ evenProbability using combinatorics problemProbability using combinatoricsSolving probability using combinatorics or elementary probabilityCombinatorial Species, significance and problems can be solved with it.Combinatorics problems that can be solved via infinite descent
What is the majority of the UK Government as of 2019-09-04?
What's the eccentricity of an orbit (trajectory) falling straight down towards the center?
What's the point of this macro?
What drugs were used in England during the High Middle Ages?
Is Sanskrit really the mother of all languages?
Life post thesis submission is terrifying - Help!
What's the difference between a share and a stock?
How quickly would a wooden treasure chest rot?
Identifying the following distribution
Why did Boris Johnson call for new elections?
How do I stop making people jump at home and at work?
Is there any reason to change the ISO manually?
RAW, Is the "Finesse" trait incompatible with unarmed attacks?
FORMAT returns large row size and data size
Was Rosie the Riveter sourced from a Michelangelo painting?
Global variables and information security
Never make public members virtual/abstract - really?
Shoes for commuting
What are some countries where you can be imprisoned for reading or owning a Bible?
Why are all volatile liquids combustible
How does the UK House of Commons think they can prolong the deadline of Brexit?
Do I have to rename all creatures in a new world?
Undefined Hamiltonian for this particular Lagrangian
Do 643,000 Americans go bankrupt every year due to medical bills?
Combinatorics problems that can be solved more easily using probability
Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zerosLooking for an article on general principles of discrete mathematicsElementary problems with group theoretic solutionsMore tricky probability ProblemsNumber of binary $Mtimes N$ matrices with even row sums, even col sums and $K$ ones, $K$ evenProbability using combinatorics problemProbability using combinatoricsSolving probability using combinatorics or elementary probabilityCombinatorial Species, significance and problems can be solved with it.Combinatorics problems that can be solved via infinite descent
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
I'm looking for examples of combinatorics problems which would be very difficult to solve by direct enumeration, but can be easily solved using ideas from probability, like independence, commuting sums and expectations, etc. I know I have seen such problems before, especially in the HMMT combinatorics subject tests, but I can't now recall any good examples.
I am NOT looking for probabilistic existence proofs (the so-called "probabilistic method" introduced by Erdos). The sort of problems I'm interested in are enumerative.
probability combinatorics big-list
$endgroup$
add a comment |
$begingroup$
I'm looking for examples of combinatorics problems which would be very difficult to solve by direct enumeration, but can be easily solved using ideas from probability, like independence, commuting sums and expectations, etc. I know I have seen such problems before, especially in the HMMT combinatorics subject tests, but I can't now recall any good examples.
I am NOT looking for probabilistic existence proofs (the so-called "probabilistic method" introduced by Erdos). The sort of problems I'm interested in are enumerative.
probability combinatorics big-list
$endgroup$
add a comment |
$begingroup$
I'm looking for examples of combinatorics problems which would be very difficult to solve by direct enumeration, but can be easily solved using ideas from probability, like independence, commuting sums and expectations, etc. I know I have seen such problems before, especially in the HMMT combinatorics subject tests, but I can't now recall any good examples.
I am NOT looking for probabilistic existence proofs (the so-called "probabilistic method" introduced by Erdos). The sort of problems I'm interested in are enumerative.
probability combinatorics big-list
$endgroup$
I'm looking for examples of combinatorics problems which would be very difficult to solve by direct enumeration, but can be easily solved using ideas from probability, like independence, commuting sums and expectations, etc. I know I have seen such problems before, especially in the HMMT combinatorics subject tests, but I can't now recall any good examples.
I am NOT looking for probabilistic existence proofs (the so-called "probabilistic method" introduced by Erdos). The sort of problems I'm interested in are enumerative.
probability combinatorics big-list
probability combinatorics big-list
edited 8 hours ago
Yly
asked 8 hours ago
YlyYly
7,9522 gold badges18 silver badges45 bronze badges
7,9522 gold badges18 silver badges45 bronze badges
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $mathcal G_n,1/2$ is connected, then multiply by $2^binom n2$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n to infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n to infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.
$endgroup$
add a comment |
$begingroup$
Here are some problems:
- Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zeros
- https://artofproblemsolving.com/community/c6h366278p2018435
- https://artofproblemsolving.com/community/c6h60752p366512
- https://artofproblemsolving.com/community/q2h1151650p5452212
- http://artofproblemsolving.com/community/c6h1170845p5622460
- https://artofproblemsolving.com/community/q4h1497881p9354029
$endgroup$
add a comment |
$begingroup$
This might be a good example. Prove that for any positive integers $m,n$ with $m< n$,
$$
sum_k=0^n fracbinommkbinomnk=fracn+1n-m+1.
$$
Solution: Imagine you have a deck of $n$ cards, $m$ of which are black, the rest of which are red. You shuffle the deck, and deal from the top until you deal a red card. What is the expected number of cards dealt?
Letting $X$ be the number of cards dealt, you have $P(X>k)=binommk/binomnk$, since $X>k$ occurs if and only if the first $k$ cards are all black. Therefore,
$$
E[X]=sum_k=0^nP(X>k)=sum_k=0^n fracbinommkbinomnk
$$
On the other hand, the $n-m$ red cards divide the deck into $n-m+1$ sections of black cards. Each black card is equally likely to fall in each section. In particular, there is a probability of $1/(n-m+1)$ of it falling in the top section. By linearity of expectation, the expected number of black cards in the top section is $m/(n-m+1)$. Therefore, the expected number of cards dealt is
$$
1+fracmn-m+1=fracn+1n-m+1
$$
I think there are a lot of complicated combinatorial sums which can be made easy using linearity of expectation. For example,
$$
sum_k=0^nkbinomnkp^k(1-p)^n-k=np
$$
is best understood by realizing the sum is the expected value of a binomial random variable with parameters $n$ and $p$, whose value is obviously $np$ by linearity of expectation. There, the sum is not hard, but there are probability examples in a similar vein.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3344468%2fcombinatorics-problems-that-can-be-solved-more-easily-using-probability%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $mathcal G_n,1/2$ is connected, then multiply by $2^binom n2$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n to infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n to infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.
$endgroup$
add a comment |
$begingroup$
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $mathcal G_n,1/2$ is connected, then multiply by $2^binom n2$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n to infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n to infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.
$endgroup$
add a comment |
$begingroup$
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $mathcal G_n,1/2$ is connected, then multiply by $2^binom n2$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n to infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n to infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.
$endgroup$
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $mathcal G_n,1/2$ is connected, then multiply by $2^binom n2$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n to infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n to infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.
answered 8 hours ago
Misha LavrovMisha Lavrov
53.9k7 gold badges61 silver badges114 bronze badges
53.9k7 gold badges61 silver badges114 bronze badges
add a comment |
add a comment |
$begingroup$
Here are some problems:
- Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zeros
- https://artofproblemsolving.com/community/c6h366278p2018435
- https://artofproblemsolving.com/community/c6h60752p366512
- https://artofproblemsolving.com/community/q2h1151650p5452212
- http://artofproblemsolving.com/community/c6h1170845p5622460
- https://artofproblemsolving.com/community/q4h1497881p9354029
$endgroup$
add a comment |
$begingroup$
Here are some problems:
- Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zeros
- https://artofproblemsolving.com/community/c6h366278p2018435
- https://artofproblemsolving.com/community/c6h60752p366512
- https://artofproblemsolving.com/community/q2h1151650p5452212
- http://artofproblemsolving.com/community/c6h1170845p5622460
- https://artofproblemsolving.com/community/q4h1497881p9354029
$endgroup$
add a comment |
$begingroup$
Here are some problems:
- Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zeros
- https://artofproblemsolving.com/community/c6h366278p2018435
- https://artofproblemsolving.com/community/c6h60752p366512
- https://artofproblemsolving.com/community/q2h1151650p5452212
- http://artofproblemsolving.com/community/c6h1170845p5622460
- https://artofproblemsolving.com/community/q4h1497881p9354029
$endgroup$
Here are some problems:
- Find the sum of the number of all continuous runs of all possible sequences with $2019$ ones and $2019$ zeros
- https://artofproblemsolving.com/community/c6h366278p2018435
- https://artofproblemsolving.com/community/c6h60752p366512
- https://artofproblemsolving.com/community/q2h1151650p5452212
- http://artofproblemsolving.com/community/c6h1170845p5622460
- https://artofproblemsolving.com/community/q4h1497881p9354029
edited 5 hours ago
answered 5 hours ago
AquaAqua
58.3k15 gold badges73 silver badges146 bronze badges
58.3k15 gold badges73 silver badges146 bronze badges
add a comment |
add a comment |
$begingroup$
This might be a good example. Prove that for any positive integers $m,n$ with $m< n$,
$$
sum_k=0^n fracbinommkbinomnk=fracn+1n-m+1.
$$
Solution: Imagine you have a deck of $n$ cards, $m$ of which are black, the rest of which are red. You shuffle the deck, and deal from the top until you deal a red card. What is the expected number of cards dealt?
Letting $X$ be the number of cards dealt, you have $P(X>k)=binommk/binomnk$, since $X>k$ occurs if and only if the first $k$ cards are all black. Therefore,
$$
E[X]=sum_k=0^nP(X>k)=sum_k=0^n fracbinommkbinomnk
$$
On the other hand, the $n-m$ red cards divide the deck into $n-m+1$ sections of black cards. Each black card is equally likely to fall in each section. In particular, there is a probability of $1/(n-m+1)$ of it falling in the top section. By linearity of expectation, the expected number of black cards in the top section is $m/(n-m+1)$. Therefore, the expected number of cards dealt is
$$
1+fracmn-m+1=fracn+1n-m+1
$$
I think there are a lot of complicated combinatorial sums which can be made easy using linearity of expectation. For example,
$$
sum_k=0^nkbinomnkp^k(1-p)^n-k=np
$$
is best understood by realizing the sum is the expected value of a binomial random variable with parameters $n$ and $p$, whose value is obviously $np$ by linearity of expectation. There, the sum is not hard, but there are probability examples in a similar vein.
$endgroup$
add a comment |
$begingroup$
This might be a good example. Prove that for any positive integers $m,n$ with $m< n$,
$$
sum_k=0^n fracbinommkbinomnk=fracn+1n-m+1.
$$
Solution: Imagine you have a deck of $n$ cards, $m$ of which are black, the rest of which are red. You shuffle the deck, and deal from the top until you deal a red card. What is the expected number of cards dealt?
Letting $X$ be the number of cards dealt, you have $P(X>k)=binommk/binomnk$, since $X>k$ occurs if and only if the first $k$ cards are all black. Therefore,
$$
E[X]=sum_k=0^nP(X>k)=sum_k=0^n fracbinommkbinomnk
$$
On the other hand, the $n-m$ red cards divide the deck into $n-m+1$ sections of black cards. Each black card is equally likely to fall in each section. In particular, there is a probability of $1/(n-m+1)$ of it falling in the top section. By linearity of expectation, the expected number of black cards in the top section is $m/(n-m+1)$. Therefore, the expected number of cards dealt is
$$
1+fracmn-m+1=fracn+1n-m+1
$$
I think there are a lot of complicated combinatorial sums which can be made easy using linearity of expectation. For example,
$$
sum_k=0^nkbinomnkp^k(1-p)^n-k=np
$$
is best understood by realizing the sum is the expected value of a binomial random variable with parameters $n$ and $p$, whose value is obviously $np$ by linearity of expectation. There, the sum is not hard, but there are probability examples in a similar vein.
$endgroup$
add a comment |
$begingroup$
This might be a good example. Prove that for any positive integers $m,n$ with $m< n$,
$$
sum_k=0^n fracbinommkbinomnk=fracn+1n-m+1.
$$
Solution: Imagine you have a deck of $n$ cards, $m$ of which are black, the rest of which are red. You shuffle the deck, and deal from the top until you deal a red card. What is the expected number of cards dealt?
Letting $X$ be the number of cards dealt, you have $P(X>k)=binommk/binomnk$, since $X>k$ occurs if and only if the first $k$ cards are all black. Therefore,
$$
E[X]=sum_k=0^nP(X>k)=sum_k=0^n fracbinommkbinomnk
$$
On the other hand, the $n-m$ red cards divide the deck into $n-m+1$ sections of black cards. Each black card is equally likely to fall in each section. In particular, there is a probability of $1/(n-m+1)$ of it falling in the top section. By linearity of expectation, the expected number of black cards in the top section is $m/(n-m+1)$. Therefore, the expected number of cards dealt is
$$
1+fracmn-m+1=fracn+1n-m+1
$$
I think there are a lot of complicated combinatorial sums which can be made easy using linearity of expectation. For example,
$$
sum_k=0^nkbinomnkp^k(1-p)^n-k=np
$$
is best understood by realizing the sum is the expected value of a binomial random variable with parameters $n$ and $p$, whose value is obviously $np$ by linearity of expectation. There, the sum is not hard, but there are probability examples in a similar vein.
$endgroup$
This might be a good example. Prove that for any positive integers $m,n$ with $m< n$,
$$
sum_k=0^n fracbinommkbinomnk=fracn+1n-m+1.
$$
Solution: Imagine you have a deck of $n$ cards, $m$ of which are black, the rest of which are red. You shuffle the deck, and deal from the top until you deal a red card. What is the expected number of cards dealt?
Letting $X$ be the number of cards dealt, you have $P(X>k)=binommk/binomnk$, since $X>k$ occurs if and only if the first $k$ cards are all black. Therefore,
$$
E[X]=sum_k=0^nP(X>k)=sum_k=0^n fracbinommkbinomnk
$$
On the other hand, the $n-m$ red cards divide the deck into $n-m+1$ sections of black cards. Each black card is equally likely to fall in each section. In particular, there is a probability of $1/(n-m+1)$ of it falling in the top section. By linearity of expectation, the expected number of black cards in the top section is $m/(n-m+1)$. Therefore, the expected number of cards dealt is
$$
1+fracmn-m+1=fracn+1n-m+1
$$
I think there are a lot of complicated combinatorial sums which can be made easy using linearity of expectation. For example,
$$
sum_k=0^nkbinomnkp^k(1-p)^n-k=np
$$
is best understood by realizing the sum is the expected value of a binomial random variable with parameters $n$ and $p$, whose value is obviously $np$ by linearity of expectation. There, the sum is not hard, but there are probability examples in a similar vein.
answered 6 hours ago
Mike EarnestMike Earnest
31.5k2 gold badges27 silver badges58 bronze badges
31.5k2 gold badges27 silver badges58 bronze badges
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3344468%2fcombinatorics-problems-that-can-be-solved-more-easily-using-probability%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown