Probability distribution vs. probability mass function / Probability density function terms: what's the...












1












$begingroup$


I struggle to understand what's the difference between Probability distribution (https://en.wikipedia.org/wiki/Probability_distribution) vs. probability mass function (https://en.wikipedia.org/wiki/Probability_mass_function) or Probability density function (https://en.wikipedia.org/wiki/Probability_density_function). Both probability distribution and lets say PMF seem to reflect probability of values of a random variable. Note that I do not ask the difference between PDF and PMF.



Consider the following example when a 4-sides dice is rolled twice. X is the sum of two throws. I calculate the probability mass function (left) and then show the result graphically (right). But it seems that it is fair to call this graph probability distribution. Isn't it?



Thanks!



enter image description here










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
    $endgroup$
    – Ramiro Scorolli
    Dec 3 '18 at 18:45












  • $begingroup$
    In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:53












  • $begingroup$
    Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
    $endgroup$
    – John
    Dec 3 '18 at 18:54










  • $begingroup$
    My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:58












  • $begingroup$
    My response was for the first answer :)
    $endgroup$
    – John
    Dec 3 '18 at 19:01
















1












$begingroup$


I struggle to understand what's the difference between Probability distribution (https://en.wikipedia.org/wiki/Probability_distribution) vs. probability mass function (https://en.wikipedia.org/wiki/Probability_mass_function) or Probability density function (https://en.wikipedia.org/wiki/Probability_density_function). Both probability distribution and lets say PMF seem to reflect probability of values of a random variable. Note that I do not ask the difference between PDF and PMF.



Consider the following example when a 4-sides dice is rolled twice. X is the sum of two throws. I calculate the probability mass function (left) and then show the result graphically (right). But it seems that it is fair to call this graph probability distribution. Isn't it?



Thanks!



enter image description here










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
    $endgroup$
    – Ramiro Scorolli
    Dec 3 '18 at 18:45












  • $begingroup$
    In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:53












  • $begingroup$
    Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
    $endgroup$
    – John
    Dec 3 '18 at 18:54










  • $begingroup$
    My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:58












  • $begingroup$
    My response was for the first answer :)
    $endgroup$
    – John
    Dec 3 '18 at 19:01














1












1








1





$begingroup$


I struggle to understand what's the difference between Probability distribution (https://en.wikipedia.org/wiki/Probability_distribution) vs. probability mass function (https://en.wikipedia.org/wiki/Probability_mass_function) or Probability density function (https://en.wikipedia.org/wiki/Probability_density_function). Both probability distribution and lets say PMF seem to reflect probability of values of a random variable. Note that I do not ask the difference between PDF and PMF.



Consider the following example when a 4-sides dice is rolled twice. X is the sum of two throws. I calculate the probability mass function (left) and then show the result graphically (right). But it seems that it is fair to call this graph probability distribution. Isn't it?



Thanks!



enter image description here










share|cite|improve this question











$endgroup$




I struggle to understand what's the difference between Probability distribution (https://en.wikipedia.org/wiki/Probability_distribution) vs. probability mass function (https://en.wikipedia.org/wiki/Probability_mass_function) or Probability density function (https://en.wikipedia.org/wiki/Probability_density_function). Both probability distribution and lets say PMF seem to reflect probability of values of a random variable. Note that I do not ask the difference between PDF and PMF.



Consider the following example when a 4-sides dice is rolled twice. X is the sum of two throws. I calculate the probability mass function (left) and then show the result graphically (right). But it seems that it is fair to call this graph probability distribution. Isn't it?



Thanks!



enter image description here







probability probability-theory probability-distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 3 '18 at 19:07







John

















asked Dec 3 '18 at 18:32









JohnJohn

1236




1236








  • 1




    $begingroup$
    The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
    $endgroup$
    – Ramiro Scorolli
    Dec 3 '18 at 18:45












  • $begingroup$
    In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:53












  • $begingroup$
    Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
    $endgroup$
    – John
    Dec 3 '18 at 18:54










  • $begingroup$
    My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:58












  • $begingroup$
    My response was for the first answer :)
    $endgroup$
    – John
    Dec 3 '18 at 19:01














  • 1




    $begingroup$
    The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
    $endgroup$
    – Ramiro Scorolli
    Dec 3 '18 at 18:45












  • $begingroup$
    In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:53












  • $begingroup$
    Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
    $endgroup$
    – John
    Dec 3 '18 at 18:54










  • $begingroup$
    My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
    $endgroup$
    – drhab
    Dec 3 '18 at 18:58












  • $begingroup$
    My response was for the first answer :)
    $endgroup$
    – John
    Dec 3 '18 at 19:01








1




1




$begingroup$
The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
$endgroup$
– Ramiro Scorolli
Dec 3 '18 at 18:45






$begingroup$
The key is that the Probability Mass Function is associated to discrete random variables, while the Probability Distribution Function is associated to continuous random variables. There are some authors that use this while others simply call it PDF. While the PMF is defined as $P(X=x)$ the PDF could be heuristically defined as $P(x+epsilon geq X geq x-epsilon)$, since for a continuous r.v. $P(X=x)=0$
$endgroup$
– Ramiro Scorolli
Dec 3 '18 at 18:45














$begingroup$
In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
$endgroup$
– drhab
Dec 3 '18 at 18:53






$begingroup$
In the context of measure theory a probability distribution is commonly a pushforward-measure induced by a random variable/vector. This is the Kolmogorov-definition and the measure is prescribed by $Bmapsto P(Xin B)$. Using the Kolmogorov-definition we do not have to distinguish between concepts like discrete distributions (linked to a PMF), absolutely continuous distributions (linked to a PDF) or something in between.
$endgroup$
– drhab
Dec 3 '18 at 18:53














$begingroup$
Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
$endgroup$
– John
Dec 3 '18 at 18:54




$begingroup$
Thanks, but I am asking not about "Probability Distribution Function", but about "Probability distribution" (en.wikipedia.org/wiki/Probability_distribution). Probability distribution can be binomial which is discrete.
$endgroup$
– John
Dec 3 '18 at 18:54












$begingroup$
My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
$endgroup$
– drhab
Dec 3 '18 at 18:58






$begingroup$
My former comment concerns "probability distribution" and the Kolmogorov-definition can be found on the link you provide.
$endgroup$
– drhab
Dec 3 '18 at 18:58














$begingroup$
My response was for the first answer :)
$endgroup$
– John
Dec 3 '18 at 19:01




$begingroup$
My response was for the first answer :)
$endgroup$
– John
Dec 3 '18 at 19:01










2 Answers
2






active

oldest

votes


















2












$begingroup$

"Probability distribution" is a general term describing a mathematical entity that is represented by the "cumulative distribution function" (or just "distribution function") and also by its "probability mass function" or "probability density function" (or just "density"), when it exists.



For example the following sentence is perfectly correct even though a bit wordy: "the cumulative distribution function of the Normal probability distribution is XXX, while its probability density function is YYY".



As to what your graph reflects, the cumulative distribution function is non-decreasing by definition.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you a lot! You answer was very helpful.
    $endgroup$
    – John
    Dec 4 '18 at 14:46










  • $begingroup$
    But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
    $endgroup$
    – John
    Dec 4 '18 at 16:09










  • $begingroup$
    @John. Indeed, for a discrete random variable. Where is the problem?
    $endgroup$
    – Alecos Papadopoulos
    Dec 4 '18 at 20:47










  • $begingroup$
    But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
    $endgroup$
    – John
    Dec 5 '18 at 8:32










  • $begingroup$
    @John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
    $endgroup$
    – Alecos Papadopoulos
    Dec 5 '18 at 9:12



















0












$begingroup$

Probability density functions are always associated with continuous random variables. Continuous variables, are variables which can be measured, such as time, and length. However, a probability mass function is a function which only takes in discrete values for its random variable. However; in both cases the function must satisfy two conditions in order to be a PDF or PMF: 1) The honesty condition (The sum of all the values or outcomes must equal one for discrete cases, and integral for continuous cases). 2) Given any outcome, x, the function f(x) must be between 0 and 1. A probability distribution is a function which assigns a probability to an outcome.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
    $endgroup$
    – John
    Dec 3 '18 at 19:52










  • $begingroup$
    Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
    $endgroup$
    – John
    Dec 3 '18 at 19:57










  • $begingroup$
    The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
    $endgroup$
    – drhab
    Dec 4 '18 at 7:35











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024486%2fprobability-distribution-vs-probability-mass-function-probability-density-fun%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









2












$begingroup$

"Probability distribution" is a general term describing a mathematical entity that is represented by the "cumulative distribution function" (or just "distribution function") and also by its "probability mass function" or "probability density function" (or just "density"), when it exists.



For example the following sentence is perfectly correct even though a bit wordy: "the cumulative distribution function of the Normal probability distribution is XXX, while its probability density function is YYY".



As to what your graph reflects, the cumulative distribution function is non-decreasing by definition.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you a lot! You answer was very helpful.
    $endgroup$
    – John
    Dec 4 '18 at 14:46










  • $begingroup$
    But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
    $endgroup$
    – John
    Dec 4 '18 at 16:09










  • $begingroup$
    @John. Indeed, for a discrete random variable. Where is the problem?
    $endgroup$
    – Alecos Papadopoulos
    Dec 4 '18 at 20:47










  • $begingroup$
    But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
    $endgroup$
    – John
    Dec 5 '18 at 8:32










  • $begingroup$
    @John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
    $endgroup$
    – Alecos Papadopoulos
    Dec 5 '18 at 9:12
















2












$begingroup$

"Probability distribution" is a general term describing a mathematical entity that is represented by the "cumulative distribution function" (or just "distribution function") and also by its "probability mass function" or "probability density function" (or just "density"), when it exists.



For example the following sentence is perfectly correct even though a bit wordy: "the cumulative distribution function of the Normal probability distribution is XXX, while its probability density function is YYY".



As to what your graph reflects, the cumulative distribution function is non-decreasing by definition.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you a lot! You answer was very helpful.
    $endgroup$
    – John
    Dec 4 '18 at 14:46










  • $begingroup$
    But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
    $endgroup$
    – John
    Dec 4 '18 at 16:09










  • $begingroup$
    @John. Indeed, for a discrete random variable. Where is the problem?
    $endgroup$
    – Alecos Papadopoulos
    Dec 4 '18 at 20:47










  • $begingroup$
    But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
    $endgroup$
    – John
    Dec 5 '18 at 8:32










  • $begingroup$
    @John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
    $endgroup$
    – Alecos Papadopoulos
    Dec 5 '18 at 9:12














2












2








2





$begingroup$

"Probability distribution" is a general term describing a mathematical entity that is represented by the "cumulative distribution function" (or just "distribution function") and also by its "probability mass function" or "probability density function" (or just "density"), when it exists.



For example the following sentence is perfectly correct even though a bit wordy: "the cumulative distribution function of the Normal probability distribution is XXX, while its probability density function is YYY".



As to what your graph reflects, the cumulative distribution function is non-decreasing by definition.






share|cite|improve this answer









$endgroup$



"Probability distribution" is a general term describing a mathematical entity that is represented by the "cumulative distribution function" (or just "distribution function") and also by its "probability mass function" or "probability density function" (or just "density"), when it exists.



For example the following sentence is perfectly correct even though a bit wordy: "the cumulative distribution function of the Normal probability distribution is XXX, while its probability density function is YYY".



As to what your graph reflects, the cumulative distribution function is non-decreasing by definition.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Dec 4 '18 at 3:56









Alecos PapadopoulosAlecos Papadopoulos

8,23811535




8,23811535












  • $begingroup$
    Thank you a lot! You answer was very helpful.
    $endgroup$
    – John
    Dec 4 '18 at 14:46










  • $begingroup$
    But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
    $endgroup$
    – John
    Dec 4 '18 at 16:09










  • $begingroup$
    @John. Indeed, for a discrete random variable. Where is the problem?
    $endgroup$
    – Alecos Papadopoulos
    Dec 4 '18 at 20:47










  • $begingroup$
    But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
    $endgroup$
    – John
    Dec 5 '18 at 8:32










  • $begingroup$
    @John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
    $endgroup$
    – Alecos Papadopoulos
    Dec 5 '18 at 9:12


















  • $begingroup$
    Thank you a lot! You answer was very helpful.
    $endgroup$
    – John
    Dec 4 '18 at 14:46










  • $begingroup$
    But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
    $endgroup$
    – John
    Dec 4 '18 at 16:09










  • $begingroup$
    @John. Indeed, for a discrete random variable. Where is the problem?
    $endgroup$
    – Alecos Papadopoulos
    Dec 4 '18 at 20:47










  • $begingroup$
    But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
    $endgroup$
    – John
    Dec 5 '18 at 8:32










  • $begingroup$
    @John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
    $endgroup$
    – Alecos Papadopoulos
    Dec 5 '18 at 9:12
















$begingroup$
Thank you a lot! You answer was very helpful.
$endgroup$
– John
Dec 4 '18 at 14:46




$begingroup$
Thank you a lot! You answer was very helpful.
$endgroup$
– John
Dec 4 '18 at 14:46












$begingroup$
But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
$endgroup$
– John
Dec 4 '18 at 16:09




$begingroup$
But does not CDF should not look like this? probabilitycourse.com/chapter3/3_2_1_cdf.php (Fig. 3.4)
$endgroup$
– John
Dec 4 '18 at 16:09












$begingroup$
@John. Indeed, for a discrete random variable. Where is the problem?
$endgroup$
– Alecos Papadopoulos
Dec 4 '18 at 20:47




$begingroup$
@John. Indeed, for a discrete random variable. Where is the problem?
$endgroup$
– Alecos Papadopoulos
Dec 4 '18 at 20:47












$begingroup$
But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
$endgroup$
– John
Dec 5 '18 at 8:32




$begingroup$
But in your answer you said that my graph (in the original question) reflects CDF. But in my original graph there is no cumulative aspect, in my view....
$endgroup$
– John
Dec 5 '18 at 8:32












$begingroup$
@John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
$endgroup$
– Alecos Papadopoulos
Dec 5 '18 at 9:12




$begingroup$
@John I did no such thing. Perhaps one more careful reading of my answrer would be in order.
$endgroup$
– Alecos Papadopoulos
Dec 5 '18 at 9:12











0












$begingroup$

Probability density functions are always associated with continuous random variables. Continuous variables, are variables which can be measured, such as time, and length. However, a probability mass function is a function which only takes in discrete values for its random variable. However; in both cases the function must satisfy two conditions in order to be a PDF or PMF: 1) The honesty condition (The sum of all the values or outcomes must equal one for discrete cases, and integral for continuous cases). 2) Given any outcome, x, the function f(x) must be between 0 and 1. A probability distribution is a function which assigns a probability to an outcome.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
    $endgroup$
    – John
    Dec 3 '18 at 19:52










  • $begingroup$
    Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
    $endgroup$
    – John
    Dec 3 '18 at 19:57










  • $begingroup$
    The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
    $endgroup$
    – drhab
    Dec 4 '18 at 7:35
















0












$begingroup$

Probability density functions are always associated with continuous random variables. Continuous variables, are variables which can be measured, such as time, and length. However, a probability mass function is a function which only takes in discrete values for its random variable. However; in both cases the function must satisfy two conditions in order to be a PDF or PMF: 1) The honesty condition (The sum of all the values or outcomes must equal one for discrete cases, and integral for continuous cases). 2) Given any outcome, x, the function f(x) must be between 0 and 1. A probability distribution is a function which assigns a probability to an outcome.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
    $endgroup$
    – John
    Dec 3 '18 at 19:52










  • $begingroup$
    Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
    $endgroup$
    – John
    Dec 3 '18 at 19:57










  • $begingroup$
    The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
    $endgroup$
    – drhab
    Dec 4 '18 at 7:35














0












0








0





$begingroup$

Probability density functions are always associated with continuous random variables. Continuous variables, are variables which can be measured, such as time, and length. However, a probability mass function is a function which only takes in discrete values for its random variable. However; in both cases the function must satisfy two conditions in order to be a PDF or PMF: 1) The honesty condition (The sum of all the values or outcomes must equal one for discrete cases, and integral for continuous cases). 2) Given any outcome, x, the function f(x) must be between 0 and 1. A probability distribution is a function which assigns a probability to an outcome.






share|cite|improve this answer











$endgroup$



Probability density functions are always associated with continuous random variables. Continuous variables, are variables which can be measured, such as time, and length. However, a probability mass function is a function which only takes in discrete values for its random variable. However; in both cases the function must satisfy two conditions in order to be a PDF or PMF: 1) The honesty condition (The sum of all the values or outcomes must equal one for discrete cases, and integral for continuous cases). 2) Given any outcome, x, the function f(x) must be between 0 and 1. A probability distribution is a function which assigns a probability to an outcome.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 3 '18 at 19:34

























answered Dec 3 '18 at 19:29







user612135



















  • $begingroup$
    Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
    $endgroup$
    – John
    Dec 3 '18 at 19:52










  • $begingroup$
    Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
    $endgroup$
    – John
    Dec 3 '18 at 19:57










  • $begingroup$
    The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
    $endgroup$
    – drhab
    Dec 4 '18 at 7:35


















  • $begingroup$
    Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
    $endgroup$
    – John
    Dec 3 '18 at 19:52










  • $begingroup$
    Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
    $endgroup$
    – John
    Dec 3 '18 at 19:57










  • $begingroup$
    The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
    $endgroup$
    – drhab
    Dec 4 '18 at 7:35
















$begingroup$
Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
$endgroup$
– John
Dec 3 '18 at 19:52




$begingroup$
Thank you. In my figure, did I plot: a) probability distribution; b) PMF c)both ?
$endgroup$
– John
Dec 3 '18 at 19:52












$begingroup$
Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
$endgroup$
– John
Dec 3 '18 at 19:57




$begingroup$
Here it is written that "PMF p(S) specifies the probability distribution for the sum S of counts from two dice." Will you agree? en.wikipedia.org/wiki/Probability_distribution#/media/…
$endgroup$
– John
Dec 3 '18 at 19:57












$begingroup$
The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
$endgroup$
– drhab
Dec 4 '18 at 7:35




$begingroup$
The second condition must be satisfied by a PMF but not necessarily by a PDF. The values taken by a PDF are not probabilities and can exceed $1$.
$endgroup$
– drhab
Dec 4 '18 at 7:35


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024486%2fprobability-distribution-vs-probability-mass-function-probability-density-fun%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Quarter-circle Tiles

build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

Mont Emei