• Title: Marginal Distributions

  • Series: Probability Theory

  • YouTube-Title: Probability Theory 20 | Marginal Distributions

  • Bright video: https://youtu.be/wJf1mJoxZk8

  • Dark video: https://youtu.be/u6YuSebR82g

  • Quiz: Test your knowledge

  • PDF: Download PDF version of the bright video

  • Dark-PDF: Download PDF version of the dark video

  • Print-PDF: Download printable PDF version

  • Thumbnail (bright): Download PNG

  • Thumbnail (dark): Download PNG

  • Subtitle on GitHub: pt20_sub_eng.srt missing

  • Timestamps (n/a)
  • Subtitle in English (n/a)
  • Quiz Content

    Q1: Let $X : \Omega \rightarrow \mathbb{R}^2 $ be a random vector which is uniformly distributed on the unit square $[0,1] \times [0,1]$. What is the correct marginal probability density function?

    A1: $$ f_{X_1}(t) = \begin{cases} 1 &\text{ for } t \in [0,1] \ 0 &\text{ else } \end{cases} $$

    A2: $$ f_{X_1}(t) = \begin{cases} 1 - t &\text{ for } t \in [0,1] \ 0 &\text{ else } \end{cases} $$

    A3: $$ f_{X_1}(t) = \begin{cases} t^2 &\text{ for } t \in [0,1] \ 0 &\text{ else } \end{cases} $$

    A4: $$ f_{X_1}(t) = \begin{cases} 1+t &\text{ for } t \in [0,1] \ 0 &\text{ else } \end{cases} $$

    Q2: Let $X : \Omega \rightarrow \mathbb{R}^2 $ be a random vector with components $X_1: \Omega \rightarrow \mathbb{R}$ and $X_2: \Omega \rightarrow \mathbb{R}$. If $X_1$ and $X_2$ are (absolutely) continuous random variables, what is a correct implication for the marginal density functions?

    A1: $$ f_{X}(s,t) = f_{X_1}(s) f_{X_2}(t) $$ for all $s,t$ implies that $X_1$ and $X_2$ are independent.

    A2: $$ f_{X}(s,t) = f_{X_1}(s) + f_{X_2}(t) $$ for all $s,t$ implies that $X_1$ and $X_2$ are independent.

    A3: $$ f_{X}(s,t) = f_{X_1}(s) f_{X_2}(t) $$ for all $s,t$ implies that $X_1$ and $X_2$ are not independent.

    A4: $$ f_{X}(s,t) = f_{X_1}(s) + f_{X_2}(t) $$ for all $s,t$ implies that $X_1$ and $X_2$ are not independent.

  • Last update: 2024-10

  • Back to overview page