Random variables $X$ and $Y$ are mutually completely dependent if there exists a one-to-one function $g$ for which $P\lbrack Y = g(X)\rbrack = 1.$ An example is presented of a pair of random variables which are mutually completely dependent, but "almost" independent. This example motivates considering a new concept of dependence, called monotone dependence, in which $g$ above is now required to be monotone. Finally, this monotone dependence concept leads to defining and studying the properties of a new numerical measure of statistical association between random variables $X$ and $Y$ defined by $\sup \{\operatorname{corr} \lbrack f(X), g(Y)\rbrack\},$ where the $\sup$ is taken over all pairs of suitable monotone functions $f$ and $g.$