Let $X_1, X_2, \cdots, X_n$ be a sequence of random variables (r.v.'s) and put $S_m = \sum^m_{\nu = 1} X_\nu, 1 \leqq m \leqq n$. It is well-known that \begin{equation*}\tag{(1)}E|S_n|^r \leqq n^{r - 1} \sum^n_{\nu = 1} E|X_\nu|^r\quad r > 1,\end{equation*} $E|S_n|^r \leqq \sum^n_{\nu = 1} E|X_|nu|^r,\quad r \leqq 1.$ However, if the r.v.'s satisfy the relations \begin{equation*}\tag{2}E(X_{m + 1} \mid S_m) = 0 \text{a.s.}\quad 1 \leqq m \leqq n - 1,\end{equation*} it is possible to improve the first inequality considerably. The case $r > 2$ with independent r.v.'s will be treated elsewhere by one of the authors, von Bahr. If $r = 2$, we have, under (2), \begin{equation*}\tag{3}ES^2_n = \sum^n_{\nu = 1} EX^2_\nu.\end{equation*} In the case $1 \leqq r \leqq 2$, we will show that under (2) \begin{equation*}\tag{(4)}E|S_n|^r \leqq C(r, n) \sum^n_{\nu = 1}e|X_\nu|^r,\end{equation*} where $C(r, n)$ is a bounded function of $r$ and $n$. In Theorem 2 we show that (4) is true with $C(r, n) = 2$. If the distribution of each $X_{m + 1}$ conditioned by $S_m$ is symmetric about zero, one can put $C(r, n) = 1$ (Theorem 1). Further, if the r.v.'s satisfy the following conditions \begin{equation*}\tag{(5)}E(X_i \mid R_{mi}) = 0\text{a.s.}\quad 1 \leqq i \leqq m + 1 \leqq n,\end{equation*} where $R_{mi} = \sum^{m + 1}_{\nu = 1, \nu \neq i} X_\nu$ it is possible to put $C(r, n) = 2 - n^{-1}$. The conditions (2) and (5) are fulfilled if the r.v.'s are independent and have zero means. In this case, however, it is possible to make $C(r, n)$ dependent on $r$, so that $C(r, n) \rightarrow 1$ as $r \rightarrow 2$. It is possible to show by an example, that (4) is not generally true with $C(r, n) = 1$ even in this case. If $1 \leqq r < s \leqq 2$ and $E|X_\nu|^s < \infty, 1 \leqq \nu \leqq n$, it is generally better not to use (4) directly, but to use it together with $E|S_n|^r \leqq (E|S_n|^s)^{r/s}$, so that $E|S_n|^r \leqq (C(s, n) \sum^n_{\nu = 1} E|X_\nu|^s)^{r/s}$. The case $r < 1$ is by (1) trivial.