Let f be a random multiplicative function, and consider the sum of f(n) over a short interval x ≤ n ≤ x+y (where y = o(x) as x tends to infinity). Thanks to work of Chatterjee-Soundararajan and Soundararajan-Xu, it is known that these sums have a Gaussian limiting distribution when rescaled by their standard deviation, provided x/y is at least a certain power of log x. On the other hand, work of Harper and of Caich implies that these sums will converge to zero when rescaled by their standard deviation, if y is "close'' to x. I will report on joint work (in preparation) of myself, Soundararajan and Xu on this problem. We find that on the full range y = o(x), the sums have a Gaussian limiting distribution when rescaled properly, but the correct scaling factor changes as y approaches x. In contrast, when y ~ x there is no rescaling under which the sums have a (non-degenerate) Gaussian limit.