Randomness in the Digits of Square Roots
Each of these "walks" is a plot of the accumulated total of the digits of the square root of a number. The digits are normalized so that the midpoint of their range is always zero. Despite this simple definition, these sequences seem for practical purposes random.
Each successive step goes up or down by an amount determined by the successive digits in the square root, normalized so that the middle of the possible digits corresponds to 0.
Can you find a case where the curve is not trivial, but still has some form of recognizable regularity? No such example is currently known.