Human beings are incredibly keen to see patterns everywhere, to the extent that they see patterns where there are no patterns at all. That's why you hear someone calling your name in the white noise in the shower, or you see a face in the folds of the pillow in the dark, or animals in the clouds. You can understand why this might have been useful for our ancestors – a "false positive", where you see a sabre-toothed cat in the weeds but there isn't really one, might be annoying, but a "false negative", where you don't see a sabre-toothed cat in the weeds but there actually is one and it eats you, would be more of a problem.
We see patterns in random data as well. We want to see causes, not just the workings of chance. Again, that could have been helpful to our ancestors: If three people get cholera around the water hole, it might be a fluke, but you might not want to drink from that water hole anyway, to be on the safe side. The Nobel Prize-winning psychologist Daniel Kahneman calls this "belief in the law of small numbers": the conviction that we can extrapolate from small samples to the wider world. It's also known as the "representative heuristic".
The trouble is when it leads us to make bad decisions. In the Second World War, says Kahneman in his book Thinking, Fast and Slow, Londoners noticed that parts of their city hadn't been hit by V2 bombs – so they decided that the Germans had spies in those areas whom they were trying to keep safe. People avoided the areas where the bombs had previously landed. But an analysis of the strikes showed that they were landing exactly as you'd expect if they were completely random.
We don't need to worry so much about V2 bombs these days, but we still have the same problem. People who see a pattern to how often certain numbers come up in the lottery, and sportspeople who always play in the same lucky socks after they scored in them the first time, are falling victim to the belief in the law of small numbers.