Diving deeper and deeper into machine learning and neural networks I can’t help but notice how some of the techniques there are metaphors for what happens in our brains. After all, our brains a just a lot more sophisticated and larger versions of the deep computer networks that do computer vision, or predict clicks on ads.
When I first grokked backpropagation, I was in awe about how elegant it is. An iterative approach to computing the partial derivative of multi-dimensional, complex probability density function. Beauty, in the mathematical sense of the word. Then, as I learned about the more practical aspects of it, I learned about the less intuitive concept of “batch_size”.
Every round of backpropagation makes the neural network slightly better at its task, by updating the individual neurons based on the errors they make. The batch size comes from the algorighm known as SGD which will only look at small amount of examples before updating. It makes sense to look at the data one example at a time, because humans can learn from single examples. Yet, things tend to work better in, when the network updates by average amount over a batch of examples.
(Image credit: https://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/)
The computer will look at multiple examples, then it will update its brain to be better at its task. A human will have a day full of work and conversations, and then they will go to sleep. Almost as if… the human is updating their brain connections during that time. Doing a backpropagation over the batch of experiences during the day.
Here’s how I imagine the human brain. As we go with our days, our neurons collect feedback signals about all the things they did wrong. Those signals manifest themselves as “error chemicals” at each neuron, which temporary update its function. If a neuron gathers too much error chemicals, it gets tired. When too many neurons get tired, we need to sleep. Sleep will remove the error chemicals, and update the underlying neurons in the direction the error chemicals suggest. We can help this process of learning by getting enough sleep, but also by taking naps when needed and meditating.
Even if you think this “error chemical” model is nuts, I don’t disagree with you. But I think that the model is still more useful than nothing. At the least, I’ve prioritized getting good sleep recently, as I’m trying to learn a whole bunch of things, and I think sleep becomes necessary to “save” the current learnings before starting into others. I’ve also been pacing my learning through the day, realizing that taking breaks of mental rest but physical activity do help to recharge me for more learning.
And I’ve heard and read other things to suggest that “the error chemical” model is not too different from reality. For example, the amount of sleep necessary increases with age. Babies need more sleep and more often because their brains have so much to adapt to. And old people don’t need too much sleep. Furthermore, at end of every day, it often happens that our dreams reference things that happened to us, such as eating spaghetti, or playing with a dog.
And sometimes we have nightmares about trauma. PTSD is at least partially about having learned about a version of reality that doesn’t fit our normality assumptions.
Additionally, cramming for exams doesn’t work. Not if you want to get a deep and lasting understanding. Instead, by engaging in deliberate and consistent practice, it is possible to master a skill over a long time.
And not just for advanced skills. Sleep is universal among animals. Even fish sleep. Which means that sleep must have a universal and simple purpose. And updating the error chemicals makes sense even if you’re a dumb fish.
But even if you’re not a dumb fish. If you a smart human. There’s a bunch of research on the hippocampus. It’s where the hippopotamumi go to college :P. It’s the area of brain responsible for memories. And it can get full of hippos during a day of learning.
A good night of sleep helps those hippos graduate and go around the brain, looking for a place to settle down. Not getting enough sleep is like failing all the hippos, so they remain in the hippocampus for longer. They can’t graduate, but they’ve filled the hippocampus. So no new hippos can arrive. No new memories can be made until the old hippos graduate, or drop out.
Getting a good sleep later on won’t fully reverse the trouble of lost sleep. The error chemicals will saturate and the neurons will stop adapting to the new information, or they’ll start forgetting the old information.
Again, this model is approximation at best, and reflects only a small amount of the things sleep is for. But I’ll use it for my reasoning base.