The New York Times has an obituary for child neurologist Peter Huttenlocher, who surprised everyone by finding that the human brain loses connections as part of growing into adulthood.
Huttenlocher counted synapses – the connections between neurons – and as a paediatric neurologist was particularly interested in how the number of synapses changed as we grow from children to adults.
Before Huttenlocher’s work we tended to think that our brain’s just got more connected as we got older, but what he showed was that we hit peak connectivity in the first year of life and much of brain development is actually removing the unneeded connections.
This is know as synpatic pruning and it was demonstrated with this graph from classic 1990 paper.
I love this graph for a couple of reasons. Firstly, it’s a bit wonky. It was hand-drawn and whenever it is reproduced, as it has been in many textbooks, it’s always a bit off-centre.
Secondly, it’s crystal clear. It’s a graph showing the density of synaptic connections in the visual cortex of the human brain and you can see it’s rapidly downhill from the first year of life until the late teens where things start to even out.
This is a good thing as the infant brain starts over-connected but loses anything that isn’t needed as we learn which skills are most important, and we are left with only the most efficient neural connections, through the experience of growing up.
One of Huttenlocher’s discoveries was that this process of synaptic pruning may go wrong in people who have neurodevelopmental disorders.
Link to NYT obituary for Peter Huttenlocher.
What in the world does “only the most efficient neural connections” mean? And how does the brain decide which synaptic connections are needed and which are not? I don’t think any high level thought process is going on here. My guess is that the explanations of what is going on is flawed by a romantic misunderstanding of reality.
Fantastic graph indeed! You should definitely post classic papers of that sort more often!
Thanks.
@Mason Kelsey
Neural networking software (which is a software simulation of hardware ‘neuron’ components) performs learning the same way that neurons do. It was actually developed from Computational neuroscience (a field belonging to Neurology).
Computational neuroscience first began modelling how neurons connect and disconnect to perfrom information processing systems in the 1950’s on the earliest ‘super’ computers.
They were able to do this because all information processing systems work on the same basic principle. For example a transister is either ‘on’ or it’s ‘off’, a neuron either fires or it doesn’t. The differnce between the two was that neurons made more then just one connection to another neuron. Also neurons could add and remove connections as needed, where transisters could only operate the way they were intitally connected (this is why software is needed as an ‘extension of hardware’ to give insturctions that can change). Software is basiclaly a set of on and off states fed into transisters arranged to process them in particular way.
“only the most efficient neural connections”
Do you know what neural networking is commericaly used for? One example is how Automotive manufactures use Neural Networking software to control robot arms on assembly lines. This mean they don’t need to program them, they just move the arms to teach them how to weld a particular way. The robots then learn and tweak slightly what they are doing to find a more efficient way to weld. They get better at it the more they do it the same way a human brain would.
Only the sets of connections which are the simpilest remain, and overly complex instructions are pruned during the learning process. Otherwise too many neurons are wasted doing an unnecessary task. 🙂
The neurons do the pruning themselves based on the degree of activity of a particular circuit. It’s bascially a feedback system, but I have no idea how exactly neurons in the brain do it. A software simulation does it with mathmatics.
Manually Pruning a neural networks (for the purpose of significantly reducing enormous CPU/memory load) is referred to as ‘Brain Damage’ in computer jargon.
Some degree of re-learning is needed to repair neural networks after pruning, they are smaller but usually slightly less efficient after manual pruning. Techncially that’s what traumatic brain injury in humans amounts to.
Would you like to see a technical explaniation?
Click to access 00080236-a%20simple%20procedure%20for%20pruning%20back-propagation%20networks.pdf
Thanks, Steve. A basic assumption being made is that pruning is desirable. I wonder what effects would occur for children who have less pruning, or none at all. Would they learn even faster? Having AI neural networks mimic human pruning seems a bit odd. Why do that? Perhaps the real evolutionary results of pruning is to do nothing beyond making the human child dependent on the parents that much longer.
I’ve also noticed that the leveling off of pruning only occurs about the age when formal education begins. It is hard to tell from the graph because it doesn’t include any error range. Is there more than a correlative relationship here? Would pruning continue if the child were not exposed to formal education, as in the case of feral children? If so it provides a wild suggestion that civilization is the cessation of pruning and that pruning is undesirable.
Mason: This research has already been done. Since the initial discovery animal research has shown that the same pruning occurs in other species. There are also ways to block the pruning from occurring and when that has been done in animals they are dysfunctional. For humans it is important to note that the beginning of this process aligns with when we start to learn language.
The thing is that the physical structure of the brain and how it interconnects is part of how it stores information and behaviours. As the brain prunes unneeded connections it helps create the person. A block of marble isn’t a statue until the sculptor chisels it down. The brain begins as a mass of connections and overtime some are strengthened, some are weakened, and some are removed. The result is our behaviours, skills, and abilities. Pruning is part of the process that takes a mass of nerve cells and turns it into a functional human brain.
Thanks, chigaze. The implication is that memories are not just a process of building up resonance structures by creating new synapses, but that pruning of existing synapses is also part of the process of learning. The fact that there is a leveling off of that process also implies that learning slows down for most folks by the age of 10 for humans.
Are there any known cases of dysfunctional humans because of the lack of pruning? And what is meant by “dysfunctional” in animals. That is a bit vague. Thanks.
Mason: This research has already been done. Since the initial discovery animal research has shown that the same pruning occurs in other species. There are also ways to block the pruning from occurring and when that has been done in animals they are dysfunctional. For humans it is important to note that the beginning of this process aligns with when we start to learn language.
The thing is that the physical structure of the brain and how it interconnects is part of how it stores information and behaviours. As the brain prunes unneeded connections it helps create the person. A block of marble isn’t a statue until the sculptor chisels it down. Pruning is what helps change the brain from a mass of nerve cells into something functional.
@Mason Kelsey
“Perhaps the real evolutionary results of pruning is to do nothing beyond making the human child dependent on the parents that much longer.”
Haha, I appreciate this.
“Having AI neural networks mimic human pruning seems a bit odd. Why do that?”
Well, there’s a couple of reasons. There’s only one way to perform information processing; connections in circuits must add and delete connections as neccessary to ‘learn’. The first soultion is not perfect (overly complicated), so pruning increases efficiency as conections are changed around. Evolved information processing systems are usualy more efficient then engineered ones because they evolved with limited resources – so find the best soultions. Simulated neural networks are extremely inefficient to simulate, so imitating biological neurons is the best approach increase efficiency.
“Is there more than a correlative relationship here?”
lol I dunno.
Mason: This research has already been done. Since the initial discovery animal research has shown that the same pruning occurs in other species. There are also ways to block the pruning from occurring and when that has been done in animals they are dysfunctional. For humans it is important to note that the beginning of this process aligns with when we start to learn language.
So does this mean that as we get older we become more hard set in our ways neurologically speaking? Less capability of starting new neural pathways?