Thursday, April 11, 2019

Synaptic Delays Mean Brain Signals Must Move at a Snail's Pace

Scientists have long advanced the claim that the human brain is the storage place for memories and the source of human thinking. But such claims are speech customs of scientists rather than things they have proven. There are numerous reasons for doubting such claims. One big reason is that the proteins in synapses have an average lifetime of only a few weeks, which is only a thousandth of the length of time (50 years or more) that humans can store memories. Another reason is that neurons and synapses are way too noisy to explain very accurate human memory recall, such as when a Hamlet actor flawlessly recites 1476 lines. Another general reason can be stated as follows: the human brain is too slow to account for very fast thinking and very fast memory retrieval.

Consider the question of memory retrieval. Given a prompt such as a person's name or a very short description of a person, topic or event, humans can accurately retrieve detailed information about such a topic in one or two seconds. We see this ability constantly displayed on the long-running television series Jeopardy. On that show, contestants will be given a short prompt such as “This opera by Rossini had a disastrous premier,” and within a second after hearing that, a contestant may click a buzzer and then a second later give an answer mentioning The Barber of Seville.  Similarly, you can play with a well-educated person a game you can call “Who Was I?” You just pick random names of actual people from the arts or history, and require the person to identify the person within about two seconds. Very frequently a person will succeed. We can imagine a session of such a game, occurring in only ten seconds:

John: Marconi.
Mary: Invented the radio.
John: Magellan.
Mary: First to sail around the globe.
John: Peter Falk.
Mary: A TV actor.

We can also imagine a visual version of this game, in which you identify random pictures of any of 1000 famous people. The answers would often be just as quick.

The question is: how could a brain possibly achieve retrieval and recognition so quickly? Let us suppose that the information about some person is stored in some particular group of neurons somewhere in the brain. Finding that exact tiny storage location would be like finding a needle in a haystack, or like finding just the right index card in a swimming pool full of index cards. It would also be like opening the door of some vast library with a million volumes and instantly finding the exact volume you were looking for.

There are certain design features that a system can have that will allow for very rapid retrieval of information. One of these features is an indexing system. An indexing system requires a position notation system, in which the exact position of some piece of information can be recorded. An ordinary textbook has both of these things. The position notation system is the page numbering system. The indexing system is the index at the back of the book. But the brain has neither of these features. There is nothing in the brain like a position notation system by which the exact position of some tiny group of neurons can be identified. The brain has no neuron numbers, and a brain has no coordinate system similar to street names in a city or Cartesian coordinates in a grid. Lacking any such position notation system, the brain has no indexing system (something that requires a position notation system).

So how is it that humans are able to recall things instantly? It seems that the brain has nothing like the speed features that would make such a thing possible. You can't get around such a difficulty by claiming that each memory is stored everywhere in the brain. There would be two versions of such an idea. The first would be that each memory is entirely stored in every little spot of the brain. That makes no more sense than the idea of a library in which each page contains the information in every page of every book. The second version of the idea would be that each memory is broken up and scattered across the brain. But such an idea actually worsens the problem of explaining memory retrieval, as it would only be harder to retrieve a memory if it is scattered all over your brain rather than in a single little spot of your brain.

We also cannot get around this navigation problem by imagining that when you are asked a question, your brain scans all of its stored information. That doesn't correspond to what happens in our minds. For example, if someone asks me, "Who was Teddy Roosevelt," my mind goes instantly to my memories of Teddy Roosevelt, and I don't experience little flashes of knowledge about countless other people, as if my brain were scanning all of its memories.  

When we consider the issue of decoding encoded information, we have an additional strong reason for thinking that the brain is way too slow to account for instantaneous recall of learned information.  In order for knowledge to be stored in a brain, it would have to be encoded or translated into some type of neural state. Then, when the memory is recalled, this information would have to be decoded: it would have to be translated from some stored neural state into a thought held in the mind. This requirement is the most gigantic difficulty for any claim that brains store memories. Although they typically maintain that memories are encoded and decoded in the brain, no neuroscientist has ever specified a detailed theory of how such encoding and decoding could work. Besides the huge difficulty that such a system of encoding and decoding would require a kind of "miracle of design" we would never expect for a brain to ever have naturally acquired (something a million times more complicated than the genetic code), there is the difficulty that the decoding would take quite a bit of time, a length of time greater than the time it takes to recall something. 

So suppose I have some memory of who George Patton was, stored in my brain as some kind of synapse or neural states, after that information had somehow been translated into synapse or neural states using some encoding scheme.  Then when someone asks, "Who was George Patton?" I would have to not only find this stored memory in my brain (like finding a needle in a haystack), but also translate these synapse or neural states back into an idea, so I could instantly answer, "The general in charge of the Third Army in World War II."  The time required for the decoding of the stored information would be an additional reason why instantaneous recall could never be happening if you were reading information stored in your brain.  The decoding of neurally stored memories would presumably require protein synthesis, but the synthesis of proteins requires minutes of time. 

There is another reason for doubting that the brain is fast enough to account for human mental activity. The reason is that the transmission of signals in a brain is way, way too slow to account for the very rapid speed of human thought and human memory retrieval.

Information travels about in a modern computer at a speed thousands of time faster than nerve signals travel in the human brain. If you type in "speed of brain signals" into the Google search engine, you will see in large letters the number 286 miles per hour, which is a speed of 128 meters per second. This is one of many examples of dubious information which sometimes pops up in a large font at the top of the Google search results. The particular number in question is an estimate made by an anonymous person who quotes no sources, and one who merely claims that brain signals "can" travel at such a speed, not that such a speed is the average speed of brain signals. There is a huge difference between the average speed at which some distance will be traveled and the maximum speed that part of that distance can be traveled (for example, while you may briefly drive at 40 miles per hour while traveling through Los Angeles, your average speed will be much, much less because of traffic lights). 

A more common figure you will often see quoted is that nerve signals can travel in the human brain at a rate of about 100 meters per second. But that is the maximum speed at which such a nerve signal can travel, when a nerve signal is traveling across what is called a myelinated axon. Below we see a diagram of a neuron. The axons are the tube-like parts in the diagram below.


neuron

The less sophisticated diagram below makes it clear that axons make up only part of the length that brain signals must travel.

neurons

There are two types of axons: myelinated axons and non-myelinated axons (myelinated axons having a sheath-like covering shown in blue in the diagram above). According to this article, non-myelinated axons transmit nerve signals at a slower speed of only .5-2 meters per second (roughly one meter per second). Near the end of this article is a table of measured speed of nerve signals traveling across axons in different animals; and in that table we see a variety of speeds varying between .3 meters per second (only about a foot per second) and about 100 meters per second. 

But from the mere fact that nerve signals can travel across myelinated axons at a maximum speed of about 100 meters per second, we are not at all entitled to conclude that nerve signals typically travel from one region of the brain to another at 100 meters per second. For nerve signals must also travel across dendrites and synapses, which we can see in the diagrams above. It turns out that nerve signal transmission is much slower across dendrites and synapses than across axons. To give an analogy, the axons are like a road on which you can travel fast, and the dendrites and synapses are like traffic lights or stop signs that slow down your speed.

According to neuroscientist Nikolaos C Aggelopoulos, there is an estimate of 0.5 meters per second for the speed of nerve transmission across dendrites. That is a speed 200 times slower than the nerve transmission speed commonly quoted for myelinated axons. According to Bratislav D. Stefanovic, MD, the conduction speed across dendrites is between .1 and 15 meters per second. Such a speed bump seems more important when we consider a quote by UCLA neurophysicist Mayank Mehta: "Dendrites make up more than 90 percent of neural tissue."  Given such a percentage, and such a conduction speed across dendrites, it would seem that the average transmission speed of a brain must be only a small fraction of the 100 meter-per-second transmission in axons. 

Besides this “speed bump” of the slower nerve transmission speed across dendrites, there is another “speed bump”: the slower nerve transmission speed across synapses (which you can see in the top “close up” circle of the first diagram above). There are two types of synapses: chemical synapses and electrical synapses. The parts of the brain allegedly involved in thought and memory have almost entirely chemical synapses. (The sources here and here and here and here and here refer to electrical synapses as "rare."  The neurosurgeon Jeffrey Schweitzer refers here to electrical synapses as "rare."  The paper here tells us on page 401 that electrical synapses -- also called gap junctions -- have only "been described very rarely" in the neocortex of the brain. This paper says that electrical synapses are a "small minority of synapses in the brain.")

We know of a reason why transmission of a nerve signal across chemical synapses should be relatively sluggish. When a nerve signal comes to the head of a chemical synapse, it can no longer travel across the synapse electrically. It must travel by neurotransmitter molecules diffusing across the gap of the synapse. This is much, much slower than what goes on in an axon.

Diagram of a synapse

There is a scientific term used for the delay caused when a nerve signal travels across a synapse. The delay is called the synaptic delay. According to this 1965 scientific paper, most synaptic delays are about .5 milliseconds, but there are also quite a few as long as 2 to 4 milliseconds. A more recent (and probably more reliable) estimate was made in a 2000 paper studying the prefrontal monkey cortex. That paper says, "the synaptic delay, estimated from the y-axis intercepts of the linear regressions, was 2.29" milliseconds. It is very important to realize that this synaptic delay is not the total delay caused by a nerve signal as it passes across different synapses. The synaptic delay is the delay caused each and every time that the nerve signal passes across a synapse. 

Such a delay may not seem like too much of a speed bump. But consider just how many such "synaptic delays" would have to occur for, say, a brain signal to travel from one region of the brain to another. It has been estimated that the brain contains 100 trillion synapses (a neuron may have thousands of them).  So it would seem that for a neural signal to travel from one part of the brain to another part of the brain that is a distance away only 5% or 10% of the length of the brain, that such a signal would have to endure many thousands of such "synaptic delays" requiring a total of quite a few seconds of time. 

An average male human brain has about 1300 cubic centimeters. Let's try to calculate the minimum number of synapses that would have to be sequentially traversed in order for a neural signal to travel through a volume of only 1 cubic centimeter (.39 of an inch). 

If there are 100 trillion synapses in a brain of 1300 cubic centimeters,  then the number of synapses in this volume of 1 cubic centimeter would be roughly 100 trillion divided by 1300, which gives 77 billion. (This page gives an estimate of 418 billion synapses per cubic centimeter, but notes that estimates of synapse density vary; so let's just stick with the smaller number.)

It would be a big mistake to assume that a neural signal would have to sequentially traverse all those 77 billion synapses. To traverse the shortest path across this area, the signal would have to merely pass through a number of synapses that is roughly the cube root of the total number of synapses in this volume (the number that you would have to multiply by itself three times to get the total number of synapses in this volume).  Similarly, if we imagine a ball with 64 equally spaced connected nodes, including nodes in the center, something rather like the ball shown below,  then it is clear that the shortest path between any one node at the outer edge of the ball to another node on the opposite end of the ball would require that you traverse a number of nodes that is at least the cube root of 64, which is 4. 



So to roughly compute the shortest series of synapses that would have to be traversed for a brain signal to travel though this 1 cubic centimeter volume, we can take the cube root of 77 billion (the number that multiplied by itself three different times equals 77 billion).  The cube root of 77 billion is 4254.  So it seems that to traverse the shortest path through a volume of 1 cubic centimeter containing 77 billion synapses, traveling a distance of about 1 cubic centimeter, a neural signal would have to pass sequentially through a path containing at least 4000 different synapses (along with other neural elements such as dendrites).  

To calculate how long this traversal would take across a 1 cubic centimeter region of the brain, considering only the dominant delay factor of synaptic delays,  we can simply multiply this number of 4000 by the synaptic delay (the time needed for the signal to cross a single synaptic gap).  Using the smallest estimate of the synaptic delay (an estimate from 1965 of about .5 millisecond), and ignoring the more recent year 2000 estimate of 2.29 milliseconds for the synaptic delay, this gives us a total time of 4000 multiplied by .5 millisecond.  This gives us a total time of two seconds (2000 milliseconds) for how long it would take a nerve signal to travel across one cubic centimeter of brain tissue.   The velocity of nerve signal speed we get from this calculation is a speed of less than 1 centimeter per second (it's actually a speed of a half a centimeter per second).  

Take careful note that this speed is more than 10,000 times slower than the "100 meters per second" figure that is given by some experts when they are asked about how fast a brain signal travels. Such an expert answer is very misleading, because it only calculates the fastest speed that a nerve signal can travel inside the brain, while it is traveling through the fastest tiny parts of the brain (myelinated axons),  not the average speed of such a brain signal as it passes through different types of brain tissue and many different synapses. It turns out that because of the "speed bump' of synaptic delays, the average speed of a nerve signal traveling though the brain should be about 20,000 times slower than "100 meters per second" -- a slowpoke speed of about a half of a centimeter per second.  That's half the maximum speed at which a snail can move.  If I had used the year 2000 estimate of the synaptic delay (2.29 milliseconds), I would have got a speed estimate for brain signals that is only about .125 centimeters per second, which is one eighth the speed of a moving snail. 


slow brain

This calculation is of the utmost relevance to the question of whether the brain is fast enough to account for extremely rapid human thinking and instantaneous memory retrieval.  Based on what I have discussed, it seems that signal transmission across regions of the brain should be very slow -- way too slow to account for very fast thinking and instantaneous recall and recognition.  

Many a human can calculate as fast as he or she can recall. For example, the Guinness world record web site tells us, "Scott Flansburg of Phoenix, Arizona, USA, correctly added a randomly selected two-digit number (38) to itself 36 times in 15 seconds without the use of a calculator on 27 April 2000 on the set of Guinness World Records in Wembley, UK."  Such speed cannot be explained as the activity of a brain in which signals literally move at a less than a snail's pace. 

To give another example, In 2004 Alexis Lemaire was able to calculate in his head the 13th root of this number:

85,877,066,894,718,045, 602,549,144,850,158,599,202,771,247,748,960,878,023,151, 390,314,284,284,465,842,798,373,290,242,826,571,823,153, 045,030,300,932,591,615,405,929,429,773,640,895,967,991,430,381,763,526,613,357,308,674,592,650,724,521,841,103,664,923,661,204,223

In only 77 seconds, according to the BBC, Lemaire was able to state that it is the number 2396232838850303 which when multiplied by itself 13 times equals the number above.  Here we have calculation speed far beyond anything that could be possible if calculation is done by a brain in which signals travel at less than a snail's pace.  

In this matter it seems our neuroscientists have acted as if they were afraid to put two and two together. They have measured the speed of brain signal transmission in axons, dendrites and synapses. But I find a curious avoidance in neuroscience literature of the basic topic of the average time it should take a signal to travel across one region of the brain to another. It's like our neuroscientists are afraid to do the math which might lead them to the conclusion that signals cannot travel from one random brain region to another nearby region at a rate of more than an inch a second. For if they were to do such math, their claim that brains are the source of our thinking and recall would be debunked.  

Echoing part of what I have said here, a textbook says "the cumulative synaptic delay may exceed the propagation time along the axons." But why aren't scientists more explicit, by telling us that this cumulative synaptic delay will actually exceed the propagation time along the axons by a factor of more than 1000, leading to "snail's pace" brain signals? Another source vaguely tells us that "cumulative synaptic delay would affect the speed of information processing at every level of cognitive complexity" without mentioning what a crippling effect this would be if our brains were doing thinking and recall. 

I may note whenever a neuroscientist answers a question such as "how fast do brain signals travel" by mentioning only the fastest rate at which a brain signal can travel through the fastest little parts of the brain (through a myelinated axon), such as neuroscientists typically do, such an answer is either deceptive or very clumsy. It's like answering the question "how fast can you travel across Manhattan" by citing the maximum speed limit on any Manhattan cross-street such as 42nd Street, without considering all the delays caused by traffic lights.  Synaptic delays are comparable to traffic light delays, and they are a factor that must be calculated when realistically considering how fast a brain signal typically travels inside the brain.  

It is interesting that both this 1979 scientific paper and this 2008 scientific paper estimate the number of synapses in the human cortex as being about a billion per cubic millimeter, which equals a trillion per cubic centimeter.  This is 10+ times greater than the 77 billion per cubic centimeter figure I was using above. The more synapses, the more speed bumps, and the slower the brain signal. If I had done the brain speed calculation specifically for cortex tissue (the supposed center of higher thought), the calculation would have come up with a brain signal speed very much slower than the  half a centimeter per second result that was reached. 

To sum up,  we have several gigantic reasons for thinking that brains must be too slow to account for instantaneous recall:

(1) Finding the exact little spot where a memory was stored would be like finding a needle in a haystack, given the lack of any indexing system or position coordinate system in the brain.
(2) Decoding stored memories from encoded neural states would take additional time that would make neural memory recall much less than instantaneous.
(3) The "snail's pace" speed of brain signals (greatly slowed by synaptic delays) would prevent an instantaneous recall of memories and stored information such as humans often have. 

The slowness of the brain is one of many neuroscience reasons for believing that the brain cannot be the storage place of our memories, and cannot be the source of our thinking and consciousness.  Human mentality must be primarily a psychic or spiritual or non-biological reality rather than a neural reality. 

I can imagine various ways in which a person could try to rebut some of the argumentation in this post, Someone could simply say that we know that signals must travel very fast in a brain, because humans are able to recall things instantly or recognize things instantly. But we do not at all know that recognition or recall are actually effects produced by the brain, and we have good reasons for doubting that they are (such as the short lifetimes of synapse proteins and the fact that the high noise levels in brains and synapses is incompatible with the fact that humans such as Hamlet actors can flawlessly recall very large bodies of memorized information). So we cannot use the speed of recognition or recall to deduce the speed of brain signals. 

Another way you could try to rebut this post would be to cite some expert who estimated how fast signals move about in a brain.  But further analysis would generally show that such an estimate was not derived from a calculation of all the low-level factors (such as synaptic delay) affecting the speed of brain signals, but was simply a calculation based on the assumption that brains must pass about signals at the speed at which humans recognize or recall things or respond to things.  We cannot use such circular reasoning or "begging the question" when considering this matter. The only intelligent way to calculate the speed of a brain signal is to do a calculation based on low-level things (such as synaptic delays) that we definitely know, rather than starting out making grand assumptions about the mind and brain that are unproven and actually discredited by the very low-level facts (such as the length of synaptic delays)  that should be examined. 

Although neuroscientists typically claim that synapses are where memories are stored in the brain, there are four ways in which the characteristics of synapses are telling us that thinking and memory is not brain-caused:

(1) Synapses show no signs of having stored information, and their main structural feature (the disorganized little blob or bag that is the synaptic knob or head) seems like pretty much the last type of structure we'd expect to see in something storing information for decades. 
(2) Synapses are unstable units undergoing spontaneous remodeling, and synapses consist of proteins with average lifetimes of only a few weeks, only a thousandth of the maximum length of time that humans store memories.
(3) Synapses are very noisy, so noisy that one expert tells us that a signal passing through a synapse "makes it across the synapse with a probability like one half, or even less," making synapses unsuitable as reliable transmitters of memory information that humans such as Wagnerian tenors can recall abundantly with 100% accuracy. Given such noise levels, which would seem to have the effect of rapidly extinguishing brain signals,  there would seem to be good reason for suspecting that it is effectively impossible for brain signals to travel more than a centimeter or an inch without vanishing or becoming mere tiny traces of their original strength. 
(4) The most common type of synapse is slow,  and although the synaptic delay in a single synapse is only about a millisecond,  when we calculate the cumulative synaptic delay we find that brain signals must be slower than a snail's pace, way too slow to explain instantaneous recall and fast thinking. 

In fact, if some designer of the human body had specifically designed something to tell us (by its characteristics) that our brains cannot be the source of our fast thinking and instantaneous memory, it's rather hard to imagine anything that would do a better job of telling us that than our signal-slowing, very noisy, unstable synapses.  Our synapses are telling us (by their characteristics) that thinking and memory is not brain-caused, but our neuroscientists (trapped in ideological enclaves of dogma and reigning speech customs) aren't listening to what our synapses are telling us. 

Postscript: I may note that you do not get a much faster estimate for the speed of brain signals if you calculate the speed from one neuron to the nearest neuron, rather than the speed through a cubic centimeter. The speed is the same snail's pace I have calculated, because the signal will always have to pass through synapses that are the dominant slowing factor. 

There is an entirely different method you could use to calculate the speed of signals inside the brain, using not estimates of the number of synapses per cubic centimeter, but instead the average distance between neurons. This paper mentions an average distance of about 26 micrometers between neurons in a rat cortex, and it says, "we believe that the parameter of 26 ┬Ám [micrometers] average distance between neurons is also a valid assumption in the human brain." I assume that by "average distance between neurons" this source means the average distance between two adjacent neurons. Below are some calculation figures that we get if we use this average distance figure, and we use a synaptic delay estimate that is about the average of the .5 millisecond and 2.29 millisecond estimates quoted above. 


Average distance between neurons in micrometers 26
This distance in centimeters 0.0026
Synaptic delay in milliseconds 1
Time needed to cross distance above (in seconds), considering only the synaptic delay.001
Total distance that could be traversed by a brain signal in a second 1000*0.0026 centimeter=2.6 centimeter
Signal speed between adjacent neurons in centimeters per second 2.6


Using this method, we get a result in the same ballpark as the result calculated by my first method.  The first method found that brain signals travel at a rate of about .5 centimeters per second, and this method finds that brain signals travel at about 2.6 centimeters per second, which is about an inch per second.  Either way, this speed is way too slow to account for instantaneous recall and very rapid thinking. 

A most-realistic estimate of brain signal speed would also take into account two other factors ignored in the calculations above (and also ignored by neuroscientists when discussing the speed of brain signals):
(1) The noise in synapses, and the fact that in the cortex, signal transmission across synapses is highly unreliable. A scientific paper says, "In the cortex, individual synapses seem to be extremely unreliable: the probability of transmitter release in response to a single action potential can be as low as 0.1 or lower."  Considered over a large section of brain tissue, this unreliability would be equivalent to a big additional slowing factor, and might well lead to speed estimates much lower than I have made here. 
(2) Synaptic fatigue,  a temporary inability of the head or vesicle of a synapse to send a signal, because of a depletion of neurotransmitters.  It's hard to find any specific number regarding synaptic fatigue, but it could be a very large additional slowing factor. Referring to synaptic fatigue, one paper states the following:

By contrast, following neurotransmission, synaptic vesicle membranes are internalized within seconds, and the recycled synaptic vesicles can be reloaded with neurotransmitter within 1–2 minutes.

It sounds as if synapses, like penises, are things that can't keep firing continuously without substantial rest periods. This sounds like what could be a very large additional slowing factor, making it all the more unlikely that brain signals inside the cortex can regularly travel about at much more than about a centimeter per second. 

Sunday, February 24, 2019

"Brains Store Memories" Dogma Versus the Reality of Noisy Brains

Neuroscientists typically maintain that human mental phenomena are entirely produced by the brain. But this claim is inconsistent with many low-level facts that neuroscientists have discovered. Remarkably, the facts and details that neuroscientists have learned on a low level frequently contradict the dogmatic high-level assertions neuroscientists make.

The table below summarizes this conflict.


High-level Neuroscientist Claims Low-Level Facts Discovered by Neuroscientists
Brains produce thinking” Human cognitive ability and memory is not strongly damaged by hemispherectomy operations in which half of a brain is removed to treat epilepsy seizures. 
Most of Lorber's hydrocephalus patients with brains mostly consisting of watery fluid had above average intelligence, and a Frenchman was able to long hold a civil service job while almost all of his brain was gone.
Brain scans do not show brains working significantly harder during either heavy thinking or recall, and no signal change greater than 1% occurs during such activities.
When we do accurate mental calculations, it is our neurons that are doing the work” Neurons are noisy, and synapses transmit signals with only a 50% likelihood or less– the type of thing that should prevent accurate mental arithmetic as savants can perform.
Our memories are stored in our brains” Neurons and synapses have been extensively examined at very high microscopic resolutions, and no sign of stored information or encoded information has been found in them other than the gene information in DNA.
There is high protein turnover in the synapses that neuroscientists claim to be the storage place of memories, and the average lifetime of the proteins that make up synapses is only a few weeks – only a thousandth of the lifespan of very old memories in old people.
There seems to be nothing in the human brain resembling the write mechanism like we see in storage systems such as computers.
When we remember, we read data from our brains.” There seems to be nothing in the human brain resembling the read mechanism like we see in storage systems such as computers.
There is in the human brain no position coordinate system, no indexing, no neuron numbering system, nor anything else that would seem to make possible an instantaneous recall of information from some very precise location in a brain, in a manner similar to a retrieval of data from a particular page of a particular book
Although we would expect information to be reliably transmitted across neurons during precise and accurate human recall, neurons are actually quite noisy, and transmit signals with only a low reliability.
Synaptic density studies show that the the density of synapses in brains strongly drops between puberty and adulthood, at the very time when learned knowledge is piling up.

By following the links above, you can read detailed discussions of the claims I make in the right column – except for my claims about neurons being very noisy, which I will justify in this post. 

When we talk about the noise in a communication system, we can imagine this as a kind of static that prevents the transmission from occurring without errors. A young reader may not even know what static is, since nowadays digital communication occurs with very little noise. But I experienced static frequently in my youth, back in the days long before the internet. One type of static would occur when I listened to the radio. When I tuned in to a radio station too far away, the radio signal would be mixed with a crackling noise or static that might prevent me from hearing particular words or musical notes in the transmission. In my youth there was also a problem with television noise or static. On top of a TV set there would be an antenna, and if it wasn't pointing just right, a TV signal might be rather noisy. The noise might be of a visual type, with random little blips appearing on the TV screen. Sometimes the static would be so bad you couldn't see much of anything on the TV you recognized.

The table below illustrates an example of noise in a signal transmission system.


Type of system Input Output
Low-noise system Toto, I've a feeling we're not in Kansas anymore.” Toto, I've a feeling we're not in Kansas anymore.”
High-noise system Toto, I've a feeling we're not in Kansas anymore.” Tojo, I've a f2@eling we're Xot in K3$sas anymore.”

A neuron acts as an electrical/chemical signal transmitter. A neuron will receive an electrical/chemical input, and transmit an electrical/chemical output. But a neuron does not act as efficiently and reliably as a cable TV wire or a computer cable that transmits signals with a very low error rate. Neuroscientists know that a large amount of noise occurs when neurons transmit signals. In other words, when a neuron receives a particular electrical/chemical input signal, there is a very significant amount of chance and variability involved in what type of electrical/chemical output will come out of the neuron. The wikipedia.org article on “neuronal noise” identifies many different types of noise that might degrade neuron performance: thermal noise, ionic conductance noise, ion pump noise, ion channel shot noise, synaptic release noise, synaptic bombardment, and connectivity noise.

In a very recent interview, an expert on neuron noise states the following:

There is, for example, unreliable synaptic transmission. This is something that an engineer would not normally build into a system. When one neuron is active, and a signal runs down the axon, that signal is not guaranteed to actually reach the next neuron. It makes it across the synapse with a probability like one half, or even less. This introduces a lot of noise into the system.

So according to this expert, synapses (the supposed storage place of human memories) transmit signals with a probability of less than 50 percent. Now that's very heavy noise – the kind of noise you would have if half of the characters in your text messages got scrambled by your cell phone carrier.  A scientific paper tells us the same thing. It states, "Several recent studies have documented the unreliability of central nervous system synapses: typically, a postsynaptic response is produced less than half of the time when a presynaptic nerve impulse arrives at a synapse." Another scientific paper says, "In the cortex, individual synapses seem to be extremely unreliable: the probability of transmitter release in response to a single action potential can be as low as 0.1 or lower." 

Another scientific paper tells us, “Neuronal variability (both in and across trials) can exhibit statistical characteristics (such as the mean and variance) that match those of random processes.” Another scientific paper tells us that Neural activity in the mammalian brain is notoriously variable/noisy over time.” Another paper tells us, "We have confirmed that synaptic transmission at excitatory synapses is generally quite unreliable, with failure rates usually in excess of 0.5 [50%]."

This is a problem for all claims that memories are retrieved from brains, because humans are known to be able to remember things very accurately, but “neural noise limits the fidelity of representations in the brain,” as a scientific paper tells us.

Now, a neuroscientist might claim that such facts can still be reconciled with the mental performance of humans. He might argue like this:

Yes, neurons are pretty slow and noisy, but that's why human memory is slow and unreliable. Think of how it works when you suddenly see some old schoolmate that you haven't seen in twenty years. It may be a while before you remember their name. And when you remember something about that person, your memory will probably be not terribly accurate. So you have a kind of a slow “noisy” memory.

But it is easy to come up with examples of human memory performing without error in a noiseless manner. I just closed my eyes and recited the following lines without any error at a rate faster than you can read these lines aloud:

I am the very model of a modern Major-General
I've information vegetable, animal, and mineral
I know the kings of England, and I quote the fights historical
From Marathon to Waterloo, in order categorical

I'm very well acquainted, too, with matters mathematical
I understand equations, both the simple and quadratical
About binomial theorem I'm teeming with a lot o' news
With many cheerful facts about the square of the hypotenuse

But that's not very impressive, for there are singers who can flawlessly sing without any errors at a very rapid pace the entire delightful song “I Am the Very Model of a Modern Major General” from Gilbert and Sullivan's “The Pirates of Penzance,” and the song is about eight times longer than what I have quoted. Also, in the world of opera there are singers who can flawlessly sing every note and every word of the part of Hans Sachs in Wagner's four-hour opera Die Meistersinger von Nurnberg, an opera in which Hans is on stage singing for a large fraction of those four-hours. There are other singers who can flawlessly sing the title role in the opera Siegfried, which requires the lead singer to sing on stage for most of its three hours. There are other singers who can flawlessly sing the role of Tristan, which also requires a similar demand. In such cases we have a very rapid and flawless error-free retrieval of an amount of information that would take many, many pages to write down.

A rock singer at a funky free-wheeling concert might get away with an error rate of 2% in his memory recall of words, but opera fans are very intolerant of errors. When Wagner fans (who have typically heard an opera many times on recordings) go to something like the Bayreuth festival, they expect singers to recall Wagner's notes and words with 100% fidelity, and that is what they usually get, even when hearing roles such as Tristan and Siegfried which require a singer to memorize hours of singing.  Every time an actor performs Hamlet, he recites 1480 lines of dialog, and many such actors recall all such lines without any errors. 


neuron noise

Then there is Leslie Lemke, who according to this article in wikipedia.org "can remember and play back a musical piece of any length flawlessly after hearing it once."  It is well documented that there are quite a few Muslims who can recite the entire holy book of their religion, a book of some 80,000 words. Then there are people who flawlessly remember content that is hard to remember. According to the site of the Guiness Book of World Records, Rajveer Meena memorized pi to 70,000 digits, reciting those 70,000 digits without any errors. Lu Chao memorized pi to 67,000 digits. A 1917 scientific paper stated that one or more people had accurately "memorized the exact layout of words in more than 5,000 pages of the 12 books of the standard edition of the Babylonian Talmud."

How could such feats occur if memory retrieval is being performed by neurons and synapses that are very noisy? They cannot be. In these cases, human memory is acting at a reliability vastly surpassing what should be possible if memory retrieval or thought is a neural phenomenon.  A scientific paper states, "Neural noise limits the fidelity of representations in the brain."  But humans such as those I have mentioned seem to be able to recall huge amounts of learned text or song without any such problem of a degradation of "fidelity of representations." 

A similar conclusion is forced on us when we consider the accuracy of the most impressive human calculators. In 2004 Alexis Lemaire was able to calculate in his head the 13th root of this number:

85,877,066,894,718,045, 602,549,144,850,158,599,202,771,247,748,960,878,023,151, 390,314,284,284,465,842,798,373,290,242,826,571,823,153, 045,030,300,932,591,615,405,929,429,773,640,895,967,991,430,381,763,526,613,357,308,674,592,650,724,521,841,103,664,923,661,204,223

In only 77 seconds, according to the BBC, Lemaire was able to state that it is the number 2396232838850303 which when multiplied by itself 13 times equals the number above.  Here we have calculation accuracy far beyond anything that could be possible if noisy neurons are the source of human thought. 

Given the high amount of noise in neurons and synapses, which would strongly degrade the accuracy of neural memory retrieval and neural signal transmission, the facts of very accurate human calculation and very accurate human memory recall (as shown by calculation savants, Hamlet actors, and Wagnerian opera singers) are very much in conflict with the dogmas that our thinking is performed by our brains and our memories are stored in and retrieved by our brains.  This is yet another case in which the low-level facts of neuroscience defy the dogmatic claims of neuroscientists. 

Think for a moment about the implications if a synapse can only transmit a signal with about a 50% reliability, as indicated by the previously quoted expert on neuron noise. This does not at all mean that people would recall things with about 50% accuracy if memories are stored in brains; it's much worse than that. Since any act of neural memory retrieval would involve innumerable different signal transmissions through innumerable neurons, we would expect the actual accuracy to be only some tiny fraction of 50% if we were using synapses to retrieve our learned knowledge.  Similarly, if you play the game "Chinese whispers" (also called "gossip") at a school lunch table, and have everyone at the table be playing noisy music in earphones as they hear the gossip story being whispered among the players, the tenth person to receive the story will be unlikely to receive even 20 percent of it accurately. 

Let us imagine a planet in which the sky was perpetually covered in very thick clouds, so that no one had seen the stars or the local sun.  On such a planet there would be a great mystery: from where comes the heat that keeps life on the planet warm? If you were a rather clumsy thinker on such a planet, you might come up with some cheesy theory to explain the heat on your planet, and dogmatically cling to it -- maybe the theory that rocks on your planet warm the planet through radioactivity, or that heat shoots up from the hot core of the planet. But if you were a better thinker, you would say, "There is nothing anyone has observed that can explain this planet's heat -- it must come from some mysterious unseen reality."  It is something similar that we should say about our mental capabilities: that nothing we have observed can explain them, and that they must come mainly from some mysterious unseen reality. 

Postscript:  It is sometimes suggested that by transmission redundancy we can escape the consequences of unreliable and noisy synaptic transmission (in which signals may travel across a synapse only 50% of the time or as little as 10% of the time). But this paper makes clear that in the cortex of the brain there is little such redundancy. It states the following:

In the cortex, individual synapses seem to be extremely unreliable: the probability of transmitter release in response to a single action potential can be as low as 0.1 or lower . In other words, as many as nine out of ten presynaptic stimuli fail to trigger transmitter release. The critical difference between these cortical connections and those at the neuromuscular junction is that, in the cortex, the synaptic connection between a pair of cells is often made up of only a few release sites, sometimes only one [6][7]. In the cortex, then, the postsynaptic response to a single presynaptic action potential is highly variable, because it is the average over a small and unreliable population....In the periphery [of the brain], reliability is achieved by averaging over many release sites. In the cortex, rich interconnectivity within a restricted volume limits the possible number of such redundant connections.

Monday, January 7, 2019

Memories Can Form Many Times Faster Than the Speed of Synapse Strengthening

The main theory of a brain storage of memories is that people acquire new memories through a strengthening of synapses. There are many reasons for doubting this claim. One is that information is generally stored through a writing process, not a strengthening process. It seems that there has never been a verified case of any information being stored through a process of strengthening.

If it were true that memories were stored by a strengthening of synapses, this would be a slow process. The only way in which a synapse can be strengthened is if proteins are added to it. We know that the synthesis of new proteins is a rather slow effect, requiring minutes of time. In addition, there would have to be some very complicated encoding going on if a memory was to be stored in synapses. The reality of newly-learned knowledge and new experience would somehow have to be encoded or translated into some brain state that would store this information. When we add up the time needed for this protein synthesis and the time needed for this encoding, we find that the theory of memory storage in brain synapses predicts that the acquisition of new memories should be a very slow affair, which can occur at only a tiny bandwidth, a speed which is like a mere trickle. But experiments show that we can actually acquire new memories at a speed more than 1000 times greater than such a tiny trickle.

One such experiment is the experiment described in the scientific paper “Visual long-term memory has a massive storage capacity for object details.” The experimenters showed some subjects 2500 images over the course of five and a half hours, and the subjects viewed each image for only three seconds. Then the subjects were tested in the following way described by the paper:

Afterward, they were shown pairs of images and indicated which of the two they had seen. The previously viewed item could be paired with either an object from a novel category, an object of the same basic-level category, or the same object in a different state or pose. Performance in each of these conditions was remarkably high  (92%, 88%, and 87%, respectively), suggesting that participants successfully maintained detailed representations of thousands of images.

In this experiment, pairs like those shown below were used. A subject might be presented for 3 seconds with one of the two images in the pair, and then hours later be shown both images in the pair, and be asked which of the two was the one he saw.



Although the authors probably did not intend for their experiment to be any such thing, their experiment is a great experiment to disprove the prevailing dogma about memory storage in the brain. Let us imagine that memories were being stored in the brain by a process of synapse strengthening. Each time a memory was stored, it would involve the synthesis of new proteins (requiring minutes), and also the additional time (presumably requiring additional minutes) for an encoding effect in which knowledge or experienced was translated into neural states. If the brain stored memories in such a way, it could not possibly keep up with remembering images that appeared for only three seconds each in a long series. It would be a state of affairs like that depicted in what many regard as the funniest scene that appeared in the “I Love Lucy” TV series, the scene in which Lucy and her friend Ethel were working on a confection assembly line. In that scene Lucy and Ethel were supposed to wrap chocolates that were moving along a conveyor belt. But while the chocolates moved slowly at first, the conveyor belt kept speeding up faster and faster, totally exceeding Lucy and Ethel's ability to wrap the chocolates (with ensuing hilarious results).


The experiment described above in effect creates a kind of fast moving conveyor belt in which images fly by at a speed so fast that it should totally defeat a person's ability to memorize accurately – if our memories were actually being created through the slow process imagined by scientists, in which each memory requires a protein synthesis requiring minutes, and an additional time (probably additional minutes) needed for encoding. But nonetheless the subjects did extraordinarily well in this test.

There is only one conclusion we can draw from such an experiment. It is that the bandwidth of human memory acquisition is vastly greatly than anything that can be accounted for by neural theories of memory storage. We do not remember at the speed of synapse strengthening, which is a snail's speed similar to the speed of arm muscle strengthening. We instead are able to form new memories in a manner that is basically instantaneous. The authors of the scientific paper state that their results “pose a challenge to neural models of memory storage and retrieval.” That is an understatement, for we could say that their results are shockingly inconsistent with prevailing dogmas about how memories are stored.

There are some people who are able to acquire new memories at an astonishing rate. The autistic savant Kim Peek was able to recall everything he had read in the more than 7000 books he had read. Here we had a case in which memorization occurred at the speed of reading. Stephen Wiltshire is an autistic savant who has produced incredibly detailed and accurate artistic works depicting cities that he has seen only from a brief helicopter ride or boat ride. Of Wiltshire, savant expert Darold Treffert says, "His extraordinary memory is illustrated in a documentary film clip, when, after a 12-minute helicopter ride over London, he completes, in 3 hours, an impeccably accurate sketch that encompasses 4 square miles, 12 major landmarks and 200 other buildings all drawn to scale and perspective." Again, we have a case in which memories seem to be formed at an incredibly fast rate. Savant Daniel Tammet (who one time publicly recited accurately the value of pi to 22,514 digits) was able to learn the Icelandic language in only 7 days. Derek Paravicini is a blind and brain-damaged autistic savant who has the incredible ability to replay any piece of music he has heard for the first time. In 2007 the Guardian reported the following:

Derek is 27, blind, has severe learning difficulties, cannot dress or feed himself - but play him a song once, and he will not only memorize it instantly, but be able to reproduce it exactly on the piano. One part of his brain is wrecked; another has a capacity most of us can only dream of.

Other savants such as Leslie Lemke and Ellen Boudreaux have the same extraordinary ability to replay perfectly a song heard for the first time. 

Cases such as these are inconsistent with prevailing theories of memory. Are we to believe that such people (typically with substantial brain damage) can somehow synthesize proteins in their brains ten times or thirty times faster than the average human, so that their synapses can get bulked up ten times or thirty times faster? That's hardly credible. But if memories are not actually stored in brains, but stored in or added to a human psychic or spiritual facility, something like a soul, then there would be no reason why the brain-damaged might not have astonishing powers of memorization.

Some people can form memories 1000 times faster than should be possible under prevailing theories of brain memory storage, which involve postulating protein synthesis and encoding operations that should take minutes. This thousand-fold shortfall in only one of three thousand-fold shortfalls of the prevailing theory of brain memory storage. The two other shortfalls are: (1) humans can remember things for 50 years or more, which is 1000 times longer than the synaptic theory of memory storage can account for (synapses having average protein lifetimes of only a few weeks); (2) humans can recall things 1000 times faster than should be possible if you stored something in some exact location of the brain. If you stored a memory in your brain (an organ with no numbering system or coordinate system), it would be like throwing a needle onto a mountain-sized heap of needles, in the sense that finding that exact needle at some later point should take a very long time.

The imaginary conversation below illustrates some of the many ways in which prevailing dogma about brain memory storage fails. It's the kind of conversation that might occur if memories were formed according to the "brain storage of memory" dogmas that currently prevail among neuroscientists. 

Costello: Alright, guy, I'm now going to teach you an important geographical fact: which city is the capital city of Spain.
Abbott: Go ahead, I'm all ears.
Costello: Okay, here it is. The capital city of Spain is Madrid.
Abbott: Okay, I'll try to remember that.
Costello: So what is the capital city of Spain?
Abbott: I haven't formed the memory of that yet. It takes time. I'm still synthesizing the proteins I need to strength my synapses, so I can remember that.
Costello: So try hard. Remember, Madrid is the capital city of Spain.
Abbott: I'm working on forming the memory.
Costello: So do you remember by now what the capital city of Spain is?
Abbott: Don't ask me too soon. It takes minutes to synthesize those proteins.

After five additional minutes like this, the conversation continues.

Costello: Okay, so it's been five minutes since I first told you what the capital city of Spain is. You should have had enough time to have formed your memory of this fact.
Abbott: I'm sure by now I have formed that memory, because there has been enough time for protein synthesis in my synapses.
Costello: So what is the capital city of Spain?
Abbott: I can't recall.
Costello: But you formed the memory by now. Why can't you recall it?
Abbott: The problem is that I don't know exactly where in my brain the memory was stored. So I can't just instantly recall the memory. The memory is like a tiny needle in a haystack. There's no way I can find that quickly.
Costello: Can't you just search through all the memories in your brain, looking for this one?
Abbott: I could try, but it would take hours or days to search through all those memories.
Costello: Sheesh, this is driving me crazy. How about this? I can teach you that Madrid is the capital city of Spain, and when you form the memory, you can tell me the exact tiny spot where your memory was formed. So maybe you'll tell me, “Okay I stored that memory at brain neuron number 273,835,235.” Then I'll just say to you something like, “Please look in your brain at neuron number 273,835,235, and retrieve the memory you stored of what is the capital city of Spain.”
Abbott: That's a brilliant idea!
Costello: Thanks.
Abbott: On second thought, it will never work.
Costello: Why not?
Abbott: Neurons aren't numbered, and the brain has no coordinate system. It's like some vast city in which none of the streets are named, and none of the houses have house numbers. So if I put a memory in one little “house” in the huge brain city, I'll never be able to tell you the exact address of that house.
Costello: So how the hell am I supposed to teach you anything?
Abbott: Beats me. And if I ever learn anything new, I'm sure I won't remember it for more than a few weeks. That's because there's a big problem with those proteins that I will synthesize to store those new memories. They have average lifetimes of only a few weeks.

As long as they cling to “brain storage of memory” dogmas, our neuroscientists will never be able to overcome difficulties such as those mentioned in this conversation.