Tuesday, September 21, 2021

They Wrote As They Would Have Written If Brains Don't Store Memories

Recently www.gizmodo.com asked a set of brain experts and technologists the question "Will it be possible to upload information to my brain?" The answer that should be given is: no, it never will be, because brains do not store memories and do not store learned information.  None of the respondents gives us this answer. But the answers we get are just the type of answers we would expect to get if (a) brains do not store memories and do not store learned information, and (b) there was an  unwarranted dogma popular among neuroscientists that brains store memories and learned information. 

Under such a case, we would expect the experts to kind of go around in circles, and fail to mention any specific way in which information could be uploaded to the brain; and we would expect the experts to say things such as "we need to learn much more before we can do this," just as if they had no real idea how someone could upload information to the brain.  We might also expect the experts to give us "red herring" distractions, by referring to little things that have been done relating to the brain and technology, which are not at all uploading information or memories into brains. That is just what happens. 

The first brain expert (Michael Beyeler) answers, "I think the prospect of augmenting our senses and our intellect with a brain device is certainly within our reach." But that was not the question he was asked, that question being whether information could be uploaded into the brain.  He then states the following, making a misleading statement often made by neuroscientists:

"However, the biggest challenge I see is that our understanding of the brain is simply not good enough to make brain uploads viable. We need to better understand how information is stored and accessed in the brain." 

The second sentence is misleading because it implies that there is some current understanding of how learned information is stored and accessed in the brain.  There is no such understanding at all.  No one has any detailed credible theory of how a brain could store and retrieve learned information.  What we have learned about the brain suggests that it is totally unsuitable for such a task.  There is no sign of any write mechanism in the brain, and no sign of any read mechanism in the brain.  The synapses of the brain are places of constant molecular turnover, with the proteins that make up synapses having averge lifetimes of less than two weeks. No scientist has ever read infomation from a dead brain or some tissue extracted from a living organism, other than the genetic information that exists in all cells in the body. Therefore, statements such as "we need to better understand how information is stored and accessed in the brain" are misleading, because they imply that we have a partial knowledge of such a thing, when no such partial understanding exists. Such statements are like someone saying, "We need a better understanding of how extraterrestrials killed John Kennedy." 

Next we hear from Rajesh P. N. Rao, who claims that there has already been some sending information into brains, but what he is talking about is not at all uploading information to brains, but merely sending signals into a brain.  Then, committing an error just like the one described in the previous paragraph, Rao says that "uploading more complex information into a brain will require advances in at least three areas" including "a deeper understanding of how abstract information is processed and stored in the brain." Since there does not currently exist any understanding at all of how abstract information is stored or could be stored in the brain, it is misleading to say that we need a "deeper understanding" of such a thing, a statement incorrectly implying that there is currently some understanding of such a thing.  Again, we have a statement that is like saying, "We need a better understanding of how extraterrestrials killed John Kennedy."

We then hear from Spencer LaVere Smith, who gives us the same confession that we don't know what to do to upload information into brains. Smith at least avoids the previously discussed error, by saying this: 

"Uploading expertise in a new language or a detailed memory—that won’t be possible anytime soon, for two reasons: (1) our technologies for manipulating neural circuitry are too crude, and (2) our understanding of what to manipulate and how is too primitive."

Smith rather gives away that neuroscientists have not the slightest idea of how to upload information into the brain by referring to a million-year timeframe for the accomplishment of such a task. 

We next hear from Andrew Maynard, who speaks as if uploading into your brain is not something that will occur in the lifetime of anyone living, and says that "we almost certainly shouldn't" do such a thing.  Then Kevin Warnick states, "As for downloading things like memories (which you haven’t actually had) into the brain, I can’t see any reason why this will not be possible in the future, but to do that we need to learn a lot more about how memories are stored and the process of recall."  Again, we have a misleading insinuation that something is now known about memory storage in a brain.  We have no such understanding at all.

We then hear from Dong Song, who states the following:

"First, I think this is definitely something theoretically possible. The common understanding in the scientific community is that information is stored in the brain in the form of synaptic weights and/or neural activities, and that these can be altered externally in many different ways, including via brain-machine interface. If they are altered in the right way, information will then be uploaded into the brain."

There is no such "common understanding" about synapses being the storage place of memories, and the use of "synaptic weights and/or neural activities" itself tells us about the lack of any such understanding (you would not use "and/or" followed by a vague phrase if there was an understanding of synapses storing memories). There is merely a senseless speech custom of claiming that memories are stored in synapses. Such a custom makes no sense because:

(1) no one has any credible detailed theory of how information could be stored through an alteration of weights, and we know of no one who has ever stored any complex information by altering weights;  

(2) we know that humans can instantly form permanent new memories, something that would not be possible if memory storage involved an alteration of weights that would take at least several minutes;

(3) we know that the average lifetimes of proteins in synapses are only a few weeks or less, which is only about a thousandth of the length of time (50 years or more) that humans can remember things; 

(4) we know that synapses typically last for relatively short times, because synapses are physically associated with dendritic spines that almost all last for a much shorter time than a year

Song's claim that "information is stored in the brain in the form of synaptic weights and/or neural activities" suggests a lack of any real knowledge on this topic, just as you would reveal a lack of any clear knowledge of who killed John Kennedy by saying that he was killed by "Oswald and/or some murky conspiracy."

"Synaptic strengthening" is the kind of jargon droplet that neuroscientists spit out when asked about neural memory storage, to try to make us think they have some understanding of such a topic. There is no detailed theory behind such an empty phrase, and the phrase is as empty as the vague empty phrase "cellular reconfiguration."  When asked about how a brain could instantly recall a memory, neuroscientists don't even have any jargon droplets to spit out.  The brain has no sign of repeated tokens used for memory storage, no sign of any stored images, no sign of a coordinate system or position notation system, and no sign of any indexes. So the brain is like some book with no letters, no characters, no photos and no pictures, without any page numbers, and without any index. Just as such a book would have no resemblance to an object for instantly retrieving information on a topic, the brain bears no resemblance to a device for instantly retrieving a memory such as humans are able to do. 

We then hear from Gopala Krishna Anumanchipalli, who says this: "It is not inconceivable that one day, we could 'upload' more complex information like a new skill or delete a traumatic episode from memory." But he says nothing to suggest any idea of how such a thing could be done.

We then hear from William Eugene Bishop, who makes the same misleading insinuation of others by saying, "Our knowledge about the code for representing information and how that code is persistently stored in the brain—things that will come down to the level of individual neurons and how they are connected—is very limited." Again, the insinuation that some knowledge exists of such a thing. No such knowledge actually exists.  After incorrectly referring to "our knowledge of how information is represented and stored in the brain," something that does not actually exist to any degree (except for the genetic information common to all cells),  Bishop states, "while we are surely many years, likely decades, away from systems that could be routinely used to upload information to our brain, it seems likely that one day this will be possible," without doing anything to justify such a claim.  The fact that such a job is predicted to occur only decades in the future gives away that the speaker has no understanding of how it could be done.

Finally Joshua R. Smith states, "I find it much harder to imagine that one could ever successfully generate in the brain higher level cognitive input in the brain, such as words or thoughts, or even sophisticated visual information at the level of readable text." 

The answers the experts gave are just what we would expect to get if  (a) brains do not store memories and do not store learned information, and (b) there was a groundless dogma popular among neuroscientists that brains store memories and learned information. Just as expected under such a case, we hear the experts  go around in circles, and fail to mention any specific way in which information could be uploaded to the brain; and we mainly hear the experts say things such as "we need to learn much more before we can do this," just as if they had no real idea how someone could upload information to the brain. 

The type of responses given are like the responses you might get if there were some experts calling themselves "cognitive podiatrists" who believed that memories are stored in the feet, and you asked them, "When will we be able to upload memories to people's feet?"  Such experts might talk about this or that little experiment done with feet to try to create the impression that they are on the right track, and then they might say things like "we need to know a lot more about how feet store your memories before memories can be uploaded into feet."

Thursday, September 2, 2021

The Inaccuracy of Electronic or Mechanical Metaphors for the Brain

Humans love to try to make metaphorical comparisons for things in biology,  but most of these metaphors give the wrong idea. Often an organism or one of its organs is compared to something that humans made. But you vastly underestimate what a wonder a large organism is when you compare it to some mechanical device humans made. This is because human mechanical devices don't make copies of themselves. There is no airplane that splits itself into two working airplanes, and no car that reproduces itself.  So every large organism is something vastly more impressive than anything humans have made. 

Scientists like to compare the brain to some work of human invention. The most common metaphor is one in which the brain is compared to a computer.  But this is not a correct comparison. For one thing, computers are controlled by software. We know of nothing in the brain that is equivalent to software. For another thing, computers have information storage devices unlike anything in the brain. 

Consider a computer with a hard drive.  Such a system is a stable data storage system in which newly acquired information can be permanently stored for many years.  Such a system includes a read mechanism and a write mechanism, such as a read/write head that can be positioned to read or write at any location on a storage disk.  Such a system also includes an addressing system allowing data to be stored at some exact location on the storage device, and allowing data to be very quickly read from some other exact location. 

The brain has nothing like any such things.  We know of neither a read mechanism in the brain nor a write mechanism in the brain. The brain seems to have no place where learned information could be permanently stored for many years, or even a single year.  The most common claim about neural storage of memory is that memory is stored in synapses. But the proteins in synapses have average lifetimes of only two weeks or less, only about a thousandth of the maximum length of time that humans can remember things.  

Since your computer has a filing system, you can add a named file to some particular directory on your computer. The brain has nothing that is equivalent to files. Because brains completely lack any coordinate system or position notation system, if you stored something in your brain you would never be able to quickly find it.  Writing to the brain would be like throwing an index card into a swimming pool filled with index cards.  Under such a system there is no way to quickly retrieve some exact piece of information you previously wrote. Since none of the locations of the brain have any addresses or coordinates, you could never retrieve something from the brain by doing something kind of like, "Okay, let me retrieve what I stored at neural address #73428234." No such addresses exist. 

Another reason why the "brain as computer" metaphor is inappropriate is that humans have lives, consciousness and experience, a "life flow," which computers don't have. And contrary to the misleading term "artificial intelligence," computers don't actually understand anything (although they can process information). So you cannot explain your mind by saying it's caused by a computer between your skull.  Your brain bears virtually no resemblance to a computer. 

An alternate idea was presented long ago before computers were invented. William James wrote an 1898 book in which he wrongly asked us to assume that "thought is a function of the brain," something for which there was no good evidence for either in his time or today.  He then presented a theory that imagined the brain as kind of a "receiver" that somehow in some sense receives mentality or thought transmitted from some external source. It is probably no coincidence that this theory came three years after Marconi invented the radio. In 1898 radios were the cool new gadget, so there might have been a certain appeal to comparing the brain to such a thing. 

While there may well be truth in the idea that our mental capabilities come from some mysterious external source, the analogy between mental activity and radio reception was never a good one.  A radio passively receives whatever is being transmitted on some particular frequency. But a human mind is a very active and thoughtful and creative reality, unlike the entirely passive and uncreative and thoughtless machine that is a radio receiver.  So trying to draw an analogy between human minds (or human brains) and radio receivers was never a very good idea. 

A recent article in Discover magazine gives us another example of trying to compare the brain to a mechanical device.  The article is entitled "You brain is not a computer. It is a transducer." Again, we have a misguided analogy comparing the brain to a mechanical device.  A transducer is usually a fairly simple device converting some analog signal into electrical signals.  Do a Google image search for "transducer," and you'll see some little gadgets looking like this:

The author (a psychologist named Robert Epstein) dares to contradict the unfounded dogma of neural memory storage, one that has been stated so many times in Discover magazine (a bastion of biology groupthink). Mentioning someone (Barenboim) who has memorized incredibly large amounts of musical information, Epstein states the following:

"Do you think all this content is somehow stored in Barenboim’s ever-changing, ever-shrinking, ever-decaying brain? Sorry, but if you study his brain for a hundred years, you will never find a single note, a single musical score, a single instruction for how to move his fingers — not even a 'representation' of any of those things. The brain is simply not a storage device."

I am very pleased that we can read in the very mainstream Discover magazine the same contrarian idea that I have advanced for several years on this blog, that brains do not store human memories.  Unfortunately, Epstein's article is so rambling and disorganized that I cannot recommend it for much other than getting links that may point you to interesting anomalies worth reading about further. At one point Epstein rather seems to suggest the very silly idea that maybe mind is kind of sent to you from a parallel universe imagined by speculative physics. There is no good evidence for any such universes, and no explanatory need to believe in them. If such universes existed, they would not be a credible source for any of the main human mental phenomena. 

We can seem to see in the article the effects of the mainstream's thought taboos.  Epstein seems very interested in anomalies that cannot be explained by conventional claims about the brain.  But he seems to forbid himself from discussing the best-documented anomalies of this type: things such as ESP, apparition sightings, out-of-body experiences and inexplicable successes by mediums.  Instead he draws our attention to interesting but less-established anomalies such as terminal lucidity (when those with dementia suddenly regain normal mentality shortly before dying) and near-death experiences of the blind. But why should we study such things and avoid studying the evidence for ESP, apparition sightings, out-of-body experiences and inexplicable successes by mediums, when the evidence for such things is much better and more voluminous than the evidence for terminal lucidity or near-death experiences of the blind?  

It is as if Epstein is carrying around in his pocket a list of taboo things he is forbidden from discussing, for fear of being deprecated by his colleagues who never studied such things but have negative opinions about them; and it is as if Epstein feels free to mention other anomalies that discredit conventional ideas about the brain, only because his academia colleagues haven't yet got around to declaring such things taboo. 

I regard Epstein as someone who might become a solid thinker about minds and brains once he gets his thoughts more organized and starts making a much wider study of anomalies that cannot be explained under "your brain makes your mind" ideas, without paying attention to which topics have been declared taboo by his colleagues.  I recommend that he lose his "brain as transducer" idea, which makes little sense, and also recommend he discard his weird claim that he has "decapitated consciousness" by showing that it is not mysterious.  The more we learn about the mind, the more mysterious it seems. 

The brain cannot be accurately compared to any mechanical or electronic device. Discarding all the unfounded claims made about brains (so often contradicted by low-level facts we have learned about brains), we can have a good minimalist concept of the brain: that the brain is a helper organ that helps other parts of your body do their jobs. So the brain helps your eyes see, your muscles move, your ears hear, and your lips speak; and also the brain helps your lungs to keep breathing at the right rate, and your heart to keep beating at the right rate; and your brain helps your pain receptors alert you of pain.  There is no electronic or mechanical device that acts in all those ways. As for the human mind, it cannot be compared to any device humans have created, not even to computers which don't actually have lives or experience or understanding.