Storing information in a network of cells (Brain)

In summary: Now, the conscious has to decide which of the two clusters of neurons to use. It can either use the information from the cluster that corresponds to the phonological input, or it can use the information from the other cluster. If it uses the information from the other cluster, then it knows that 'apple' is the correct answer because it is connected to the neurons that code for that input.
  • #1
Deemreo
8
0
Hello all, this is my first post here. I know these things have been discussed before in this forum, but that was several years ago, and I have some questions that weren't asked.

Basically, I need to know how information is stored in the brain, presumably as memories. What I already know is that the way the cell networks are connected is the stored information; but what I don't know is how they are created and accessed.

For instance, let's say you have a perception worthy of being stored in your memory, and as it is perceived it processed through the various neurons, before being stored as... what, a network of connected cells? Or is there another part that I'm missing?

I know that at lot remains to be answered about the brain, but it is to my general understanding that the answer to this is already known, I just can't find it.

How does the brain then retrieve this information, i.e. how does it know that information is stored in a particular pattern. If each neuron has hundreds of different connections, and a 'memory' is stored as the overall network of connection between a number of cells; how does the memory know which cell is the next part of the memory?

Is my general understanding correct? Or am I completely wrong?

I also have several future follow up questions.
Thank you for your time.
 
Last edited:
Biology news on Phys.org
  • #2
Some things to research:
long term potentiation
long term depression
papez circuit
limbic system
amygdala
hippocampus
 
  • #3
I know this hasn't been discovered yet, but does anyone have any theories about how the human brain engram/read-write process works?
 
  • #4
Deemreo said:
I know this hasn't been discovered yet, but does anyone have any theories about how the human brain engram/read-write process works?
"writing":

one of the ways is by long-term potentiation, in which the surface area of the synaptic interface is increased between particular neurons. I don't know the molecular/chemical details of this process, but knowing the search term should help you. I know it has to do with "excessive" amounts of calcium in the neuron and allegedly their relationship with NMDA receptors.

Brain Research
Volume 453, Issues 1-2, 21 June 1988, Pages 308-314

Synaptic interface surface area increases with long-term potentiation in the hippocampal dentate gyrus

http://www.google.com/url?sa=t&source=web&cd=2&ved=0CBoQFjAB&url=http%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2F0006899388901710&ei=zqYaTYX8OoG6sQPl3rzRAg&usg=AFQjCNHsXv8uxmuyYNTDwTp2MTdQjrwqOA

I don't know if there's quite such a similar concept of "reading" as in computers. The changes brought by LTP changes the network, and so signal processing aspects are altered which leads to different behavior (learned behavior) but I don't really see a separate "reading" process, just a different "doing" process as a result of LTP.

I would also note that the wikipedia on long-term potentiation is a mine of references.
 
Last edited by a moderator:
  • #5
Much obliged, Pythagorean, you've been a tremendous help so far. I'll do some more research in those areas.

I'll be checking back often, if anyone else would like to share some input.
 
  • #6
Pythagorean said:
I don't know if there's quite such a similar concept of "reading" as in computers.
Artificial neural nets give at least one analogy according to which the reading is accessed by the content, i.e. the information is engramed in a neural net, and each time part of a learned association is given the net will output the rest of the information.

Deemreo said:
if anyone else would like to share some input.
One thing to add is the relationship between memories and sleep. The http://en.wikipedia.org/wiki/Sleep_and_memory" makes a very good job at describing current knowledge and includes many refs.
 
Last edited by a moderator:
  • #7
The only thing I really need to know at this point is how brain keeps information separate and organized. Simplified example.

Your brain has two clusters of neurons, one with information about apples, one about bricks. For the sake of simplicity, neither cluster is connected to the other.

When your conscious wants to bring up information about apples, what happens?

Since your conscious doesn't know anything, presumably it pings both clusters. Since they both respond, how does the conscious choose the right information? How does it know that apples is the right choice, and not bricks?

--

It would seem that in order for the conscious to bring up information from a cluster of neurons, it would already need to know enough to make the right choice about which information to use.
 
  • #8
Deemreo said:
For the sake of simplicity, neither cluster is connected to the other.
For the sake of simplicity, assume both clusters are partially overlapping. Apple and brick would be a cluster of activated/silent neurons, with say two neurones in common (that's just an example of how it would works...). Suppose these neurons code for the phonological information. To ping 'apple' you activate these two neurons by 01, to ping 'brick' you activate it by 10. This will activate the whole set of activated/silent neurons connected to either the phonological information 'apple' or 'brick', specifically.
 
  • #9
One can find particular neurons that only fire during a visual encounter with particular concepts.

One particular neuron in a study fired for "The Beatles" whether it was text saying beatles or pictures of the band members.

Another fired only when a particular building was shown, and once when somebody mistook another building for that building.

These experiments were performed by Christoph Koch and presented in his talk, "quest for consciousness"
 
  • #10
Thank you for all of the information so far, it's definitely been very helpful.

I'm still having a bit of difficulty figuring out how the information is stored in the networks of neurons. It would seem that the connection 'strength' is where the actual information is stored, is that correct?
 
  • #11
I think it's more accurate to say the connection strength is a major factor. We're still not completely sure how it works.

One perspective I've foud helpful is that you don't statically store memory. It's malleable (in fact, it's easy to plant memories with the appropriately framed questions and association triggers)

so, IMO, it better to think of neurons as always processing information and not really ever storing it.
 
  • #12
So, when a neuron sends information to another neuron; does the initial neuron send the same signal to every neuron that it's connected to?

Since a neuron usually transmits through the axon, and the axon usually branches and connects to many other neurons, the initial neuron must talk to every other neuron that it connects to, right?
 
  • #13
Deemreo said:
the initial neuron must talk to every other neuron that it connects to, right?
Right, except that some axo-axonal synapses may cancel the message arriving to some of the target neurons.
 
  • #14
Okay, time for a new question. How much information can a neuron hold, and how does it hold it?

Apparently there has been recent information which suggests that a single cell can hold 'fleeting memory' for 'at least a minute, maybe more'.

So, I know that memories are stored as the connections between neurons, or rather the sum of the parts equals the whole. What I don't know is how much information can each neuron hold.

The way I see it, there can be two options: Each neuron contains one element of the memory, such as 'a brick is heavy' or 'a brick is often red'; the other option would be that every cell in a network of a particular memory has the same memory... once again using the principal that the more neurons for a memory there are, the brighter, more detailed, more clear the memory is.

Are either of those close at all? Or am I completely missing it?
 
  • #15
Pythagorean said:
I don't know if there's quite such a similar concept of "reading" as in computers. The changes brought by LTP changes the network, and so signal processing aspects are altered which leads to different behavior (learned behavior) but I don't really see a separate "reading" process, just a different "doing" process as a result of LTP.

I would also note that the wikipedia on long-term potentiation is a mine of references.

Wasn't there some research that came out not long ago (maybe a year or two max) that showed that we don't "read" memories, we recall them by the brain recreating thoughts, feelings, emotions, smells etc we had at the time--And so in this way isn't analogous to "reading" a hard drive like a computer does?

Sorry I don't have time to look into right now, Was thinking you might be aware of the study(ies) I'm talking about, being into neuro and all.
 
  • #16
Deemreo said:
Okay, time for a new question. How much information can a neuron hold, and how does it hold it?
According to the most common view the information is hold in the synapses (with some contribution of glial cells, but this is another mechanism for modulating synapse strenght), so in a sense, nothing.

Deemreo said:
Apparently there has been recent information which suggests that a single cell can hold 'fleeting memory' for 'at least a minute, maybe more'.
I'm not sure what you're talking about. The time frame corresponds to early activation of c-fos expression. Is that what you're referring too?

Deemreo said:
The way I see it, there can be two options: Each neuron contains one element of the memory, such as 'a brick is heavy' or 'a brick is often red'; the other option would be that every cell in a network of a particular memory has the same memory... once again using the principal that the more neurons for a memory there are, the brighter, more detailed, more clear the memory is.
Doesn't work this way. What really matter is the number, location, and stenght of the synapse. We can try an estimate based on this, but it's handwaving a bit. Say the strenght can vary from 1-100, the number from 1-1000, and the location from 1-100000 (this number is hudge because there are many possible locations for a synapse, even on a single dendrite), so the total number of bit per neuron would be at most 10^10 per neuron.

It's however likely that the biological processes include some probabilistic rules so that the exact limit is in fact lower, but who knows... what seems quite clear is that memory is not a limit for any living animal.
 
  • #17
bobze said:
Wasn't there some research that came out not long ago (maybe a year or two max) that showed that we don't "read" memories, we recall them by the brain recreating thoughts, feelings, emotions, smells etc we had at the time--And so in this way isn't analogous to "reading" a hard drive like a computer does?
This is not really that new :redface: the excitement came when it was realized that each time a memory is recalled/recreated, there is a window to modify this memories, so that it could be an avenue for the treatment for posttraumatic stress disorder and maybe some other psychiatric conditions. Obvioulsy, this may also constitute a scientific fondation for some Freudian ideas. What is not clear to me is why you say it would not be analoguous to reading a hard drive?
 
  • #18
bobze said:
Wasn't there some research that came out not long ago (maybe a year or two max) that showed that we don't "read" memories, we recall them by the brain recreating thoughts, feelings, emotions, smells etc we had at the time--And so in this way isn't analogous to "reading" a hard drive like a computer does?

Sorry I don't have time to look into right now, Was thinking you might be aware of the study(ies) I'm talking about, being into neuro and all.

Yeah, that was what I was trying to get at. We actually "do" instead of "read". I have no idea about the studies though; this was presented in a matter-of-fact way to me in my neurobiology course.
 
  • #19
Lievo said:
I'm not sure what you're talking about. The time frame corresponds to early activation of c-fos expression. Is that what you're referring too?

This was one of the two stories that I read a while back: http://www.livescience.com/health/090125-memory-cell.html"
Lievo said:
Doesn't work this way. What really matter is the number, location, and stenght of the synapse. We can try an estimate based on this, but it's handwaving a bit. Say the strenght can vary from 1-100, the number from 1-1000, and the location from 1-100000 (this number is hudge because there are many possible locations for a synapse, even on a single dendrite), so the total number of bit per neuron would be at most 10^10 per neuron.

It's however likely that the biological processes include some probabilistic rules so that the exact limit is in fact lower, but who knows... what seems quite clear is that memory is not a limit for any living animal.

Okay, this is some very interesting information here. Can you tell me where you learned this?

I do have a few questions about it. When you say strength, number, and location; I presume when you say number, you are referring to when one neuron has multiple connections another neuron, correct?

And of course, as an extension of a question I asked a few weeks ago, how would a neuron differentiate from connections on different parts of it's dendrites? For instance, if a neuron has a connection on both sides, how does it know that the connection on the right side means different information that the connection on the left side?

If you have the patience and time, could you perhaps explain this process with a more detail? I would be very grateful.
 
Last edited by a moderator:
  • #20
Lievo said:
This is not really that new :redface: the excitement came when it was realized that each time a memory is recalled/recreated, there is a window to modify this memories, so that it could be an avenue for the treatment for posttraumatic stress disorder and maybe some other psychiatric conditions. Obvioulsy, this may also constitute a scientific fondation for some Freudian ideas. What is not clear to me is why you say it would not be analoguous to reading a hard drive?

Like Pythagorean pointed out, I was under the impression that we recall through "doing" ("recreating", for lack of a better word, thoughts, sights, smells, etc during the "time" of the memory), unlike how a computer simply "reads" stored bits of information.
 
  • #21
Deemreo said:
This was one of the two stories that I read a while back: http://www.livescience.com/health/090125-memory-cell.html"
Ok, this is in fact not new at all. To get the idea see http://abstrusegoose.com/321" . What happens is that synaptic plasticity can go through fast mechanisms or through second messagers that will impact the physiology of the cell either using calcium or by changing the genetic expression. As the latter take some time, it could be said that the cell memories something. But in fact this 'memory' cannot be read until it changes the cells response, and this will necessitate to change the synapse (ok: there is also a possibility to change the threshold, but this also can be seen as a form of synaptic plasticity). So a better analogy is a RAM that could not be read until it goes into the hard drives.

Deemreo said:
When you say strength, number, and location; I presume when you say number, you are referring to when one neuron has multiple connections another neuron, correct?
Not necesserally. A neuron can both increase the number of connection is has with an another neuron and change the set of neuron it's talking to.

Deemreo said:
how would a neuron differentiate from connections on different parts of it's dendrites? For instance, if a neuron has a connection on both sides, how does it know that the connection on the right side means different information that the connection on the left side?
Of course there is no mechanism by which the cells 'knows' that something is coming from the left or the right and from which precise location on a dendrite. But the question should be: would it make a difference in the input/ouput relationship if a synapse was on the right or on the left? And yes it makes a huge difference.

Remember what cause a neuron to fire is that the membrane potential http://en.wikipedia.org/wiki/Action_potential" have been produced at the post-synaptic membrane (one location/synapse per EPSP) and passively transmitted from their synaptic original location to the axon hillock, as a wave progressively attenuated through its course.
This progressive attenuation implicates that the closer to axon hillock, the more likely a same EPSP will cause the neuron to fire. More importantly, this lead to http://en.wikipedia.org/wiki/Summation_(neurophysiology)" , i.e. when two EPSPs collided they summed up and become more likely to make the cell fire. But for a collision to take place, there is a need to meet at some location in the first place.

So look at the synapses on the right dendrite and imagine the EPSPs are just a little bit too weak to reach threshold. If an additional EPSP is produced on the left: nothing. If it's on the right: the cell fire. This is just a very simple example, but it generalize so that spatial distribution of the synapse on the dendritic tree dominates when trying to understand the input/output relationship, because this is what determinate which inputs will collaborate and in which way.
 
Last edited by a moderator:
  • #22
bobze said:
I was under the impression that we recall through "doing"
Ok thanks. I also think it's the correct picture but found it was not so bad as an analogy of memory reading -that's just a matter of taste.
 

Related to Storing information in a network of cells (Brain)

1. How is information stored in the brain?

Information is stored in the brain through the formation of new connections between neurons. When we learn something new, our brain creates new connections or strengthens existing ones between neurons, allowing us to retrieve and store that information for later use.

2. Can the brain run out of storage space?

The brain has an incredible capacity for storing information and is estimated to have the ability to store the equivalent of 2.5 million gigabytes of data. Therefore, it is highly unlikely that the brain will run out of storage space.

3. How does the brain retrieve stored information?

The retrieval of stored information in the brain is a complex process that involves the activation of specific neural pathways. When we try to remember something, our brain sends signals along these pathways, triggering the activation of certain neurons and allowing us to retrieve the information we stored previously.

4. Can we improve our brain's ability to store information?

Yes, we can improve our brain's ability to store information through activities such as learning, practicing, and challenging our brains. These activities help to create new connections between neurons and strengthen existing ones, improving our brain's overall storage capacity.

5. Is there a limit to the types of information that can be stored in the brain?

The brain has the ability to store a wide range of information, including facts, memories, skills, emotions, and more. However, there may be limitations to the amount of information that can be stored in each type of memory, and certain information may be prioritized over others for storage.

Similar threads

  • Biology and Medical
Replies
13
Views
2K
Replies
9
Views
2K
  • Biology and Medical
Replies
12
Views
2K
Replies
4
Views
2K
Replies
2
Views
2K
  • Biology and Medical
Replies
1
Views
869
Replies
3
Views
929
  • Biology and Medical
Replies
4
Views
3K
  • Biology and Medical
Replies
8
Views
1K
  • Biology and Medical
Replies
15
Views
2K
Back
Top