[deleted] 1 points
I know that it's been 27 days since your post, but I only recently discovered /r/neuroscience (and /r/neuro even more recently - so I can understand if this post is dead), but I came across this today: http://www.mit.edu/newsoffice/2012/conjuring-memories-artificially-0322.html
Appears that memory can be encoded in a very small number of neurons as opposed to large scale networks.
[deleted] 1 points
As a neuroimaging analyst, I often develop simple programs to automate my data processing. It's useful to think about brain function in a computational manner, but it won't capture the complexity of neural function at the level of the whole brain. Neurons are not like bytes, they don't solely represent a binary state. I know I oversimplified your analogy - it could be that neurons each store a larger number of states, but it's more likely links between them that encode patterns of environmental and internal stimuli. If you look at recent studies into AI that attempt to re-create patterns of neural function with computer hardware, you'll find that your latter suggestion is supported.
I was thinking about going into some basic computational modeling based on weighted connections as an example, but I shall leave that to you, if you're interested.