May 05, 2011
You can go ahead and pre-file this story in your "More Ways Brains Are Just Like Computers and Vice Versa" folder in advance if you'd like, but this neuroscience/computer network story does have a unique twist.
There are a number of theories that propose ideas for what causes schizophrenia, among which is the "hyperlearning hypothesis." In this line of reasoning, those who suffer from schizophrenia have brains that refuse to forget or cast aside as much as others who do not have the condition.
As a recent article described, "Without forgetting, one loses the ability to extract what's meaningful out of the immensity of stimuli the brain encounters. [Schizophrenics] start making connections that aren't real and start drowning in a sea of so many connections that they lose the ability to stitch together any kind of coherent story."
A team of researchers at the University of Texas at Austin and Yale University have discovered that this same problem can happen with computer networks that "cannot forget fast enough" and acquire a sort of virtual schizophrenia. Using a model of a neural network, when they added virtual dopamine (which is the trigger for the "over-remembering") to show how networks can recall memories in a schizophrenic manner as well.
This neural network, called DISCERN, has the ability to learn natural language. For this study that straddles neuroscience and computer science, the team simulated what happens to language when eight different neurological dysfunctions are introduced -- schizophrenia being among them.
The model was based on telling a series of simple stories to DISCERN, which were, as described "assimilated into DISCERN's memory in much the way the human brain stores information -- not as distinct units but as statistical relationships of words, sentences, scripts and stories." The team repeats these, eventually "training" the computer to understand.
One neurological researcher weighed in on the concept, noting that this is "basically a series of connectionist models. These are computer simulations of a large number of simple units or nodes that can have "activiations" of varying strengths and which have connections to other nodes. This model ‘learns' by modifying the strength of these connections according to some kind of simple learning rule." This researcher notes that there are clear connections between human brains and these connectionist models -- they are far simpler than brains yet they can do complicated things like recognize faces or objects.
Dr. Ralph Hoffman, a professor of psychiatry at Yale analyzed the results, comparing the computer to the neural network patterns. While there were remarkable similarities in the process behind understanding and interpreting information into memory and back out to output, one of the projects leaders explains that this is not proof of the hyperlearning hypothesis, but it does support it.
You can read the full paper from the journal, Biological Psychiatry, Using Computational Patients to Evaluate Illness Mechanisms in Schizophrenia or read a condensed summary of the findings from the University of Texas at Austin.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?