Full Name:  Web Master
Username:  webmaster@serendip.brynmawr.edu
Title:  Testing
Date:  2006-01-10 07:34:04
Message Id:  17581
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip


Full Name:  Emily Anne Lewis
Username:  ealewis@brynmawr.edu
Title:  Soothing Noises: A Look Into Music Therapy
Date:  2006-04-27 13:32:14
Message Id:  19163
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

For many injuries, one might undergo physical therapy. However, for
certain people, music is a necessary part of their therapy. Music Therapy is
defined as the "prescribed use of music, and the relationship that develops
through shared musical experiences, to assist or motivate a person towards
specific, non-musical goals."(1) Those who practice this therapy technique use
music to effect changes in communication, social, cognitive, and physical skills
as well as emotions.

For many people, music captures attention, and most kinds of music hold
a subject's attention. Listening to music makes use of many different parts of the
brain, making it possible for a person to respond to music in some way, even if
brain damage has occurred. For example, a person may not be able to process
the words of a song, but they find the tune soothing, and thus they calm down.

Music therapy is available to all age levels and no musical experience is
necessary for music therapy to work . Because anyone of any age can respond
to music but not everyone can communicate by writing or speaking, music
therapy is often used with younger children who cannot fully communicate yet or
people with aphasia. If the right kind of music is played for an Alzheimer's
patient, they may remember something for a bit longer, or recall something that
they could not previously remember. Even for those who are not Alzheimer's
patients, music is a good memory aide. It is not the "make your child get into
Harvard" trick that it was touted to be only a few years ago, but music can assist
people in memorizing many things, whether lines for plays, or note cards for

Persons who seem to respond particularly well to music therapy are those
with autism. The nonverbal, non threatening medium seems to encourage
attention and communication from the subjects. Many autistic children might sing
but not speak, making it easy for them to connect to the music. It also seems to
aid in autistic children remembering words and definitions. If a song is presented
about how to use silverware when eating, children are more likely to remember
how to do so.

For those with attention problems, music can be used to aid focusing.
Music adds some structure to time, making a long span of time seem shorter, or
vice versa. A friend of mine with Attention Deficit Disorder once explained to me
that she could only do work if music was playing. Otherwise, she said, she could
not focus as well, even with the aid of her medication.

In junior year of high school, I had surgery on my ankle and underwent
physical therapy for a time afterwards. My physical therapist said that I should
always have music playing when I did my exercises. I tried my exercises with
and without music, finding that my ankle moved better when I had the music on.
Why did this happen? Music, especially that with a good rhythm, stimulates
movement, and triggers our motor neurons, thus propelling our muscles. Music
was often what motivated me to do my exercises and kept me going as I went
through them.

Music therapy seems to be very effective when treating those who have
been abused, either physically or emotionally or both. Music can be used to set
up a trusting relationship between the therapist and the patient, as well as
encourage the patient to speak about his/her troubles and experiences.
Sometimes, it helps if the patient understands that songwriters and singers have
also had to go through the pain of domestic abuse, or a messy divorce. Thus,
songs like "Better man" by Pearl Jam or "Family Portrait" by Pink are useful in
treating patients. In "Better Man," the lead singer of Pearl Jam sings about his
mother, who he feels settled for a man who treated her abysmally, because she
believed she wouldn't find anyone else. "Family Portrait," on the other hand, is
sung from a child's perspective, as her family falls apart.

Whether young, old, suffering from a disease, or not, music therapy can
be useful to any person.

1)American Music Therapy Association
2)Autism and Music
3)Prelude Music Therapy

Full Name:  Gray Vargas
Username:  gvargasr@haverford.edu
Title:  What does it take to carry a tune?
Date:  2006-05-05 01:30:32
Message Id:  19232
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

I'm sure we all know someone who can't "carry a tune." There is always that one person who you are singing with who just can't seem to hit the right notes. For some of us, this ability comes naturally and does not require particular effort or specific training. I have noticed this and wondered about it my whole life, so when this topic was brought up in class, I wondered what neural mechanisms were involved in this skill and why some people have such problems. In researching this question, I also became interested in whether this ability can be or needs to be taught, and what this tone deafness tells us about how the brain processes music.

Some refer to tone deafness as amusia, but tone deafness is actually a type of amusia, which actually means the inability to comprehend or produce music at all. Tone deafness (otherwise known as tune deafness, dysmelodia or dysmusia) is when an individual can not differentiate between two different notes (it will be referred to as amusia in this paper). In common usage, though, many people say someone is "tone deaf" when they cannot reproduce a particular note—like the earlier example of someone singing (1). Some believe that this inability to produce a tone is more due to a lack of training than is the inability to differentiate tones (1). This brings up an interesting question of whether different systems are involved in each process (differentiating or producing), how one can be deficient while the other is normal, and which is more apt to improve with training.

Amusia can be congenital or caused by an injury later in life, and is estimated to be an affliction of 4% of the population. Some researchers argue that amusia is most often from brain damage or ear damage acquired later in life (6). Regardless, these individuals have normal intellectual, memory, hearing, cognitive and language abilities (2)(3). These individuals also do not show problems with perceiving or reproducing rhythm, showing that this deficit is very specific to pitch processing. Because of the intonation involved in language production and processing, one would think that understanding speech would also be a problem in these individuals, but this is not true (4). Some researchers believe that the perceptual needs for understanding language are much less fine-tuned than those necessary for pitch perception in music, so that the abnormalities these amusic individuals have are not severe enough for them to show deficient language processing (4). Therefore since the deficit is specific to music (and not language or timing) then there must be neural systems devoted specifically to music or at least neural systems devoted to pitch processing that are not involved in rhythm processing or language (at least to the same extent).

Since amusic individuals can be so specifically impaired in differentiating tones, many believe that there must be music-specific neural networks involved in the process (4). This brings up a more general question of how specialized the nervous system is, or how many processes use the same neural connections or systems. One would think that the nervous system would only be hard-wired specifically for certain purposes that are crucial to its survival. Because many argue that music serves no obvious purpose in our species' survival, one would not expect a neural system to be devoted to it alone. Therefore, pitch perception must be carried out in the neural systems that are in place for other functions like language, only at different levels of complexity (4). In other words, some researchers believe that "humans are born with musical predispositions that develop spontaneously into sophisticated knowledge bases and procedures that are unique to music" (4). So the musical predispositions present are actually remnants from other processing but can be developed to be used for musical skills. Other evidence for music-specific neural networks comes from the finding that a focal lesion in the left superior temporal gyrus caused amusia in one case study, where the individual could not recognize or produce melodies, but had no problems with rhythm or speech (7). While it is tempting to conclude that the left superior temporal gyrus is the site for tone deafness, it is still possible that a lesion to another region of the brain could lead to the same result.

On the other hand, there is evidence that music processing and other functions such as language are closely related, and thus might share neural systems. For example, in tonal languages such as Cantonese and Vietnamese, where the tones used to pronounce words can change their meaning, there are almost no tone deaf individuals (1). This suggests that in a culture where differentiating tone is essential to daily life, either these individuals differ genetically in terms of the skills they are capable of, or the skill is hard-wired in all individuals world-wide, and when it is necessary for daily life at a young age and throughout life it is maintained and not allowed to disappear. In countries with non-tonal languages such as the US, and in households where music is not practiced at a very young age, it is possible that this skill is not developed when individuals were born with the possibility to have the skill. Other support for the connection between pitch recognition and other skills is the claim that some researchers make that those with amusia are more likely than most people to have rhythmic and memory/recognition problems (1). Some say that musical ability is made up of many different skills ("neurally isolable processing components") which can be specialized for music (5).

A similar question is whether or not the ability to discriminate notes or reproduce them is innate. In a test where four notes are presented followed by a fifth which is either higher or lower, babies are able to tell the difference in the fifth note, while adults with amusia cannot (4). It would be interesting to study whether babies who later develop amusia are able to detect the difference in tone, since if they were it would mean that every baby is born with this ability and it was just not developed or maintained. But regardless, the vast majority of humans are born with the ability to discriminate different tones, without needing any experience to do so, suggesting that this ability has some sort of universal utility. This five tone test is a new method for measuring tone deafness which is the first to test it reliably (6). Researchers now do not have to rely on subjects to tell them whether they could tell the tones apart or not. By using an electroencephalogram (EEG)—scientists can tell by a subject's brain waves whether or not they detected the change in the fifth tone. There is a spike of electrical activity mainly from the right hemisphere, starting 200 ms after the tone, which signals a change in pitch in normal individuals and which is not present in individuals with amusia (2). Therefore, the amusic individuals' brains are not registering the difference.

Some researchers claim that training can improve tone deafness (some say completely), while others say that training is only effective in young children (and others say it is untreatable whatsoever) (2). It is unclear why there would be differences in training responsiveness between pitch discrimination, pitch reproduction or matching a tone, since matching a tone in fact requires the discrimination between the tone you are producing and the one you are trying to match. Pitch recognition involves comparing sensory input to either long-term or short-term memory, while matching a sung note with an outside note involves a feedback loop—such that an individual compares the sensory input coming from their own voice, and compares it with the other consecutive sensory input of the true tone. Therefore, if the feedback loop is damaged, one would not be able to tell that there was a difference between the tone you were producing and the reference tone (identical to how those with amusia cannot differentiate between two different tones). However, it is unclear how those without amusia, who can tell apart notes in melodies, still cannot match a note. Therefore, they can discriminate between two notes in a task but not between the note they are producing and the reference tone, and cannot/do not alter their tone to match them.

As I was researching amusia I became interested in how so many of us can distinguish tones and match tones so effortlessly. This is remarkable since in order to determine the tone being heard, the ear and brain must extract the pure tone since different instruments add different oscillations over the pure tones. And after the pure tone is extracted, it must be accurately compared to the previous tone in short-term memory. One remarkable phenomenon on the other end of the continuum from tone deafness is those individuals who are able to produce a certain pitch without hearing an initial reference tone. This is called perfect pitch, or absolute pitch, and is present in about 1 in every 10,000 people in America (9). It has been seen in as many as 1 in every 20 people in other countries (9). These individuals with perfect pitch are also able to recognize pitches instantly and with no effortful processing, and with no external reference pitch (8). Anecdotally, some individuals say that they compare the note to an internal reference pitch while others just report knowing the pitch intuitively (8).

Again, there are opposing theories on the genetic and environmental contributions to this ability, as some believe we all were born with perfect pitch (and either do or do not develop the skill depending on our experience), and others believe certain individuals are predisposed to have the talent (9). There is some evidence that certain groups, including autistic individuals, have an especially large incidence of perfect pitch (around 1 in 20 individuals), suggesting that there is a strong genetic influence on this ability (9). It is very easy to find materials online which claim to be able to teach adults how to obtain perfect pitch, showing that a large group of people believe that this is a teachable skill, even late in adulthood (10).

So what has all this taught us? Initially I believed that being able to sing "on key" was an ability you were either born with or without, but there is convincing evidence showing that certain experiences in early life might play a role in developing these abilities—the potential for which we might all be born with. Therefore, your friend who cannot match that note on the radio might have just not needed the skill enough in early life for it to remain or become developed. However, the differences are not entirely due to experience, since the evidence for genetic variation is also pretty convincing, since certain cultures and certain disorders show dramatically different amounts of tone deafness and perfect pitch than others. It also seems very likely that these deficits or exceptional abilities are localized to neural processes which can be developed specifically for music, since deficits in timing and language are not often seen in tone deaf individuals, and lesions to a specific part of the brain can cause tone deafness without the related deficits. It is possible that there are differences in the neural processes involved in producing and differentiating tones, but it is not clear why since both involve differentiating. Regardless, there is some evidence that pitch recognition, even perfect pitch, can be affected by experience, and thus can be taught late into adulthood.

1)Wikipedia tone deaf site, a good description of tone deafness on Wikipedia
2)EEG study, an article on the EEGs of tone deaf individuals
3)amusic study, an abstract of a study on amusic individuals
4)amusia study, an article describing amusia and the five tone task
5)music processing, an article on music processing
6)NPR article, an article on tone deafness for a general audience
7)lesion study, a lesion case study causing amusia
8)perfect pitch, a website on perfect pitch
9)wikipedia, a broad description of perfect pitch
10)perfect pitch training, a site advertising a perfect pitch training program

Full Name:  Stefanie Fedak
Username:  sfedak@brynmawr.edu
Title:  ...At the Risk of Sounding Glib
Date:  2006-05-05 07:51:18
Message Id:  19234
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

I published my second web paper on post-partum depression and the possibility of vitamin treatments for women who experience mild to severe episodes of post-partum depression. I drew my inspiration from the very public assertions of Tom Cruise that vitamins could 'cure' post-partum depression, and his blunt criticism of actress Brooke Shields' use of anti-depressants in the treatment of her post-partum depression. Cruise went so far as to call psychiatry a 'pseudo-science', and to furthermore, call Today show host Matt Lauer 'glib' in the midst of a live interview. As fate would have it Shields happened to give birth in the same hospital, on the same day as Cruise's beloved fiancée, Katie Holmes. The irony of it all was almost too much to bear. Shortly after little Suri (Tom and Katie's child) entered the world, Paul sent me his comments about my paper – and thus, the spark for this final entry on serendip was born (no pun intended).

After reading over my initial paper, and with the help of Paul's expert (not glib) comments, I recognized that I had made a number of broad assertions about the treatment of post-partum depression without proper contextualization. In order to avoid the risk of sounding a little too much like Tom Cruise, I wanted to clarify and reflect upon the alternative of vitamin treatments, the so-called "accepted" treatment options, and attempt to understand the intricacies of post-partum depression and its effect on women.

The first and most valuable source I could think of to elucidate the mysteries of motherhood was my own mother, Nan. In a conversation shortly after the completion of my original work, I mentioned the topic of my paper to Nan; her response was, "I can completely see how many mothers would be affected by post-partum depression." She then reflected upon an incident shortly after my own birth, where a woman threw her baby out of a window in her New York City apartment; she was later diagnosed with post-partum psychosis, the most severe on the spectrum of post-partum disorders.

While my own mother did not suffer from post-partum depression, she clearly understood how the intensity of childbirth, combined with the dramatic change in living situation, and what she termed as "overwhelming" responsibility of childcare, could drive a woman to harm her own child. When I asked my mother what she thought the solution to post-partum depression was, she responded that in serious cases, it seemed obvious that medical treatment should be the primary recourse, but she thought the most important thing a new mother needed was support – she stressed that 'support' could come in the form of someone who is willing to help change diapers, or having someone available to talk to who isn't wearing diapers.

As I thought about my mother's reaction, I began to consider my own first reaction to Cruise's claim that vitamins could 'cure' post-partum depression – "Tom Cruise is a nut job! What good would vitamins do?" While Tom Cruise's mental state might be up for debate, it is undeniably true that he has made a significant contribution in a continuing discussion about treatment options for one form of depression – though this contribution may have been made in a strange and abrasive way.

Both anti-depressants and vitamins are tested in similar ways; by comparing the reactions to and any possible improvements in, patients who received the substance – be it vitamin or drug – against patients who did not receive the substance. Evidence, as cited in my previous paper, shows that researchers have presented findings that both anti-depressants and vitamins, may have a level of effectiveness in the treatment of post-partum depression – a point that went somewhat overlooked in my original work. To treat 'studies' as though they are absolutely conclusive is both ill informed and ineffective in the furtherance of the scientific process of "getting things less wrong".

A second important observation that went unmentioned in my previous paper was patient individuality in relation to pursuit of treatment. It is unwise to seek a cookie-cutter course of action for each patient, because no patient is the same. Perhaps vitamins are all one woman needs in order to resolve her symptoms, whereas another woman might need a combination of drug therapy and counseling in order to work through her symptoms. Brain chemistry is not standardized, and treatment options shouldn't be either.

In science, it seems you can never really rule any one thing out. Nothing is entirely conclusive, and observations are the basis of all further pursuits of knowledge. In rejecting the idea that vitamins may have beneficial properties, we are no better than Tom Cruise asserting that psychiatry is a mere 'pseudo-science'. Just because something seems unorthodox, doesn't make it wrong; and just because something has become somewhat of a 'norm' doesn't make it right.

Works Cited from Original Paper

(1) Brooke Shields' Op-Ed from The New York Times, July 2005. http://www.nytimes.com/2005/07/01/opinion/01shields.html?ei=5090en=7189d307fdb5772dex=1277870400

(2) Transcript from NBC's Today show with Matt Lauer, June 2005.

(3) A Brief Introduction to Postpartum Illness, Authored by Shoshana S. Bennett PhD., 2003.

(4) Synopsis of the Andrea Yates Case, courtesy of the Court TV Crime Library. http://www.crimelibrary.com/notorious_murders/women/andrea_yates/index.html

(5) BioNeurix review of Amoryn, an all natural treatment for depression. http://www.amoryn.com/formula_bvitamins.html

(6) "Altering the Brain's Chemistry to Elevate Mood", Brown, Gaby, and Reichert

Full Name:  Carolyn Theresa Dahlgren
Username:  cdahlgre@bmc
Title:  I am the "I" in Interaction between Instincts and Thought
Date:  2006-05-05 13:48:09
Message Id:  19244
Paper Text:
<mytitle> Biology 202
2006 Third Web Paper
On Serendip

In front of you are three doors. They are labeled one, two, and three. Behind one of the doors is a prize. You choose door number one... and you are told that it was a good choice because door number three did not have the prize. Now there are two doors. Where is the prize? What do you do? What is your choice? Do you stay with your gut choice or do you switch to the other door?
As a classroom activity, we played out this hypothetical game and discovered that, statistically, your chances of winning the prize are better if you switch your choice to the other remaining door then if you stay with your initial choice. In order to win more frequently, you have to overcome your gut reaction to stay with the door you originally chose. Your analytical side has to overcome your gut or, perhaps, your instincts will triumph, either way, you have to choose.
While wondering which door to choose in order to find the prize, there are several parts of the self that are interacting to help chose the 'right' choice. There are instincts from the gut and thoughts from the head and each is vying for attention from the self. It is a tripartite system; one that is very similar to Sigmund Freud's theory of tripartite personality. Freud divides the mind into three parts: the id the ego and the superego. "The id contains 'primitive desires' (hunger, rage, and sex), the super-ego contains internalized norms, morality and taboos, and the ego mediates between the two and may include or give rise to the sense of self." (Wikipedia) According to Freud, the majority of our life experiences take place at the unconscious level, in the id and the superego; most of what drives us is not conscious. "At any given time, we are only aware of a very small part of what makes up our personality; most of what we are is buried and inaccessible." (All Psych Online)
In Freud's tripartite theory of personality, the id is the inner child inside the brain. The id is hidden away in the unconscious mind. It is "chaos, a cauldron of seething excitement. We suppose that it is somewhere in direct contact with somatic processes, and takes over from them instinctual needs and gives them mental expression...these instincts fill it with energy, but it has no organization and no unified will, only an impulsion to obtain satisfaction for the instinctual needs." (Freud) The id is instinct. The id focuses on somatic pleasure and gratification. As such, the id is associated with the body. The id is responsible for the gut feelings that we experienced during the door game. The id is emotion unfettered by thought or convention. In order to override this gut reaction, we need to explain another part of the unconscious mind; the superego.
The superego is the thoughtful part of the unconscious mind. It is the moral part of our unconscious which guides behavior by imposing strict moral and ethical guidelines on behavior. "Many equate the superego with the conscience as it dictates our belief of right and wrong." (All Psych Online) According to Freud, "the superego is a symbolic internalization of the father figure and cultural regulations. The super-ego tends to stand in opposition to the desires of the id because their conflicting objectives, and is aggressive towards the ego." (Wikipedia) The superego is the part of the unconscious that is responsible for the intruding thoughts during the door game; the thoughts that we should switch over to the new door.
The ego is consciousness; it is the self that we believe exists in reality. The ego is the 'I Function". The Latin translation of 'ego' is the nominative of the first person personal pronoun, 'I myself'. The ego is our awareness; a conglomeration of all of our unconscious desires as well as cognition about the world and the best way to achieve unconscious desires. "The ego understands that other people have needs and desires and that sometimes being impulsive or selfish can hurt us in the long run. It's the ego's job to meet the needs of the id, while taking into consideration the reality of the situation." (All Psych Online) It is the attention of the ego that the superego and the id are vying for; the ego is the part of the mind that mediates unconscious desires and cognition about the consequences of fulfilling those desires. "On behalf of the id, the ego controls the path of access to motility, but it interpolates between desire and action the procrastinating factor of thought... it dethrones the pleasure- principle, which exerts undisputed sway over the processes in the id, and substitutes for it the reality-principle, which promises greater security and greater success." (Freud) A healthy ego should be the strongest part of the mind so that it can "satisfy the needs of the id, not upset the superego, and still take into consideration the reality of every situation." (All Psych Online)
According to Freud, the ego, our consciousness, is mediated by unconscious processes and the constraints of the external world. If this is so, how many of the choices that we make and attribute to free will are, in reality, truly determined by the unconscious battle of id, ego, and superego? According to Freudian theory, "freedom of the will is, if not completely an illusion, certainly more tightly circumscribed than is commonly believed, for it follows from this that whenever we make a choice we are governed by hidden mental processes of which we are unaware and over which we have no control." (Thornton) Most of the information that the brain receives never enters consciousness and we often do not have any control over what does enter consciousness unless the 'I Function' specifically interferes. For example, we do not feel the presence of our clothes unless the 'I Function' demands that the sensations are brought into consciousness. Even then, it is hard to maintain the feeling unless you devote you whole attention to the task.
The unconscious is a reservoir for memories which we are not aware that we remember. These 'repressed memories' influence conscious thought and behavior. Freud had the extreme view that all repressed memories were the result of traumatic events. A less drastic interpretation of Freudian theory provides an interesting interpretation of 'lost memory', that the unconscious is a storage place for all the information that we cannot process due to the massive volume of neuronal information that we receive every second. Most of our lost memories are not really lost; they just never entered conscious awareness.
Perception is mostly a filtering and defragmenting process. Our interests and needs affect perception, but most of what is available to us as potential sense data will never be processed. And most of what is processed will be forgotten. Amnesia is not rare but the standard condition of the human species. We do not forget in order to avoid being reminded of unpleasant things. We forget either because we did not perceive closely in the first place or we did not encode the experience either in the parietal lobes of the cortical surface (for short-term or working memory) or in the prefrontal lobe (for long-term memory). (Carroll)
I had never really given much thought to the credibility of Freud's theories; instead I dismissed them out of hand. Some of the new interpretations and applications of the Freudian tripartite model of personality, however, have developed into a useful story of the mind and of free will. It is folly to import Freud's theories directly into modern society and to accept some of his more extreme ideas without some critical analysis. Freud did, however, have a huge impact on the field of psychology and the understanding of the human mind. "There is ample evidence that conscious thought and behavior are influenced by nonconscious memories and processes...Freud should be considered one of our greatest benefactors if only because he pioneered the desire to understand those whose behavior and thoughts cross the boundaries of convention set by civilization and cultures." (Carroll) I guess I had instinctual reaction to dismiss his theories due to the criticisms, especially feminist critiques, which I had heard and had influenced me. Freud's theory of id, ego, and superego, however, provide and interesting parallel to the interpretation of the self that we developed during our 'Chose a Door' game. The self is an interaction between gut reaction and thought is nearly the same as saying the self is ego, id, and superego. The combination of these two stories is interesting, they have given me insight into free will and helped me more critically investigate choices I make while experiencing life. I am not sure I make the choices at all. Still, with further investigation into the unconscious parts of the mind, perhaps I will gain more knowledge about how much or little free will I actually have and who I am.

All Psych Online: The Virtual Psychology Classroom. "Freud's Structural and Topographical Models of Personality". Last Update: March 21, 2004. Date of Access: May 4, 2006.
Carroll, Robert T. The Skeptic's Dictionary. "Psychoanalysis". http://skepdic.com/psychoan.html. Date of Access: May 4, 2006.
Carroll, Robert T. The Skeptic's Dictionary. "Unconscious Mind". http://skepdic.com/unconscious.html. Date of Access: May 4, 2006.
Freud, Sigmund. "The Unconscious". http://www.ship.edu/~cgboeree/freudselection.html. Date of Access: May 4, 2006.
Thornton, Steven P. Internet Encyclopedia of Psychology: "Sigmund Freud". http://www.iep.utm.edu/f/freud.htm. Date of Access: May 4, 2006.
Wikipedia. "Ego, super-ego, and id". http://en.wikipedia.org/wiki/Ego%2C_super-ego%2C_and_id. Last Updated: May 4, 2006. Date of Access: May 4, 2006.

Full Name:  Mariya Simakova
Username:  msimakov@brynmawr.edu
Title:  The Neurobiology of Nostalgia: A Story of Memory, Emotion, and the Self
Date:  2006-05-05 14:18:12
Message Id:  19245
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

-Is nostalgia debilitating or enriching?
-Neither. It's one of a thousand tender emotions.

Vladimir Nabokov
Interview to BBC-2, 1969

It is a normal day on Bryn Mawr Campus. I am walking to class, my reading is done, it is the beginning of spring and the air is filled with the scent of magnolias. Suddenly I stop, shell-shocked. It is too early in the season for the daily ritual of grass-mowing, so a few brave weeds recklessly hasten to reach their full height. Nothing has changed: the buildings, the trees, the green are as they have been a moment ago. And yet I am in Russia, I feel it within every cell of my body. Even the tangible evidence of the gray stone walls seems insignificant in comparison; in fact, the buildings around me look as if they have been cut out of cardboard – reality feels artificial, while the holistic memory within me has an air of truth. How does my mind make that leap? I can logically trace the development of that thought: they don't mow the grass all that often in Russia, so it is natural to make the connection; but how can I explain the sheer intensity of the memory, the feeling of its immediate reality?

The phenomenon of nostalgia is complex and fascinating. Although the term is usually used for the particular shade of immigrant homesickness, for the desire to return to one's home country, it can also be applied to our common longing for "the old times," for the flavor of jam that our grandmother used to make or for the song that they used to play on the radio twenty years ago. Despite its widespread character, nostalgia is not well-studied. The neurobiology of memory and emotion is still in its infancy, and nostalgia is, perhaps, much too complex to consider meaningfully at this time. Yet it seems that the current understandings of emotion and memory can be applied to it. In this paper, I will attempt to do so by trying to unpack my own experiences of and with nostalgia in light of contemporary neurobiological research. Naturally, this is a laywoman's attempt and bound to be wrong as such, but I think it will help me understand something about myself that, until now, I have taken for granted.

1. What Is Memory?

When most of us think about memory, we imagine a grand neural library, with complete records filed away at particular locations, ready to be pulled off the shelf when the correct stimulus is introduced. Current neurobiological research, however, challenges this model of memory and suggests that there are no complete "records" stored anywhere in the brain. Instead, both long- and short-term memories arise from the synaptic interactions between neurons. When a short-term memory is created (for instance, when you hear a name of a new acquaintance), the initial firing of the neuron makes the synapse temporarily more sensitive to similar subsequent signals: that is, the synapse is "strengthened." Over time, however, this sensitivity wears out. In order to acquire a long-term memory of the event (for instance, in order to remember the name of a new friend), the synapse must be strengthened permanently; that is, it must remain sensitive to or familiar with the particular neural firing pattern. (6) This permanent strengthening constitutes a memory "trace" in the brain, a path that, once made, is followed easily by subsequent signals. Complex memories of events constitute patterns of traces formed over large areas of the brain. When a familiar signal (such as the sight of an unmowed weed) is introduced, our brain facilitates its communication to other neurons, which respond with their own firing patterns. A particular memory, then, is not "retrieved," in the sense of being pulled off the shelf. It is, in fact, being recreated in "real time" by complex neural interactions, using the existing trace pathways. (1) The task of remembering is essentially a creative and multivalent process, not a direct "input-and-output" connection.

Such a model of memory stresses the importance of the "rules of storytelling" (1), of the larger interrelated brain structures required for the creation of particular memories. In order for me to connect the sight of the weeds to my earlier Russian experiences, a complex pattern of neural interactions, which constitutes my "story" about the event, must be played out. What is that pattern and where is it located in the brain?

2. The Geography of Memory

Memory is not unitary. There are "subgenres" of it, each constituting a separate way of telling stories. Declarative memory, for instance, is conscious. We normally consider declarative memory as the phenomenon par excellence, since it is responsible for our lasting knowledge of events, facts, etc. and makes it possible for us to access this knowledge at will. Declarative memory can be subdivided into episodic memory (which contains autobiographical, and especially spatiotemporal, information) and semantic memory (our knowledge of general facts about the world). Procedural memory, on the other hand, is unconscious and is responsible for our ability to retain information about action patterns, such as driving a car or making some coffee.

Classically, different areas of the brain are thought to be responsible for different subgenres. Thus, short-term declarative memory is believed to be connected to the cortical regions of the brain, with the left prefrontal cortex involved in episodic memory formation and prefrontal and parietal cortical areas involved in episodic (and especially spatiotemporal) memory retrieval (7). At some stage of the process of long-term declarative memory formation, the hippocampal regions become involved. Studies show that if these areas are damaged, patients can no longer form lasting memories, although they function well in situations calling for short-term retrieval. After some time, however, long-term declarative memory becomes independent of the hippocampus, since patients with damage to this area are able to retain memories of events prior to the injury (7). Nevertheless, it is difficult to pinpoint the exact brain regions responsible for particular memory functions, and it may be suggested that the processes responsible for memory formation and retrieval are not localized but, rather, exist in widely distributed brain structures. Moreover, it is difficult to separate the processes and structures involved in memory formation and retrieval (7) and it may be that these processes are, in fact, interrelated and perhaps even simultaneous. It is also hard to separate memory-related processes from other directly related cognitive processes, such as the evaluation of particular memories in terms of the current situation.

3. Emotion and Memory

While the above structures and processes are involved primarily in memory formation due to repetition, it has long been established that emotions can have a strong effect on memory even in the absence of repetitive situations. Studies exploring the interactions between fear and memory shed some light on the processes and structures implicated.

It has been suggested that there exist at least two distinct memory systems that can act independently of each other. The first system is linked to the amygdala and specializes in forming memories directly related to emotional experiences. The second system is connected to the hippocampus and regulates declarative and episodic memory unrelated to emotion (such as repetition-based memory and recollection of events at will). Although these systems can theoretically stand on their own, in practice they continually influence each other.

The amygdala, which is connected to human emotion, can influence the encoding and storage of hippocampal-dependent episodic memories. We all have experience with the fact that memories of emotional events are often more vivid and persistent than the ones acquired by mere repetition. One reason for this may be that, when the emotional stimulus is encountered for the first time, the amygdala enhances our ability to perceive it and the attention that we pay to it, which facilitates the formation of the memory trace. The amygdala has direct connections with sensory cortical processing regions (for instance, with the visual cortex). Therefore, it receives signals from them before the hippocampus does and can increase our sensitivity to them without our being consciously aware of it. It has been suggested that the events "tagged" by the amygdala as emotionally significant receive priority in memory encoding. (4)

Moreover, the amygdala can influence retention of emotional memories. It is known that hippocampal-dependent memories undergo a period of fragility, when the synapses can be easily desensitized if the signal is not repeated. This has been called the process of memory consolidation. Emotional reactions, mediated by the amygdala (such as arousal and release of hormones), can "tag" these memories as necessary for survival and, therefore, worthy of retention. Brain imaging studies show a correlation between amygdala activity at the moment of encoding and later persistent memory of emotional stimuli. Moreover, it is thought to influence perceptual systems as well, possibly affecting the very strength of the signals we receive. (4)

Hippocampus, in turn, can influence the amygdala function. When a subject learns by verbal instruction to associate a signal (such as, for instance, a blue square) with an aversive even (a mild shock to the wrist), there is activity in the amygdala even when the signal is never actually accompanied by the aversive event. That is, the amygdala "experiences" a painful emotion every time a blue square is produced, even if this experience is merely "imaginary." The acquisition and possibly retrieval of hippocampal-dependent episodic memory can, thus, significantly influence the amygdala, which, in turn, has an affect on physiological expression of emotion when an appropriate signal is encountered. (4)

The interaction between the amygdala and the hippocampus may also play an important role in our self-regulation of social behavior. For instance, when subjects are taught specific strategies (which require hippocampal-dependent memory for acquisition and application) for reappraisal of negative events in a positive or a non-emotional light, both subjective reports about emotional reaction to negative stimuli and the amygdala response to them are diminished. Hippocampus, thus, plays a role in modulating the amygdala function (4), which is connected to such emotions as fear, sadness, and anticipation of pain, especially those with autobiographical and personal value, and to the generation of response to these emotions (5).

4. Memory and the Self

While it is evident that memory is one of the components making up the complex thing we call the self, the mechanisms for its participation in this formation are not clear. It has long been noted that patients with damage to the hippocampus could still "remember," that is identify as theirs, specific personality traits. Therefore, something other than hippocampus must be able to reconstruct personal memories. Todd Heatherton, a psychologist at Dartmouth University, suggests that medial prefrontal cortex – located in the cleft between the hemispheres of the brain – plays an important role in consolidating and recreating perceptions and memories to produce a unitary sense of self. Just as the hippocampus does not "store" but creates memories by combining the trace patterns, medial prefrontal cortex creates (at least in part) the sense of self. (3)

Matthew Lieberman of the UCLA suggests that there may be two systems that connect memories and the sense of self in the human brain. His experiments imply that when we are confronted with words or experiences that have become deeply entrenched in us whether by repetition or emotion (for instance, athletes strongly associate the word "athletic" with their sense of self), what he calls the reflexive memory system is at work. It mediates not memories but intuitions; bypassing the regions of the brain that require explicit reasoning, it taps into the regions that produce quick emotional response. These types of connections take a long period of time to form, but once formed, allow athletes to identify themselves as "athletic" without the mechanism of memory retrieval. The reflective memory system, on the other hand, mediates our responses to neutral or ambiguous personal stimuli (for instance, the association of the word "performer" with the sense of self in an athlete). Lieberman argues that this system utilizes the hippocampus and other areas of the brain already known to be involved in memory retrieval/recreation; when faced with such signals (especially in new circumstances), the reflective system is responsible for thinking consciously about our experiences and for connecting them meaningfully with the already existent memory traces. (3)

Overall, the lower regions of the brain are responsible for the retention of memory traces, while the higher regions (such as the neocortex) provide frameworks of stories for the meaningful recreation and recombination of these traces. When a new input is received, the neocortex attempts to fit it into the already existing frameworks or pattern structures (1). Our brains are conservative, they place a value (often an emotional value) on the coherence and stability of old stories and do not easily change them. Humans want to preserve things the way they were, they feel comfortable when the new experiences fit their previous stories (2).

It should also be noted that the mechanisms of memory reconstruction are so complex and interrelated that sometimes there can be no understanding of what has affected us and, especially, of what will affect us in the future. We like to preserve our past stories, but the brain is also a creative organ that can change its frameworks from one moment to another. Therefore, memory can be productive and generative, with the new sensory inputs (or sensory inputs resembling the old ones but appearing in new situations) altering our stories of self. Precisely because memory is not a complete record that can be pulled off the shelf, precisely because it is never "real" or "faithful," it can play an important role in creating new stories (2).

5. What All of This Means for My Nostalgia

What do these understandings about memory, emotion, and the self have to do with my nostalgia? Everything. They provide me with a framework for exploring its mechanisms and the effects it has on my emotional and rational well-being. Over time, complex interrelated patterns of traces have been formed in my brain. Partially, they were formed by repetition: for example, I have learned to associate unmowed weeds with the summers spent at my grandmother's country house. Partially, they have been affected by the emotions associated with them, including the sense of belonging, comfort with my surroundings, love for my country, the happiness of my childhood, etc. After moving to America, the positive value of these emotions has been subjectively strengthened by the fact of my remembering or, perhaps, reminding myself that I am deprived of them. Moreover, the existence of these set memory patterns, of stories already used by the brain to account for certain signals (such as the unmowed weed associated with Russia, summer, and childhood), is comfortable in its own right. Therefore, when the signal (such as the weed) is presented to my conservative and emotional brain, the latter recreates the familiar story. The problem is that this story clashes with the current reality: I am in America, not in Russia. Instead of accommodating the story to the signal, my memory and emotional systems attempt to fit the signal into the existing schemes. When they are successful, I experience the feeling of being back in Russia and of the irreality of the American landscape. When they fail, the emotional pain of nostalgia enters as a reaction to the mismatch of the brain's expectations and the outside signals. The feelings of irreality and pain can be simultaneous, since they are both involved in the processes related to my story of self in the present moment.

Most painful and disruptive nostalgia episodes can, in fact, be self-induced. If the hippocampus and the amygdala do exist in a highly reciprocal relationship, I can actually "teach" myself that certain events or experiences (such as eating my mother's pies, for example) are pleasant, while others (such as American cuisine) are not, even if there is no "objective" emotional value to these experiences. When I encounter them, they will elicit strong positive or negative nostalgic emotions.

On the other hand, some of my memories and their emotional associations have migrated into my unconscious and became deeply interwoven with my enduring sense of self. Perhaps this can account for the fact that I nostalgically react to certain stimuli without consciously understanding or acknowledging them. Sometimes a sight or a smell makes me happy or sad, and only later I realize that it was because it made me feel like I was back home. And I am not certain that I can (or want to) change these, since such an alteration will modify some parts of my self that I consider fundamental.

Although nostalgia is usually associated with painful and negative emotions, with the inability to adapt oneself to new surroundings and situations, with the insistence on continuously reliving the past, this need not be true in all cases. There can be a creative and positive way of experiencing the feelings of nostalgia – whether it involves happiness or pain. If the memory retrieval triggers new memory formation – like my particular experience with the weeds in front of Denbigh – it can result in creative re-appraisal of my current stories of self. I can consciously teach myself to re-evaluate these experiences in a positive and generative light – for instance, as providing me with new knowledge or understandings – and by doing this diminish the negative (in the sense of debilitating and uncreative) emotional responses. But I don't even have to transform painful emotional memories into happy ones. The sadness and pain of nostalgia is not necessarily incapacitating; it can (and does) provide me with a different way of looking at the world, with a more complete sense of my being. Often, it is a prompting for creative work par excellence – and this paper can serve as an example. As with all elements of our stories of the self, nostalgia can be useful to our well-being, emotional and rational. And I believe that this usefulness is enhanced by a clearer understanding of the neural mechanisms involved in it.

WWW Sources

1)A summary of the 2004-2005 Brown Bag Discussion in BMC Center for Science in Society on Memory, History, and the Brain, Part II., Transcribed by Anne Dalke. Presenters: Elliott Shore, Paul Grobstein, and Paula Viterbo.

2)A summary of the 2004-2005 Brown Bag Discussion in BMC Center for Science in Society on Memory, History, and the Brain, Part I., Presenters: Elliott Shore and Paul Grobstein.

3)Zimmer, Carl. "The Neurobiology of Self." Scientific American, November 2005.

4)Phelps, Elizabeth. "Human emotion and memory: interactions of the amygdala and hippocampal complex." Current Opinion in Neurobiology, 2004.

5)Phillips, M. L. "Understanding the Neurobiology of Emotion Perception: Implications for Psychiatry." British Journal of Psychiatry, 2003.

6)Fields, Douglas R. "Making Memories Stick." Scientific American, 2005.

7)Roskies, Adina. "Mapping Memory with Positron Emission Tomography." Proceedings of the National Academy of Sciences of the United States of America, March 1994.

Full Name:  Brom Snyder
Username:  msnyder@haverford.edu
Title:  Language and the Brain
Date:  2006-05-05 15:15:10
Message Id:  19251
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Language's complexity and constant evolution make it one of the most fascinating and difficult frameworks to understand. Human beings have the ability to take two completely unrelated things, like a feeling of warmth, and create a sound that conveys the meaning and essence of that object to all the humans around him. The study of the brain and language is a rich and diverse field opening the door for a greater understanding of the most basic elements of humanity.

The definition of language varies depending on the source. For some language is defined as a process of neural pattern transference. Language involves the imperfect translation of the neural pattern in one individual to the brain of another. (1) Other scientists view language as a code with symbols "that can be connected to the words and phrases in a language." (2) Language serves as the point of convergence of physical objects and the abstract, assigning arbitrary but agreed upon signifiers to objects and ideas that exist in individual's brains.

Within in the brain the left hemisphere dominates language processing. Much of the language processing in the brain occurs in areas associated with the sylvian fissure. The sylvian fissure divides the paritiel, frontal, and temporal lobe serving as a point of convergence for these three regions of the brain. (3) The sylvian fissure houses the advanced association cortex, an area of the brain integral in the cognitive processes linked with language. The necessity of the advanced association cortex in the processing of language makes sense because on a basic level because language acts as a metaphor, "it finds 'connections' between the things in the mind" and things seen and experienced. (1) The generation of connections between unrelated entities like a sound an object or idea may be a common form of synaesthesia, a neurological condition where the subject experiences " a neurological mixing of senses." (4) People experiencing synaesthesia will see certain colors associated with certain sounds. In the case of language an object or feeling may have initially been associated with a sound made by humans. If this sound was consistently replicated by a group of humans it could develop into language, transferring a specific meaning with a specific sound.

For most of human history the only opportunity to examine the brain's language processing ability occurred when the brain suffered an injury. Doctors in the eighteenth and nineteenth century realized that the brain played a vital role in the processing of language because strokes or head injuries to certain areas caused a loss of language capability. One of the most famous doctors associated with the study of brain damage's link to language problems was Paul Braca, a nineteenth century French neurologist, whose patients exhibited a large vocabulary but could not form grammatically correct sentences. These patients suffered damage to the inferior frontal gyrus, a region of the brain thought to control the movement of the tongue and mouth. (5) The development of instruments like the MRI, that reveal the amount of neural activity in areas of the brain during a wide variety of tasks promote a much greater understanding of neural activity associated with the forming and processing language.

Among linguists there exists a theory that a "universal grammar" exists among all human languages meaning that "all humans are 'programmed' with a structure that underlies all surface expressions of language." (6) Noam Chomsky, one of the most outspoken supporters of the universal grammar theory, argues "certain aspects of our knowledge and understanding are innate, part of our biological endowment, genetically determined, on a par with the elements of our common nature that cause us to grow arms and legs rather than wings." (7) Chomsky's assertion that part of language is biologically innate opens the door for an interesting examination of the brain. If Chomsky is correct there must be a structure within the brain, possibly the advanced association cortex, where the neural pathways in almost every human follow a basic pattern, at the most basic level processing signals involved with language similarly. The implications of the innate nature of language leads to questions of whether other things like reason and morality also have a center in the brain? Do humans have an innate morality or framework for organizing the world that emerges despite differences in environmental factors?

In many ways language represents the factor which distinguishes humans from any other living organism. While other organisms utilize sounds to convey messages concerning direct matters of survival, humans use language to describe and transfer abstract concepts, an ability other organisms lack. The emergence of language in humans marked one of the greatest triumphs of humanity. Like any worthy scientific enquiry an examination of language and its relationship to the brain leaves the scientist with more questions than they started with. While language is vital to humanity we are currently only scratching the surface of how language is created and processed.

1)"Language as a neural process"
2) www.med.harvard.edu/publications/On_The_Brain/Volume 4/Number4/F95Lang.html "Language and the Brain" by David Caplan
3)"Lateral Sulcus"
4) "Synaesthesia –union of the senses" by Adrianhon
6)"Universal Grammar"
7) http://www.zmag.org/ZMag/grammar.htm "Universal Grammar and Linguistics" by Michael Albert

Full Name:  Perrin Braun
Username:  pbraun@brynmawr.edu
Title:  4.0 = Good for the Mind, Not for the Body?
Date:  2006-05-09 21:27:14
Message Id:  19295
Paper Text:
<mytitle> Biology 202
2006 Third Web Paper
On Serendip

As the end of finals week approaches at Bryn Mawr, many frazzled Mawrtyrs will attest to the fact that their eating habits drastically change during midterms and finals. Seniors, in the midst of writing their theses, will often complain of increased weight gain or suppressed appetite that they claim correlates to the level of stress that they are experiencing. Regarding weight fluctuation, it is not uncommon to hear impending graduates warn underclassmen that "it's not the freshmen fifteen, it's the thesis thirty." Many Mawrtyrs can relate to coming back to their rooms from a stressful day of class and studying and mindlessly binge eating. It is no secret that college students have notoriously horrible eating habits. Many people report of eating even when they are not hungry to begin with. Studies show that people who eat while performing other task, like studying or watching television, eat more than people without distractions because they tend to eat without realizing that quantity or quality of what they are consuming. However, every individual has different eating habits, which are categorized into two groups—those who are restrained and unrestrained eaters. The "restrained" eaters are intentionally restrictive with their diet, but are far more at risk for binging and excessive eating when distracted by stress than their unrestrained counterparts. At a recent study performed at Swarthmore College, in which half the participants were unrestrained and the other half were restrained eaters, students were asked to perform mental tasks with a bowl of snack food within reach. Findings show that the restrained group snacked on much more food during the test than the unrestrained eaters did. Essentially, the response of these two groups to food during periods of concentration is quite different (2). For many individuals, however, mindless eating is a way of coping with feelings of pressure or sadness. Indulging in "comfort food" as a way to alleviate stress is often common in a collegiate setting, where incoming freshmen are thrust into a brand-new environment and seniors are preoccupied with advanced research and thoughts of life after college. Dietary restraint, a theory that is extremely applicable to Bryn Mawr students, "conceptualizes individuals as differing on a spectrum of dietary concern and self-awareness about body image and weight; the relatively highest prevalence rates for dietary restraint are found in young women" (1). Several theories exist as to why some people have the tendency to over-eat. It has been suggested that binging provides us with a temporary reprieve from stress and re-directs our anxiety towards food. These ideas postulate that food, however, that food influences stress recovery in different ways for restrained and unrestrained eaters. The theory that re-attributing our anxiety towards binging does not completely alleviate our stress, however, for it only serves to re-direct it to a different source. When we experience periods of great emotional stress, our bodies produce hormones like adrenaline and cortisol. When the period of stress is over, cortisol increases appetite so that the carbohydrates and fat that should have been burned was replaced. This is due to our biological history, for when our ancestors were under stress, they needed these chemicals to increase their heart rate and tense their muscles so that they could be alert enough to fight (4). Our ancestors were much more physically active than we are today, but we have had the same hormones for hundreds of years, which tell us to eat even if we had been sedentary a whole day studying at a desk. While some foods do indeed have a temporarily calming effect due to certain chemicals that they release in your body, the ones that we commonly reach for when we are tense actually increase our level of irritability. These foods are often called "pseudostressors" or "sympathomimetics" because they imitate the sympathetic nervous system, which is involved in stress reactions. For instance, foods which have refined sugar and carbohydrates, are not easily processed by the body. Additionally, consuming a large amount of sugar in a short time span results in hypoglycemia, which is characterized by headaches, anxiety, and irritability. Sugar induces raised blood glucose levels, leading to insulin resistance and more produces more strain on the pancreas. Processed foods, such as common "junk" or snack foods, contain synthetic products, such as Sodium Yellow and Tartrazine, that are believed to cause stress. Artificial colors are actually believed to incite the central nervous system, contributing to hyperactivity and allergies in children (3). Therefore, the prevalence of obesity in our society might have to do in part with the amount of environmental and self-imposed stress that Americans experience on a daily basis. Healthy eating and exercise can help to alleviate our daily anxiety, but there is a definite biological explanation that rationalizes why we are so quick to reach for that bag of cookies or potato chips. Though food consumption can certainly temporarily gratify our senses, stress-eating does not successfully combat that problem that is causing us to eat. Works Cited 1)Journal of Behavioral Medicine, experiment on eating habits 2)MedicineNet, guide on healthy living 3)Better Nutrition, article on food and stress 4)Overweight website, article on food and stress

Full Name:  Perrin Braun
Username:  pbraun@brynmawr.edu
Title:  4.0 = Good for the Mind, Not for the Body?
Date:  2006-05-09 21:29:16
Message Id:  19296
Paper Text:
<mytitle> Biology 202
2006 Third Web Paper
On Serendip

As the end of finals week approaches at Bryn Mawr, many frazzled Mawrtyrs will attest to the fact that their eating habits drastically change during midterms and finals. Seniors, in the midst of writing their theses, will often complain of increased weight gain or suppressed appetite that they claim correlates to the level of stress that they are experiencing. Regarding weight fluctuation, it is not uncommon to hear impending graduates warn underclassmen that "it's not the freshmen fifteen, it's the thesis thirty." Many Mawrtyrs can relate to coming back to their rooms from a stressful day of class and studying and mindlessly binge eating. It is no secret that college students have notoriously horrible eating habits. Many people report of eating even when they are not hungry to begin with. Studies show that people who eat while performing other task, like studying or watching television, eat more than people without distractions because they tend to eat without realizing that quantity or quality of what they are consuming. However, every individual has different eating habits, which are categorized into two groups—those who are restrained and unrestrained eaters. The "restrained" eaters are intentionally restrictive with their diet, but are far more at risk for binging and excessive eating when distracted by stress than their unrestrained counterparts. At a recent study performed at Swarthmore College, in which half the participants were unrestrained and the other half were restrained eaters, students were asked to perform mental tasks with a bowl of snack food within reach. Findings show that the restrained group snacked on much more food during the test than the unrestrained eaters did. Essentially, the response of these two groups to food during periods of concentration is quite different (2). For many individuals, however, mindless eating is a way of coping with feelings of pressure or sadness. Indulging in "comfort food" as a way to alleviate stress is often common in a collegiate setting, where incoming freshmen are thrust into a brand-new environment and seniors are preoccupied with advanced research and thoughts of life after college. Dietary restraint, a theory that is extremely applicable to Bryn Mawr students, "conceptualizes individuals as differing on a spectrum of dietary concern and self-awareness about body image and weight; the relatively highest prevalence rates for dietary restraint are found in young women" (1). Several theories exist as to why some people have the tendency to over-eat. It has been suggested that binging provides us with a temporary reprieve from stress and re-directs our anxiety towards food. These ideas postulate that food, however, that food influences stress recovery in different ways for restrained and unrestrained eaters. The theory that re-attributing our anxiety towards binging does not completely alleviate our stress, however, for it only serves to re-direct it to a different source. When we experience periods of great emotional stress, our bodies produce hormones like adrenaline and cortisol. When the period of stress is over, cortisol increases appetite so that the carbohydrates and fat that should have been burned was replaced. This is due to our biological history, for when our ancestors were under stress, they needed these chemicals to increase their heart rate and tense their muscles so that they could be alert enough to fight (4). Our ancestors were much more physically active than we are today, but we have had the same hormones for hundreds of years, which tell us to eat even if we had been sedentary a whole day studying at a desk. While some foods do indeed have a temporarily calming effect due to certain chemicals that they release in your body, the ones that we commonly reach for when we are tense actually increase our level of irritability. These foods are often called "pseudostressors" or "sympathomimetics" because they imitate the sympathetic nervous system, which is involved in stress reactions. For instance, foods which have refined sugar and carbohydrates, are not easily processed by the body. Additionally, consuming a large amount of sugar in a short time span results in hypoglycemia, which is characterized by headaches, anxiety, and irritability. Sugar induces raised blood glucose levels, leading to insulin resistance and more produces more strain on the pancreas. Processed foods, such as common "junk" or snack foods, contain synthetic products, such as Sodium Yellow and Tartrazine, that are believed to cause stress. Artificial colors are actually believed to incite the central nervous system, contributing to hyperactivity and allergies in children (3). Therefore, the prevalence of obesity in our society might have to do in part with the amount of environmental and self-imposed stress that Americans experience on a daily basis. Healthy eating and exercise can help to alleviate our daily anxiety, but there is a definite biological explanation that rationalizes why we are so quick to reach for that bag of cookies or potato chips. Though food consumption can certainly temporarily gratify our senses, stress-eating does not successfully combat that problem that is causing us to eat. Works Cited 1)Journal of Behavioral Medicine, experiment on eating habits 2)MedicineNet, guide on healthy living 3)Better Nutrition, article on food and stress 4)Overweight website, article on food and stress

Full Name:  Rachel Freeland
Username:  rfreelan@brynmawr.edu
Title:  Review of Phantoms in the Brain: Probing the Mysteries of the Human Mind, By V.S. Ramachandran
Date:  2006-05-11 10:55:34
Message Id:  19314
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

In Phantoms in the Brain, Dr. V.S. Ramachandran recounts his work with patients who have bizarre neurological disorders, such as blindsight, phantom limb pain, and anosognosia. His findings shed light on the complex structural design of the brain and how the brain enables us to construct our body image, control our emotions, dream, make decisions, and have beliefs. These findings tell us about who we are. In addition, his studies confirm the idea that the self (I-function) consists of an illusion from the interactions of many brain functions.

I will be focusing on laughter and pseudocyesis (phantom pregnancy), as there are many neurological phenomenons explored throughout the book. While reading, I found these two neurological mysteries to be the most intriguing. In addition, I feel that they are good examples of how the brain and in particular the I-function, contributes to our understanding of how signals from inside of the box get translated to outputs.

In Chapter 10, The Women who Died Laughing, Ramachandran opens with a story about Willy. While Willy was attending his mother's funeral he broke into a fit of laughter that wouldn't stop. His family checked him into the local hospital for evaluation. The doctors could find nothing wrong, save for his uncontrollable laughter. Two days later, a nurse found Willy unconscious in his bed, having suffered a subarachnoid hemorrhage, and he died without regaining consciousness. The postmortem showed a large ruptured aneurysm in an artery at the base of his brain that had compressed part of his hypothalamus, mammillary bodies, and other structures on the floor of his brain. According to Ramachandran, "the abnormal activity or damage that sets people giggling is almost always located in portions of the limbic system, a set of structures including the hypothalamus, mammillary bodies, and cingulate gyrus that are involved in emotion" (1). This theory fits perfectly with what we learned in class about the role of the I-function. If part of the limbic system is sending out a constant signal to the I-function that says 'laugh', then the I-function might override the signal from the frontal lobes saying 'this is inappropriate'. I think it's fascinating that one small change in the brain's wiring causes a whole cascade of processes to go wrong.

In the next chapter, You Forgot to Deliver the Twin, Ramachandran explores the phenomenon of phantom pregnancies. Mary was nine months pregnant when she first visited Dr. Monroe. She felt the baby kicking and suspected that labor was about to begin and wanted Dr. Monroe to make sure the baby was in the right position. While Dr. Monroe was examining her, he found that her abdomen was vastly enlarged and low, suggesting that the fetus had dropped and her breasts were swollen, the nipples mottled. However, he could not get a fetal heatbeat and her naval was all wrong. One sure sign of pregnancy is a pushed-out belly button; Mary's was inverted. Mary was exhibiting a classic case of pseudocyesis. Some women who want to get pregnant or dread pregnancy develop all the signs and symptoms of a true pregnancy. Everything seems normal except there is no baby. Dr. Monroe knew that if he told Mary that there was no baby she would not believe him. Instead he told her he was going to put her to sleep and deliver the baby. When she awoke he told her the baby had died during birth. Right away her abdomen began to subside.

Can one's mind really will oneself to be pregnant? Ramachandran explains this phenomenon as a case of operant conditioning and depression. "When Mary, who wants to be pregnant, sees her abdomen enlarge due to gas and feels her diaphragm fall, she learns unconsciously that the lower it falls, the more pregnant she looks" (1). In addition, intense longing for a child and associated depression might reduce levels of dopamine and norepinephrine which could in turn reduce production of both prolactin-inhibition factor and follicle-stimulating hormone. Low levels of these hormones would lead to no ovulation and menstruation and breast enlargement and lactation, as well as maternal behavior.

Pseudocyesis is a perfect example of how forming new observations can lead to a better understanding. One of the things I enjoyed learning this semester was not to take everything at face value. Science is not exact and new observations are always being made and tested. This is very important because one should always question since you never know what you'll prove or disprove. Science is not about being right, but about being less wrong. Ramachandran does a wonderful job of explaining the step by step processes he took to get science less wrong.

This book paralleled the course of the semester well because as I was reading it, we were going more into depth about the subjects in class. For example, while we were learning about vision, I was reading the chapter on blindsight. I was able to take the knowledge that I learned in class and apply it to what I was reading. I think it would be very helpful to students if this book was assigned as mandatory reading during the semester. It's easy to read, funny, and relates exactly to the class material.

Works Cited
1) Ramachandran, V.S. Phantoms in the Brain. New York: HarperCollins, 1998.

Full Name:  Rachel Freeland
Username:  rfreelan@brynmawr.edu
Title:  Should Marijuana be Legalized for Medicinal Use?
Date:  2006-05-11 11:13:19
Message Id:  19316
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Abigail was diagnosed with breast cancer the year she turned forty. While undergoing chemotherapy, she was sick and vomiting constantly as a result of her treatments. No legal drugs, including the synthetic "marijuana" pill Marinol, helped her situation. As a result, she turned to marijuana, which she was forced to obtain illegally. However, with it, she was able to keep her food down, was comfortable, and even gained weight. Everyday, people suffering from diseases such as AIDS, cancer, glaucoma, epilepsy, Tourette's syndrome, Huntington's disease, and Parkinson's disease, turn illegally to marijuana to help relieve their debilitating symptoms. Why is marijuana illegal if it has helped so many people cope with their illnesses? Should marijuana be legalized for medicinal use?

Marijuana is the common name for Cannabis sativa, a hemp plant that grows throughout temperate and tropical climates. Tetrahydrocannabinol (9-THC) is the primary psychoactive ingredient; depending on the particular plant, either THC or cannabidiol is the most abundant cannabinoid in marijuana.

When marijuana is ingested, THC binds to the CB1 canabinoid receptor in the brain. The presence of cannabinoid systems in key brain regions is strongly tied to the functions and pathology associated with those regions. For example, nausea and vomiting (emesis) are produced by excitation of one or a combination of triggers in the gastrointestinal tract, brain stem, and higher brain centers (2). There are numerous cannabinoid receptors in the nucleus of the solitary tract, a brain center that is important in the control of emesis. The clinical value of cannabinoid systems is best understood in the context of the biology of these brain regions.

In both AIDS and cancer patients, weight loss due to nausea and vomiting is very common. "Weight loss of as little as 5% is associated with decreased survival and a body weight about one-third below ideal body weight results in death" (2). In addition, patients may even break bones or rupture the esophagus while vomiting. Many patients decide to stop chemotherapy treatments in order for the nausea and vomiting to cease, even though they know it might mean death. Although there are appetite stimulating drugs available on the market, many patients find that they are not effective or cause unpleasant side effects. Smoking marijuana seems to stimulate appetite and enhance the flavor of food. In a 1971 study done by L. Hollister, Hunger and appetite after single doses of marijuana, alcohol, and dextroamphetamine, four groups of subjects received either marijuana, alcohol, dextroamphetamine (a stimulant), or a placebo after fasting for twelve hours. Then they were periodically offered milkshakes and asked to say how hungry they were and how much they enjoyed the food. "The subjects who took marijuana felt hungrier and ate more, those who took dexatroamphetamine felt less hungry and ate less, and the effect of alcohol was negligible" (1).

However, the most compelling concerns regarding marijuana smoking in AIDS and cancer patients are the possible effects of marijuana on immunity. AIDS and cancer patients already have weak immune systems, if marijuana decreases their immune response, then they won't be able to fight off the disease. The human body protects itself from invaders, such as bacteria and viruses through the elaborate and dynamic network of organs and cells. Cannabinoids, especially THC, can modulate the function of immune cells in various ways, in some cases enhancing and in others diminishing the immune response. However, the natural function of cannabinoids in the immune system is not known. "Immune cells respond to cannabinoids in a variety of ways, depending on such factors as drug concentration, timing of drug delivery to leukocytes in relation to antigen stimulation, and type of cell function" (2). Should the risk of a diminished immune system by ingestion of marijuana keep patients from not obtaining it to relieve their nausea and vomiting?

The question is really not whether or not marijuana can be used as an herbal remedy, but rather how well this remedy meets today's standards of efficacy and safety. We understand much more than previous generations about medical risks. Our society generally expects its licensed medications to be safe, reliable, and of proven efficacy; contaminants and inconsistent ingredients in our health treatments are not tolerated (2). That refers not only to prescription and over-the-counter drugs, but also to vitamin supplements and herbal remedies purchased at the grocery store. Are we willing to outweigh the risks associated with smoking marijuana: tolerance, dependence, withdrawal, decreased immunity and cognitive ability, and changes in mood with the positives: decrease in nausea and vomiting, weight gain, diminished pain perception, relief from muscle spasticity in neurological disorders, and lower arterial systolic blood pressure? According to the FDA, the risks associated with smoking marijuana for medicinal purposes outweigh the positives. It seems to me, that medicinal marijuana should only be used when no other treatments are successful and if the patient is made fully aware of the consequences of ingestion. Lots of medications on the market today have adverse side effects, no drug is perfect. Why have people suffer when there is relief available?

Works Cited
1. 1) Hollister, L.E.. "Hunger and Appetite after Single Doses of Marihuana, Alcohol, and Dextroamphetamine". Clinical Pharmacology and Therapeutics 12 January 1971, 44-49.
2)Assessing the Science Base of Marijuana and Medicine, information about medicinal marijuana

Full Name:  Fatu Badiane
Username:  fbadiane@brynmawr.edu
Title:  Diagnosis: Culture
Date:  2006-05-11 11:47:12
Message Id:  19318
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

One's understanding of the outside world comes from a combination of what actually exists and what the brain tells us exists. One sees colors and hears sounds. Part of this perception is truth, the actual movement of electrons to make color and the vibrations in the air to make sound; and part of it is the fantasy our brain uses to make sense of the world; blue instead of red and high pitch instead of low pitch. This same need for sense in the world can be seen within the cultures of the world. The many cultures that exist contain beliefs that help those within it make better sense of the world. Especially with regards to remedies for health, treatment can come through spirituality, herbal remedies, and sacred rituals.

With these many views on the treatment of health disorders, also comes the occurrence of illnesses that vary from culture to culture. That is to say, syndromes that are found in some societies, but not within others. This is the basis of the definition given to culture-bound syndromes. The Diagnostic and Statistical Manual of Mental Disorders (DSM) "defines a culture bound-syndrome (CBS) [as] recurrent, locality-specific patterns of aberrant behavior and troubling experience that may or may not be linked to a particular DSM-IV diagnostic category. Many of these patterns are indigenously considered to be 'illnesses or at least afflictions and most have local names." A CBS, then, is an illness that is particular to a certain grouping of people and may or may not be identifiable using the methods of Western medicine. More interestingly, they often do not have a biological basis (1).

At the heart of CBS is culture. Culture is the shared beliefs, norms, and values of a particular group (2). Culture is also a creation of peoples, and of their brains, to make order, bring unity, and understand the world around them. In terms of CBS, culture is an example of how the brain, along with other brains, works to justify what is inexplicable in its surrounding environment. But, this justification is not equally seen by all the brains of the world. The same characteristics can even be seen differently by different brains.

The existence of CBS is something that is disputed among members of different disciplines. Currently, there are two main beliefs; on that CBS is rooted in biology and the other that it is rooted in culture. Robert Hahn and Ivan Karp have written about their opposing views with regards to CBS. In "Culture-bound Syndromes Unbound," Robert A. Hahn expresses his belief that biology, as well as culture, are important in disease, but that humans are too closely attached to their culture. One of his main beliefs is that the diagnosis of an illness as being culture-bound is flawed. For example, if one was to take a person trained in Western medicine and place them in another country with a different culture, they would find behaviors that seem abnormal to them. That observer would then return home after some time with a "new syndrome." This "new syndrome" is only found in that one place he visited, and is considered to be culture-bound. Hahn believes that during this evaluation process the biological basis of the "new syndrome" is being ignored. He believes that the condition must have a biological source, but due to the way it is represented in that different environment as well as it not fitting in with other known Western disorders, its origin is put in the culture; it is a culture-bound syndrome (3).

Ivan Karp shares a slightly different view. In "Deconstructing Culture-Bound Syndromes" he agrees with Hahn in that culture-culture comparison is to blame for the diagnosis of culture-bound syndrome. In the prior example, had the Westerner not compared the new culture with Western culture in such a negative light, he would not have believed the behavior to be a CBS. However, he also states that culture plays the most important role in the emergence of CBS. He believes that culture-bound syndromes do exist. Unlike Hahn who believes that there is only a biological basis to them, Karp thinks that there are different behaviors in cultures that point to the same overall behavioral sickness. He believes CBS is a real medical syndrome rooted in ones culture (3).

The difference in these two view points lies in what they view is the origin of CBS. Hahn believes it is biology, and that culture changes the ways one sees the symptoms. Karp believes that the foundation lies within the culture and that different cultures can have different ways of expressing the symptoms for the same behavioral illness.

The role of culture in CBS is in establishing what is acceptable and what is not within that culture and what should then be taken for illness based on those factors. An example of this is with anorexia nervosa. The behavior seen within anorexia which is considered non-adaptive in Western cultures may be seen as something entirely different in other cultures. But, in the Western world because of this view it is treated as a disease. This thought process is consistent with Karp's theory (3).

The existence of anorexia in the Western world is based within Western culture. Appearance, especially in women, is an ideal that is stressed in the Western world. Women must be slim and youthful to be considered beautiful and healthy. The opposite of these characteristics is viewed as ugly and lacking self-discipline. For some, anorexia is a way to obtain a desired body, while for others it is a means of self-discipline or control. These cultural values are what lead to the destructive behaviors that are associated with anorexia. In societies that are becoming more and more Western, such as Japan, anorexia is also becoming more and more common. This leads to the conclusion that the demands of Western culture on women are part of the prevalence of the illness within that society, and this makes anorexia nervosa a CBS (3).

Apart from recognizing that a CBS must be categorized as a disease within that culture itself, such as with anorexia, there are four more ways in which CBS is recognized. A CBS must have at least one of the following characteristics: be seen as a disease within the culture, familiarity within the culture, lack of familiarity to outside cultures, no demonstrable biological basis, and/or can be treated with folk medicine of the culture (4). Anorexia nervosa fits at least the first three descriptions from the list. It is seen as a disease within Western culture, it is familiar in Western culture, and it is not familiar to non-Western cultures. Another disorder to be noted is amok or mata elap that is known in Malaysia. It is an aggression disorder characterized by violent, aggressive, or homicidal behavior towards other people or objects. Amok is also similar to other disorders found in Polynesia, Puerto Rico and the Navaho Native-Americans. It can be seen as a CBS in that it does not have a biological cause, nor does it correspond to any illness within Western culture. Another example is shenjing shaijo, a depressive disorder in China. Shenjing shaijo is very similar to the Western diagnosis of major depressive disorder, but it only has the physical symptoms of depression and none of the emotional features (1).

Culture-bound syndromes are indeed something that is there. They are not imaginary with respect to cultural views, nor should they be undermined. They are real disorders that are greatly influenced by their cultural surroundings. As stated previously, culture is a way of creating order and unity within a society of people through the workings of the brain. The brain is working to change its environment, with culture, but the culture can rebound and then affect the brain. This is the case in CBS and explains why some syndromes are only seen in certain cultures and not others.

The environment the brain is found in differs for all of the cultures of the world. A different environment means slightly different brains among cultures of the world. And, within those cultures, individuals will also have slightly different brains from one another. This explains how boufée deliriante, a panic disorder, can be found among West African and Haitian populations, but not among others (1). The environment constructed by the African and Haitian culture demands different characteristics to be expressed by the brain. If these characteristics, or tasks, go astray, a disorder common only to that population will arise.

Culture-bound syndromes are an interesting phenomenon that results from the way groups of brains construct their environment. These constructions vary from place to place. With the acceptance of different values, norms, and customs, illnesses of the body and mind will be accepted and treated within varying degrees in different locations. Culture-bound syndromes could, as Hahn believes just be rooted in biology, but since many do not have any biological source it is hard to come to this conclusion for all disorders. Karp's views on the syndromes being rooted in the various cultural beliefs of the world, is an easier belief to follow. Although the actual origin of these syndromes can only be seen within the particular culture they affect, it is still important to realize that they are factors to consider when looking at health; especially in Western society which is often used as the standard for comparison.

Works Cited

1)Glossary of Culture-Bound Syndromes, This website contains a good list of the most common CBS found in Asia and other parts of the world. It give information on what criteria a CBS must have

2)Mental Health: Culture, Race, and Ethnicity – A Supplement to Mental Health: A Report of the Surgeon General, Although this report was not very relevant to the paper. The second chapter, Culture counts: The influence of culture and society on mental health, takes a look at cultural differences in mental health treatment in the United States. It can give a good idea of how CBS can make things difficult for both doctors and patients.

3)Culture-Bound Syndromes, This thesis on ADHD contains a section that looks at how it may be a CBS. It also looks at anorexia nervosa and the two main theories that surround the origin of CBS.

4)Culture-Specific Syndrome, This article from Wikipedia gives general information about CBS. It also has good links for further research at the bottom of the page.

Full Name:  Andrea Goldstein
Username:  agoldste
Title:  Name That Tune: A Look at the Neural Basis of Tone Deafness
Date:  2006-05-11 16:35:13
Message Id:  19326
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Music has long been considered a uniquely human concept. In fact, most psychologists agree that music is a universal human instinct. Like any ability, however, there is great variation in people's musical competence. For every brilliant composer or pianist in the world, there are several people we refer to as tone deaf. Far from being simply unable to carry a tune, people with tone deafness (or amusia, as it has been technically named) are unable to discriminate between tones, notice dissonance, or recognize familiar melodies. Such a "disorder" has been found to occur after some sort of brain damage, but in the last decade, a great deal of research has been done in an attempt to discover the cause of congenital amusia, which cannot be pinned to any brain damage, hearing problems, or lack of exposure to music. The findings of this research make a strong argument for the existence of an inborn disorder that impairs musical ability.

According to the extensive research of Dr. Isabelle Peretz of the University of Montreal, amusia is more complicated than the inability to distinguish pitches.(1). An amusic can distinguish between two pitches that are far apart, but cannot tell the difference between intervals smaller than a half step on the Western diatonic scale, while most people can easily distinguish smaller than that. When listening to melodies which have had a single note altered so that it is out of key with the rest of the melody, amusics do not notice a problem. Similarly, amusics find dissonant chords significantly less unpleasant than do people with normal musical abilities. As would be expected of someone who identifies as tone deaf, amusics perform significantly worse at singing and tapping a rhythm along with a melody than do normal people.(2)

The most fascinating aspect of amusia is how incredibly specific to music it is. Because of its close ties to language, it would seem that a musical impairment may be caused by a language impairment, or the other way around. Studies suggest, however, that language and musical ability are independent of one another. People with brain damage in areas critical to language are often still able to sing, despite being unable to communicate through speech.(3) More interestingly, while amusics show deficiencies in their recognition of pitch differences in melodies, they show no impairment in recognizing intonation in speech. For example, amusics who speak tonal languages, such as Chinese, do not report having any difficulty discriminating between words that differ only in their intonation. The linguistic cues inherent in speech make discrimination of meaning much easier for tone deaf people.(4) Without context clues, however, amusics show impairment in interpreting speech in the same way as interpreting melodies. Amusics are also successful most of the time at detecting the mood of a melody, despite their inability to interpret its tonality. Further experiments have shown that amusics are able to recognize the lyrics of a song out of the context of its melody, can identify a speaker based on his or her voice, and can discriminate and identify environmental sounds.(2) It seems that the issue has nothing to do with hearing; it is a processing deficit that happens further along on the neural pathway.(5) In addition, there is no general deficiency of memory or attention with amusics; the disorder is simply very specific to music.(2)

More recent work has been focused on locating the part of the brain that is responsible for amusia. The temporal lobes of the brain, which are home to the primary auditory cortex (6) have been the primary suspects. It has long been believed that the temporal lobes, especially the right temporal lobe, are most active when engaged in musical activity (3), so any musical disability should logically stem from here as well. Because it has been shown that there is no hearing deficit in amusia, researchers moved on to the temporal neocortex, which is where more sophisticated processing of musical cues was thought to take place. The most recent studies, however, have suggested that the deficits in amusics are located outside the auditory cortex. EEGs of the brains of amusics do not show any reaction at all to differences smaller than a half step. When changes in tones are large, their brains overreact, showing twice as much activity on the right side of the brain as a normal brain hearing the same thing. These differences do not occur in the auditory cortex, indicating again that the deficits of amusia lie not in hearing impairment, but in higher processing of melodies.(5)

So what does this all mean? It depends on your point of view. Looking only at the vast body of research of Peretz and her colleagues in the field of neuropsychology of music, it would appear that there is definitely some sort of disorder at play in amusia. As a student of neurobiology, however, this idea is met with some skepticism. There is no doubt that the studies by Peretz et al. are legitimate and have found significant differences between the brains of so-called amusics and normal brains. The more important question now becomes one of normality. Every trait from skin color to intelligence to mood exists on a continuum – there is a great deal of variation from one extreme to the other. Just because we recognize basic musical ability as something that the vast majority of people have doesn't mean that the lack of it is abnormal. Although our culture has set a cut-off point for many traits (like intelligence or mood), these are arbitrary distinctions between normal and abnormal.

What makes an amusic worse off than a musical prodigy? Musical ability is culturally valued, and may have been a factor in survival at one point in human history, but it does not seem likely that it is being selected for on an evolutionary scale any longer. Darwin believed that music was adaptive as a way of finding a mate (3) , but who needs to be able to sing to find a partner in an age when it's possible to stand outside someone's window playing a song that expresses your emotions on your iPod?

While idea of amusia is interesting, it seems unfounded beyond the interpretation that it is simply one end of the continuum of innate musical ability. Comparing this "disorder" to learning disorders like dyslexia or specific language impairment seems to be going too far in our modern "a diagnosis for everything" society. Before amusia can really be declared a disability, further research must be done to determine whether lack of musical ability is actually detrimental in any way. If no disadvantages can be found to having amusia, then it is no more a disability than having poor fashion sense or bad handwriting.

1)Music on the Brain, from Time Magazine

2) Ayotte, J., Peretz, I., & Hyde, K. "Congenital amusia: A group study of adults afflicted with a music-specific disorder." Brain, 125(2), 238-251.

3)Music of the Hemispheres, from Discover

4)Tonal Languages for the Deaf, from WonderQuest

5)Experts tune in on tone deafness, from BBC News

6)Tonal Hearing, from Instituto de Fisiología Celular

Full Name:  Nicolette Belletier
Username:  nbelletier@brynmawr.edu
Title:  Lucid Dreaming and the Self
Date:  2006-05-11 23:00:08
Message Id:  19333
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

When someone remembers a dream, the next day he or she is quick to share with friends. If the story is particularly strange, the dreamer is likely to be bewildered yet delighted to recount his or her strange adventures. A particular person may have made an unexpected appearance, or perhaps more seriously, a loved one may have died. For some, dreams are an entertaining coincidence. Others seek meaning in dream dictionaries. No matter what the initial reaction, it is difficult to know what to make of dreams. Although the information in dream analysis books may not provide the dreamer with any insight about his or herself, dreams stimulate introspection. Specifically, consciousness in dreams, presents humans with an opportunity to explore their minds and gain a better understanding of the self.

The average dreamer may be startled by the level of reality that he or she experiences while maintaining the notion that a state of sleep requires a lack of mobility and consciousness. As widely conceived as this is, it is proven that a person can carry out complex processes while remaining asleep. "Sleepwalkers" can walk and talk while technically sleeping. One young woman climbed to the top of a 130 ft tall crane and there have been homicide cases in which the defendant claimed to have been sleepwalking while committing the crime (1), (2). Situations like these show that the sleep-state can quickly turn from a place to have creative adventures to a potential for danger. For some, it might be fun and interesting to lay back and let dreams occur only to possibly remember them in the morning. For others, the lack of control that many people experience while asleep can lead to unpleasant nightmares or in the case of people with sleep disorders, embarrassing behavior.

Some people however, do not experience dreams as uncontrollable or mysterious occurrences. Lucid dreamers have the ability to recognize that they are currently dreaming and are sometimes even able to control what they do in their dreams. Today, scientists are exploring the characteristics of lucid dreams and developing techniques for people to use to create them.

Situations like these show that the sleep-state can quickly turn from a place to have creative adventures to a potential for danger. For some, it might be fun and interesting to lay back and let dreams occur only to possibly remember them in the morning. For others, the lack of control that many people experience while asleep can lead to unpleasant nightmares or in the case of people with sleep disorders, embarrassing behavior.

For people who have never experienced a lucid dream, the prospect of creating a complex adventure with no consequences in reality is fascinating. "The Lucidity Institute" organizes vacation trips in Hawaii to provide the curious with techniques and the proper setting to become lucid dreamers (3). However, lucid dreaming is not something in which only well-off tourists dabble. The section of the widely successful public-access encyclopedia website Wikipedia called "Wikibooks" features a volunteer-written guide to lucid dreaming. The Lucid Dreaming book was voted the Book of the Month in February 2005 (4). It is clear that there is a public interest in learning about dreams and attaining consciousness while not having to be confronted by reality.

Some of the techniques being developed today to bring about lucidity may seem like science fiction. For example, Stephen LaBerge of the Lucidity Institute created the NovaDreamer, a sleep mask fully equipped with lights, speakers, and a device to track eye motion. The lights and sounds serve as "dream cues." Eye movement during REM sleep has been proven to be indicative of lucidity because lucid dreamers can perform certain predetermined eye movements while dreaming. By measuring eye movement, the NovaDreamer can detect the point a which a dreamer gains control of the dream. Eye movement measurements also allow the NovaDreamer to time cues to occur during REM sleep. Sometimes, the cues will be incorporated into the dream. For example, a flashing light might become a part of a dream story. If the dreamer becomes lucid, he or she will remember that the lights are part of the NovaDreamer and the dream state will be interrupted. Another function of the NovaDreamer is a button that the subject will press if he or she wakes up to delay the stimuli while he or she falls back asleep. The button will make a beeping noise when pressed. Because of this, the NovaDreamer can provide another dream cue. Some people will dream about waking up and pressing the button, and when there is no beep, it is a sign that the person is dreaming (5). Recognizing the dream state is the first step to maintaining consciousness throughout the dream, and controlling the events that take place during the dream.

Although the practice of conditioning the brain to increase the chance of lucidity while may seem like a new if unbelievable phenomenon, it has been a spiritual tool for more than a thousand years (6). Through lucid dreaming, Buddhists have sought enlightenment. The philosophy of Buddhism is that suffering is existence (7). First accepting that life leads to suffering and that enlightenment is possible, one must follow the Eightfold Path in order to escape from suffering. One step of the Eightfold Path is "Right Mindfulness", in which a person sustains awareness the surroundings and the way the self fabricates a story from them. The Buddha outlined Right Mindfulness as consisting of four concepts-- contemplation of the body, contemplation of feeling, contemplation of the state of mind, and contemplation of the phenomena (7). Since the Buddhist philosophy maintains that "reality" as we see it is not truly reality because it is obscured by our perceptions of it, Buddhists have long sought to find a clearer reality in the sleep state. To them, dreaming is not an oddity nor is lucid dreaming a lark. Through "dream yoga," individuals can maintain consciousness while asleep and seek a better understanding of the self.

The techniques of dream yoga are ancient and extensive. For example, as one is falling asleep, he or she is directed to "Sleep on the right side as the lion doth. With the thumb and ring finger of the right hand press the pulsation of the throat-arteries; stop the nostrils with the fingers [of the left hand]; and let the saliva collect in the throat."(6) However, despite the fact that such instructions seem foreign in our Western world, they may in fact cause REM sleep to occur faster by lowering heart rate and also create a generally increased state of consciousness as the subject falls asleep (6) . The concept of mindfulness is essential in dream yoga, for the subject must maintain the same way of thinking all day long as when sleeping. In contrast to Western strategies today, in which individuals ask themselves "Am I awake?" (3) , dream yoga suggests that the person be constantly remind himself that he is dreaming (6).

Certainly centuries old spiritual dream practices will differ from developing scientific findings in the Western world. However, it is telling that people from such different walks of life seek refuge in the dream the world. Perhaps most Americans would not see dreams as a source of salvation. Still, the fact that people have an interest in controlling their dreams suggests that they have an interest in learning about consciousness and the way the brain works. Human curiosity about dreams inspires us to confront issues of reality and fabrications of the mind. Even an individual may not explore lucid dreaming through a spiritual lens like in Buddhism, it inevitably is a way for individuals to escape from reality as they perceive it and explore themselves without any outside stimulation.

WWW Resources:

1) Teen 'sleepwalks to top of crane', BBC article

2) 'Sleepwalker' accused of murder, BBC article

3) "Inward Bound", New York Times article

4) "Lucid Dreaming", open-content textbook

5) Lucid Dreaming FAQ by The Lucidity Institute

6) Wallace, B. Alan (Ed.). (2003). Buddhism and Science. New York, New York: Columbia University Press. See especially Stephen LaBerge's "Lucid Dreaming and the Yoga of the Dream State: A Psychophysiological Perspective" pp 233-255.

7) "About Buddhism"

Full Name:  Caroline Troein
Username:  ctroein@brynmawr.edu
Title:  If You Change Your Mind: The Effects of Learning on the Brain
Date:  2006-05-12 00:58:15
Message Id:  19338
Paper Text:
<mytitle> Biology 202
2006 Third Web Paper
On Serendip

Students at this time of year invariably come to a point where they feel that learning has become too difficult to continue. More than simply sleep deprivation, time stresses and the wealth of knowledge

The question is: does learning change the structure of the brain? Implications are far ranging if this is the case. For instance, the concept of learning could be changed to accommodate methods more productive towards changing the brain. Furthermore, natural predispositions towards certain methods of learning could be uncovered by exploring the structure of the brain. If the brain is changed by learning, it could be possible to artificially induce those changes or administer drugs to make the brain more malleable to those changes.

The ability of brains to physically adapt is called brain plasticity [1]. This can occur in two ways, either through "a change in the internal structure of the neurons...[or] an increase in the number of synapses between neurons." [2] Short term memory's ease of transfer into long term memory is a measure the plasticity of the brain. "One theory of short term memory states that memories may be caused by 'reverberating' neural circuits", that the synaptic impulses branch out and becomes widespread in the system. If the information survives a period of time, information can become more permanent or long term. The conversion into long term memory is associated with anatomical and biochemical changes. [2]

Infamously, a group of researchers from University College of London released a report that the brains of London taxi drivers were more developed than those of their ordinary citizens [3]. For this research, they would earn an Ignoble Prize [4] for Medicine in 2003. Their study showed that the posterior hippocampi of London were significantly larger. Since being a taxi driver requires high spatial representation, the results correlate with hippocampus stores a spatial representation of the world. Not only this, but it appears that the region can posteriorly expand and anteriorly shrink based on a higher demand for spatial representation.

Being a London taxi driver has demands not only for spatial understanding, but also to be able to process large amounts of information. Required to pass an extensive test where they must be able to immediately recall the route from any two points in London, the Knowledge [5] takes on average 34 months of preparation to pass. By physically practicing one of 320 standard routes on moped, future drivers are demanded to have impressive recall of exact details of routes, such as intersections, roundabouts, and landmarks. The longer that drivers have been doing these routes, the more their hippocampi posteriorly expand. This demonstrates that there is not a pre existing neurological difference, but that the area of the hippocampus is malleable enough to adapt to changing needs of the individual.

Learning new routes is an essential component of human existence since it carries a high survival value. An issue raised by the change in hippocamporal volume is that if there is not a need for spatial knowledge, this area of the brain could be severely underdeveloped. Learning in this model requires exposure, interaction and time. Time is needed so that the structure of the brain can physically change and accept new information. Imprinting knowledge is not enough to retain it. One must have the capability to retain it as well as the adaptability to do this.

Children have a great capability to learn, especially language. Language demonstrates the dynamics present in language acquisition of the adaptability of the brain and those qualities already understanding language. Aside from theories claiming that humans innately can distinguish between syllables [6] Infants may have an innate knowledge of the basic structural foundation of language, a sort of universal grammar. The knowledge of the universal grammar seems to disappear after childhood, and an adult can never achieve the same fluency in any new language. Yet adults are more adept at using "conscious study of a second language in a classroom setting" than children, most likely because adults have learned different heuristics, or short cuts, to use in understanding language.

Being bilingual changes the brain's structure. Bilingual speakers have more grey matter in the language region, with a direct correlation between the amount of grey matter and the age at which they became bilingual. Even learning a new language latter in life changes the brain by producing more grey matter, although it is not as elastic as before. The capability to change the structure of the brain seems to decrease with age. [7]

"The brain never stops changing and adjusting"[8] Although it may become more difficult to gain the proficiency of a native speaker, the brain still has the ability to adapt to new situations, such learning Braille after become blind.[9] The brain can adapt its language circuitry at any age. The use of specific brain exercises, however, can make the process of adaptation more expedient.[8] Thus the task of learning new languages is always possible, but the change must be directed to utilize the correct aspects of brain plasticity.

Learning does change the structure of the brain quite significantly. Whether with spatial knowledge or languages, the brain has a high degree of malleability throughout a person's life with which to adapt to new situations. Recognizing this is essential to how we treat opinions and information that people hold. People may value familiarity from a psychological perspective, enjoying the security that it affords, but they are not cemented into thought processes for the rest of their lives. If the brain is able to change, then issues must be dealt with on a cognitive level and approached with an understanding that learning is possible with time.

Web References

1) Neuroplasticity , general resource on Neuroplasticity

2) Brain Plasticity , overview with specific topics about brain plasticity

3) Navigation-related structural change in the hippocampi of taxi drivers UCL study on London taxi drivers

4) Ignoble Prizes: Past Winners the alternative Noble prizes

5) The Knowledge

6) Where are the boundaries between words How infants recognize syllables

7) Learning a second language changes brain

8) Brain Plasticity, Language Processing and Reading How brains are malleable throughout life

9) Old Brains, New Tricks Blindness and relearning

Full Name:  Ebony Dix
Username:  edix@brynmawr.edu
Title:  Seeing the World in Color: An Unfair Advantage?
Date:  2006-05-12 03:02:18
Message Id:  19341
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

An important part of our everyday lives is being able to see the world in color. Color helps us recognize and distinguish between objects of varying hue and saturation, it attracts our attention, and it serves as a "nonlinguistic code that gives us instant information about the world around us" (1). It enables us to tell whether our steak is medium or rare, to detect whether a newborn child is healthy or jaundiced, to find our car in a parking lot and to know when to slow down, stop, or go at a traffic light. What would our world be like if we could not see color? Just ask someone who is color blind. Individuals with colorblindness are at a seemingly huge disadvantage in our society because they are unable to perceive the world like the majority of the population, and thus are often dubbed disabled.

First, it is important to establish what is meant by color and how we are able to see it. Most scientific sources define color as a sensation produced in the brain by light that enters the rod cells, – one of the two types of photoreceptors found in the retina of the eye, - via the absorption/reflection of different wavelengths and frequencies of photons (4). When light is transmitted from an object to the eye, it stimulates the different color cones of the retina, therefore making the perception of various colors in the object possible. However, this definition does not describe for what purpose color vision evolved and why it is so important from an evolutionary perspective. This paper will attempt to briefly explore the evolution of color vision and how it came to be so useful for our everyday lives. In addition, it will explore the advantages and disadvantages of color vision versus those of color blindness in order to assess whether the human race can look forward to significant changes in the form and function of the color cones in the eye in the future.

Researchers at Caltech have recently discovered that they had been wrong all these years about color vision being developed in primates for the purpose of finding the right fruit to eat when it was ripe. Recently these researchers discovered that the color cones in the eyes of primates (which include humans) are, "optimized to be sensitive to subtle changes in skin tone due to varying amounts of oxygenated hemoglobin in the blood" (5). These findings suggest that color vision evolved in old-world primates for the purpose of distinguishing between changes in skin tone caused by blushing and blanching. Today, we do not utilize color vision solely for seeing emotion or distinguishing levels of oxygenated blood in the faces of mates or enemies as old-world primates used to, but rather, we seem to need color to perform simple daily functions. It seems that color vision was adapted by old-world primates for certain functions and over time has been exapted for a set of different functions in human beings. The reason for exaptation is debatable as many theories suggest different things. Within society, where the cultural norm is that colorblind individuals are in the minority, one might argue that we exapted color vision as an improvement to make us better suited to our environment. Some might even posit that it was a selective advantage for the purpose of survival that eventually led to the development of trichromat vision in primates and hence in humans. This would imply that those who lack trichromat vision are selectively disadvantaged and if individuals in society were competing for survival of the fittest those who are colorblind would lose.

Before categorizing the potential losers of this competition, I should take a step back and recognize that a more politically correct term for the color blind individual has been coined: color deficient (2). Individuals who are color deficient experience color blindness to varying degrees, in which they are unable to perceive differences among some or all colors that others can distinguish. There are varieties within the category of red-green colorblindness, in which individuals lack certain cones in the retina that normally enable them to distinguish between the green-yellow-red part of the spectrum (3). There are also varieties of blue-yellow color blindness, which involve the inactivation of the short-wavelength sensitive cone system whose absorption spectrum peaks in the bluish-violet (3). Monochromacy is the complete inability to distinguish any colors, which also occurs in different forms.

While colorblindness as a disability remains controversial, some argue that individuals with color deficiencies have some advantages that enable them to perform certain tasks better than those who can see color. For instance, color blind hunters are claimed to be more successful at selecting prey against a confusing background, and the military have found that color blind soldiers can sometimes see through camouflage that fools everyone else (3). One might expect the military to develop special recruiting units solely for color deficient individuals, so that they can snatch up the rare and underrepresented talent. However, unfortunately, that is not the case. Certain agencies such as the military, the coastguard, and the Federal Aviation Administration (FAA), perform color blind tests in order to screen potential employees (7). Colorblind individuals do not possess "the ability to perceive those colors necessary for the safe performance of airman duties" according to the FAA (8). Might this suggest that there are safe performance airman tasks that for which color vision was exapted from the old-world primates?

Our society has constructed an environment in which many things we use on a daily basis, such as road maps and traffic lights are color coded, which certainly does not benefit the small portion of the population that can't use them. One might argue that the process of natural selection will eventually weed out those less capable of functioning in our changing environment, but that just doesn't seem fair. Where does that leave the lonely anomalous trichromat, dichromat or monochromat individual (7)? Perhaps over time, within the next few million years, color vision will go through an additional metamorphosis and alter the way human beings live. For instance, perhaps colorblindness will be become more prevalent among individuals in populations over time and the benefits of this condition will outweigh the deficiencies. But until then, it is important to realize that colorblindness is a disability from the perspective of individuals living in a world constructed by members of the majority who perceive their shared surroundings in a different way. Until changes are made to make everyday tasks such as reading a color coded map more accessible to colorblind individuals, they will remain at a disadvantage because they will not be able to function fully in society with the rest of its members. Perhaps one day, it will be those of us who are capable of seeing color dubbed disabled, as the ones who are selectively disadvantaged to function amongst the rest of the population!


1) http://psy.ucsd.edu/~dmacleod/221/color%20papers/Neitzreview.pdf - Article entitled Molecular Genetics of Color Vision and Color Vision Defects, by Maureen Neitz, PhD and Jay Neitz, PhD.

2) http://webexhibits.org/causesofcolor/2.html - Website that gives detailed information on colorblindness.

3) http://en.wikipedia.org/wiki/Colorblindness - Free online encyclopedia that provides basic definitions of terms such as colorblindness.

4) http://www.cis.rit.edu/mcsl/faq/faq1.shtml - This site provides answers to color and color perception questions, as well as a nice diagram of the eye.

5) http://www.sciencedaily.com/releases/2006/03/060320221839.htm - Website that publishes daily science news. This particle piece is an article from Caltech on the evolution of color vision old-world primates.

6) http://www.sciencedaily.com/releases/1999/11/991109072142.htm - Website that publishes daily science news. This particle piece is an article from University of Chicago Medical Center on the origin of primate color vision.

7) http://www.agape1.com/color%20vision.htm – Website of the Agape Optometry Center that discusses colorblindness.

8) http://www.leftseat.com/baggish.htm - Website by Pilot Medical Solutions, Inc. that provides an article with details on color vision information for pilots.

Full Name:  Sylvia Ncha
Username:  sncha@haveford.edu
Title:  Unconsciousness vs. Consciousness vs. Awarness...What It All Means
Date:  2006-05-12 04:25:49
Message Id:  19345
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Nowadays, we think we know what decisions or behaviors we do consciously and we know the habits that we do unconsciously but how do we know the difference? When people are in a public setting and a person yawns, why do others yawn right after? Is this action unconsciously letting the person know that they are tired? Humans are conscious subjects of experience, we like our perceptions and sensations and for the most part, we understand why we have them. We understand why we feel pain and pleasure and why we do certain things and not others. However, what is the difference between a sleepwalker slapping someone unconsciously versus the person doing it consciously? What in consciousness makes the two situations different?

A sleepwalker, from what we have discussed in class is not really responsible for his actions because in a sense he did not do anything. The sleepwalker's brain is what is in control at that point. So the conscious is not just the brain because nothing in the brain produces the I in what I did or when one says I did not know that I did that. When a person refers to themselves as I, they are referring to a state of being. Being can only occur if the person is aware and from this is where, you get consciousness. Consciousness is where we are, we are consciousness. The consciousness comes in different forms like tasting, seeing, hearing, smelling, and feeling. These are sensory forms of consciousness or awareness. Seeing is our visual awareness while hearing is our auditory awareness. At the same time, if you hear, see, or smell something you are conscious of it. Though they seem to be quite similar, are consciousness and awareness the same idea? If I hear a sound of an instrument then I am aware that there is a sound being played from somewhere and therefore I am conscious of it but if I can not identify the instrument that is producing the sound then I am not aware of the instrument. This is actually all a play on words, I believe strongly that consciousness and awareness are the same thing. Awareness is just knowing that something exists and that is what consciousness is, it is the know. So now that we have established that they are the same thing lets get deeper.

You can be aware of a sound but not be aware of what is producing the sound. In other words you can be aware of something but not know what that something is. What is that I hear...is an example of this idea. A person can therefore be aware of something but not be aware that they are aware of that something. Another example is that a child can be aware that they hear barking but they are not aware that it is from a dog but they do recognize it as a distinct sound. The idea of awareness and consciousness is one that is very complicated because it has been manipulated in a way that makes consciousness an aspect of awareness and awareness being on a higher level.
Earlier, it was established that consciousness did not include much of the brain, it does not mean the brain is still not a powerful entity of its own. Consciousness can take off in one direction, while behavior can go in another direction. The brain therefore can carry on at least part of its job without consciousness being present. For example, people can be daydreaming while driving and though their body is doing the driving by means of the brain, they are not quite there. The driver is in another world but somehow, the body is still able to maneuver through the roads and not harm anyone. As we said in class, some people are able to carry out full conversations and give out directions, while they are sleeping. Sleep talkers are not conscious when they are doing this yet they are able to still do perform the same skills as if they were conscious. This says a lot about the brain, more that it is one of the strongest forces in our bodies that could, and I said could, probably due without consciousness. This is not at all saying that we do not need our conscious, we do because with it we are who we are and we are aware of everything around us.

Furthermore, consciousness, unconsciousness, and awareness, are all quite complex ideas but they are all necessary aspects of the human. Awareness as I believe, is the same as consciousness because both ideas have to do with what we know, what we see, hear, taste, etc. Our consciousness helps us to understand the world around us, and I guess our unconsciousness allows us to understand that our bodies can work without us being aware of it. Another interesting idea about unconsciousness is that, sometimes in sleep the unconsciousness can discover some important events that will happen in the future and try to give the conscious warnings, eventually the unconsciousness creates different ominous dreams. Overall, people are just so complex in that what makes them a being is not only the brain but themselves being their conscious as well.

Full Name:  Bethany Canver
Username:  bcanver@brynmawr.edu
Title:  Storytelling: The Sixth Sense
Date:  2006-05-12 08:40:03
Message Id:  19355
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Storytelling is an integral part of everyday life that we are first exposed to as young children and then actively engage in as we grow older. Traditionally, we think of storytelling in terms of its educational or cultural functions; stories can teach morals, cultural expectations, and behavioral norms. A large part of what maintains and perpetuates society comes in the form of a story, whether it's in a book, magazine, on television, in a movie, on the internet, or on the evening news. The knowledge individuals have because of the storytelling they are constantly exposed to vastly outweighs the firsthand knowledge they have about the world. Therefore, storytelling can be described as biologically imperative because, like sight of hearing, stories help individuals make sense of the complex world around them.

Through language and symbols whose meanings are learned during the process of socialization, human beings are able to communicate complicated stories to one another. The utility of this phenomenon is that even when an individual has not had the experience being told as a story, the individual is able to recreate the scenario in his/her mind. The recreated scenario is now a part of the individual's body of knowledge and can be drawn on for reference in the future. Just as our sense reduce the multitude of external signals to a less complex "story" that dictates how individuals understand and experience the world, our mind works at a macro level (relative to sight, hearing, smell, taste, and touch) to reduce complex inputs so that they are manageable. The brain groups certain patterns of sense as well as behaviors into stories that are used to predict outcomes. For example, when an individual smells smoke and hears sirens he/she is most likely to conclude that there is a fire because his/her brain has already linked smoke and sirens in a previous story. A series of such causal relationships bring order to the complexity of the external world (1).

Not only are individuals passive listeners of stories but they are also actively creating and telling stories to others. New experiences, for example, are reduced to a story with a causal relationship embedded in it and is classified as what Ken Baskin calls an "antenarrative" (1) , or a mutable version of the story. When repeated testing of the story's causal relationship yields consistent outcomes the story becomes a fixed part of an individual's personal knowledge and aids the individual in understanding how the world works according to his/her perceptions. Baskin describes this process in terms of a "self-reinforcing feedback loop" (1) in which actual outcomes that are consistent with the expected outcomes predicted from a story define what an individual knows and defines as real.

In communicating ideas, images, or situations through storytelling a greater understanding of experiences is gained by both the storyteller and the listener because as Robin Mello notes, storytelling is an interactive phenomenon (2) . By sharing stories with others, an individual can compare that which he/she has determined to be factual against another individual's story of his/her experience in what is essentially a "negotiated transaction" (2). Therefore, the ways individual brains make sense of the world is part of a larger collective understanding of how things work. It is this collective understanding of stories which ensures some degree of uniformity across a given culture's stories. The causal relationships in stories which are accepted outside the individual level outline the rules used to generate fictitious stories.

Not only are stories told at the interpersonal level but they are also told by the storyteller portion of the brain to the audience portion of the brain (3). The brain is on one hand collecting and organizing information and then presenting it in a logical way to the I-function. It is likely that the process of creating the story for the I-function includes a great deal of choice and editing that is governed by prior experience and pre-existing stories that the I-function already has knowledge of.
Stories are intertwined with reality and memory (2) so that imaginary and factual narratives influence reality and memory and reality and memory, in turn, influence the propagation of stories. How and what is remembered or considered a part of reality is determined by how individual and collective stories categorize experiences. In addition, the stories that an individual has archived are subject to revision if the expected outcomes predicted by the story is not fulfilled. This, perhaps, accounts for the variation among memories of many people who experienced the same event.

Like science, storytelling also is a constantly evolving summary of observations which is why science can be thought of as a story (3) . In essence every human being is a scientist who is conducting experiments at all times and every scientist is a storyteller who's story's plot and development is based on prior experiences.

An interesting consequence that results from the use of stories in developing a knowledge base and a sense of reality is the multiplism inherent in the process. Because every brain is wired differently it can be deduced that the storytelling portion of every brain does not categorize and present stories to their respective I-functions in a uniform manner. Despite the existence of the collective understanding of stories described above, the similarities between individual's stories are limited to the use of language and images that have relatively common meanings at the cultural level. How two individuals process and interpret the same experience can be widely varied. Where does this variation come from? Is it genetically determined that some people will only interpret experiences as positive and others as negative? If so, perhaps this can explain those in society who are called pessimists or optimists. If not genetically determined than it is possible that this variation comes from a difference in environment. For example, people who live in poverty interpret experiences differently than those who are of a higher class status. This argument has been used by Daniel Moynihan and the subsequent theory of the Culture of Poverty.

Another point to ponder is what is the storytelling portion of the brain leaving out? The storytelling-brain has to compensate for that which gets left out of the story because it is not detected or does not fit into the organizational system of the brain just as the brain has to compensate for the portion of the eye that does not have photoreceptors. This portion of the eye that is not receiving input results in a blind-spot that the brain fills in by completing the story.

Stories told by the brain to the I-function and stories told from one person to another make up what we know to be reality. We experience the world indirectly by way of our senses and are brain's organization of that information without necessarily having to have the said experience firsthand. Though we are influenced by the stories of others and the stories that are common to society as a whole, there is a great deal of variation that exists among any number of individuals who have had the same experience due to the unique neural arrangement of each human being.

1. Baskin, Ken. "The Function of Storytelling in Knowing". www.peaceaware.com/scmol/abstracts_2005/Baskin.doc
2. Mello, Robin. "The Power of Storytelling: How Oral Narrative Influences Children's Relationships in Classrooms". http://ijea.asu.edu/v2n1/
3. Grobstein, Paul. "Science, Pragmatism, and Multiplism". http://serendipstudio.org/sci_cult/pragmatism.html
4. Georges, Robert A. "Toward an Understanding of Storytelling Events". The Journal of American Folklore. Vol. 82 1969: 313-328.

Full Name:  Erin Schifeling
Username:  eschifel@brynmawr.edu
Title:  There is no "I" [-function] in Procedural Memory
Date:  2006-05-12 10:28:26
Message Id:  19363
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

In my earlier explorations of memory (1), (2), I came across an interesting distinction between procedural and declarative memory and gap in my research (memory development from conception to adulthood). The process of neurological development suggests that declarative memories are complex procedural memories located, morphologically and evolutionarily, near the emergence of the I-function.

Procedural memories are memories formed when repeated signals reinforce synapses (the connections between neurons). These memories explain central pattern generators not present at birth. Although a procedural memory can be as simple as a connection between two nerve cells in the fingertip, CPG's are a good example of complex, centralized procedural memories. Procedural memories generate learned output, sometimes in response to specific inputs and other times to create coordinated "motor symphony" outputs for a given command.

Declarative memories are nervous system changes that allow us to recall and narrate past experiences. Academic theories, historical facts, the relationships between family members, and word meanings are also declarative memories. While repeated actions forming procedural memories in already present synapses is relatively easy to conceptualize, a network of neurons to record the experience of your 10th birthday does not exist at birth. Also teaching coordinated outputs seems at first glance simpler than reinforcing inputs.

Data from cricket studies help with some of this. Male crickets that are hybrids of two species produce a hybrid song. Female crickets do not chirp. Still, the hybrid females are more attracted to the hybrid song than the song of either parent species. Similar genetic characteristics produce song generators and song receptors that are hybridized in the same way. (3) If receiving and producing systems of neurons respond to the same genes, it is not difficult to imagine that memories would form in similar ways, and might even employ some of the same neuron networks.

The interneuron configuration that can produce procedural memories should produce declarative memories in a similar way; the reception of information from one's environment under certain situations (repetition if one is studying times tables, and heightened emotions during important events) can cause certain receiving neurons to fuse together, forming a memory of sensory information instead of muscular output. Furthermore, once sensory signals are inside the brain, they are not, in themselves, in any way different from signals starting within the brain. All the sensory inputs that form a memory could be joined in a pattern generator that fires the right sensory neurons for us to perceive, again in the brain, remembered perceptions. This makes declarative memories seem not that different from procedural ones, but there is an important difference, and that difference is the presence of the I-function.

The declarative-procedural division is also described as an explicit-implicit divide, because unlike procedural (implicit) memories, declarative (explicit) memories form and are recalled along side the I-function. We learn to walk not by thinking and figuring out how to walk, but through attempts that subconsciously pattern and train our motor neurons. We are conscious of what happens around us and within our brains at our tenth birthday parties and when we remember it years later. We consciously learn and use vocabulary in a second language.

How does this framework compare to the developmental evidence, which according to "ontogeny recapitulates phylogeny" parallels evolution? Does it indicate that declarative memories are a step up from procedural ones, using the same basic biological properties and the addition of the I-function? While possibly more complex, declarative memories form among the excess of "blank" nerves toward the front of the brain that can connect in a myriad of ways based on the person's experiences. These synapses have more flexibility because they are not directly motor or sensory neurons or interneurons necessary for survival.

Before birth, human embryonic nervous systems develop from a ridge of cells on the back that form the neural tube. The neural tube eventually branches out to all parts of the body and bulges at the head to form the brain. The neural cells multiply and divide, producing most of the necessary cells before birth. Then, the individual cells position themselves and begin to form the connections necessary for survival. (4) Development begins and finishes first at the tail end of the embryo and lastly in the head. Within six months, all the spinal and motor neurons in the body and toward the base of the brain are functional, and the baby can survive outside the womb. This tail to head order of development reoccurs within the brain as well, with the brainstem mostly developed by six months but the cerebral cortex not finished until fifteen to twenty years after birth. (5)

Following birth, the sensory neurons must be programmed through experience so that the baby can properly see and hear. Through childhood the brain continues to develop more and more based on experience, moving from the base toward the highest and most frontal parts of the brain and forming procedural pattern generators for complex movements. Children learn to distinguish words and faces by first establishing procedural auditory and visual processors and then through declarative memory. While most of the neurons for life are present at birth, the newborn brain is only about a quarter of the adult brain size. Neurons grow, branching and fusing together, mylenation occurs, accelerating signals between neurons, and unused cells die, leaving only useful and efficient networks. (4), (5)

After early childhood, the next set of major changes occurs during adolescence. In addition to hormones and physical body development due to puberty, teenagers are still developing the frontal parts of their brains. The number of certain receptors and the gray matter (amount of neurons) in the frontal lobes peak in early adolescence and then decline as connections solidify, unused neurons die, and white matter (nervous system cells that speed up signal transmission) increase (6). The frontal regions of the brain that change the most during the second decade of life are used by adults for planning, judgment, and goal directed behavior (7).

While impossible to prove, declarative memories do appear in age groups (and species) with I-function abilities. These memories appear without structural changes to the nervous system so they are probably not structurally or chemically different. Also these processes appear to occur in the brain regions closest to the forehead, the most recently evolved and latest to develop. Thus declarative memories probably evolved when the biological framework from procedural memories continued to be used in the more recently evolved and less restrained frontal areas of the brain where the I-function emerges.


1)Memory Loss and Recovery, Erin Schifeling

2)The Biological Basis for Memory Manufacture, Erin Schifeling

3) Bentley, D. and Hoy, R. "The neurobiology of cricket song." Scientific American. August, 1974. In class lecture notes, January 23

4)Brain Development, Chudler, Eric H.

5)Brain Development: Frequently Asked Questions

6)Alcohol and the Adolescent Brain, White, Aaron M.

7)Inside the Teenage Brain

Full Name:  Faiza Mahmood
Username:  Fmahmood@brynmawr.edu
Title:  Bipolar Disorder
Date:  2006-05-16 04:16:32
Message Id:  19395
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Bipolar disorder is a disorder that causes unusual shifts in a person's mood, energy, and ability to function. We all know that everyone has their good days, and their bad days. We all have our ups and downs—this is all part of life. However, this is quite different from the symptoms that characterize bipolar disorder. Bipolar disorder, often referred to as manic depression, is a serious and chronic mental illness that is characterized by dramatic and disruptive mood swings ranging from high (manic) to low (depressive) states. It is one of the most common, severe, and persistent mental illnesses that exist today. (1). It is estimated that more than 2.5 million Americans have bipolar disorder. However, this figure may be largely inaccurate because bipolar disorder is difficult to detect and is often misdiagnosed As many as 80 percent of individuals go undiagnosed or misdiagnosed for up to 10 years. (6)

Bipolar disorder is an illness that causes dramatic mood swings—from overly "high" and/or irritable to sad and hopeless, and then back again, often with periods of normal mood in between. (2). It might be helpful to think of the various mood states of this illness as a spectrum or continuous range. At one end is severe depression, which includes moderate depression; then comes mild and brief mood disturbances, then normal mood, then hypomania (a mild form of mania), and then mania. (5).

Symptoms of the manic state include increased energy, restlessness, increased libido, racing thoughts and talking, a decreased need for sleep, excessive euphoria, reckless behavior without regard for consequences, unrealistic beliefs in one's ability and powers and severe thought disturbances, which may or may not include psychosis. Symptoms of the depressive state include loss of interest or pleasure in activities once enjoyed; a distinctly low or irritable mood; change in appetite and sleep patterns; fatigue, and thoughts of death or suicide, or suicide attempts. (7) Between these highs and lows, patients usually experience periods of higher functionality and can lead a productive life. Sometimes, sufferers may experience mixed episodes, when they feel both manic and depressive symptoms simultaneously (2)

The symptoms described above are primarily characteristic of BPI. However, varying degrees of bipolar disorder exist. BPII is a milder disorder consisting of depression alternating with periods of hypomania. Hypomania is essentially a less severe form of mania that does not include psychotic symptoms or lead to major impairment of social or occupational function, which BPI does. There are also different subtypes of bipolar disorder, depending on the frequency of the episodes. A person suffers from rapid cycling when they experience four or more episodes per year. Ultra rapid cycling is similar to rapid cycling except the episodes occur more often, experiencing four or more episodes per week. (5) Thus, the type, severity and duration of mood episodes experienced can vary. Some individuals might have a predominance of either mania or depression, whereas some sufferers may experience equal numbers of both. However, in general the depressed mood tends to last longer than the manic mood. If left untreated, these episodes can last from several days to several months. (6)

No racial predilection exists for this illness and the disorder seems to be equally prevalent among genders, although rapid cycling is more common in women. (1) However, the timing of when Bipolar disorder typically develops can vary. It usually tends to occur in late adolescence or early adulthood. Nevertheless, some people can have their first symptoms during childhood, and some develop them late in life. In its early stages bipolar disorder may be often seen as more of a problem rather than mental illness. For example, it may first appear as alcohol or drug abuse, or poor school or work performance. If left untreated, bipolar disorder tends to worsen and the person experiences episodes of full-fledged mania and clinical depression. (1)

Like other mental illnesses, bipolar disorder cannot yet be identified physiologically—for example, through a blood test or a brain scan. (1) Therefore, a diagnosis of bipolar disorder is made on the basis of symptoms, course of illness, and, when available, family history. The diagnosis of BPI usually requires the presence of a manic episode of at least 1 week's duration that leads to hospitalization or other significant impairment in occupational or social functioning. This episode of mania cannot be caused by another medical illness or by substance abuse. This also may or may not include the occurrence of a depressive episode. (7)

The causation: of bipolar disorder is still unclear. The disorder symptoms appear to be caused by biochemical imbalances of hormones or certain neurotransmitters in the brain (especially dopamine, serotonin, norepinephrine, and acetylcholine). However, the cause of this imbalance is unclear. (6) Often times bipolar disorder tends to run in families, which leads to the proposal that there are genetic factors involved in producing this imbalance. The gene for bipolarity has mainly been traced to chromosome 18 as well as some others. However this is uncertain because it has been found in some cases that chromosome 18 is unaffected in bipolar patients. It also, likely that bipolar disorder is linked to several genes acting together. However, this may not be the only cause of bipolar disorder, as evidenced by studies of identical twins, who share all the same genes. These studies indicate that both genes and other factors play a role in bipolar disorder. If bipolar disorder were caused entirely by genes, then the identical twin of someone with the illness would always develop the illness, and research has shown that this is not the case. (6) Therefore, what are other possible causes?

The psychodynamic theory holds that the dynamics of manic-depressive are linked through one common pathway. Practitioners of this theory feel that depression is a manifestation of the losses (the loss of self-esteem and the sense of worthlessness.) Therefore, mania serves as a defense against the feelings of depression. However, biochemical causes seem to implicate the opposite of this proposal and hold that multiple biochemical pathways are most likely contributing to bipolar disorder. For example the blood pressure drug reserpine, which depletes catecholamines from nerve terminals, was noted to cause depression. This led to the catecholamine hypothesis, which holds that an increase in epinephrine and norepinephrine causes mania and a decrease in epinephrine and norepinephrine causes depression. Other things that may exacerbate mania include hormonal imbalances and disruptions of the hypothalamic-pituitary-adrenal axis involved in homeostasis and the stress response, causing symptoms of bipolar disorder. Lastly, environmental factors may play a role in the onset of bipolar disorder. It seems that in some cases the cycle the illness may be directly linked to external stresses or the external pressures, which may be serving to exacerbate some underlying genetic or biochemical predisposition. (7) It seems that there is no single cause for bipolar disorder but rather many factors act together to produce the illness. It may be that although the initial triggers of bipolar disorder may begin differently and take different pathways, they all ultimately lead to a common pathway that induces the illness in those who experience it. It also still seems unclear to me whether the biochemical imbalances are a cause of mood disorders, a result of the symptoms, or a little of both.

Although the cause of bipolar disorder may be unclear, one thing that is clear is that bipolar disorder is a serious lifelong struggle and challenge (7) If left untreated it can result in damaged relationships, poor job or school performance, and even suicide. However, with treatment people with this illness can lead full and productive lives. (2) Theoritically there is no cure for bipolar disorder; however treatment to cope with the illness is available. Two forms of treatment exist, physiological, and psychotherapeutic. Different categories of drugs are used to help cope with the illness including, mood stabilizers, anti-depressants as well as anticonvulsants. The two most commonly used medications are lithium and valaproate. Lithium, a mood stabilizer, can control episodes and decrease the likelihood of reoccurrence, but it is still unknown exactly how it works and why it works for some and not others Valaproate, an anticonvulsant or antiepileptic drug (AED), that can also be used as a mood stabilizer and is usually used in cases where the patient cannot stand the side effects of or does not respond to lithium. (7) Psychotherapeutic treatment may include behavioral therapy, family therapy and general education about the disease. Because bipolar disorder is a recurrent illness, long-term preventive treatment is encouraged. A strategy that combines medication and psychosocial treatment is usually optimal for managing the disorder over time. (6).

Bipolar disorder, is an illness that raises questions and challenges the concept of brain=behavior. It is diseases like this one which makes us reexamine the whole concept of brain=behavior. During both manic and depressive episodes, the person loses control on a behavioral level. However is this behavior equivalent to what is occurring in the brain, or is it due to something that, that seems to override the so called force of the brain?

Whatever the force may be that is responsible for the symptoms associated with bipolar disorder, it is nevertheless, a very real illness that can have a devastating impact on many people. Close personal experience with bipolar disorder has sparked my interest in this topic. My father was very recently diagnosed with bipolar disorder, and I have first handedly witnessed many of the symptoms and behaviors previously discussed. Bipolar disorder is undiagnosed all too often and the consequences of this can be very hard for not only the person with the illness but those that surround them as well. Like other serious illnesses it can be hard on spouses, other family members and employees. Family members of people with bipolar disorder often have to cope with serious behavioral problems (such as wild spending sprees and out of control behavior) and the lasting consequences of these behaviors. (6)

Initially upon begging my research, I was aware of the basics of bipolar disorder; however my understanding included only that extreme shifts of moods occured, ranging from depression, to mania, or even psychosis. However; I was largely unaware of the actual symptoms and behavioral characteristics of the disorder. Upon starting my investigation, I had a lot of unanswered questions. Was bipolar disorder something that could just appear? When does it usually emerge? Can it really be undiagnosed for that long a time period? Many of these questions were answered by my research. Initially, the diagnosis didn't make much sense, as there is no family history of bipolar disorder in our family. However, it seems that multiple factors may have cause the onset of the illness, and perhaps there was a rare genetic mutation that occurred causing the onset of the illness that may or may not have included environmental triggers targeting the genetic or biochemical predisposition of the illness.

I feel as though through research and personal experience that there are alarmingly high rates of misdiagnosis and under-diagnosis of bipolar disorder. This is something that must be addressed. Bipolar disorder is often not recognized as an illness, and people may suffer for years before it is properly diagnosed and treated. (6) Like diabetes or heart disease, bipolar disorder is a long-term illness that must be carefully managed throughout a person's life. (6) For this reason awareness of recognizing the illness and urging those experiencing its symptoms to seek help must be made. An NMHA survey recently revealed that while more than 60 percent of people can identify bipolar disorder as a mental illness; however more than two-thirds of those surveyed said they had limited or no knowledge of it. (3) This points to gaps in public awareness of the illness. It just seems that there are too many Americans living with bipolar disorder that are unaware of the illness and the toll it may be taking on their lives, work and relationships. Bipolar disorder can have serious, life-altering consequences if left untreated. In fact, bipolar disorder has one of the highest mortality rates among all mental illnesses. Approximately 25-50% of those with bipolar disorder attempt suicide and 11% actually commit it. (7)

There is also certain stereotypes and stigma surrounding bipolar disorder that still persists today, thereby discouraging people who may have the illness from seeking a diagnosis and treatment. Many people think that people with bipolar disorder are just crazy; they do not consider it an illness. This is not the case, in fact many people with bipolar disorder have been some the worlds most talented and famous people today and during history. Beethoven, Winston Churchhill, Mark Twain, Marilyn Monroe, Larry King, Billy Joel, Elton John to name a few noteworthy individuals. (4). Nevertheless, there still seems to exist a stigma surrounding bipolar disorder, which may be in large part due to the fact that the media's role in coverage of bipolar disorder seems to be more negative then positive. Whatever, the case the stigma surrounding bipolar disorder must be eliminated.

People must educate themselves and others about bipolar disorder and other mental illnesses. I know that I for one would have liked to have had more knowledge about the disorder so that the undiagnosis in my own family wouldn't have gone undiagnosed for so long. However, for those who have the illness or know someone who does it is important to realize that this illness can be a very treatable one. It is possible to restore one to their "normal" selves, that is to say their nonmaniac or depressive selves. I have begun to recently witness the power of treatment myself. There does seem to much hope for those with bipolar disorder, for those who are lucky enough to be properly diagnosed, at least.

1)NIMH Bipolar Disorder
2)Bipolar Disorder
3)NMHA Bipolar Disorder
4)Famous People With Bipolar Disorder
5)What is bipolar disorder and Subtypes
6)Bipolar Disorder Questions & Answers
7)Bipolar Disorder

Full Name:  Beatrice Johnson
Username:  besiar@aol.com
Title:  Fear the Feelings
Date:  2006-05-18 18:31:02
Message Id:  19405
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip


What is this thing that takes up so much of our time? What is this thing that causes or prevents so much to, or from happening in an individual? What is it that drains and overwhelms the individual to a point, where and when they can no longer function or even think deeper than their present moment and in that moment there is nothing but confusion? What is it that affects an individual's thinking to the point that it drains him or her of their full strength and their intelligence? Of course there are answers to these questions, but do they really satisfy, or do the answers really answer the fear.
Of course a fair amount of fear is healthy and necessary in the human being. It has provided humans with a protective response in time of danger. This has come to be known as the "fight or flight response" (Harvard physiologist Walter Cannon). The "fight or flight response" is our body's primitive, automatic, inborn response that prepares the body to "fight or flee" from perceived attack, harm, or threat to our survival. (1) This view is further elaborated on:
When we experience excessive stress – whether from internal worry or external circum-
stances – a bodily reaction is triggered, called the "fight or flight response". This response
is hardwired into our brains and represents a genetic wisdom designed to protect us from
bodily harm. (2)

When our fight or flight system is activated, we tend to perceive everything in our
environment as a possible threat to our survival. By its very nature, the fight or flight
system bypasses our rational mind – where our more well thought out beliefs exist – and
moves us into "attack" mode. This state of alert causes us to perceive almost everything in
our world as a possible threat to our survival. As such, we tend to see everyone and every-
thing as a possible enemy. Like airport security during a terrorist threat, we are on the look
out for every possible danger. We may overreact to the slightest comment. Our fear is
exaggerated. Our thinking is distorted. We see everything through the filter of possible
danger. We narrow our focus to those things that can harm us. Fear becomes the lens
through which we see the world. (3)

I must ask, what part does stress play in fear. Is it apart of fear or is it something altogether different? Is stress fear, or something before? Is fear a combination of feelings? If it is a primitive, automatic, inborn response has it changed its response to fear? Has it evolved to the needs of the body and the mind? Or has it remained the same, to protect us in urgent and immediate survival? Is fear just a protective signal for the body and mind to take action, to flee or to think? If fear is distorted and exaggerated what causes this to happen? Why does it go beyond that protective state?
Fear the feeling can be dealt with in a different manner, that can be seen along with other feelings in a more general way :
Feelings, on the other hand, are always hidden, like all mental images necessarily are,
unseen to anyone other than their rightful owner, the most private property of the organism
in whose brain they occur. Emotions play out in the theater of the body. Feelings play out
in the theater of the mind. (4)

A feeling is the perception of a certain state of the body along with the perception of a
certain mode of thinking and of thoughts with certain themes. Feelings emerge when the
sheer accumulation of mapped details reaches a certain stage. (5)

With this view in mind, it would seem that all feelings react in the same manner. But fear seems to much stronger when viewed against other feelings. It seems that fear of some sort exists in all feelings.
On a personal note, I feel the feeling of fear and I can't put it into words it just has to be felt. Words are very shallow when you try to convey a feeling. Feelings can't be explained. The words just don't come out just the feeling. The feeling seems to cancel out the words. Some of the questions have been answered, but that only gave room for more questions.


Damasio, Antonio. Looking for Spinoza Joy, Sorrow, and the Feeling Brain
Neimark, Neil F. The Fight or Flight Response at www.TheBodySoulConnection.com

Full Name:  Bethany Keffala
Username:  bkeffala@bmc
Title:  Autism, Mirror Neurons, and Theory of Mind
Date:  2006-05-18 23:40:26
Message Id:  19407
Paper Text:


Biology 202

2006 Third Web Paper

On Serendip

Autism is a developmental disorder whose symptoms tend to manifest very early in life. They often include unusual behavior and social interaction, as well as great difficulty communicating (3) . It has been hypothesized that autistic people may lack in development of theory of mind, and it has also been suggested that a mirror neuron system deficiency may be a cause. It seems to me that both of these guesses are on the right track, but that they are connected, and that theory of mind is a side effect of our advanced mirror neuron system.

Non-autistic children generally start developing theory of mind as early as five months old, with a growth spurt at around three or four years, and with development nearing completion at around five years of age (6) . With theory of mind, we are able to recognize that the self and its desires are separate from the rest of the world, and that other people have minds, and mental and emotional states that are similar to our own but are products of their own experiences. Autistic children develop theory of mind later than non-autistics, if they develop it at all.

It is very plausible that theory of mind is connected to the mirror neuron system. Theory of mind is all about using our own experience and our own thoughts and beliefs in order to think about someone else's thoughts or beliefs, and how they might be similar or different. We use what we have as a sort of template to try to comprehend where another person is coming from. This is very like mirror neurons, which are the key linking action and observation, and how we understand the action of another person.

Though mirror neurons were first discovered as a link between action and observation, it has since been suggested that they may also play a very central role in the learning and evolution of language. They are found in a very important language center of the brain, and it has been demonstrated that they might be at least partially responsible for our ability to comprehend each other's speech, much in the way that they help us to comprehend one another's actions. As a system, it seems that the function of mirror neurons in general is to act as templates for understanding the variety of behaviors of those around us. We use the same system to act and speak as we do to comprehend this behavior in others, much like theory of mind for which we refer to our own beliefs and experiences to form guesses about what is going on in another person's mind.

Mirror neuron systems in autistic people are not as highly functioning as those of non-autistic adults. It has been suggested that mirror neurons are responsible for our ability to empathize with another's emotions, and recently there has been much research looking at a possible link between autism and a deficiency in the mirror neuron system. When a non-autistic person watches someone expressing emotion, experimenters can see an activation of the limbic system (which is linked with emotion) in the brain of the watcher. This, however, is not the case for someone with autism. They may be able to imitate the facial expressions or actions of the person, but the limbic system is not activated (1) . One study in which autistic children watched a tape of people using different facial expressions to express different emotions showed that the less there was of a blood-flow to mirror neurons, the more problem the person had in deducting which emotion was being displayed by the person in the tape (2) .

Another example of a possible connection between autism, mirror neurons, and theory of mind can be found when we look at language. We find this connection in particular when we look at the use of figurative language in everyday conversation. It is thought that the understanding of another's intention is vital in the understanding of everyday conversation. This touches on theory of mind, in that theory of mind helps us to understand another person's states of mind and intentions. Without theory of mind, everyday conversation, which is full of figurative language would become very confusing. This seems to be the case for autistics. Some examples of interactions with autistic children follow:
"A request to 'Stick your coat down over there' is met by a serious request for glue. Ask if she will 'give you a hand', and she will answer that she needs to keep both of her hands and cannot cut one off to give to you. Tell him that his sister is 'crying her eyes out' and he will look anxiously on the floor for her eyeballs" (7) . If an autistic person has a faulty or non-existent theory of mind, then they will have a difficult time guessing the intention of a speaker with whom they are trying to converse. This missing understanding of intention prevents them from arriving at a figurative conclusion, and instead they react as if the speaker were speaking literally.

Some deficiency in the mirror neuron system could very well result in an impaired development of theory of mind. If mirror neurons are vital to the development of theory of mind, and if both theory of mind and mirror neurons are vital to successful communication, then it makes a lot of sense that we find autism paired with an underdeveloped mirror neuron system. It may be beneficial for us to look at more ways to test the connections between these phenomena. It seems that the level of development of theory of mind ranges across cases of autism. It would be a good idea to see if there is a correlation between mirror neuron activity and the level of development of theory of mind. It might also be productive to see if this correlates with level of ease in comprehension and utilization of figurative language.

Works consulted:







7)Happé, Francesca G. E. "Understanding Minds and Metaphors: Insights from the Study of Figurative Language in Autism". In METAPHOR AND SYMBOLIC ACTIVITY. Laurence Erlbaum Associates, Inc, 1995.

8) Dennis, Maureen, Lazenby, Anne L., and Lockyer, Linda. "Inferential Language in High-Function Children with Autism." In the Journal of Autism and Developmental Disorders, Vol. 31, No. 1, 2001.