Author Archives: Celine Elizabeth Gosselin

Why do humans get nostalgic?

Nostalgia is one of my greatest weaknesses, I think. For such a long time I’ve been severely missing aspects of my childhood, and often these memories are a source of depression. For me, nostalgia prevents me from moving forward. However, I know many other people experience nostalgia as a calming mechanism and can easily look back on happy memories without any emotional turmoil in the present. “‘When you’re nostalgic about something, there’s a little bit of a sense of loss—[the moment has] happened, it’s gone—but usually the net result is happiness,’ says Clay Routledge, a social psychologist at North Dakota State University, who, with several other researchers, has studied the emotion extensively over the past decade.” Either way, experiencing nostalgia definitely influences mood, and can affect how we live our lives. So where does it come from?

Routledge conducted a study on the possible triggers of nostalgia (found here). Participants were asked to describe situations that lead to experiencing nostalgia, and scientists found that negative emotions, chiefly loneliness, were the most common answers. This makes sense, going along with Routledge’s initial statement- nostalgia is possibly brought about in an attempt to cope with any negative mental state someone may find himself or herself in. Routledge refers to these as “psychological threats.” But if this is the case, it doesn’t make sense that my negative mental states often come from the nostalgia itself. Also, this study is observational, and results may be confounded by third variables such as the age of the participant or possible mental disorders. I would need an actual experiment studying the levels of brain chemicals during episodes of nostalgia to be completely convinced that nostalgia is brought about by a negative mental state.

Upon doing some more digging, I thought it was a bit odd that many news articles studying nostalgia and its effects cite the same study by Routledge (and on some occasions, nothing else.) I find it hard to believe that there are so few modern studies conducted on this topic of psychology. There are several older studies that take a more negative view on the effects of nostalgia, but the only study used to inform the public about the new light scientists are seeing the nostalgia issue under is the one conducted at North Dakota State University. The findings of the study do look decent, and many researchers were involved, but there should be more scientists doing similar experiments in order to reduce the likelihood of chance.

Nostalgia1

 

Cracking your knuckles: debunked

I am a violist in Penn State’s Philharmonic Orchestra, and after a long two hour rehearsal, my fingers are often stiff and I’m just itching to crack my knuckles to relieve myself of the tension. There are some people who are chronic knuckle crackers, and then there are people like my mom (who freaks out and tells me I’ll get arthritis if I keep doing that). What I want to know is if there is actual scientific evidence that suggests cracking your knuckles is bad for your joints, or if it’s just a myth made up by people who get grossed out by the popping noise.

John Indalecio, a hand therapist at the Hospital for Special Surgery in New York, believes there is not enough compelling scientific evidence to suggest that knuckle-cracking will cause arthritis. However, if a habit is formed, there is a chance that more problems could occur down the road. Sounds reasonable to me, but the first article I found was from Huffington Post, so I think I’ll need to look at the actual studies to see if the results support this claim.

A PLOS ONE study concludes, “Presently, the literature in this area is confusing in that the energy produced during joint cracking is though to exceed the threshold for damage[51], but habitual knuckle cracking has not been shown to increase joint degeneration [52]. Ultimately, by defining the process underlying joint cracking, its therapeutic benefits, or possible harms, may be better understood.” In a nutshell: making cracking your knuckles a habit has NOT been shown to cause an increase in joint issues. More experiments would need to be done in order to further assess the long-term effects.

What I’m wondering is if there has ever been a study examining the effect of knuckle-cracking on many different people considering the genetics one may have to make them more prone to joint or muscle issues. If you have a family history of arthritis or osteoporosis, for example, does the effect of cracking your knuckles worsen? Are there benefits to having certain types of people crack their knuckles to create these cavities in the finger joints?

Another thing- the PLOS ONE study was definitely very thorough, but the average person would find it difficult to get the big picture of what the scientists discovered. For so long, everyone believed (and some still believe) the sound of cracking knuckles came from air bubbles being popped, but this study proves otherwise. I think the information would be much more accessible to the general public and less likely to be misunderstood if another version of the study was published simplifying the findings and explaining what they mean for us.

With all this new technology, why are humans still so ignorant?

There are some times when I am just floored by how ignorant humans, specifically Americans, are these days, even when answers are right at their fingertips. America is the developed world’s second most ignorant country, in fact. First, I’d just like to compile a long list of ridiculous facts demonstrating the stupidity of some Americans.

  1. Only 71% of Americans can locate the Pacific Ocean on a map
  2. 22% of Americans could name all five Simpson family members, compared with just 1 in 1,000 people who could name all five First Amendment freedoms
  3. A 2007 National Constitutional Center poll found that two-thirds of Americans couldn’t name all three branches of the U.S. federal government, nor a single Supreme Court justice.
  4. When respondents of another poll were asked whether they could recall any of the rights guaranteed by the First Amendment, a majority could name only free speech. More than a third were unable to list any First Amendment rights.
  5. 14% of voters in one poll believe in Bigfoot
  6. 13% of voters think Barack Obama is the anti-Christ
  7. Polls over the last few years have variously shown that about 30 percent of us couldn’t name the vice president,
  8. about 35 percent couldn’t assign the proper century to the American Revolution
  9. 25% of Americans were unable to identify the country from which America gained its independence. Although 19% stated that they were unsure, Gallup findings indicated that others stated answers varying from France to China. Older folks scored much better than young people on this question, as a third of those 18-29 were unable to come up with the correct answer. [source]
  10. Despite being a constant fixture in school curricula, another 30% of Americans didn’t know what the Holocaust was. [source]
  11. 20% of americans believe that the Sun revolves around the Earth

I think you get the picture. The statistics are definitely crazy and slightly entertaining, while kind of disturbing. Please keep in mind that these statistics may not be current or taken from reliable sources. That being said, the general trend when googling “american ignorance” is that we are an overwhelmingly stupid population.

I know it’s fiction, but I think M.T Anderson’s novel Feed is a really great depiction of how the ability to easily acquire knowledge does not mean the human race will become any more knowledgeable. With Google, you can find the answer for any question you might have, but having the information served to you on a silver platter will make it harder to retain what you’ve learned. If you have to go digging to find something out, you have to WORK for that knowledge, and chances are you are going to come across something along the way that will stick with you.

This article supports my theory: “Why are we so deluded? The error can be traced to our mistaking unprecedented access to information with the actual consumption of it… only a small percentage of people take advantage of the great new resources at hand. In 2005, the Pew Research Center surveyed the news habits of some 3,000 Americans age 18 and older. The researchers found that 59% on a regular basis get at least some news from local TV, 47% from national TV news shows, and just 23% from the Internet.”

We are in a wash of news and information. Some is good, some is bad or anecdotal, and it can be hard to distinguish one kind from another. In addition, the previous link suggests that fewer people today read the paper, watch the news, or visit news websites. The voter turnout is declining, especially in young people, because a lack of information causes a sense of apathy and diminishes the importance of nationwide elections. It’s so easy to be informed in this day in age, but people take it for granted and don’t understand how much they truly do not understand about the world.

Where do our fears come from?

Everybody has something that keeps them up when things go bump in the night. The National Comorbidity Survey, a study of more than eight thousand respondents in the United States, ranked the popularity of several phobias:

1. Bugs, mice, snakes, and bats 2. Heights 3. Water 4. Public transportation 5. Storms 6. Closed spaces 7. Tunnels and bridges 8. Crowds 9. Speaking in public

Our fears tend to change as we grow older: for example, WebMD says that common fears of preschoolers include masks, the dark, monsters and ghosts, while children in school panic due to snakes and spiders, being home alone, angry teachers, doctors, and natural disasters. But how do we develop these fears?

The most obvious answer is that a fear can be formed by a traumatic experience in the past involving the source of terror. For example, if I was stung by a bee as a child, I would naturally be afraid of bees in the present due to anxiety that I might be stung again. However, this isn’t always the case- I’m somewhat afraid of bees even though I’ve never been stung or even seen it happen to someone else.

Another possible explanation is that fear arises from witnessing another person’s anxiety or phobia. Michael Cook at the University of Wisconsin discovered that “monkeys born in the wild are afraid of snakes — a useful asset for their survival. But monkeys raised in a laboratory don’t react when they see a snake, whether it’s poisonous or not. This shows that our fears can’t be genetic.” However, “monkeys who have never been afraid of snakes quickly learn to be frightened of them: they only have to see that another monkey is scared. And it only has to happen once.”

This would certainly explain many phobias, but I’m still left wondering why seeing a horror movie in a theater with other scared people has less of an effect on many of us than seeing a horror movie at night in the dark while at home alone. This probably stems from being afraid of the dark and sudden noises as a preschooler, but why do some people carry this fear throughout adulthood?

Maybe a fear of darkness is so common because most humans fear the unknown or what they cannot see with the naked eye. “This fear, and the grinding anxiety that it generates, acts as a check and limiting mechanism against reckless behavior like, say, running around in the dead of the African night with a continent’s-worth of big cats out on the prowl. In other words, it’s an evolutionary advantage.” Anxiety increases awareness and at a time when humans were far from the top of the food chain, being afraid of the dark made all the difference in the survival of the species.

fears

(source: https://www.scienceworld.ca/ads)

 

Is there such a thing as too much affirmative action in some college degree programs?

For a while, I’ve wondered if affirmative action in talent-based degrees such as theater, architecture, music performance, and fashion design is doing its job correctly.

My sister is a senior in high school and is applying to many major schools for musical theater. She attends the Academy of Theatre Arts, where many students do go on to pursue theater at a university. Last year, a graduating senior was accepted into the highly competitive NYU Tisch School of the Arts in New York City. He is a good dancer, but we were puzzling about why he was picked over the many other talented “triple threats” vying for a spot in the program. This student happened to be a mixed race hispanic, so it got me wondering if that had any effect in his being chosen above other students that might have been more talented.

Keep in mind- the above story is an anecdote. There are many things we do not know about the situation to judge how much this student’s race affected his acceptance. He could have had an excellent audition, known or worked with NYU faculty beforehand, received superb letters of recommendation… for all we know, the talent level in applicants last year may not have been as high.

However, I do want to examine in this blog post the pros and cons to affirmative action, and if it has truly succeeded in doing its job.

 

It’s a common myth that affirmative action will cause many Caucasian workers to lose out. Government statistics prove that this is not necessarily true. According to the Department of Commerce, “there are 2.6 million unemployed Black civilians and 114 million employed White civilians (U.S. Bureau of the Census, 2011). Thus, even if every unemployed Black worker in the United States were to displace a White worker, only 2% of Whites would be affected.” This statement, however, is applicable to the general workforce and not the creative pursuits where talent comes into question. Statistics and studies in this specific area are severely lacking.

I do understand that many African Americans don’t have as much of an opportunity to learn how to play the violin well, study under a prestigious dance teacher, or visit cities with famous architecture to inspire them in their work. I think this is the problem that should be remedied- that way, we can even the playing field, and the main decision for college acceptances or job offers will be based on talent, like it should be. Surprisingly, one of my least favorite supreme court justices agrees with me. Clarence Thomas, “who has opposed affirmative action even while conceding that he benefited from it, told a reporter for The New York Times in 1982 that affirmative action placed students in programs above their abilities. Mr. Thomas, who was then the 34-year-old chairman of the Equal Employment Opportunity Commission, didn’t deny the crisis in minority employment. But he blamed a failed education system rather than discrimination in admissions.”

There aren’t many studies to prove that it would be more beneficial to improve the early education system rather than have a quota for minority students applying for college. In some cases, I’m sure affirmative action works great- there are more minority and female workers in the workforce today than ever before. However, when talent is called into question, I wonder if there is a better solution to the inequality conundrum that still plagues us today.

Toothbrush germs: can you get rid of them?

There isn’t much science on the age old dilemma of the “right” way to brush your teeth. Sure, plenty of people have thoughts or opinions on why you should or shouldn’t wet your toothbrush before brushing your teeth, and especially on the internet, we can be quite vocal about our preferences.

Some people argue that by wetting your toothpaste before brushing, you are creating extra foam to get into and clean all those hard-to-reach crevices in your mouth. Others claim that their dentists advised against this and that the water will reduce the toothpaste’s effectiveness. I myself am part of a third group, preferring to wet my toothbrush before putting toothpaste on the bristles at all. These can just be personal preferences, but each method has some impact on the cleanliness of your teeth after each brush.

Is there a “best way” to keep your teeth clean?

Confirmed in a Myth Busters experiment, if you leave your toothbrush on the bathroom counter, all it takes is one toilet flush for tainted water droplets and fecal matter to contaminate it. Bacteria can spread to anything within a twenty foot radius of the toilet. Ever since I was a little kid, this fact has traumatized me. By rinsing my toothbrush before putting the toothpaste on it, I try to avoid the disgusting mental image of what may or may not be going into my mouth.

Unfortunately, this method is not as effective as one might hope. It’s nearly impossible to escape the germs, regardless of when or whether you rinse off your toothbrush before use. One possible solution is to flush with the toilet lid down and avoid making a choice as to your toothbrushing habits all together. However, in college most dorm restrooms lack this feature.

Another possibility is to rinse your toothbrush in mouthwash before each use. Sounds like a good idea. But in this study, scientists found that there were essentially no differences between toothbrush cleaning practices to prevent contamination. “Using a toothbrush cover doesn’t protect a toothbrush from bacterial growth, but actually creates an environment where bacteria are better suited to grow by keeping the bristles moist and not allowing the head of the toothbrush to dry out between uses,” explains study author Lauren Aber. She suggests that regardless of your habits, you should replace your toothbrush about every 3 months.

In conclusion, I do stand behind my earlier opinion: you SHOULD rinse your toothbrush thoroughly before each use. There may not be a huge benefit to making this a habit, but all it will cost you is a bit of tap water and probably ten extra seconds to your morning or nighttime routine.

Perfect Pitch: how you get it and why it matters

 

Fair warning: some of this might not make a whole lot of sense to those without much musical experience. I apologize.

I’m a musician. I play the viola in Penn State’s Philharmonic Orchestra, hope to earn a BM in Music Composition, and struggle with sight singing in music theory just like anyone else. I have what is usually called “relative pitch”- I can hear a melody and sing it back on the same pitches but I couldn’t tell you what letter notes I sang. For all I know, the key of the melody (or ‘do’ in solfege) could have been anything from C# to F. Basically, if you put a piece of sheet music in front of me and pointed at a pitch on the staff, I wouldn’t be able to sing it without first hearing it on the piano.

This is pretty common for most people. Relative pitch is a left brain skill that can be learned, and it allows musicians to understand why a piece of music works and what makes something sound a certain way (1).

On the other end of the spectrum, you have something called perfect pitch or absolute pitch. It is not a skill you can acquire over time as an adult. Only one of ten thousand people have it, and most people go their entire lives not knowing they do (2). Basically, anyone with perfect pitch can hear a note played on any instrument and be able to identify it. In my high school, I knew someone with absolute pitch who correctly figured out that the fire alarm was a Bb without any external reference. While perfect pitch does not equate to any understanding or comprehension of music that is heard, it allows you to hear each note by the “color” of its sound (1). A person with perfect pitch can see a notated piece of music and be able to sing its exact pitches without any prompting, hear a piece of music and know what key it is in, and distinguish that a note is a “B” instead of a “C#”, pitches that sound identical to the ear (1).

Almost every musician I know is envious of the special few who are graced with absolute pitch. Though it is not something that can make or break a performer, it is extremely useful when sight reading music for the first time or dictating a melody by ear in music theory classes. As a composer, I wish I could hear a melodic pattern and be able to play it back on my viola or even a piano to be able to analyse compositional techniques more quickly.

Getting down to the science of it, it appears that the auditory system of an absolute listener is the same, both physically and functionally, as that of a non-absolute listener. Perfect pitch is an act of cognition, “needing memory of the frequency, a label for the frequency (such as “B-flat”), and exposure to the range of sound encompassed by that categorical label. Absolute pitch may be directly analogous to recognizing colors, phonemes (speech sounds) or other categorical perception of sensory stimuli. (3)” Basically, just like most people learn to identify the color purple by the “frequencies of the electromagnetic radiation that is perceived as light, it is possible that those who have been exposed to musical notes together with their names early in life will be more likely to identify, for example, the note C. (3)”. Most people who have perfect pitch, it turns out, start music lessons before the age of six. Perfect pitch might also be related to a person’s genes, but this is largely untested. It’s just as likely that it can only be obtained through exposure to music at a young age (3).

While I understand that this issue is far from urgent, explaining the lack of studies, understanding why some people have absolute pitch is something I definitely feel could use more experimentation and research. Scientific study began in the 19th century, but many aspects of the anomaly are still unexplained or uncertain. For example, how much does a child have to listen to at a young age to increase their chances for developing perfect pitch? How does having perfect pitch improve cognition for other aspects of life?

I don’t expect these studies would be controversial or expensive to conduct. This is not a life or death issue, so the ethics of conducting an experiment focusing on the benefits of developing perfect pitch or the correlation between a musically trained ear and higher intelligence shouldn’t be a problem. Scientists could simply conduct a trial with a control group of children in a certain age range who are not exposed to classical music and an equally randomized group that will listen to classical music for x hours a day. After a set period of time, all children will be tested on physical and cognitive function.

Reverse causation and confounding variables are impossible in this instance; the children have no choice in whether they must listen to music or cannot, and with a randomized trial, there will be an even mixture in each group of more and less intelligent children. Therefore, if a positive correlation DOES appear in the study, scientists can assume that developing perfect pitch early on causes an increase in intelligence. Scientists can then conduct another experiment testing the effectiveness of certain music genres on developing perfect pitch, the most beneficial age to train one’s ear to hear with perfect pitch, and the time one has to train each day to best develop the skill.

So what I’m trying to say is… when I google “perfect pitch”, I’d much rather see articles that answer the questions above or studies that expand upon information that has been around for a long time than what currently pops up- reviews to the disaster that is the movie Pitch Perfect.

Screen Shot

  1. http://www.perfectpitch.com/perfectrelative.htm
  2. http://www.creativitypost.com/arts/do_you_have_perfect_pitch
  3. https://en.wikipedia.org/wiki/Absolute_pitch

 

Initial Blog Post: Celine Gosselin

 

 

 

I’ve always been far more interested in learning about the arts than the sciences. Even though my dad found a career in teaching Exercise Science at the University at Buffalo and my mom is a clinical psychologist, I’ve always preferred to create something new than to discover something already in existence but unknown to humanity. I grew up with a love of reading, dance, music, and drawing- pretty much any medium with the ability to convey a message and tell a story. It’s not super surprising that I’m a freshman studying Music Composition at Penn State.

This is not to say that I have a passionate hatred for the sciences. Throughout my middle school and high school career, none of the science teachers did enough to immerse me in the learning and, like you said in class today, Andrew, I ended up memorizing facts and forgetting most of them soon after. Chemistry especially never seemed applicable to my daily life- all humans are made up of elements and ions and molecules… So what? How does knowing that these particles are in my body make me a better person? How can I actively contribute to a class discussion where there is only one right answer? How can I distinguish myself as a unique individual in class if earning an “A” can only be achieved by one way of thinking?

I’ve always LOVED to debate. I have a lot of opinions and don’t really have a problem in expressing what I believe or how I feel. I always do much better in classes where I am able to contribute to discussion in my own unique way, and none of my science classes have allowed me to do that in the past. For me to learn something well, I need to be able to connect the material with something I know from my own life experience. This is why I was excited to see that there was a science class based on controversy to fulfill part of my natural sciences requirement. While science will never be my cup of tea, I’m hoping a more interactive take on how the world works will be a better fit for me than my past experiences have been.

 

IMG_2301

(this is me. I am obsessed with Disney World.)

Attached is a link to one of my compositions. I play the viola.

“Lovebirds” written by Celine Gosselin