Author Archives: Patrick Winch

Never Been Sicker

Regardless of whether you consider it to be good or bad, the drinking culture at Penn State (and on many college campuses) is large. This fact has been addressed in other people’s posts as well. And I’m sure most of us, even if we don’t really drink, have heard the phrases “liquor before beer, you’re in the clear” and “beer before liquor, never been sicker”. After hearing about and thinking of these phrases, I was compelled to find out if they are actually true. So, does it really matter what order you consume different types of alcohol in, and does the order affect the consumer’s health in any way?


The Null hypothesis in this study is that it does not matter what order you drink different types of alcohol in. The alternative hypothesis is that the order of alcohol types consumed has an effect on whether or not the consumer becomes sick or gets a hangover. The independent variable here is alcohol consumed, and the dependent variable is sickness or hangovers.

First, we have to look at how the body processes alcohol. Brown University does a good job of explaining simply, stating that after consumption, alcohol is absorbed into both the small intestine and the stomach. Enzymes in the liver then break down the alcohol. The liver is only able to process one standard drink an hour, and if the consumer drinks more than that, the liver becomes saturated and the extra alcohol builds up in the blood and tissues until it can be processed. The liver does not necessarily process different types of alcohol at different rates. As stated earlier, the liver can only process one standard drink an hour- which means 12 ounces of beer, 5 ounces of wine, or 1.5 ounces of spirits or hard liquor. In addition to that, George Washington University explains that the body metabolizes .015 BAC an hour, which is equal to that one standard drink an hour. So if we know how the body works and how long it takes to sober up, why do so many people still follow old urban myths?


Allison Van Dusen of Forbes Online debunks these claims in a 2007 article. She cites Julia Chester of Purdue University, saying that it does not matter which order you drink alcohol in. The only difference is that the body does in fact absorb different types of alcohols at different rates- for example, a 12 ounce glass of beer will be absorbed more slowly ton an empty stomach than it’s liquor equivalent. All in all, it’s the speed at which you drink that has the biggest pull on sickness and level of intoxication. So, simply put, the phrase is more or less useless. If the average consumer follows that phrase in hopes of avoiding a hangover, there are much better ways to do that.

Some of the best ways, writes Jess Novak of the Daily Meal (2014) include drinking less, drinking water alongside your alcohol, and even taking an Advil before bed. Obviously, no one under the age of 21 should be consuming alcohol in the first place. There are obviously also third variables that can affect a person’s level of intoxication, such as weight, tolerance and amount of food in the drinker’s stomach. Another important point to address is the potential power of the anecdote. For example, a person could easily say that they are an adamant believer in the “liquor before beer, you’re in the clear” phrase, just because they followed the rule and didn’t get sick or get a hangover. But, that person could’ve also taken other measures whether knowingly or unknowingly to prevent themselves from becoming sick or getting a hangover. In fact, it’s likely that both phrases were actually created based off of repeated anecdotes alone, with no science to back up the phrase.

To conclude, it can be said that the average consumer of alcohol should not worry about or follow those phrases. Rather than change the order of alcohol consumption, a person should instead limit their drinking and be more cautious of their pace.


Image 1 found here

Image 2 found here

The Deadly Habit

I’ve been cracking my knuckles for years. Whenever I did it around my mom as a little kid, she would get mad at me and tell me that it’ll give me arthritis someday. Now, I’ve come to the realization that she didn’t really have any evidence to prove that claim, and she also most likely told me that to scare me and get me to stop. But lately, I’ve been wondering whether or not I should try to break the habit that I’ve been told is so harmful.

The first question to ask is what it is that makes the cracking noise. Tina Hesman Saey of ScienceNewsforStudents (2015) writes that the cracking noise comes from a bubble in between the joints in the finger. The original data comes from a study performed in 1947 that showed the two joints in the finger rapidly separating and forming an air bubble. Since then, different hypotheses have been suggested, including a theory that claimed that the noise came from the bubble itself popping. But recently, two scientists from the University of Alberta in Canada decided to do an MRI of a hand while the knuckle is being cracked. The official study showed that the noise comes from a cavity in the knuckle joint popping. The cavity was formed by fluid rushing to the knuckle. The results were in line with previous thinking, but now, there was actual scientific data to back up the theories. The study concluded with a statement about how the scientists plan to repeat their study in the future with bigger sample sizes, including individuals who are unable to crack their knuckles.knuckles

The image above shows the cavity in the joint.

The null hypothesis in this study is that cracking knuckles does not lead to any negative effects on the body. The alternative hypothesis is that cracking knuckles can lead to potential negative effects such as arthritis and other medical conditions.

So, the question being asked here is whether or not this cavity-popping habit actually harmful. Does the cracking of one’s knuckles (independent variable) cause arthritis (dependent variable) or any other medical conditions? Michael Behr, M.D. of Piedmont Healthcare doesn’t seem to think so. Although he does not directly dismiss the potential negative effects of knuckle cracking, he does say that there is no direct correlation between the cracking and arthritis or any other harmful medical condition. In addition, he makes it clear that although there is no proof that cracking leads to arthritis, any pain felt while cracking joints should be taken very seriously.

It is important to address and consider third variables when it comes to the development of medical conditions such as arthritis. Things like genetics play a big part in whether or not a person develops conditions, and even though a person who cracks their knuckles often may develop arthritis later in life, it can always be said that correlation does not equal causation. Elizabeth Cohen of CNN (2016) writes of a similar study as the one mentioned earlier in which joints were looked at before and after they were cracked under ultrasound. The study again showed that the knuckle crackers did not experience any negative effects, and were actually exposed to a greater range of motion. This topic does not suffer from the file drawer problem, as there has been more than one study put out on the matter. As of now, there are no active studies in progress on the topic.

To conclude, it can be said that as far as we know now, cracking your knuckles can actually be beneficial! The average person who cracks his or her knuckles should not stop doing so just because their parents asked them to.


Image found here

Great Minds Think Alike?

I’m sure we’ve all heard the phrase “great minds think alike”. I’ve said this many times before, usually when my friends and I have similar ideas. But recently in class, we’ve had several guest speakers who advocate for the type of thinking that isn’t necessarily conventional. In fact, in our “How to win a Nobel Prize” lecture, we were told that we should be constantly testing and challenging concrete ideas. This got me thinking: does collaboration in the job environment hinder progress?

If one were to look at that question just based off of the last few lectures we had before break, they might think that the answer is obvious. We were shown several cases in which previously concrete ideas that were accepted and followed were suddenly destroyed thanks to new thinking.

In today’s day and age, writes Susan Cain, employees are hired based off of things like people skills, and offices often don’t even have walls. The idea of individuals working on their own in solidarity is dying out. And that might not necessarily be a good thing. For one, collaboration rewards and even demands like-minded thinking. When working in a group, there are several confounding variables to worry about that could potentially hinder progress. For example, a group member may be hesitant to share ideas out of fear of ridicule or embarrassment. Or, one member might rely on another to come up with the ideas or do the work- I know most of us can relate to that one. But on a more advanced level, Susan Cain cites the ideas of Hans Eysenck, emphasizing the fact that group work and collaboration are harmful towards innovation. This is because working with others has potential negative externalities such as wasted energy on social cues that limit and reduce focus. In addition to this, Cain also talks about many major innovative geniuses such as Steve Wozniak, Steve Jobs and others, all of whom share introverted tendencies. Vivian Giang of Business Insider writes about multiple ways that collaboration in companies can be unsuccessful. She mentions problems such as group competitiveness, failed recognition of individual achievement, and bias disguised as loyalty to any specific group.


In her article, Susan Cain cites a study called the Coding War Games, in which programmers from ninety-two separate companies were looked at. In the study, it was shown that workers in the same company performed at the same level, but there was a big difference between different companies in terms of production. 62% of the programmers in the top companies reported that their workspace was private for the most part, whereas only 19% of programmers in the worse companies reported the same thing. She continues after reporting this study, saying that brainstorming sessions are for the most part extremely unsuccessful, and that quality of group work decreases as the size of the group increases.

There are no current studies found being done on this topic, and it is unlikely that this topic could suffer from the file drawer problem. Also, sample size does not seem to matter when it comes to this topic, since many studies and anecdotes have been recorded, whether formal or informal.

To conclude, it is reasonable to say that collaboration in the job environment does, in many instances, hinder or slow progress. Different people succeed in different ways, but it can be said that working in self-controlled isolation is the best way for the average worker to develop stronger product and better ideas.

Sound of Silence

I’ve always been very interesting in both rock and roll and the 1960’s in general. Last year, I even wrote a 25-page research paper with the topic of “How did the Beatles evolve from pop music idols into social and political commentators in the 1960’s?”. The other day, I was in the car, and the song “Sound of Silence” by Simon and Garfunkel came on. In the song, silence is talked about as if it is a real, tangible thing that can be touched and affected. This got me thinking: what role does silence play in our lives?

From a literal standpoint, silence itself is virtually nowhere. There are very few natural places where there will be an absolute absence of sound. But obviously, there are times and places that allow us to come as close as possible to that silence. Silence allows for people to focus better, it can be soothing, but it can also be harmful in cases such as those involving mental patients. From a more positive point a view, Carolyn Gregoire of the Huffington Post gives several reasons as to why silence is good for the brain. She explains that silence allows for humans to relax and restore mental resources while relieving tension. She also cites environmental psychologist Craig Zimring on his 2004 paper that explains how unnecessary noises can have effects on a persons health as well, leading to higher blood pressure and an increase in heart rate. Lastly, Gregoire explains the concept of the default mode network, or the brain’s ability to tap into deep thought, meditation or even daydreaming. Accessing this network allows for humans to think analytically and creatively. Based on this article, we can assume that silence is very beneficial to the human brain when given the right amount of exposure.


But silence (or isolation) can be used as a weapon. Historically, isolation has been used against prisoners in hopes of reform, and even on prisoners of war by other countries governments. Michael Bond (2014) of BBC writes that in 1951, Donald O. Hebb of McGill University in Montreal decided he would use human patients to see what effect extreme isolation had on the human mind. For his experiment, he had college students spend extended periods of time in silent cubicles with no human interaction or stimulation of any kind. The effects began to take place after only a few hours, with students experiencing anxiety, high emotions, and even intense hallucinations. The study didn’t last longer than a week, and many students had to be taken out after just two days. Sample size in this experiment makes no real difference on outcome, as many other experiments like this have been done before. At the same time, we cannot rule out any confounding variables, as some patients may be more prone to anxiety or mental breakdowns that lead to detrimental effects.

This practice of isolation has in the past been installed for practical and seemingly ethical uses as well, such as with prisoners. Eastern State Penitentiary was built in 1821, and the prison advocated for complete isolation for each prisoner, giving them only a Bible and restricting any communication between prisoners. It wasn’t until 1933, David Kidd (2014) writes, that the penitentiary realized the harm in placing men in an environment with complete isolation and hopelessness. The penitentiary was closed down completely in 1970 because it was viewed as being unfit for human living.

Active studies are not being performed on this topic, and it is pretty clear that isolation has negative effects on the mind in cases including incarceration, whether genuine or mimicked. But, as college students (or non-incarcerated citizens in general), it can be concluded that silence is beneficial and an important part of every day life.

To Spongebob or not to Spongebob

When I was growing up, my parents never obviously limited my TV time. I didn’t watch a whole lot of TV anyway, so it was never a big deal. Now, the only shows I enjoy watching on TV are Spongebob and SportsCenter. I’m very verbal about my love for Spongebob, and last year I had an interesting conversation with one of my teachers when she told me that she wasn’t going to let her child watch cartoons at all. She claimed that excessive cartoons, and Spongebob in particular, makes kids dumber. This got me thinking: how much of a factor is TV on brain development, and does excessive TV consumption cause negative impacts on a person’s life?

The first thing that’s important to consider is brain development in children and its stages. According to developmental biologist Jean Piaget, children go through three stages of brain development before they reach age 12. Stage one occurs between birth and age two; stage two occurs between age two and age seven, and stage three occurs between age seven and age twelve. The final stage is from age twelve onward. Generally speaking, it can be assumed that children will watch most TV between ages 2-12. The studies of Piaget state that during the intuitive phase (ages 4-7) of the pre-operational period (ages 2-7), children develop and begin to understand concepts in raw forms. In addition to that, many of those concepts become so engraved to a point where they are irreversible. It isn’t until the period of Concrete Operations (ages 7-12) that kids are able to use logic and reason more fluently. Obviously, different cartoons demonstrate and portray different messages, and some are more advanced than others. So, it can be said that depending on the cartoon, TV shows do potentially have an effect on the minds of children, and in some stages, those effects can be permanent.

Historically, it has been believed that excessive television consumption “rots the brain”. According to Kyla Boyse (2010), kids who watch TV excessively tend to have a higher chance of being obese, are more likely to drink and do drugs at a younger age, and are often misinformed about sexuality. She argues that television often promotes these things, and that that promotion can have a very negative impact on the minds and lives of children. Boyse touches on a study done that shows the effect of TV consumption on school life, stating that effects can last many years and even harm the average consumer as late as age 26. In addition to this, she states that TV consumption as a child leads to a decreased chance of college graduation and an increased chance of flunking or dropping out of school.


From a more concrete standpoint, there are obviously many opportunity costs of watching TV, including physical activity, reading, or even playing with friends and, as a result, developing communication and inter-personal skills. Another study done in early 2016 somewhat breaks down the previously stated idea, suggesting that heredity plays a huge and widely overlooked role in how children develop. Criminologists Joseph Schwartz of The University of Nebraska and Kevin Beaver of Florida State University (2016) studied the correlation between kids’ TV consumption at a young age and other variables like ethnicity, gender, and the rates of incarceration and violent crimes, as they got older. The difference of this study was that it actually tested multiple sibling pairs as well. In almost all results of the case, it was found that when blood relation was taken into consideration, almost all correlation between excessive TV consumption and negative impacts on life disappeared. So, all in all, Schwartz and Beaver are basically saying that TV consumption does not necessarily lead to bad behavior in the teenage or adult years. The claim is that those actions are already more likely influenced by heredity. Heredity is the one confounding variable that could lead to bad behavior, regardless of how much TV a person watched as a child.

To conclude, TV consumption may hinder brain development in children, but it is not the defining cause of delinquency. Granted, it can be concluded that excessive consumption does have negative impacts on one’s life, in that a child’s time is limited and while watching TV he or she is not playing with friends, reading, or engaging in physical activity. Despite Schwartz and Beaver’s conclusions, a logical parent would monitor both the type of shows their child watches and for how long they watch them and encourage them to spend more of their time at play, some thing I enjoyed daily, despite my adoration of Spongebob.


Bad Metabolism

For some reason, I seem to find myself getting sick much more often than any of my friends or family. Fortunately, these sicknesses usually don’t amount to anything more than your average head cold or sore throat, but nevertheless, my metabolism just never seems to be able to keep up. This whole thing has never really made sense to me. I’m an 18-year-old male, in high school I played 2 sports and worked out very often, I got a good amount of sleep and I definitely ate well. To top it all off, last week I got pinkeye for the third time in the past 10 months.

To tackle the question of why I get sick so often, I decided to do some basic research as to what could be making me sick. The article attached here lists off several reasons that people get sick. It touches on matters like oral health and dirty hands- neither of which pertains to me. In fact, my friends often ridicule me for being a germ freak, feeling the need to wash my hands before and after everything. After going through everything, the only section that sounded somewhat familiar to me was the section that touched on the sensation of having allergy like symptoms without having any reported allergies.


The condition that the article touches on in that section is called nonallergic rhinitis. According to the Mayo Clinic Staff (2016), nonallergic rhinitis can include factors common in an average head cold such as runny nose and congestion. In addition to this, there are no signs of actual allergic reactions in nonallergic rhinitis, which clears up the matter concerning how threatening the condition really is.

So why is it that I am cursed with this and so many of my friends are not? Obviously I can’t officially diagnose myself with the condition, but my symptoms seem to match pretty closely to the listed effects. For starters, my father has pretty bad post-nasal drip. He is always clearing his throat and coughing phlegm, which could definitely be at the base of the problem. According to The Mayo Clinic Staff (2016), constant weather changes in things like humidity and temperature can also contribute to the problem. I am from Maryland, and Maryland, like Pennsylvania, definitely experiences all four seasons. Weather in Maryland is extremely volatile- one day it will be 80 degrees and sunny and the next day it’ll be 63 degrees and rainy. In addition to that, the article above also talks about the effects of dry or dirty air. Like I said, I live in Maryland- just 20 minutes outside of DC. I worked in DC this summer, and took the beltway to work every day. There is plenty of smog, and the air is not very clean. So, although millions of other people living in Maryland and DC don’t suffer from nonallergic rhinitis, I think it may be safe to say that weather along with air pollution and unfriendly genes may be the major contributors to my problem. All in all, I suppose the only thing I can really do is keep eating healthy, exercise, and live a healthy lifestyle in hopes of combatting sickness.

Image found HERE.

Athletes V Athletic Trainers

For the past four years, I have attended a high school in Montgomery County, Maryland. There, I played lacrosse and basketball. The sports teams at my school are collectively very competitive not only in our conference, but at a national level as well. With that being said, athletes in general would obviously prefer not to sit out during any part of the season due to medical reasons. But, at my high school, injury is not what most athletes fear. Ironically, it is the training staff that most athletes are afraid to consult. The trainers at my school have a reputation for prolonging the amount of time in which athletes are not allowed to play- especially when there is a concussion at hand.

So, I decided to ask: do doctors over-diagnose injuries among high school athletes?

Before I continue, I will say that obviously, there are extreme cases where athletes need serious recovery time, or even need to stop playing their sport all together. For example, one of my friends from middle school suffered 6 concussions over the course of 5 years, and he is no longer allowed to participate in any contact sport. In addition to this, it is also important to note that obviously all athletic trainers should and usually do have the athlete’s health and safety in mind when diagnosing injuries.

To first address this question, I did a bit of research on how age and brain development may effect recovery time. Bentz & Purzycki (2008) explain that younger athletes, and high school athletes in particular, are more susceptible to longer recovery periods after concussions. This is due to the fact that the brains of younger athletes are not as developed as the brains of college or professional athletes, which makes the younger athletes more vulnerable to more extreme brain damaging. The chart below shows the difference in recovery time between high school and college athletes.



With that knowledge, it’s also notable to touch on the presence of pressure put on physicians and trainers to get athletes back on the field as soon as possible. This Amednews article (2010) acknowledges that pressure, explaining how physicians and team trainers are pressured at all levels to clear players from injury before they are ready to play again. Because of this, the article explains, certain states have decided to crack down on concussion law. They propose things like increased teaching of concussion risk and legitimate doctor approval before being cleared to resume activity. If passed, these laws could help protect athletes and potentially save lives in the long run.

According to Joseph Nordqvist (2015), athletes who suffer concussions during their athletic careers are still affected by the incident months later, whether they realize it or not. In addition to that, older players who suffered concussions earlier in their lives are also likely to feel the effects, showing symptoms that mirror both Parkinson’s and Alzheimer’s disease. The article ends by clearly stating that athletes who suffer multiple concussions after prematurely returning to sport before adequate recovery are much more vulnerable to more severe brain damage in the long run.


To conclude, I would say that students at my old high school, including myself, should have probably trusted our athletic trainers more than we actually did. Brain damage and injury is a very serious matter, and it seems that any degree of concussion can be somewhat dangerous. Athletic trainers have the athlete’s best interest in mind, and high school athletes should trust them.


The first image came from HERE.

The second image came from HERE.

Stick it to the Man

Medicine as a whole is debatably one of the most, if not the most important advancement humanity has ever had. With that being said, medicine is not hard to get a hold of, and many people abuse it. But, to the responsible consumer, medicine can help cope with everyday problems, like headaches or sore throats.

For some reason, I seem to be the only person in my family who refuses to take medicine when I come down with common illnesses. My older sister gets migraines often, so it’s safe to say she is an avid (but responsible) user of Advil. We always have medicine in the house, but whenever I get a little sick, I decide that I’d rather tough it out and wait for it to pass. This isn’t because I think I’m some sort of tough guy- because I’m not. But, over time I have developed the idea that attitude and mental toughness play a pretty big role in recovery and response to illness. By this, I simply mean that I believe that there are times when a person can overcome or minimize the effects of an everyday problem. With that being said, I obviously don’t think that mental toughness is going to prevent anyone from getting the flu or breaking bones. But, I do believe that if I take medicine less often, it will be more effective once I actually do take it, just due to the fact that my body isn’t accustomed to the additional support.

I decided to address and attempt to answer the question of whether or not mental toughness and attitude play any role in response to sickness.

blog post 2

Wikipedia gives a basic definition of mental toughness that can be found HERE, and although this definition gives no direct mention of any relation to health, the article found HERE expands on the matter, talking about how increased mental toughness can help a person achieve their goals in health, business, and life. For example, someone who is mentally tough may be frictionally unemployed for a shorter amount of time, or they may stick to their health diet more closely than someone who is not as mentally tough.

Psychologists in general believe in the importance of mental toughness. A psychologist named Gary Seeman writes the article listed HERE, and he advocates for psychotherapy to develop mental toughness in order to confront ordeals that life throws our way. So, while mental toughness may not actually influence health directly, it definitely plays an important part in everyday life and one’s ability to persevere through any type of hardship.

Like with alcohol, body tolerance forms when the consumer uses the drug repeatedly to a point at which larger dosages are needed to deliver the same effect. The article listed HERE explains the difference between tolerance, resistance, and dependence. With that being said, my stance on only taking medicine when absolutely necessary might not be completely justified. But, even with this new knowledge, I most likely won’t change my ways.




Bad Experiences

Hey everybody, my name is Patrick Winch, and I’m a freshman from Silver Spring, Maryland- about 20 minutes north of DC.

Personally, I’ve had many bad experiences with science classes. In high school, I took biology and chemistry my freshman and sophomore years. I didn’t do too well in either, so I decided not to take a science junior year, but then had to take a sports exercise and health science class my senior year. So basically, I haven’t even taken physics, which might be an issue. In addition to that, my mom is an English teacher and my dad is a musician. The sciences don’t really run in my family, which is one of the reasons why being a science major is more or less out of the question. But with that being said, I’ve always loved space. I watched Star Wars religiously as a child, and that love was revived with the new movie. When I was little, I wanted to be an astronaut- but that ideal was crushed when I came to the realization that I wasn’t very good at math or science. At NSO when I was planning my schedule, my advisor recommended this class to me. I remember reading the description and immediately wanting to take the course.

Like I said before, space has always amazed me. I went to a Catholic middle school, and my eighth grade religion teacher taught my class syllogisms and proofs of God’s existence. He was a proponent of the idea that science and religion work hand in hand, and one of the proofs he taught stated the fact that the universe is expanding over time. He explained that if we were to go back in time, the universe would contract up until the point of non existence- the same point at which an external force would be needed for creation. In my opinion, religion and science are both very human in their own ways. Religion focuses on human connection with an intangible force or forces, and science focuses on the world around us and outside of us, and how we interact with those worlds. So, with that being said, I hope this course allows me to deepen my understanding of the role of science in our world and how it can relate to other areas of study.

I still watch Star Wars pretty often- in fact; I even joined the Star Wars club here at PSU. When I was younger, I never fully understood or appreciated the science behind the movies. The National Geographic article I found, which can be accessed HERE, explains some of that science that attracted me as a young boy and still amazes me today.galaxy