Monthly Archives: November 2016

Does height affect performance in basketball?

When I was younger I used to play a lot of basketball with my friends and I realized one day that whoever was the smallest kid playing seemed to always get picked last regardless of his ability. This brings me to the question whether height impacts performance in basketball. Across the NBA there are obviously more tall players than short guys so that gives many of us the illusion that tall basketball players are better and more dominant but that might not actually be all that true. In the history of the National Basketball Association there has been a player as tall as 7 foot 7 and a player as short as 5 foot 3 so maybe its not about height as much as we all think……..

Being Tall

In basketball according to Dr. Steven Halls, “Male basketball players tend to be 9 inches taller than the average male” this just shows how size in this sport really is a trend. There are many obvious reasons why being tall in a sport like basketball helps, for example the rim is set at 10 feet tall and the closer you are to it the easier it is to get the ball in the basket. A common phrase you hear in basketball is “You cant teach size” meaning that a player who may not be that great he already has size which in itself some people see as a benefit and a way to make the player better easier. Stuart Kim who is a professor and scientist of genetics at the University of Stanford agrees with many sports analysts and physicians that height has a direct correlation with more injuries and being more injury prone which is a disadvantage in basketball especially when paying these athletes so much money.

Being Short

Many people who watch basketball always think that being short is always a disadvantage, believe it or not being smaller in a big mans game may also have many benefits. Scientist David Robson, in sports especially basketball the shorter player has the upper hand; he has the quicker first step and is faster because it takes less time for nerve impulses to travel from the limbs to the brain. Players like Allen Iverson and Nate Robinson both being under 6 foot tall have silenced all doubters and have given shorter athletes so much confidence, Robinson at 5 foot 9 has won 3 slam dunk contests and Iverson who is arguably one of the best NBA players of all time at 5 foot 11. The speed and quickness as well as the ability to to shoot better than big men is all advantages mixed in there are the obvious disadvantages of  being a shorter player such as further from the basket shorter arm span and cant play “Above the rim”.

graph above

Height in the NBA from the image above is on a steady incline which means 1 thing for sure; height does seem to be a superior factor and strategy for teams in the league.

The top 10 scoring leaders in the most recent NBA season had 7 players above the height of 6 foot 7 and 3 below that and for rebounds all 10 were above 6 foot 8. These simple stats show that the null hypothesis for the most recent season would be rejected and that height does affect performance in basketball. These are just examples of how in basketball especially the NBA, height is a necessity, numbers do not lie and for many consecutive seasons for statistics the gifted taller players dominate in major categories.

http://stats.nba.com/leaders/#!?Season=2015-16&SeasonType=Regular%20Season&StatCategory=REB

http://www.si.com/nba/photos/2014/10/08/nbas-tallest-and-shortest-together

http://www.livestrong.com/article/555710-how-to-use-shortness-to-your-advantage-in-basketball/

Classical studying

For the longest time, I have been jealous of everyone that can listen to their favorite music while studying or doing work. I’ve tried time and time again but somehow I always end up getting distracted by it or find myself accidentally writing the lyrics in the middle of my essays. One day I decided to try listening to classical music and discovered it was perfect for me. With no lyrics and just the sound of instruments playing along, I was able to actually get my work done in peace. I had heard several times that listening to classical music can actually improve your studying but never looked into this idea. In this blog, I will investigate whether this rumor is in fact true or false.

Check out Mozart, what a babe

 

While researching, I came across this study that involved 249 students wherein they were split into two groups and then attended the same lecture. However, one group listened to classical pieces by composers such as Mozart, Tchaikovsky and Bach during it. It was found that when tested afterwards, the group that had listened to classical music scored higher on the test than the group that did not.

According to this article , the reasoning behind this is that classical music’s beat pattern stimulates both the right and left sides of the brain. As a result of this, the brain is able to retain more information easily. Not only that but music can also increase your blood pressure and heart rate, allowing you to concentrate more effectively.

Another interesting factor that can play into improved studying is emotion. This report includes how classical music can stimulate the hypothalamus. This is a region in the brain that controls many functions including emotional activity. I decided to add this in on account that there is a correlation between mood and motivation. If you find yourself in a good mood you may also notice that your drive towards accomplishing something (such as studying or homework) is increased as well. Of course, Andrew would point out that CORRELATION DOES NOT EQUAL CAUSATION, so it is safe to say that one may not necessarily cause the other, but it is not uncommon to see the two go hand in hand.

Maybe next time you study you’ll consider listening to a little Mozart. If I had told you to try this only based off of my own interesting experience, you probably wouldn’t trust it as much because that would just be considered an anecdote. However my research obviously shows that classical music can help when studying. So don’t just take my word for it, try it out yourself!!

 

Sources:

sciencedirect.com/…/pii/S1041608011001762

classicalforums.com/articles/Music_Brain

sciencedirect.com/…/pii/S1053811905004052

Pictures:

MTE1ODA0OTcxNzMyNjY1ODY5.jpg

BRAN_MusicActivityPic.png

bach.jpg

Do Power Poses Work?

powerpose

Frank Underwood (Kevin Spacey) doing a high power pose

One of the most popular TED talks of all time is Amy Cuddy’s “Your body language shapes who you are”, currently sitting at 37.7 million views. This talk is famously known for Cuddy’s use of the term “power poses”, a body positioning that brings confidence, energy, and overall well-being. She breaks the poses down into high power, an example being standing up straight with your hands on your hips, and low power, an example being looking down and crossing your arms to protect yourself. Cuddy even cites a study performed breaking down the increasing in risk tolerance, testosterone, and cortisol from the participants that performed a power pose. Overall, it sounds like there is no reason not to be doing power poses. However, a recent study from the University of Pennsylvania is finding different results.

lowpowerposes

Low power poses

Penn researchers attempted to replicate the study performed by Amy Cuddy’s group with power poses. Researchers took 250 college-aged males and tested them in an experimental study that compared saliva samples between groups of each power pose. The researchers divided out two experimental groups to perform high power and low power poses respectively. The control group performed neutral power poses, meaning they sat in a natural position. After viewing faces on a computer screen for fifteen minutes as the original study did, the men played tug-o-war and provided another saliva sample. Unlike the results that Cuddy published in her talk, the researchers did not find a noticeable difference in the results, leading them to reject their hypothesis. The study even concluded a possible harm in performing power poses, based on a comparison to a 1970’s experiment with sparrows.

amycuddybook

Amy Cuddy’s book on power poses

The replication from the University of Pennsylvania is not alone. An article from Slate cites another replication study by psychology researchers Joe Simmons and Uri Simonsohn that has very similar findings of the Penn study. Simmons and Simonsohn actually found a small decrease in risk taking below the baseline of Cuddy’s original study. They concluded that if an effect were to exist from performing power poses, it would be very difficult to study and replicate. This meant that the original experiment has fundamental problems with how it was conducted. As Andrew showed us, a good experiment should be very easy to replicate by other scientists to demonstrate consistency. Cuddy’s experiment was difficult for researchers to replicate, and when it was, the results were inconsistent with the original results.

Overall, power poses may not be as beneficial as the millions of people were led to believe from Amy Cuddy’s TED talk. The scientific backing is not in support of the original results and some even cites problems with the poses. Would it be logical for an SC 200 student to perform power poses? Based on the studies presented, I would say no. However, the brain is very confusing and human intuition is lousy, so if power poses make you feel more confident, energetic, or positive, then power poses might be worth the risk.

 

Works Cited

Cuddy, Amy. “Your Body Language Shapes Who You Are.” TED.com. TED Conferences, LLC, June                       2012. Web. 30 Nov. 2016.

Gelman, Andrew, and Kaiser Fung. “Amy Cuddy’s “Power Pose” Research Is the Latest Example                       of Scientific Overreach.” Slate Magazine. Graham Holdings Company, 19 Jan. 2016.                           Web. 30 Nov. 2016.

“Power Poses Don’t Help and Could Potentially Backfire.” ScienceDaily. ScienceDaily, 28 Nov.                             2016. Web. 30 Nov. 2016.

Could our society survive a zombie apocalypse?

Although I know we’re going to talk about this in class soon enough, I find the topic of a zombie apocalypse being a conceivable event fascinating. We have all seen if not heard of the hit TV show, The Walking Dead, and I’m sure many of us have watched other shows and movies about zombies. They’re fictional, completely made up starting back nearly two-hundred years. So what makes it so intriguing that people take the threat seriously, or at least consider it a possibility that may occur in the future? I think people find it relevant to them because it comes from the genre of dystopian literature. Just like how people are paranoid of Big Brother from 1984 becoming a reality, so do they see a kind of relevance in a zombie outbreak. Both are set in our world, cast with normal people, and include a single factor that changes everything. Whether or not a zombie apocalypse will occur won’t be known until the moment Patient Zero emerges, but until then it is left for us to speculate on whether or not we could survive if it does. It is my belief that if it were thirty, maybe even twenty years ago, we would not be prepared in the least. noldHowever, in the present I think we would have a chance.

Why would we have a chance to survive a zombie apocalypse now, compared to two decades ago? The media. The influx of zombie culture has risen exponentially in the past years as an overwhelming amount of zombie-themed movies, television shows, video games, books, comics, and more have been released. A majority of these platforms take place in the present or the near future, allowing people the ability to feel connected to the story, following the paths of the characters and comparing their actions to what they speculate their own actions would be if a similar event were to happen to them. Zombies themselves have also adapted over the years through their media sources; the movie Night of the Living Dead, released in 1968, was the first time zombies were depicted as undead cannibals. Before the 21st century, most zombies were depicted as eerie and slow-moving, incapable of anything faster than an awkward shuffle. Now, to make movies and shows more thrilling, zombies are given the ability to sprint after their prey and are often capable of feats that require great strength.

The changes to the original design of the creatures made them even more frightening to consumers. And what do people do when they’re frightened of something? When a child is afraid of monsters that hide in the dark, their parents tell them that the monsters are afraid of the light, and won’t bother the child when their nightlight is on. So in the case when society is the child and zombies are the monsters, who is the parent and what is the nightlight?

Society asked, and in 2011 the CDC answered. The Centers for Disease Control and Prevention posted a Zombie Apocalypse survival guide, including a list of supplies (water bottles, food, medication, sanitation, etc.) to create an emergency kit, instructions for deciding on an ztfemergency plan with your family, and information about joining their Zombie Task Force. Also in 2011, an article was published in the Journal of Clinical Nursing that includes instructions for nurses on how to care for patients with the zombie virus (referred to as Solanum), listing details for different stages of the disease (fever, coma, death, re-animation, etc.) and discusses the possibility of euthanasia for patients already infected with the virus. While the CDC post on the subject has some levity to it, the article in the nursing journal takes the threat very seriously, affectively going through all types of symptoms, reasoning behind the illness, and even dealing with the patient’s loved-one’s psychological problems after the person has re-animated or has been euthanized.

In 2009 a rather popular mathematical analysis was published, analyzing the possible outcomes of a zombie virus outbreak. In the report’s conclusion, the authors state that unless immediate action is taken to prevent the spread of the disease, such as treatment or complete eradicationsus of the virus carriers or virus source, a zombie virus has the capability to destroy civilization, leading to a world not far from those in media depictions. The report is referenced in several of the other sources I looked at, and seems to be acknowledged as the basis of prediction for the outcome of a possible zombie apocalypse.

As of yet, only hoaxes of zombie viruses have been reported, and the closest nature has gotten to something similar is in cases such as this parasite, which infests the bodies of honeybees and causes the bee to act unnaturally before its death. There has yet to be, to my knowledge, a virus in humans, animals, bugs, or any kind of life that reanimates the victim after the initial death.

In conclusion, with the fear of zombies from recent media inciting a desire to be prepared in the potential event of a zombie apocalypse, I think that people will be able to react quickly enough to prevent a drastic spread of the virus. The mathematical analysis claimed that a cure or total eradication is necessary in order to completely stop the disease, which I believe our society may twdbe capable of because of the CDC posting an official guide for surviving an outbreak and people already having general knowledge of zombies from the media. It seems silly to write all of this in a serious manner, but the fact that people besides myself debate it earnestly as well, such as the nursing journalist, only proves my point that while such an outbreak seems implausible, we as a society would be prepared nonetheless.

Sources:

Wiki Zombie Films

CDC Blog Post

Zombie Task Force

Nursing Journal Article

Mathematical Analysis

Honeybee Parasite

Pictures:

Night of the Living Dead

Zombie Girl

Latent Infection Graph

The Walking Dead

Ice Melt

Living in Long Island, New York, I have seen my share of hard winters. The endless shoveling and clearing off the car is almost a staple routine. My mom has a really bad ankle and is always TERRIFIED of slipping on the ice. Recently my parents have resorted to the rock salt. But there have been so many claims of it being unsafe. I have my doubts and concerns about it, but first of all how does it even work? My hypothesis suggests that ice melt is highly dangerous to our health, if touched or even ingested.

According to an article I found herewhen you add the salt substance it will cause the temperature to drop. If you were to sprinkle salt on your drive way, sidewalk, or road, this should lower the freezing point which would overall dissolve the ice into liquid water. The salt will cause the ice to melt for this reason.

cro_home_icemelt_clean

(picture found here)

Now that we understand a little bit of the science… is it safe? After looking at an article found here, the ice melting chemicals usually contain a number of different things within it in order for it to be able to do its job. Usually, It contains rock salt, potassium, calcium chloride,urea, and magnesium chloride. Now, what we are all wondering is how dangerous is this combination. Basically, if touched by the skin (or paws for our furry friends), you may find some rash or dryness in the area. This is not too serious, but I could see how this could bother our pets. Poison control seems to receive a ton of calls each winter season due to young children accidentally ingesting the ice melt. Basically what happens here is the child might experience some sort of intestinal discomfort but it doesn’t seem life threatening. A remedy poison control recommends is to wipe out your mouth, and drink a glass of milk. I think the media suggests severe damages and repercussions that make the market for “safe” or “pet friendly” ice melt so popular

After reading an article I found that…

It basically suggests that it is highly unlikely for your pet to eat enough of this salt that they would die or even get sick. It is crazy to me how much press this topic seems to get when in reality these scenarios are highly unlikely.  The bottom line is, we know chemicals are harmful to our health and shouldn’t be played with, but if an accident were to occur the effects would NOT be detrimental to us or or pets.

cro_home_icemelt_salt

(Picture found here)

The Predestined Termites

As a stupid person, Lewis Terman, the founder of the IQ test, completely exposed me and my kind.

In the early 1920’s, Terman, who was a psychology professor at Stanford, created a test to determine how smart a person was. The test soon became know as the Intelligent Quotient test (or IQ test).

Terman’s hypothesis, and subsequent conclusion, was that IQ is the primary determinant of a person’s future success. The study has been referred to as the Genetic Study of Genius.

An article by the New York Times details how the study is ongoing and is the longest active longitudinal lifelong studies.

The Study (as reported by the NYT):

  • Terman selected 1,521 students throughout schools in California that had an IQ of over 135, which is well over average intelligence. These kids came to be referred to as Termites.
  • Every five years (but far more frequently in the initial years), a researcher would observe the subjects life. Different factors such as their relative success, personal life, and happiness.
  • It was classified as an observational study. It didn’t have any controls, although data and perception of success could give the researchers all the baseline controls necessary to determine a conclusion.
  • The study is still ongoing. The “Termites” are currently in their 80’s.

The Results:

  • About two years after the study was initiated, Terman concluded that these gifted students were demonstrating success in their academic and personal lives.

Even though the test was supposed to track the success of the subjects, the lifespan study revealed much more data than they had initially desired.

For example, It was found that the average lifespan of subjects with divorced parents was 76, whereas the average age of subjects who’s parents stayed together was 80.

Even though the study yielded important results that went far beyond the testing, critics have picked apart Terman’s conclusion. Here are the general issues with the conclusion (some brought up by the NYT):

  • Terman’s involvement: The professor formed a relationship with many of the subjects. This relationship may have skewed some of the data. Terman allegedly used his Stanford connection to assist some of the subjects to get into the prestigious university.
  • Sons of professors, the third confounding variable: Parent professors, who have been proven to have smarter children on average due to the genetic properties of IQ, typically provide an educated, steady, and upper-middle class household for a child to grown up in. This may skew the data and cause the data to be more influenced by socioeconomic status rather than IQ.
  • Reverse Causation: Can the subjects success influence the IQ scores? This article would point at that possibility. IQ scores are typically sporadic when taken at a young age. This means that the subjects tests may have been higher when they were young. A false genius? A little Einstein imposter? It is unlikely, but can not be ruled out based on the way the study was conducted.

It would also be interesting if Terman could have created a placebo effect by telling a certain faction of kids with IQs below 135 that their score was above 135. Maybe the confidence of being a genius would propel the subjects to expose themselves to more intense learning and a higher pressure for success. Even though this could hold moral issues, because lying to a child so they have a lifelong incorrect perception of themselves is generally frowned upon, the results would make the data carry more weight.

Beam Me Up Scotty

You have no idea how many times in my life I have wished that teleportation could be a reality. I love to travel, as in I love to be where I’m traveling to, however the traveling itself has gotten a bit tedious for me. I’m sick of long car rides, and I hate having to go through the struggle of airports. It would be incredible if I could have a little teleportation device to simply push a button on or hop into a teleportation machine and it would send me wherever I want to go in barley any time at all. Unfortunately, this technology doesn’t exist. We are on the way there though! I was curious, so I tried to figure out what sort of research is being done on teleportation technology.

Image result for teleportation

I found one research paper published which reported that a group of European researchers managed to successfully complete a quantum transport of photons a distance of about 88 miles between to Canary Islands! This article discussed how this success was accomplished and reported just a few short days after Chinese researchers reported transporting photons about 60 miles. The process of quantum teleportation means the information can be sent between two locations helped by a previously shared relationship between the sending and receiving locations.

It’s taken years to even get to this point, and we’re only just now figuring out how to send tiny little photons very short distances. It took the European researchers close to a year just to develop systems that could withstand the weather conditions they would encounter. They also used single-photon detectors to detect the information teleported across the two points, and synced the clocks between these two points.

Both of these successful attempts at quantum teleportation have set the basis for future attempts. We now know the fundamentals behind quatum teleportation. We can really only move forward from here. However, like this article says, neither one of these reports published have been peer-reviewed. This does mean that reports in the future may show slightly different results. The reports are basically just proving that a concept for quantum teleportation is one that can actually be used.

Keeping all of this in mind, we can still hold high hopes that researchers will accomplish some incredible things in regards to quantum teleportion in the coming years. We’re one step closer to a real life Star Trek…

Sources:

Source 1

Source 2

Source 3

Picture Source

Vicks Vapor Rub on… Feet?

screen-shot-2016-11-30-at-1-52-24-pmIt seems that I been chronically sick since coming to college. I was telling my mom my symptoms and she mentioned putting Vicks vapor rub on my feet with socks over them. I was shocked and grossed out, but curious to see if it would actually work. As far as I know, for that one night I did not cough as much. This could’ve been due to chance or actually a direct correlation. So, I wanted to look into this and see if there’s evidence.

In an article I read, it talks about the fact that screen-shot-2016-11-30-at-1-55-38-pmsome people swear by this method, however this article explains that this method has no supporting scientific evidence. Vicks rub contains menthol, which tricks the brain into thinking that airways are clear. This makes you feel less congested. Because of this, Vicks is typically rubbed onto the chest so that it is a close enough distance to your nose in order to be inhaled. Rubbing it on your feet does not allow the menthol to be inhaled because of the distance from your airways. However, as previously stated, some swear that rubbing it on your feet as well helps the common cough.screen-shot-2016-11-30-at-1-52-34-pm

If I were to do a trial I would do a randomized double blind trial where I had:

Group 1: No Vicks on feet or chest for 8 hours sleep

Group 2: Vicks only on chest for 8 hour sleep

Group 3: Vicks only on feet for 8 hour sleep

All involved would need to be sick with the same cold or bronchitis, because this affects the chest congestion. Then, I would see who had the most improvement with less coughing and less congestion.

Takeaway

screen-shot-2016-11-30-at-1-52-12-pmThere’s no harm in trying this unordinary method, however be aware that it might not be the best method for curing your cough.

Health Benefits of Chocolate

Ever since I was little, I would get terrible headaches. Although they have become less frequent over the years, they are still unbearable. I was never able to find something that would be able to alleviate the pain. I decided to research different remedies that could help get rid of headaches. One particular remedy was chocolate. As strange as this seems, many people believe that chocolate has health benefits that can alleviate the pain of a headache. As much as I wish that chocolate had a direct causation of getting rid of headaches – considering my love for chocolate – there is a good chance that it is not. I decided to do some further research on this topic; the null hypothesis would be that chocolate does not relieve headaches and the alternative hypothesis would be that chocolate does relieve headaches.

According to an article on Live Strong, the cocoa bean contains antioxidants that get rid of free-radicals. The definition on Merriam-Webster describes a free-radical is a reactive group of atoms. These atoms can cause blood vessels to become inflamed which leads to headaches and migraines. Therefore, eating chocolate is one possible way to reduce inflammation and pain (Bond 2015). However, dark chocolate would be the most effective because it contains the highest amount of cocoa powder. In addition, people with diabetes can sometimes get hypoglycemic headaches from a low amount of sugar intake. Eating chocolate has been proven to reduce the pain of these types of headaches (Bond 2015). These assumptions however still do not mean that there is a direct link between eating chocolate and relieving headache pain. However, in another article written by ABC News, they explain how one study conducted on rats found the same theory to be true – the cocoa powder in chocolate reduces inflamed cells, which therefore reduces headache pain. However, they also explain how many other studies cannot prove whether chocolate can or cannot relieve headaches (Chitale 2008).

Another theory is that chocolate can actually  generate headaches. In a double-blind placebo study conducted at the University of Pittsburgh, researchers studied whether chocolate can cause headaches and migraines. The study was strictly based on women who had chronic headaches. The study was a double-blind trial where the women were randomly given two samples of chocolate and two samples of carob – a healthy substitute for cocoa (Merriam-Webster). The study found that the chocolate was no more likely to start a headache than the carob (Marcus, Scharff, Turk, and Gourley).

So in conclusion, there is no definite proof that chocolate can relieve headache pain or not. There’s also so evidence of chocolate provoking a headache. The main takeaway from this is that although chocolate is not the definite way to stop the pain of a headache, it wouldn’t hurt to eat a little piece the next time you do have a headache just to try it.

Sources:

http://www.chocbox.co.uk/media/catalog/category/chocolate-bar.png

http://www.merriamwebster.com/dictionary/free%20radical

http://www.livestrong.com/article/428310-why-does-chocolate-alleviate-aheadache/

http://abcnews.go.com/Health/MindMoodNews/chocolate-studies-headache/story?id=8530685

https://www.ncbi.nlm.nih.gov/pubmed/9453274

http://www.merriam-webster.com/dictionary/carob

Sleep Apnea: How Dangerous is it?

sleep-apnea-1 Image 1

I think we can all agree that sometimes there is truly nothing more satisfying than a good nights sleep. When we are asleep, we feel safe and impregnable as we are snuggled under a soft blanket, lost in a dream as our brain and body are given a chance to relax and rejuvenate. Sleep is vital for an individuals health but when your sleep cycle is constantly disrupted there can be many dangerous consequences. Sleep apnea, a chronic disorder, interrupts sleep as ones breathing motions are paused up to 20 seconds. This causes them to snore, constantly wake up feeling short of breath, as well as choking or grasping for air in the middle of the night.

Sleep apnea happens when the airways in the lungs are either blocked or the soft tissue in the upper airway of the throat collapses, which restricts oxygen from flowing throughout the body. A persons diaphragm and chest are coerced to work harder in order for the lungs to receive oxygen. The brain and lungs don’t receive enough oxygen and it becomes difficult to breath because the brain doesn’t trigger when it is necessary to breath. The majority of the people with sleep apnea are overweight, middle aged men who have encountered a history with excessive drinking and smoking. Other risk factors that could potentially lead to sleep apnea is sleeping on your back, as well as enlarged tonsils. Sleep apnea isn’t just associated with men; nine percent of women and five percent of children are diagnosed with sleep apnea. Something so minuscule like snoring is perceived to be a minor problem but sleep apnea is an extremely dangerous disorder. Since airways are blocked blood vessels have a difficult time providing for the heart. This increased the risk of high blood pressure which leads to heart diseases as well as strokes. Those with obstructive sleep apnea also become over-tired because they aren’t getting effective sleeps so they have poor concentration, weaker immune systems, and a greater difficulty to learn.

sleep-apnea-diagramImage 2 

A study was conducted by Daniel J Gotllieb to discover if sleep apnea is linked with heart failure. This longitudinal observational study is definitely reliable considering the high sample size of 1927 men and 2495 women who are over 40 years old and tests were for run for nine years. The null hypothesis states that there is no relationship between coronary heart disease and obstructive sleep apnea. The alternative hypothesis is individuals with obstructive sleep apnea have a greater risk of having heart failure.  In this study, the null hypothesis was rejected because men who have sleep apnea were 58 percent more likely to have heart failure. The cause of this was due to the nocturnal blood pressure elevation during ones sleep apnea. The study revealed middle aged men with sleep apnea had a greater risk of cardiovascular problems and heart failures opposed to women.

Another study revealed how sleep apnea negatively effects a child’s cognitive potential. They sampled 19 kids with obstructive sleep apnea and 12 kids who were healthy sleepers in order to see if there was evidence that kids with sleep apnea have problems learning, focusing and decision making. After the kids were given neuropsychological tests, researches concluded that brain cells were damaged, as there was severe changes in metabolites inside the brain of kids with with obstructive sleep apnea. This caused those kids to perform weaker on IQ test as well as decision making tests. The effects of an interrupted sleep cycle has negative effects on the learning of children and ultimately their future success. Obstructive sleep apnea alters their decision making and has led to an increased chance of ADHD as well as behavior problems. This will cause them to get easily distracted and make it hard to focus which impacts them tremendously in school. The control group in this study had no patients with ADHD because which could be a factor that could have helped determining a stronger correlation between sleep apnea and ADHD.

Sleep apnea may be overlooked and under diagnosed but is a very dangerous disease that people must be aware of. Snoring, daytime fatigue,quick wake-ups, and restlessness during sleep may seem insignificant as well as un-harmful but is extremely dangerous. Since there is a correlation between sleep apnea and body weight as-well as health, to an extent this disease can be preventable if a person makes life-style changes like eating healthier and reducing the use of alcohol and drugs which will lower the chance of obtaining obstructive sleep apnea.

sleepapnea1_1

Image 3

Citations

www.helpguide.org/articles/sleep/sleep-apnea.htm

www.medicalnewstoday.com/articles/178633/.php

circ.ahajournals.org/content/122/4/352.full

http://www.medicaldaily.com/sleep-apnea-children-effects-may-include-adhd-learning-problems-244833

 

 

 

 

 

Coffee: Can it Prevent Diabetes?


thumb_2033_content_big_wide

Image 1

Before I entered college I wasn’t a coffee drinker. As refreshing as an ice coffee looks it never satisfied my tastebuds. Frankly, I never understood the huge craze for coffee and why people would wait on an extensive line and pay the price for that inconsequential cup of coffee each and everyday . After a few long, rough weeks in college I realized it provides my body with many beneficial nutrients like antioxidants and the hormone adiponectin which actually benefit my health in the long run and can prevent me from diseases like diabetes.

So what even is diabetes? Type 2 Diabetes is a widespread disease that over 29 million people are diagnosed with. Diabetes influences the production of glucose within the body because the cells don’t allow glucose to enter into the body. These people with diabetes have an insulin deficiency and their bodies don’t produce and absorb the correct amount of sugar for their body. Insulin, is a hormone which regulates how much sugar can be absorbed  by the body are affected. If there isnt enough sugar or too much inside the body, the central nervous system is weakened because the brain and muscles don’t have enough energy to properly function. Diabetes is extremely dangerous because it can lead to future problems like kidney failure, heart failure and strokes.                                                                                                                        

Image 2

coffee-diabetes

A longitudinal observational study  produced by the American Diabetes Association completed a study testing whether increasing the amount of coffee drank by women will reduce the risk of developing diabetes. 88,259 women who didn’t have diabetes, whose ages ranged from 24 to 46 were tested in order to see if the intake of coffee would actually lead to less of a chance of getting diabetes. This observational study included a large sample size, and a high follow up rate on the patients for 10 years. The factors make this study extremely accurate and cancel out a great deal of bias that could have been associated with this type of study.  The null hypothesis was there is no relationship between coffee and chances of getting diabetes. The alternative hypothesis which was there is a correlation between coffee and diabetes and when coffee is consumed the chances would be lowered. The study did have third confounding variables that could have potentially caused a relationship between coffee and diabetes. For example, body mass index, genetics, and smoking and the consumption of alcohol. After the data was analyzed, the results illustrated that the more cups of black coffee a women drinks, they will have a reduced risk of developing type 2 diabetes. The risk (exposure x hazard) when a women drinks one coffee is .87, then reduces to .58 after 2-3 cups and then drops to .53 when 4 cups of coffee are drank opposed to non-coffee drinkers. The explantation for these statistics is the more coffee consumed, the better glucose tolerance a person has. These people have a blood glucose level usually under 140mg/dL opposed around 200mg/dL for people who are diagnosed with type 2 diabetes.

In the Diabetologia, a study published in 2014 by Shilpa N. Bhupathiraju, she tested the subsequent risk of type 2 diabetes in both men and women after the consumption of coffee. 48,464 women and 27,759 men were tested over a 4 year period. Patients who increased the amount of coffee they drank per day by one cup reduced their risk by 11 percent but those who decreased coffee intake by a cup a day actually increased their relative risk to develop diabetes by 17 percent.  The recent question “does coffee reduce risk of developing diabetes” has been ongoing and doesn’t  fit under the Texas Sharpshooter Fallacy or the File Drawer Problem because no data is being ignored or hidden and there are no false conclusions being taken on this topic because there are over 28 published studies on whether coffee has an effect on diabetes.

If someone is looking to make a lifestyle change in order to try and prevent type 2 diabetes they must live a healthier lifestyle and cut back on unhealthy eating and drinking. Increasing their coffee intake could reduce their risk of having diabetes. It is important to drink black coffee because it is healthier and works more efficiently opposed to coffee with sweetener and sugar. Patients who have type 2 diabetes have a greater amount of serum amyloid which is an inflammatory within the blood. Coffee drinkers have less serum amyloid in their blood which explains how coffee can reduce ones risk of obtaining type 2 diabetes.

main-qimg-b2b235ca590713a94cbf751e9dd0f24c-c   Image 3

Citations 

http://link.springer.com/article/10.1007/s00125-014-3235-7

http://care.diabetesjournals.org/content/29/2/398

http://www.medicinenet.com/script/main/art.asp?articlekey=17982

http://link.springer.com/article/10.1007/s00125-014-3235-7

Is a School Uniform Policy Beneficial?

Image 1 image-20151214-9540-ddxul1

In high school, I was given the freedom to dress with any style I wanted to. On any given day, I could wear anything that I had in my closet as long as it wasn’t offensive to my teachers and class mates.  The way a person dresses, reveals what kind of person they want to be perceived as and will determine how people respond to you . It also shows what their interests are, what there style is, and their background. Everyday students wear inappropriate, revealing, racist, and embarrassing clothing that may lead to bullying because it causes problems throughout students and the school district.. The way we dress has a big influence on a schools environment and how disciplined a student is.

School is a place of learning where kids and teenagers are taught social skills, behavior skills, proper dress attire, fundamentals, and most importantly education. These lessons are essential when these students grow up and start to enter college and then the workplace. School uniforms in public schools are being adopted across the United States and have dramatically increased at a higher rate than ever before. The percentage of public school uniform policies have increased from 12 percent in 2000 to over 20 percent in 2014. Schools are now implementing uniform based dress codes because of the positive affects in the community as well as among the students. For example, just after two years after enforcing a uniform policy in Long Beach, CA they found students were safer because their was less crime going on.  The percentage of sexual assaults, battery chargers, robberies, drug possession, and vandalism were all cut at a substantial rate. When all of the students are dressed in an appropriate uniform there are less sexual attacks than if students were wearing inappropriate or degrading clothing. Also students are less likely to conceal weapons under big and baggy clothes because they are wearing a uniform. Also it is easy to keep track of the students and make it difficult for intruders to come inside the school. When an environment is safe, student are kept off the dangerous streets and are able to put more focus on school and learning.        Image 2

600-158234739-formal-school-uniformsIn 2009, Elisabetta Gentile from the University of Houston tested how school uniform affect the students behaviors and academic success in school districts around the southwest.  The data of this observational paper was composed from 1993 to 2006 which included test scores, attendance records, and disciplinary records  In only 5 years they saw an upward trend in attendance as rate increased from 93 percent to 96 percent. They found attendance in girls also increased  7 percent when the uniform policy was in place.  Students did have a small increase in language exams which fell between .02 and .04 standard deviations. Disciplinary actions didn’t decrease but the reason for this was most likely because of harsher rules within the building as well as uniform infractions.         Image 3 

school-uniforms-5-728

A study composed by Virginia Draa of Youngstown State University found a positive correlation between uniform and the improvement in graduation, attendance, and behavior rates among 6 big city Ohio schools.  She saw that when students were forced to wear a uniform to school their graduation rates increased to 11 percent where as non uniform public schools had gradation rates drop 4 percent. Also in 4 of the 6 schools attendance rates up 3.5 percent. Since there was only 6 schools tested and only 2 didn’t have an increase attendance rate we can conclude that could have been due to chance. This means there is no controlled explanation of why 2 of the schools attendance rates didn’t go up.

It’s obvious throwing on khakis,dress shoes, a button down or polo isnt going to magically make students more intelligent and make them preform better in school. But it will be beneficial to the community and other members in their school. Giving students too much freedom in their dress code will cause unnecessary problems, fighting, and stress. When there is less violence,bullying and peer pressure students feel safer attending school. Uniforms are extremely beneficial but most importantly in poorer and dangerous areas where crime is high. In the future implementing uniforms would be beneficial in rural cities around the world because it will greatly impact students.

Citations

http://www.cleveland19.com/story/4356460/study-says-school-uniforms-might-help-attendance-graduation-rates.

www.uh.edu/econpapers/RePEc/hou/wpaper/2009-03.pdf

www.schooluniforms.procon.org

 

Live High, Train Low

co-high-altitude-training-camp  Image 1

In order to be an Olympic athlete you must be willing to put your body through the most challenging and uncomfortable positions. These elite athletes alter their training and living conditions in order to give their body an advantage during competition. The United States Olympic Training Center is located in Colorado with altitude that towers 8,000 feet above sea level. Olympic athletes who want that “edge” or want to improve their performance in their sport, live here in preparation for the Olympic Games.  They follow the training method Live High, Train Low where there body must acclimate to the high altitudes while living and then train at lower altitudes.

When a person breathes, oxygen infiltrates into their lungs and then defuses into the red blood cells but at higher altitudes, the air is thinner which makes it more difficult to breathe because there is less oxygen per volume of air. When the body can’t get enough oxygen into the lungs,  erythropoietin triggers more red blood cells to transport to the muscles.  Athletes want to produce more red blood cells, especially hemoglobin because they want to increase the carrying capacity of oxygen into the muscles which then fuels the body with more energy. When there are more red blood cells in the body more oxygen can pass which increases performance and will maximize training intensity when an athlete moves to a lower altitude. This training method has positive effects to the body, for example a lower heart rate and lactate, increase hematocrit,  and the density of the mitochondria increases as well.

The Journal of Sports Science and Medicine published a study  whose aim was to see how living  in high altitudes and then training at low altitudes effects the blood characteristics and times of  triathlon runners. The study eliminated bias as both men and women followed the same protocol over the course of 17 days and there was also a placebo group. They found a 3.2 percent increase in hemoglobin mass. The results also revealed a positive shift towards the lactate-power so the athletes ran at a faster speed. Runners improved 2.8 percent which does illustrate that following this training method is advantageous for a serious competitor because olympic athletes compete at the highest stage where every second matters.

altitude-boosts-exceeds-live-high-train-low-altitude-training-methods-800 Image 2

In a 1998 literature review by Dr. Baker, she wanted to find out the effects on the athletes who endured in living high and training low. She focused on five studies of athletes who lived at high altitude where the air was thin and then trained close to sea level. She found there to be an improvement in the time of runners who followed this training method in every study. The improvement was only 2-3 percent but for an elite athlete that is the difference between winning and loosing.  In one experiment of 22 runners, 54 percent of them found positively responded to living at a high altitude and then training in a low altitude. From these studies it is easy to conclude that this method provides advantages for runners who want to improve their speed and endurance. When the body finally adapt to the the thin air it can increase its carrying capacity of oxygen which allows them to run at a better pace for a greater amount of time.

The training procedure which requires athletes to live in high altitudes and then train in low altitudes with more red blood cells and oxygen is beneficial at a smaller degree. It must be noted that improvements are recognized at such a small amount that a normal athlete who doesn’t participate at the highest level shouldn’t make a major life change and follow this training method. It is feasible and influential  for the best athletes in the world who are looking for that gold medal. Maximum strength and power sports won’t reap the benefits but this type of procedure can significantly help runners, cyclers, swimmers, skiers and even some other sports.

maxresdefault-2  Image 3

 

Citations 

http://altitudetraining.com/main/sports/research/PracticalApproachBurke

http://www.medicinenet.com/erythropoietin/article.htm

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3772580/

 

Can drinking water improve your grades?

Staying hydrated is one of the most important habits in human health.  But we’ve all had days where we somehow just forget to drink enough water.  Days after these are when we wake up with chapped lips and suddenly attempt to re-hydrate.  This morning was one of those times for me, except not only did I wake up with the dire need to quench my thirst; it felt as though my brain was foggy for the entirety of the morning.

This got me thinking.  Could hydration mean something more than physical wellness?  Could drinking enough have a direct correlation with mental health?

According to a study I found on hydration and cognitive function in children, I discovered that this might just be a correlation that is completely undermined in everyday human health.

Image result for drinking water

My question is whether x variable (drinking water) has a positive, negative, or no effect on y variable (cognitive function).

The first study I looked at focused on different degrees of dehydration including none/mild, moderate, and severe.  At birth, humans are composed of 75% water.  By late childhood/early adulthood, humans have lost nearly 25% of this water.  When looking at dehydration in the average human, mild dehydration can mean as little as a loss of 2% weight in water.  Far too often we experience this 2% loss or mild dehydration.  Mild hydration can decrease performance level in tasks as simple as short-term memory, psychomotor skills, or tasks such as arithmetic ability or perceptual discrimination.  In this study, research participants were strategically dehydrated through either heat exposure or treadmill exercise.  Researchers found after observing participants post-dehydration process, that participant levels of concentration and tracking performance were significantly lower than before the dehydration process.  They also discovered that participants showed signs of increased tiredness and headaches.  Additionally, participants were asked to identify the color of an object.  The reaction time for this test increased post-dehydration, though participants still correctly answered.  What is notable about this study, is that cognitive function only continued to decrease as participants were further dehydrated.

In a second study that I found, researchers deprived participants of fluids for a total of 28 hours.  In the average participant, this caused for a 2.6% loss in body mass.  Researchers were unable to note a clear decrease in cognitive function, but the volunteer participants self-reported an increase in tiredness, decrease in concentration, effort, and overall alertness.

Image result for drinking water

By looking at both of these studies, it can be concluding with a good deal of certainty that failing to stay hydrated, even within a single day, can decrease cognitive function.  I would argue that X and Y have a strong negative relationship.

What’s the practical takeaway then?  Stay hydrated, kids.  As we approach the end of the semester, keep a water bottle with you and refill it throughout out the day as you continue to drink water.  Staying hydrated won’t only keep you physically healthy, but it will absolutely help your mental/cognitive state.  My high school crew coach taught me a rule that I do my best to live by: drink half your weight in fluid ounces every day.

Who knows though?  Staying on top of your hydration game might just make the difference in that borderline A!

 

D’Anci, Kristin E. “Hydration and Cognitive Function in Children.” Nutrition Reviews 64.10 (2006): 457-64. Web. 30 Nov. 2016.

Lieberman, Harris R. “Hydration and Human Cognition.” Nutrition Today 45.6 (2010): 33-36. Web. 30 Nov. 2016.

 

Photo 1 Link

Photo 2 Link

Should You Take That Drink?

When a student enters college, finally abandoning their parents after 18 years they instantly begin their journey to become an adult. College reveals a new world that gives kids a chance to make a new name for themselves as they can express their feelings and beliefs. They are associated  with a new group of diverse individuals and must adapt and develop friendships and beneficial relationships.  These freshman are given new responsibilities and challenges that they must deal with in order to finally prove their independence and maturity. One of the hardest adjustments for students is the social aspect of college. Lets be real, in college, kids like to party and at these events they are usually drinking. Because of the fun and excitement, sometimes drinking and partying will take over a students life and they begin to put their priorities and schoolwork aside in order to go out and drink with their new friends. Unlike high school their parents can’t influence their decision making and prevent them from consuming alcohol at such a high rate. So this brings me to my question, is there a correlation between a college students grade point average (GPA) and how much they drink.

Drinking alcohol  is extremely common among college students and has been over the course of history.  For college kids it is easy to get away with drinking whenever they want because they don’t have parental supervision at all times.  The more time these kids are drinking the less they are sober which means they will be studying a lot less. If they are less prepared for their classes and exams, they most likely aren’t going to do as well as they could have. Waking up hungover will cause the student to loose motivation and energy which can negatively effect their work ethic and study habits.  According to the National Institute on Alcohol Abuse and Alcoholism, if a student is intoxicated from the night before they are 5 times more likely to miss class the following day than a student is sober the night before. Alcohol also has many negative effects on the body. For example, areas of the brain which are responsible for an individuals learning, memory, verbal skills, vision, and cognition are all affected. The heart is affected as heart rates increase and blood pressure rises. Also cancers can develop at a higher risk in the liver, breast, mouth and throat because of the rapid consumption of beer and vodka.Alcohol also increases the risk of sexual assaults, campus violence, death, dropouts, and spread of sexually transmitted diseases. All these factors illustrate the negative consequences of drinking alcohol and how it could impact not only the GPA of a college student but there overall health.

vermont-college-links-energy-drinks-to-high-risk-sex-ousts-them-from-campus-1024x683 Image 1

After knowing all the dangers of alcohol consumption, it should be perceived that there is an inverse relationship between alcohol intake and GPA( More one drinks, lower the GPA). After reading a study conducted by Jill Coymen, testing to see how alcohol use affects the academic success of students from the University of New Hampshire. In the study her independent variable was alcohol use and the dependent variable was academic performance. She sampled   223 students, (59 percent female) and (41 percent male), (85 percent being white) with 74 survey questions about drinking habits and academic success.  The null hypothesis in this study was there is no difference in GPA for the students who drink alcohol and the students who don’t. The alternative hypothesis is students who abuse alcohol will have lower GPA’s. After analyzing all of the data, Coymen discovered there was no correlation between alcohol abuse and GPA’s of college student, therefor we must accept the null hypothesis. 67 percent of the students drank 2 or more nights a week. A majority of the students had between a 3.0 and a 3.5 which are very satisfying grades in college. Also students with a 3.5 reported drinking 3 days a week.

screen-shot-2016-11-24-at-10-07-13-pmscreen-shot-2016-11-24-at-10-08-56-pm

Image 2&3

 

In a randomized control  study, aimed to find a correlation between alcohol consumption and GPA. This study received 113 responses from college students and wanted to determine if students GPA’s would be lower because of alcohol intake. There was 78 percent female and 22 percent male which could in fact lead to some bias data but a majority of them drank alcohol atleast 2 nights of the week in college. Overall,  58 percent of the college students had over a 3.0 and 42 percent had over a 2.0. This illustrates that drinking doesn’t actually destroy a students grades.

This blog isn’t suggesting that if you drink more your GPA will rise, because there is no way that statement is true but it does reveal that just because you necessarily consume alcohol your grades won’t drop. I believe that students can spend there free time doing whatever they please as long as they can manage their time, go to class, and study for exams. Consuming alcohol is dangerous and can negatively impact a student like causing their GPA to drop and fail out of school. Going out an enjoying yourself is part of the college experience as long as that student can get it done in the classroom first.

Citations

Why am I sick… Again?

Andrew has been telling us all semester that we should be getting a head start on our blogs before the end of the semester arrives where people start to get all kinds of sicknesses. I, along with plenty of other students heard this advice and thought “oh that isn’t going to happen to me.” Well here I am at the end of the semester with a runny nose and a vicious cough to accompany my overwhelming amount of work to do before grades are due. It seems ironic that I am sitting here writing this blog post about advice that fell on deaf ears with a tissue box to my right and a bag of cough drops to my left. But I am left wondering why is it that every year students are getting sick at the same point in the semester. Why am I suffering from the Penn State Plague once again?

paid_sick

Without fail, every year students get sick around the end of the semester so there must be a reason for it. While brain storming the reason of such a sudden wave of sickness, I thought about the time of the year and made an assumption that maybe stress from the end of the semester affects students’ immune systems in a negative matter. The null Hypothesis is that stress does not affect the immune system where as the alternative hypothesis is that stress is lowering the immune system. Stress would be the independent variable and the Penn State Plague would be the dependent variable but there are plenty of confounding third variable possibilities as well,  one of which could be lack of time. You are stress because you don’t have enough time to get all your responsibilities done and the lack of time puts a toll on personal hygiene which makes you sick.  While researching my hypothesis I found a study that followed medical students during their schooling. The study found these students immunity decreased during the 3 day period when they were taking a test. They also found that these students also have a lower level of Killer T-Cells.  Killer T-cells are a type of white blood cell that has the ability to attack and kill other viral or abnormal cell in the body. This study opened up a slew of studies researching this area. After all these studies were complete, Suzanne Segerstrom conducted a meta- analysis on all of the studies completed. This meta-analysis of over 300 studies concluded that stress all around weakens the immune system. We know from class that a meta-analysis is considered strong evidence, much stronger than anecdotal evidence.

While stress is definitely a key factor of why people are getting sick, I don’t think it is to only variable causing these sicknesses. This could be caused by student’s lack of cleanliness or even the change of seasons as I have found in the article. You might notice the atmosphere starting to get colder but this is not directly the reason why people are getting sick. As it turns out, this has some link to why we are getting sick though. When the temperature starts to change different viruses thrive in colder environments leading to higher exposure of the virus. Therefore the change of seasons allows viruses like the flu to infect more people.

Overall there could be many reasons why I have come down with flu-like symptoms. Do I think that stress from 3 exams this week might be one of those reasons? Absolutely. With evidence of meta-analysis, stress is definitely one of the factors to why I am sick along with the change of seasons. The take home message would be not to procrastinate on your blogs because no matter how strong your immune system is, you never know what virus you might catch in the latter part of the semester.

What’s so ‘super’ about super-foods?

Blueberries, quinoa, kale, Greek yogurt, oatmeal, green tea, almonds, cranberries.  The list could go on and on.  To some, this is just a list of food.  To others, this list of foods are known as something more: super-foods.  But what is it about these particular foods that makes them so super?  Is there something that allows a seemingly average food to earn the prefix of ‘super?’

I’m not vegetarian.  I’m not vegan.  I wouldn’t even consider myself an organic eater.  I do, however, love fresh food.  I like to know where the meat on my plate came from and whether or not my produce was grown locally or even regionally.  In other words, I try to be conscientious about what I put in my body.  After skimming a list of super-foods, I discovered how many I incorporate into my diet without realizing it.

Now I grew up in a home that put a good deal of emphasis on health and nutrition.  I know what a balanced meal is supposed to look like and how many cups of vegetables you’re supposed to eat in a given day.  If we were to poll the class, I’m fairly certain that most of the class would be able to distinguish healthy foods from unhealthy or junk foods.

But we’ve also been taught to pair health benefits with certain foods.  I’m sure all of us have heard that eating carrots improves your eyesight.  Or that drinking milk will strengthen your bones.  Or even that eating oranges during the winter will help keep you from catching a cold.

So is this it?  If a food brings a certain health benefit to the table (pun not intended), can it now be classified as super?  According to an article from the Journal of Agriculture and Food Chemistry– the answer is both yes and no.

In the realm of science, one of the most defining characteristics of super-foods, is that of energy density.  Looking at fruits or vegetables in particular, these special foods are especially rich in phytochemicals.  Phytochemicals, otherwise known as phytonutrients are exactly what is in the word: the nutrients or chemicals of a plant.  These play a hugely significant role in human health and can be found in fresh super-foods that are produce.  In produce such as kale or pumpkin, there exists a phytochemical called beta-carotene that benefits the immune system, skin health, bone health, and vision.  Produce that contain high levels phytochemicals automatically get bumped up to the status of super.

Image result for superfoods

So why classify foods as super?  I think that this name is helpful in understanding that no matter what the benefit may be, super-foods have something special that makes them stand out that other foods may not have.  Whether it be a load of vitamins or antioxidants, these foods can earn this title because they all have one thing in common: they boost your overall health levels.

Yes, on a college campus these foods might be harder to find, but it might just be worth taking a closer look at the dining hall menus or maybe stopping by Jamba Juice for a Acai Super Antioxidant smoothie.

As we have discussed in class, science informs public health policy, but there is much more to science than just the facts.  We see foods getting bumped up to super status all the time in food journals, magazines, and every day news.  Science is ever changing, as is the realm of nutrition.  Probably the most important takeaway despite this, is that no single good thing eaten out of moderation will remain a good thing.  In other words, no matter how super a food may be, it is ultimately more important to maintain a balanced diet.

 

Lunn, J. (2006). Superfoods. Nutrition Bulletin, 31(3), 171-172.

Seeram, N. P. (2008, January 23). Berry Fruits for Cancer Prevention: Current Status and Future Prospects. Journal of Agricultural and Food Chemistry, 56(3), 630-635. Retrieved November 30, 2016.

What Are Phytonutrients? – Fruits & Veggies More Matters. (n.d.). Retrieved November 30, 2016, from http://www.fruitsandveggiesmorematters.org/what-are-phytochemicals

Photo 1 Link

Photo 2 Link

Does consuming dairy cause us to break out?

My whole life I’ve I’ve been a huge fan of dairy products – milkshakes, yogurt, ice cream, cheese, you name it. My family always had these things stocked in our fridge, so I never saw anything wrong with them. I did notice, though, that I always broke out on my cheeks and forehead whenever I ate milk chocolate and custard, and I always dreaded waking up with acne after indulging in my favorite chocolate ice cream. A few months ago my friend told me that almost all of her acne was gone simply because she cut out dairy from her diet. This got me thinking – does dairy consumption and what we put into our body affect how much acne we get? The null hypothesis would be that our consuming dairy doesn’t have any real effect on acne while the alternative would be that certain foods and beverages, like cheese and milk, cause us to break out! I researched this question and it turns out that yes, dairy does play a huge role in our breakouts!

high-fat-dairy-products

 


First Study

Although the relationship between diet and acne has always been extremely controversial, a study published in the Journal of the American Academy of Dermatology by Dr. Caroline L. LaRosa shows a positive association of low fat/skim milk consumption and breakouts. 225 people were recruited to participate in this study, and were divided into two groups – the control group and the group that consumed the dairy products. The study stressed that both groups of people perviously had moderate facial acne and were cleared by a dermatologist to participate in the study so that it was as unbiased as possible. Both groups were given low fat/skim milk, but the control group was given a significantly smaller amount. The result? Although both groups consumed dairy, the control group, or the group that consumed much less of it, had a significantly smaller amount of acne on their face compared to the group that consumed more dairy. It should be noted that both groups started off with relatively the same amount of acne on their faces. Although the study was able to make associations between dairy consumption and acne, it was unable to determine causation, so we don’t know the mechanism, or why this occurs.

A conjectured mechanism as to why dairy might cause acne is that it is a high-glycemic-index food group, and regular consumption of foods with a high glycemic index elevates serum insulin concentrations, which may stimulate sebum production. Sebum is an oily secretion from the sebaceous glands that, when the body produces too much, causes us to break out.

acne

 


Meta-analysis

Just a few years ago, scientists knew almost nothing about dairy’s effects on our skin, but so many studies have been conducted since then that Dr. Whitney P. Bowe of SUNY Downstate Medical Center published a meta-analysis in the Journal of the American Academy of Dermatology of the results. In her meta-analysis, she talks about twenty different studies that were conducted that more or less had the same results; either increased consumption of dairy increased the non-control group’s amount of acne, or decreased consumption of acne decreased the control group’s amount of acne. I do not believe that this topic suffers from the file drawer problem because the accepted idea today in science is that dairy consumption does cause acne, so if a scientist conducted a study and found no correlation between the two, they would definitely want to publish that. To my knowledge, not many studies like that exist. This is why this issue does not suffer from the Texas Sharpshooter problem, either. There isn’t just one study that claims that dairy consumption causes acne, there are many. There have been too many positive correlations between the two, so no one positive result is being emphasized as being correct.


Something that we always have to worry about when it comes to science is anecdotes. A great example of this was the video that we watched in class about a woman who claimed that a vaccine caused her to be disabled, impairing her speech and forcing her to jerk violently when walking forward. Not only was this proven to be a hoax later on, but your chances of getting a life threatening reaction due to a vaccine, the small pox vaccine for example, is 30 in 1,000,000. I found many blogs online that bolstered about the authors’ positive experiences with cutting out, like this one titled “I Gave Up Dairy And All I Got Was The Best Skin Of My Life”. Although anecdotes should not be anyones primary reason for doing anything, this topic has much extensive research and results showing that cutting out dairy does, in fact, reduce acne. Since dairy is not an absolute necessity to our diet and someone seeing these results has struggled with acne, it would be worth a shot to see if avoiding dairy would in fact help them stop their breakouts.

From these results, we can reject the null hypothesis that our diet doesn’t have any real effect on acne. Next time you’re breaking out and notice that you’re eating too much ice cream or butter, try switching them out with fruits and vegetables and see if there’s a major difference in your skin!

Pictures –

First picture

Second picture

Is the ‘Asian glow’ real?

As someone who is half Chinese, I was shocked to find out about “Asian glow”. Asian glow is the reaction that many Asians have after drinking alcohol- their face and neck redden and they can get a headache or feel nauseous, among a variety of other possible symptoms. When I asked my dad, who is full Chinese, about it, and he said he never had that issue. He had heard about it, but all throughout his career in the wine industry, he never had a problem. This only made me question this problem more- why is something that’s such a big deal not even affect the Asian side of my family?

asian-glow-blog-post

“Asian glow” is actually Alcohol Flush Reaction (AFR), and it affects about half of the Asian population. According to Yale Scientific, the reason behind AFR is that, while alcohol is usually metabolized in your liver, where it is oxidized into acetaldehyde which then is turned into acetate, people who deal with AFR don’t have the specific enzyme (ALDH2) that  turns acetaldehyde into acetate. If people lack this enzyme, it can cause as much at 10 times the average amount of acetaldehyde to be in their system. This large amount of acetaldehyde then causes this glow, headache, and nausea.

the_alcohol_flushing_response

Man before versus after drinking alcohol, AFR example

Further studies about this could be beneficial in figuring out how serious AFR could be. One study is reporting research that points towards an increased chance of getting esophageal cancer. It’s not so much that AFR causes the increased potential of getting esophageal cancer, but rather that the lack of ALDH2 causes it. Some sort of longitudinal study of people who deal with AFR could be beneficial in seeing it they ever develop esophogeal cancer, and of course there would need to be another group of people in this study who do not deal with AFR. But then again, as Andrew often reminds us, it could always come down to chance, or even be a completely different third variable that hasn’t yet been considered or talked about.

So the answer to this question is yes- Asian glow is definitely real and could potentially be linked to more problems than just a flushed face and not feeling well, but more research needs to be done to better understand and increase our knowledge about the other dangers of not having ALDH2. My fellow Asians, depending on their level of concern about this, may want to get checked for esophogeal cancer at some point in their life.

Image sources:

Asian glow example

Shot glasses

Cupping Season

Recently, a post about this past Summer Olympics showed up on one of my various social media news-feeds. Unsurprisingly, it was Michael Phelps. In it, he had those round marks all over him like we saw quite a few athletes bearing during those games, left there from the physical therapy technique called cupping therapy. Cupping therapy isn’t a new thing, though it has recently been experiencing a revitalized popularity; it dates back so far that there are records of the ancient Egyptians using it in 1,550 B.C. This therapy technique involves placing special cups on a person’s body, using heat or air to create suction. Many people use it to relieve pain, help with inflammations, blood flow, or as simply a deep-tissue massage. In all methods of cupping therapy (dry, wet, flash, moving, needling, etc.) a therapist lights a flammable substance inside a cup and as the fire goes out, places the cup upside down on the patient’s skin. A vacuum is created by the air cooling inside, which causes the blood vessels to expand and the skin to rise and turn red (that’s the red circles we saw covering all those athletes). Sometimes, a rubber pump is used to cause the suction instead of fire. Now, we have the anecdotal reports of many Olympic athletes, as well as quite a few celebrities, who claim that cupping therapy really works to help relieve them of stress, soreness, and other ailments. However, I wanted to find some actual research to support their claims. I mean, the method has been around for thousands of years right? So it has to have some truth and effectiveness behind it.

I found one study that focused specifically on how traditional cupping therapy effects people who suffer from carpal tunnel syndrome (CTS). CTS is a condition caused by a pinched nerve in the wrist that leads to numbness, tingling, and other such things in the hands and arms. This study was a randomized, controlled, open trial that involved 52 CTS patients who ranged in age all the way from 18 to 70 and who met certain required criteria. 52 is a rather small sample size, which I did not really like to see. Before the trial itself started, each participant went through a physical examination and were told to fill out a questionnaire sheet. Then, the participants were each randomly assigned to one of two groups, wet cupping or local thermal therapy (the control group). The former group went through one treatment of wet cupping, and the control had one local application of heat over their trapezius muscle. Seven days after the treatment, the researchers performed a follow-up.

The results showed that cupping therapy was very significantly more beneficial to the patients than the heat treatment. There was also a lessened amount of disability in daily life among the patients who underwent the cupping method, as well as a lower amount of neck pain, which they had all suffered from at the start of the trial. The results did take into account that around 80% of the patients in both of the groups though that their respective treatments were going to be effective and statistical adjustment for this didn’t change the outcomes. So, it cannot be said that the results were due to the patients’ expectation bias.

Another study, this one also a randomized control trial, was done in Taiwan. The researchers’ goal was to study how effective cupping therapy is for relieving chronic neck and shoulder pain. On a personal note, being someone who has suffered for quite a while with intense neck and shoulder pain, I would love to try out cupping therapy. This study was done as a single-blind experiment in which 60 people (another unfortunately small sample size) were randomly assigned to two different groups. The first group, the cupping group, received fire cupping therapy at three acupuncture points on their shoulders and necks. The control group, on the other hand, was simply told to rest for 20 minutes. This, to me, does not seem like a very effective way to conduct an experiment. The results of this study did show a significant relief of pain in the neck and shoulders due to the cupping therapy, which lends to the support of cupping as an acceptable complementary treatment for pain and other ailments. However, we do have to keep in mind that while the results were significant, this study was done with only 60 people. Results are always, always more reliable when a sample size is bigger.

Because the sample size for the first study was only 52 and the sample for the second was 60, I cannot confidently say that the results of these studies are decent enough proof of cupping therapy effectiveness. There are quite a handful of anecdotal reports that support cupping, and very few reports that do not support it. However the research just is not definitive enough for me. I would need to see better, and larger tests done to really make a statement supporting or rejecting cupping therapy.

Sources:

Source 1

Source 2

Source 3

Source 4

Picture source.

 

Drop a pint, it might save your life!

My last blog, Annual physical exams…are they really necessary?, prompted me to write this one. In that blog, I suggested that donating blood could be a good substitute for routine screening normally conducted in a doctor’s office for; weight, iron level, blood pressure, temperature, and pulse. Added to that list is that donated blood is tested for West Nile virus, Syphilis, Hep B & C and HIV among others. Donors are notified if their blood tests positive for any of the tests conducted.

In high school, I sponsored several blood drives. I was l looking for a volunteer opportunity and landed on blood drives because, at the time, my Grandfather was awaiting a heart transplant. Thankfully, he got one and two years later he is still with us. I spent a lot of time in the hospital around that time and became aware of the desperate need for blood, every day. The blood drives were my way of helping a little, but, I also learned quite a bit. That said, I never looked at donating blood in terms of benefits to the donor, other than learning the results of one’s screening tests. I researched the topic and discovered that donating whole blood can reduce the amount of iron in our systems, which may reduce risk associated with cardiovascular events. Such an event can cause damage to the heart muscle. This would certainly be a good reason to donate whole blood, but is it proven?

Null hypothesis: Donating whole blood does not effect levels of iron in your system nor reduce cardiovascular events.

Alternate Hypothesis: If you donate whole blood, you will reduce the amount of iron in your body, thus reduce the risk of cardiovascular events.

donating-blood

Research:

This article provides a good overview of several benefits of donating blood all resulting from reduced levels of iron in the body. It suggests that lower levels of iron reduces risks associated with liver and heart ailments, cancer and hemochromatosis, which is a genetic disorder caused by iron overload.

 

Source: Photo 1

 

I then found this retrospective cohort study of donor activity over a three-year period for two groups. The first group, consisting of 1508 subjects, donated only one unit of blood in that timeframe. The second group, also consisting of 1508 subjects, donated more than one unit of blood each year, over the 3-year period. The subjects were matched for gender and age. Medical records as well as a common questionnaire were utilized to determine medical developments for the subjects 10 years after the study period. 2104 subjects remained in the study for evaluation which concluded that the subjects who gave more frequently had fewer cardiovascular events, were less likely to be taking medication and weighed less than the subjects who donated blood less frequently.

A research article published by the British Medical Journal compared 40 years and older males, with no known heart ailments. A total of 3,855 subjects initially participated; 655 were whole blood donors and 3200 were non-donors. All the subjects had varying education levels, non-cardio ailments and levels of activity. At the final assessment, 2,966 participants remained in the study. Cardiovascular events were reported for 64 donors (only non-smokers were included) and 567 non-donors resulting in a 95% confidence level that whole blood donation may reduce iron in the blood thus reducing potential cardiovascular events. This study did suggest additional clinical trials as a follow through for confirmation.

Conclusion:

While I believe more robust studies are needed to solidify evidence of reduce heart ailments because of donating whole blood, the evidence provided is certainly convincing enough for me to continue to donate and potentially change the frequency of my blood donations. Take a look at the BLOOD FACTS below. If these studies don’t convince you to donate, the facts listed below should be convincing enough for you to consider donating as well!

Blood Facts:

  • Red Blood Cells have a shelf life of about 35 days.
  • Each blood donation is tested for HIV, Hepatitis B & C, Syphilis, West Nile Virus and other infectious agents.
  • Testing requires 24 hours.
  • Every 5 seconds someone needs blood, a friend, a family member or maybe even you.
  • Less than 5%of the population donates blood, yet 80% of the population needs blood.
  • Nearly 4 millionAmericans would die each year without life-saving blood transfusions.
  • An estimated 109,500 Americans will be diagnosed with leukemia, lymphoma and myeloma this year.
A bone marrow recipient
needs up to 20 units of red blood cellsand 120 units of platelets.
An automobile accident victim
needs up to 50 units of red blood cells.
A sickle cell anemia patient
needs up to 14 units or red blood cellsper treatment.
A cancer patient
needs up to 8 units of platelets per week.
A heart surgery patient
needs up to 6 units of red blood cellsand 6 units of platelets.
An organ transplant recipient
needs 30 units of platelets and 25 units of plasma.

References

Organic Facts: Health Benefits of Blood Donation

https://www.organicfacts.net/health-benefits/other/blood-donation.html

Study #1: Google Scholars: Wiley Online Library:

http://onlinelibrary.wiley.com/doi/10.1046/j.1537-2995.2002.00186.x/full

Google Scholars: BMJ – British Medical Journal

http://heart.bmj.com/content/78/2/188.short

Community Blood Services: Blood Facts – DID YOU KNOW?

http://www.communitybloodservices.org/resources/blood-facts/

 

 

Is Ethanol an Easy Fix?

Between the views of the winner of the election, and the views of one of our recent guest speakers in class, Mike Mann, I have been confronted with some very conflicting perspectives on climate change. I, like Mike Mann, believe that it exists, is aided by human actions, and is an extremely troublesome issue. There are several different areas in which people think we couldcar-06 change the way we do things, to be less harmful to the environment. One of ideas which has been tossed around to do that is converting our method for powering vehicles from gasoline to ethanol, or at least making ethanol a more substantial part of the mixture which powers cars. Would this switch have a positive impact on the environment?

To properly examine this question, we must first define what ethanol is. Ethanol is a biomass-fuel produced by biomass. Biomass can include: corn, sugarcane, grass, garbage, or other organic materials.  Over the past several years it has been slowly integrated into the fuels of standard. Currently, the average amount of ethanol in regular gasoline is 10%. So it does not account for too much of the overall constitution of standard fuel. There are several ways which this fuel can be assessed to determine its how environmentally friendly ethanol is. Specifically, I am going to examine its effect on air quality emissions, effect on water quality, efficiency of production.

One can measure the way in which ethanol pollutes the air in two ways; emissions during cropped-danger-of-ethanol-fuelproduction, and emissions caused from burning the fuel. Currently, we are at a point where the production emissions are higher than that of gasoline. However, the emissions coming from burned ethanol are less harmful than those of burned gasoline. It is a bit of a mixed basket.

The effect of ethanol on water quality comes from its production. If the crop used to make the ethanol requires large amounts of fertilizer and pesticides, then more of that substance will be exposed to the environment. Amounts of these substances will find their way into water runoff and groundwater, which can contaminate larger water supplies.

There are two ways in which ethanol is produced, via food matter, and non-food matter. How it gets from seed to being used, it called life-cycle emission, and this affects its environmental impact. The life cycle emissions vary from source of production. Having ethanol produced from a non-food source, it more efficient and has lower life-cycle emissions than that produced from food matter. Despite that, the primary way we produce ethanol is via food matter. While it is renewable, unlike traditional fossil fuels, producing ethanol from a food crop like corn is very demanding in terms of amount of land needed, as well as water needed. It does not have a good energy balance; the amount of fossil fuel energy needed for production is more than that which the fuel provides. Therefore, recycling non-food organic matter is superior, because it doesn’t have that same environmental drain, and has a better energy balance.

It is hard to have a definitive conclusion on whether or not switching cars to a fuel with a higher concentration of ethanol will positively impact the environment.  There are a lot of factors influencing how green ethanol really is. The largest environmental problem with ethanol, is its production. In order to make ethanol a viable alternative its production needs to become less demanding of land, water, and fuel. All of those problems however are solvable, with the developing technologies and enhanced methods of production which are being developed. I would conclude that in the long run ethanol will be an environmentally friendly option for fueling vehicles, but we are not there yet. Once that point is reached, I think that transitioning cars to ethanol will have a positive effect on the environment. The biggest roadblock I think will be the same ones which we discussed in class which are affecting policies regarding green energy now, politicians who are supporting the fossil fuel industry.

Picture Links:

http://weknowyourdreams.com/car.html

https://www.google.com/imgres?imgurl=https%3A%2F%2Fmoussamahmoud.files.wordpress.com%2F2014%2F11%2Fcropped-danger-of-ethanol-fuel.jpg&imgrefurl=https%3A%2F%2Fmoussamahmoud.wordpress.com%2F&docid=6YubqfalRafAIM&tbnid=LNgPCc4_q1VHuM%3A&vet=1&w=1120&h=743&bih=710&biw=1536&ved=0ahUKEwjq6Iby3M_QAhXo8YMKHSXwDa8QMwhWKA4wDg&iact=mrc&uact=8

That eyewitness sketch looks a bit too accurate…

I watch a lot of crime shows. I don’t know what it is about them that makes them so entertaining but I just can’t seem to tear myself away. Currently I’ve been binge-watching NCIS, and I noticed something that happens a lot when they’re getting eyewitness reports on a suspect. So many times they’ll show a sketch of a suspect’s face, recalled from the witness’s memory hours, sometimes days after the incident, and it ends up looking nearly exactly like the person they catch. That is just so hard for me to see as believable. Very few people have memories good enough to make recalling all the details perfectly a possibility. Even if we are extremely confident that what we are remembering is what we actually saw, chances are we’re at least a little bit off. As soon as we experience something, our mind immediately begins to forget the details. Now, I strongly believe our ability to remember things is greatly influenced by stress as well. So, how much to eyewitnesses to crimes, especially murders, really remember?

So first off, we need to form a null and an alternative hypothesis. Our null hypothesis is that the accuracy of an eyewitness’s claim is not affected greatly by high emotional levels, such as stress. Our alternative is that the accuracy of an eyewitness’s claim is affected greatly by high emotional levels.

This study about eyewitness memory was done using a group of people who had witnessed a shooting four to five months before the study. 21 people witnessed a shooting which killed one person and seriously injured another; they were all interviewed by the police during the investigation. The shooting happened on a major street outside of a gunshot and so each of the 21 witnesses saw the incident from various vantage points. 13 of these witnesses agreed to participate in this study months later. The researchers took into account inevitable losses or changes in the memories of the participants by looking at the interviews done by the investigating officers in addition to the ones the researchers themselves conducted. The results of both of these interviews ended up showing that the witness’ reports were usually very accurate to the actual incident. The lowest accuracy level, at around 76% for the police interviews and 73% for the research interviews, came from the descriptions of people, while the object descriptions resulted in about 89% accuracy on average for the police and 85% for the researchers. I noticed that there wasn’t much change in the eyewitness reports even after 4 or 5 months. I believe this may be due in part to the fact that people wouldn’t often forget the details of an experience such as this, but also because the researchers asked quite a few in-depth questions that had to have led to memories resurfacing. In regards to stress, the five witnesses who had direct contact with something pertaining to the shooting ended up reporting the most amount of stress during and after it. In this study, stress is recorded as a confounding variable. This unfortunately means that stress was not greatly taken into consideration during the research. However, the witnesses who reported the most amount of stress had memory accuracy levels that were around 70-80%, which was the average. The results of this study do show that there is some inaccuracy in eyewitness reports due to the eyewitnesses being unable to retain some details in their memories, particularly after some time has past. However, the inaccuracy is not as great as I would have thought it would be, which is reassuring to hear. And, it does not seem to be that stress or other such high emotional levels affected the witness’ abilties to recall details of the incident. At least that means that we can usually trust eyewitness accounts to be accurate in most cases.

Police line-up ... a famous scene from the film The Usual Suspects.

Now, this article is a review of a few other studies that have been done on stress as a factor in eyewitness reports. Quite a few of the studies themselves were done around the 1960s to 80s and each was published in credible research journals. The review states that many studies showed that memories before and after an event are occasionally less accurate or forgotten altogether (at least for a temporary time) if the witness was at a high emotional level at the time of the event. But on the other hand, details of the event itself were usually very well retained, even when the event caused high emotional levels. These various studies show that there is less loss of detail in high emotion causing events than there is in events where the witnesses did not feel high emotional levels. So it seems that high emotion responses to events actually increase a witness’s ability to accurately recount details, not decrease it.

I suppose this means that I was wrong. Stress and other such factors do not seem to hinder a person’s ability to recall details of an event. This means that we must accept the null hypothesis. I am, however, disappointed that I wasn’t able to find more studies done on a topic like this one. I would think that it would be an important subject, especially to forensic scientists and investigators. It’s possible that this topic suffers from the file drawer problem, and that studies have been done that were never published.

Sources:

Source 1

Source 2

Picture Source

Dogs: Self-awareness and understanding names

Every dog is capable of responding to the sound of its name being called. Whether it responds by performing a specific action, being summoned to their owner, turning to look toward someone, or even just have their ears perk up a little, a trained dog knows when its name is being called.

Experts say that certain kinds of names may work better than others. Specifically, dogs have the ability to hear an ‘s’ sound to a much more intense degree than we do. In addition to that, dogs also respond better to shorter names with two consonant sounds. The idea is that more intense sounding names are more likely attract the attention of a dog, while at the same time, having too many sounds in a longer name may cause confusion and tend to be much harder to learn and understand.

Dogs actually have the ability to understand on average 165 words, according to studies. These words range from simple commands, to specific objects. While they don’t necessarily always understand what the object’s purpose is, they can usually identify them. (For example: a leash or a ball.)

The difference between a name being a simple trigger word and having a self-meaning to the dog, we must first understand whether or not dogs are actually self-aware. While studies have been done to test whether or not they are self-aware through the use of a mirror (and subsequently failed, as many dogs could not recognize their own reflection), Marc Bekoff developed a test that was designed specifically for dogs. In this test, he was able to show that a dog was less interested in their own “mark”, while they appeared interested and sniffed the marks of other dogs. The conclusion established that there are simply degrees to awareness among creatures. While dogs may lack many of the traits that allow us to identify ourselves, they do have a sense of what their body can do and what’s theirs.

So what does this have to do with names? Well, it all comes down to how names are interpreted in general. Isn’t a human’s name just as much of a “trigger” as a dog’s would be? We respond to what we are called, just as a dog does. Our own degree of self-awareness is unaffected by the name that we identify with. In addition, a dog can learn to know somebody else by name, the same way we do.

The truth is, we don’t know exactly what dogs think. However, we do know that they can memorize and understand words, be that in the form of triggers and commands, or just identification of objects. If a dog has the ability to identify itself and associate a name with having their attention desired, it’s fair to say that a name may mean to them the same it means to us, on a basic level.

Also, here’s a video that shows a group of dogs that each responds individually to their own name.

Sources:
Research:
http://www.waycooldogs.com/naming-your-dog/
http://www.animalplanet.com/pets/can-dogs-understand-what-we-say/
http://pets.thenest.com/self-aware-dog-12082.html
http://marcbekoff.com/
Images:
http://www.rawstory.com/wp-content/uploads/2016/02/Happy-dog-Shutterstock-800×430.png
https://media.npr.org/assets/img/2011/03/02/dog_wide-bc317aa163488fb54c338a6a9af677f681581694.jpg?s=1400
http://img-aws.ehowcdn.com/600x600p/photos.demandstudios.com/getty/article/189/181/52192335.jpg
Video:
https://www.youtube.com/watch?v=fZjoySj3LCc

Why You Should Never Pop Your Pimples.

deep-scars-2

(Photo by http://www.pimple.com.sg/permanently-remove-pimple-scars/).

I don’t break out, or so I thought… until three days ago.

The other day I looked in the mirror and cringed. I discovered a huge pimple on my chin just in time for a wedding that I am going to this weekend…perfect timing right? I never experienced popping a pimple and I hope I never do, but I picked this one and I have never seen my roommates yell faster at me than after I told them.

“You’re going to scar,” they said. “Don’t ever do that!.”

I didn’t know and now I have a giant scab on my face. I tried googling everything under the sun to get rid of it, but everything thus far has failed me… I’m just going to give it time and let it do it’s thing.

But this whole thing got me thinking- why do pimples scar? You would think that the cells recreate themselves and the skin would die off eventually creating new skin, but that’s not the case.

It turns out that it is very dangerous to pop pimples!

1.Doctor Julia Tzu advises that inflammation can become much worse than what the pimple already has if you pop it. With this scars can occur when the area is effected.

2. Scabs can be formed. Mental Floss explains how a scabs occurs, but scabs are harder to cover up than a pimple so do not pick at the acne.

3. Discoloration can occur. Post Inflammatory Hyper-pigmentation-PIH is when the spot is darker than the skin around it.

4. Infection can occur. If your hands are dirty, you use a dirty tweezer (or whatever you use),etc. you are putting yourself at risk for an infection.

You are entirely taking a gamble (Or a chance!) When you pop a pimple.

X Variable- Pimple

Y Variable- Scar

Confounding Variable- How many times you’ve picked the pimple, if you keep touching the area, if you’ve used past things on the spot to remove the pimple, etc.

This experiment concludes that you will get a scar from picking a pimple- chances are the scar will be minute, but you can make it worse. It’s best to leave the pimple alone when you first see it on your face.

In this journal, people explain how acne scars can never fully go away, the results of a scar are permanent and can never fully disappear. Yes, the scarring can be minimal, but you will never have “perfect” skin ever again. This study was done after looking at lasers/how the skin is made up. By learning all of the ingredients in what you are using on your face, you can find what works and what doesn’t.

I would look at acne as both an experimental study because you can try a million home remedies/different medications or scrubs, etc., but it will take a while to learn what works on your skin and prevents future pimples from appearing on your skin.

I think people can watch acne appear and see it go away or reduce; you can see a scar clearly on a person’s face and you can see when a pimple is inflamed.

I think people cannot predict when they are going to break out or if they’re going to wake up to a pimple on their face in the morning, but they can use preventable ingredients to lessen the looks of the pimples.

Why would anyone want to risk ruining their skin permanently??? Next time you think about popping a pimple, take a step back and think about your face.

acne1

(Photo by http://howtogetridofacne.trusted-guide.com).