This site is no longer open for business. RIP.
If you are in the class of **2015**, your website is here.
This site is no longer open for business. RIP.
If you are in the class of **2015**, your website is here.
There has been an ongoing debate for many years among sports fans about whether or not the WWE is real or simply a hoax. It is something that has been debated for a long time and people seem to struggle to find a good answer to this question.
In an article published in 2012 on the Huffington post, they explain how many aspects of the match are predetermined. Some of the main things in particular they mention are how the wrestlers protect/assist each other, how the referees help fake injuries, and how the outcome of the match is entirely predetermined. Although there are some people who do think the WWE is actually real, Huffington post seems to have some very convincing arguments. It is believed that the referee can sometimes slip wrestlers a razor blade to make the audience think he is seriously inured. In this slow motion video link below, wrestler Kurt Angle is caught cutting his forehead with a razor blade while acting like he is injured. https://www.youtube.com/watch?v=Ybg4P6_ZMOc This video shows clear proof of how the WWE is staged and rules out any form of chance because we have visual evidence.
Another aspect believe to be staged is how the wrestlers are said to protect each other. In this next video, you can see in slow motion how super star John Cena pretends he is going to hit his opponent Randy Orton but in reality he actually barely touches him. https://www.youtube.com/watch?v=IcN16bmkSWU This video us just another good example of how the WWE stages their fights.
Although people still do dispute over whether or not the WWE is real, there seems to be some pretty good evidence to support that it is staged. Although it would be inaccurate to say that every part of WWE is staged, for the mot part is it evident that what happens during the match is determined before the first punch is thrown. I believe from the evidence shown it is much more likely that it is staged as opposed to a real match. I also think this study may have suffered from the file drawer problem in the past, but as of more recently we have seen much more evidence come up that can give us a much more clear answer to this question.
During the winter, I tend to be a lot less active and unwilling to do anything than during the summer. I find myself watching countless hours of Netflix and feeling a lot less motivated than during the summer and fall. So I wonder, is there a link between the winter season and these feelings of laziness?
There is a condition known as seasonal depression, or seasonal affective disorder, in which a person experiences common symptoms of depression each year at the same time, usually starting in the fall and ending in the spring. Only half a million of the U.S. population actually suffers from seasonal depression, but ten to twenty percent of people will suffer from a less extreme cases.
It is also found that when people are lacking in supplements of Vitamin D, they are at greater risk for depression. Vitamin D, or the sunshine vitamin, is obviously harder to come by during from the late months of the fall to the early months of the spring.
So is the winter making me depressed? And will supplements of Vitamin D help me return to my normal state of slightly less than average motivation.
A small study of 3 subjects, all women, required that over the course of twelve weeks each person was given 32 to 38 ng/mL of Vitamin D. After their treatment they reported a significant decrease in depression when measured according to the Beck Depression Inventory.
Since this was an extremely small study can we conclude that vitamin D will reduce symptoms of depression? Definitely not. All subjects were women, so will Vitamin D have the same effect on men? Where were these women from? Certainly environment and location must have an effect on their state of depression – California is a lot sunnier in comparison to Pennsylvania, especially during the fall and winter months.
In a systematic review and meta-analysis of observational studies and randomized controlled trial, 31424 participants were studied. Researched found that the hazard ratio for depression was significantly higher in those with low Vitamin D levels than in those with high Vitamin D levels.
As the number of subjects studied is significantly larger than in the previous case, we can say that the representative aspect of the study is accounted for. Reverse causation can be ruled out because it is unlikely that a high risk of depression is causing low Vitamin D levels. I think we can safely assume that there is association between a person’s levels of Vitamin D and depression.
So, in these upcoming months if you’re feeling low and unmotivated, consider the fact that we aren’t getting very much sun, therefore we are subject to lower levels of Vitamin D. Try and get more Vitamin D through supplements or milk from your closest convenience store.
I have absolutely awful eyesight. My mom blamed it on the fact that I hate carrots so I never ate them when I was younger. But new studies have found that video games can be better for your vision than carrots. So really was I not eating enough carrots? Or was I not playing enough video games?
Researchers have found that action-based video games can improve contrast sensitivity – a vision feat that usually worsens as you get older. One study measured contrast sensitivity at several spatial levels in subjects with lots of experience with action-based video games in comparison to those who had virtually no video game experience. They found that the video game players had higher contrast sensitivity on a majority of the spatial frequencies than the non video game players.
Could the increased contrast sensitivity also be due to good genes? Both my parents have awful eyesight, so I always assumed that that was why my vision was awful. Reverse causation can definitely be ruled out as just because a person’s eyesight is sufficient does not mean they will play more video games. I am unsure of how many subjects were actually studied in this case, so there is a possibility that the sample population is not accurately representative of the population as a whole.
In another study, researchers took a group of non video game players and put them in video game “training”, where they played a video game for 50 hours over a nine-week period. Half of the subjects were assigned to an action-based video game while the other half was assigned a non-action stimulation-based video game. When measuring contrast sensitivity for both groups, researchers found that significant improvements were only found in the group that played the action-based video games.
Will these results be consistent amongst all age groups? Or are action-based video games most effective when played at a certain age. Also, we should consider other effects that video games have on children, such as the development of ADHD. Although video games have been found to improve contrast sensitivity in several studies, we must also keep in mind that other health issues can be a concern. After all, if a child is sitting in the house playing video games, he’s not outside being active.
Moral of the story, I should have eaten my carrots, but at least I wasn’t stuck inside playing video games all day, even if they can improve my eyesight.
Is working out at the gym better for you than working out outside? As college students, we dread weight gain or any sign at all that we’re getting any bigger than we used to be. If you’re like me, you’ll look for the easy way out. I’m always looking for the “quick fix” when it comes to exercising. I love working out at the gym because I find working on the machines more entertaining and effective than when I’m left to my own devices. But then, taking a run outside has its advantages too. Which one is really better?
In eleven randomized and non-randomized trials that recorded information from 833 adults, exercising outdoors showed to be the reigning champion. Researches found that working out outside in natural environments made subjects feel more refreshed, increased energy and decreased feelings of depression and anxiety.
But, did these studies actual measure the fitness of the subjects? Subjects may FEEL more refreshed, but are their bodies physically improving alongside their state of mind? And, were all subjects generally the same age or did the ages vary? Exercise amongst young adults versus those who are middle aged most certainly could vary. And how do we know how rigorous subjects are exercising in the gym? This could very from person to person. Not to mention, the environment that a person is surrounded with. If it’s freezing cold outside in Pennsylvania it’s debatable as to how happy a person would feel exercising outdoors than if it’s 72 degrees and sunny in California. Personally, I would much rather be exercising in 72 degree weather.
In comparison, a study performed on older adults found that those who exercised outside had a tendency to work out for longer periods of time. Subjects were on average around 66 years old or older. Those who worked out outside – mainly through walking – were more active than those who worked out indoors, on average spending 30 minutes more exercising.
It’s clear that this study as well cannot be applicable to the total population as it only gathered information from adults in the later years of their life. Will working out outside motivate me to work out more as opposed to working out inside? Is it strictly psychological? Scientists haven’t found any concrete evidence that working out indoors is worse for you than working out outdoors. Since actual physical fitness was not measured I don’t think we can draw any significant conclusions as to which is more advantageous.
Bottom line, if you’re looking for a short cut to that summer beach bod, working out indoors is just as effective as working out outdoors, you might just be a little happier if you’re outside. Although, your outside surroundings could affect how you feel as well.
My sisters and I have to completely different body types, I’m pretty tall and curvy while they are both short and very thin. I’ve always wondered why we have such different figures, when I asked my mom she told me that it was because I was given baby formula while they were breast-fed. And now I wonder if there is a difference in how formula or breast milk affects a baby’s development.
Infants who are exclusively breastfed for six months – not having any other solids or liquids other than human milk – are less likely to become subject to gastrointestinal infection. This is found by assessing information provided by 23 independent studies from 11 developing countries. A majority of these studies found that children subject to exclusive breastfeeding have no significant difference in height, weight, or body mass index at 6.5 years of age in comparison to those who are not exclusively breastfed.
As always we must consider any underlying facts. Could the country itself be a factor in this study? It’s safe to say that the people United States has different eating habits than those in countries such as Finland or Belarus. We have access to foods other countries have difficulty finding and vise versa. Is it possible that the food of the countries women live in can have an effect on the quality of their breast milk? Also, does the amount of times that a mother feeds her child have an affect on development? Will a child who is fed 5 times a day be consuming more nutrients than those who are fed 3 times a day? These facts are not given in the study.
In comparison, a study that compared the feeding habits in siblings during infancy and how they affected their development found that the sibling who was bottle fed did better on 10 of the 11 outcomes studied. Researches took note of 3 physical measures (BMI, obesity and asthma diagnosis), 3 behavioral indicators (hyperactivity, parental attachment and behavioral compliance) as well as five outcomes used to predict academic achievement (Vocabulary Test, Reading Recognition, Math Ability, Wechsler Intelligence Scale, and scholastic competence).
This study was performed on siblings 4 to 14 years of age. Some discrepancies could be that the children aren’t fully matured yet. What if one should go through puberty slower than the other, but in the end they both end up with similar physical, behavioral, and academic attributes? And are health issues taken into account? Are they a part of the study?
Although most people will say that breast feeding is better than baby formula, we can’t be completely sure that a child will develop better if exclusively breastfed.
My little sister has ADHD. It’s not a huge deal, yeah she fidgets a lot and she rarely ever hears anything I say but she lives a normal life save for having to take a pill before she goes off to school. But ever since I was little I always wondered how she got it. Was it genetics? Was it contagious? I remember when I asked my dad when we were little and he told me, “it’s because she watched T.V. when she should have been doing her homework, maybe you’ll listen to me now.” And honestly, little 10 year old me was scared to the bone. I didn’t want to have to take a pill every morning and my mom always got irritated with my sister when she wouldn’t listen to her, I didn’t want that. Eventually I got over the fear of getting ADHD, which is absurd in all aspects, but I still wonder if watching T.V. can directly cause ADHD.
A cross sectional research study used the National Longitudinal Survey of Youth to determine hyperactivity as a result of the amount of television children ages 1 to 3 watched daily. The study concluded that for the 1278 children at age 1 and the 1345 children at age 3, ten percent had attention issues at age 7. Records state, “In a logistic regression model, hours of television viewed per day at both ages 1 and 3 was associated with attention problems at age 7 (1.09 [1.03-1.15] and 1.09 [1.02 – 1.16]), respectively.”
The study concluded that exposure to television early in a child’s life is linked to hyperactivity and attention problems at age 7. I believe that to like television exposure to hyperactivity is reasonable. With video games and T.V. shows nowadays, none of the content is remotely slow moving or mentally relaxing. But, what if children are exposed to more television later in life? If a child watches a significant amount of television at age 7, but rarely watched any at the age of 1 or 3, is he still subject to the same probability of forming an attention deficit disorder?
A different study in 2007 compared television watching in a group of children with A.D.H.D. and a group of children without A.D.H.D. In this study, experts found that the environment the child was surrounded n had a big effect on the comprehension of televised stories. Researchers found that although children with A.D.H.D. could recall facts and the plot line just as well as kids without A.D.H.D. could, they had a harder time comprehending the purpose of the narrative and what was most important .
So, does television help in causing A.D.H.D.? Somewhat. Can television be the only reason some children develop A.D.H.D.? Certainly not. Although these studies were done on children at a very young age or on children who had already been diagnosed. These leaves me to wonder if children at the ages of 7 to 9 were exposed to the same amount of television as those at the ages of 1 to 3, are they likely to develop A.D.H.D. at the age of 13?
Choosing to go to college in Pennsylvania was one of the biggest decisions I have ever made in my whole life. Coming from California, I knew I was going to face a few challenges when I moved to the east coast. There are the obvious financial costs, homesickness, and drastic weather change, but then there are other things that I didn’t think about when moving across the country. One of the lesser, but still aggravating, problems that I have encountered over this past semester is jet lag. Exhaustion, loss of appetite, headaches, and lack of concentration are common when flying across multiple time zones. Although I love going to Penn State with all my heart (WE ARE!), jet lag can be a pain to deal with.
Many studies have been done to help frequent flyers manage symptoms of jet lag after their long flights. In a double blind, placebo controlled crossover trial; twenty volunteers were sent from New Zealand, to London and back then given a dose of melatonin or a placebo to measure the effect of melatonin on jet lag. Subjects were given 5 mg of the melatonin or placebo before and during their flight, as well as once a day for three days after their arrival. Subjects were then given a questionnaire that asked them about their fatigue and activity. The study found that symptoms of jet lag were less in those who were taking the melatonin as these subjects returned to a normal sleep pattern and regular energy levels more quickly; concluding that melatonin was affective in reducing the effects of jet lag.
Personally, I wasn’t a fan of getting melatonin to deal with my jet lag, as it was inconvenient to have to go to the nearest drug store to find a bottle of supplements. So, I researched further and found a different study that concluded that coffee could also assist in reducing the effects of jet lag.
This study used caffeine in comparison to melatonin. Twenty-seven subjects, all reservists of the US Air Force, received a 300 mg dose of slow-release caffeine, a 5 mg dose of melatonin or a placebo before, during and/or after a their trans meridian flight. Researches took saliva and urine samples from the subjects in order to test hormone rhythms during the 4 days. The study found that although melatonin works more quickly than caffeine does in ridding of jet lag, both can help to reduce it’s tiring effects.
These studies provided the sufficient amount of information I needed to figure out how to get rid of the irritating symptoms of jet lag. Although, the subjects used in these studies were traveling much farther distances than I do. Since melatonin and caffeine helped subjects who were flying across continents, then they are more than likely to assist people who are only flying across the country. Melatonin might be a healthier and more effective choice than caffeine, and the study could be considered more reliable as it was a double blind.
“But Mom, she’s crazy!” shouted Jess and I at the expense of our youngest sister, Juliana. Being seven years older than her, and Jess being 5 years older than her, we seemed to gang up on her quite a lot; especially considering she had some traits that were not consistent with what we considered the average little sister should have. She would wake up in the middle of the night screaming, crying, and begging for my mom (even when she was sitting right in front of her). Juliana just seemed to look past her and continue to cry and scream out of sheer terror. But what was there to be afraid of? Jess and I just didn’t understand. The doctor later called these dramatic episodes “night terrors,” and because of these horrific dreams and outbursts that used to occur very often, Juliana is now occasionally afraid to go to sleep, in fear of experiencing one.
Sigmund Freud believed that childhood experiences greatly influence the development of later personality traits and psychological problems in a person, and with this theory he developed psychoanalysis. Psychoanalysis emphasizes unconscious conflict and past events, or early childhood traumas.
Researching more into night terrors, according to Webmd, an estimated 1%-6% of children experience night terrors, boys and girls are equally affected, children of all races also seem to be affected equally, and it is a disorder that is usually outgrown by adolescence. They are most likely caused by stressful life events, fever, sleep deprivation, medications that affect the nervous system, or recent anesthesia given for surgery.
Even now, years after the night terrors used to occur, Juliana still finds trouble in sleeping some nights. As she lays her head down on her pillow, she becomes nervous and afraid to fall asleep. “Who’s afraid to fall asleep?” Jess and I thought to ourselves, but looking back on Juliana’s past, it makes sense now. Having consistent night terrors and never wanting to fall asleep when she was younger affects her 7 years later, and most likely will for the rest of her life because of the early childhood traumas that she suffered through.
Being so young and naive, Jess and I just assumed that she was crazy and that what was happening to her was one hundred percent out of the ordinary. We had no interest in trying to learn about what was actually happening within Juliana’s mind, we just told everyone she was crazy because it was a much easier (and might I add harsher) explanation. Now, although we tease her about many things, after all, that’s what big sisters are supposed to do, we avoid the topic of the terrifying night terrors she used to have as a younger child because she really couldn’t have done anything to prevent those; just like how she can’t really do anything to prevent her fear of sleeping some nights now.
I love instagram. (But who doesn’t right?) And for some reason I follow many people from Australia who are either vegan, vegetarian, or raw vegan. They all rave about it, and are all very fit. Many times when they speak about it they claim that they turned to veganism to help rid them of some condition or disease. Many of these vegans have said that they knew they felt very unhealthy, went to the doctor, and were told that they have a variety of conditions such as candida, asthma, arthritis, chronic headaches, severe acne etc. Then once they went vegan they found that all of their symptoms disappeared and they were left feeling healthier and more energetic than ever before. I’m not quite sure what to think of this phenomenon as it seems plausible, stop eating chemical ridden and processed foods and many of your ailments will go away, but I am also very skeptical, how can something like arthritis be “cured” just by not eating any animal related products? I decided to investigate.
I decided to look at HOW exactly veganism helps with ailments. One study I found looked at how going raw vegan, meaning that you don’t eat any cooked foods, helped relieve pain in patients with rheumatoid arthritis. The study was randomized, although unfortunately it doesn’t state the size of the experiment or length, where some patients were put on a raw vegan diet high in lactobacilli and chlorophyll. The patients on the raw vegan diet noticed relief of their arthritis symptoms during the trial. Some patients felt some side affects like nausea and vomiting during their diet and decided to go back to eating cooked foods and meat. These patients felt that their symptoms flared back up upon returning to their non-vegan diet. Source
The various other studies I read are basically the same as this one. A randomized group with certain ailments is studied, part of the group is given a vegan regiment and the others aren’t. Those on the vegan regiment start to feel a sense of relief of their ailments, and with some with things like acne are completely rid of acne after a few months. The answers are still unclear to scientists, the only answers they have are that by cooking foods we take out certain nutrients and aren’t using those foods full health capacity. Also, fruits and vegetables don’t have much cholesterol, which is the main cause of heart disease, and are rich in fiber and antioxidants which help fight lots of diseases and promote overall wellness.
With all of the real non paid testimonies from people on social networks saying that they’re more energized, have clearer skin, lost weight, etc. I would say that if you have one of these ailments it absolutely does not hurt to try going vegan, you have nothing to lose!
Last blogging period I wrote a blog on the rise in ADD/ADHD diagnoses and wondered if the cause was rooted in our genetics or if it was more of a social matter. The answer was probably a little bit of both. Then I stumbled upon an article (thanks to another SC200 blogger) that states that the “father of ADHD” said: “ADHD is a prime example of a fictitious disease”. First off, I find it very interesting that these words were said only a few years ago, but got no attention. Why is that? My first thought was that the ADD/ADHD medicine market is recently booming. (1 in 10 male ten year olds take ADHD medication daily). There is a ton of money being made off of these medicines, and to me it feels like it’s very easy for a doctor to convince their patient that they have ADD/ADHD. So my question is: Is ADHD even a real disease?
In an article I found, studies were done to show that certain brain abnormalities resulting in a variation of the dopamine receptor gene might cause a “behavioral condition”. They studied the brains of 105 kids diagnosed with ADHD and 103 non-ADHD children. They found that a common variable in those diagnosed with ADHD was a variant in a specific dopamine receptor gene that leads to “thinner tissue in areas of the brain that control attention”. “‘This is a very important study as it adds increasing evidence that ADHD is a heritable disease with genetically determined neurobiological underpinnings and adds further evidence that this is a valid mental disorder, often requiring neurobiological interventions [such as] psychopharmacological treatment,’ said Dr. Jon A. Shaw, professor and director of child and adolescent psychiatry at the University of Miami Miller School of Medicine.” This particular study states that due to these findings, and various others that they have access to, help prove the validity of ADHD. Source
I think the study makes sense, but also there isn’t much information on the specifics. This could be a case of the Texas sharpshooter problem, the scientists knew what they were looking for and may have called this variant gene proof, when in fact they might not have been looking at the thousands of other variables in the brain. Also, we don’t know if maybe decreased attention causes this variation. There are many questions that I have about the validity of this study.
To provide reason for why there would be any biased and how these drugs are verified, if the disease is in fact fictitious, an investigation on the American Psychiatric Association members and the pharmaceutical industry “found that “Of the 170 APA panel members 95 (56%) had one or more financial associations with companies in the pharmaceutical industry. 100% of the members of the panels on ‘Mood Disorders’ and ‘Schizophrenia and Other Psychotic Disorders’ had financial ties to drug companies. The connections are especially strong in those diagnostic areas where drugs are the first line of treatment for mental disorders.” Source
What this means is that while psychiatrists should be evaluating patients with their extensive knowledge of their practice, many members of the APA have big financial stakes within pharmaceutical companies, meaning that it’s more financially beneficial to them to tell their patients that they need ADHD medicine.
I’m not quite sure if I believe in ADHD, but then again that could mean I was a little biased in writing this blog post. I did take an article called “Brain Studies Show ADHD Is A Real Disease” and sort of put it down. But, to be fair ADHD is so blown out of proportion these days it’s hard for me to see what someone who really has trouble focusing looks like. I think this is a huge issue in America that definitely needs some concrete answers.
I wrote my last post on yellowing teeth, and at the beginning of college as I noticed my teeth weren’t as white as they used to be I considered whitening strips. I’ve always been cautious and steered clear of whitening strips because my friends who have used them say they’re teeth get super sensitive, and they usually go back to being yellow after a couple of months. I’ve also heard that at home whitening kits aren’t safe for your teeth and wanted to know if that’s true and why.
Here are the basics: whitening strips are pieces of plastic with treatment containing a form of peroxide, a whitening agent that oxidizes on contact with teeth, then reactions break stain bonds, thus removing stains. source
Upon reading the crest website, they assure that their whitestrips are tested for safety and are completely harmless. I also found that “According to the American Dental Association (ADA), both over-the-counter (OTC) and whitening products you buy from the dentist are mostly safe and effective. Some products are even eligible for the ADA Seal of Acceptance. However, the ADA recommends a dental consultation before self-treating in order to avoid exacerbating any existing problems with teeth and gums, or covering up tooth darkening that would help a dental professional in finding potential problems.” source source
However, they also address the irritation and pain that can come along with whitening. This pain depends on the whitening strips’ user, as well as how well they follow the instructions. When applied too heavily too the gums, “extreme sensitivity and soreness may result,” also keeping the strips on for longer than the recommended amount of time can strip the tooth’s enamel and cause problems in your gums. Also, users that use multiple packages run the risk of permanently damaging their teeth.
I’m actually really glad to know that the ADA has deemed these products safe, and I think I just might try whitening strips (following the recommended instructions) and see what happens!
The first things I notice on a person are their teeth. The shape, the size, how straight they are, and most importantly how white they are. I have always prided myself in maintaining my straight, bright white teeth; however, about two months into college I noticed that my teeth weren’t as white as they used to be. In fact, they were almost nearing a shade of yellow. I couldn’t understand why this was happening, I drink the same amount of coffee as I used to, possibly even less since I no longer have a coffee machine at my disposal, I’m brushing regularly, I don’t smoke… It wasn’t adding up. So, I decided to look into what could’ve caused this sudden yellowing of my teeth.
After looking through a few different websites, I found the following cause tooth yellowing:
I looked at this list, and not much of it pertained to me: I don’t take any antibiotics, I do drink tea and coffee (but not any more than I used to before my teeth started yellowing), I actually have eaten less acidic fruits because I don’t have access to them, I don’t smoke, and I don’t grind my teeth. I wasn’t sure about fluoride, I know that it’s in our toothpaste to strengthen our teeth, but I always thought it was a good thing. I looked into fluoride and it turns out that the water in America, meaning from the sink etc., is fluorinated in an effort to reduce tooth decay. However, there is evidence to show that it’s probably doing more harm than good in our water, as fluoride is very toxic. The FDA actually now mandates that all toothpastes and tooth products containing fluoride carry a warning of swallowing too much. Excessive fluoride can cause a discoloration of teeth or dental fluorosis. Excessive fluoride can also cause arthritis, bone frailty, glucose intolerance, and certain cancers. Although the absolute amounts of fluoride that cause these illnesses is yet to be defined, it seems as though the fluoride in our water is unnecessary. I’m not really sure why there is still fluoride in our water, as studies show that countries with non-fluorinated water have a decreasing tooth decay trend almost identical to those that do fluorinate their water. It seems as though Americans, and those living in countries with fluorinated water, may actually be getting too much fluoride. source
It might seem like I’m going off on a tangent about fluoride and what does this have to do with my yellowing teeth, but I’m getting to the point! Fluoride was the only variable I could think to change. I realized that since coming to college I drink water from the sinks and water fountains as it’s easier and I don’t have a filtration system. To see if this was making a difference, I stopped drinking any water not from a store bought water bottle for a little over a month. I ate the same foods, drank the same amount of tea and coffee, brushed my teeth regularly as usual, and in the end I noticed that my white smile was starting to come back. Although this could be do to third variables out of my control, or maybe just a freak chance, my experiment and my extra research on fluoride was enough to get me to stop drinking the water from the school’s pipes and I hope to have my white smile back soon.
In my last article I asked the question “does going outside into the cold with wet hair cause sickness?” and the results I found were lacking. However, during my research another question came up, why does it seem as though colder temperature cause sickness?
Most people consider winter “cold and flu” season, and associate cold weather with getting sick. It’s always been accepted that more people get sick in the colder months, but why? When you really think about it what’s different in the winter? The only reasonable response I can think of is that having a decreased body temperature possibly causes a decrease in the power of our immune system.
I found a study where 180 volunteers were randomly split into 2 groups, a control group and a variable group, and the variable group was to put their feet in cold water to lower their body temperature a few times a day and the other group did nothing. They were then to go about their daily lives. The results showed that after about a week twice the amount of people in the chilled feet group reported that they caught a cold compared to the control group. Source
I think this is an interesting study because it deals with lowered body temperature, but doesn’t actually force cold bacteria on the patients rather it lets the world take its natural course to see if they’d get sick. However, 180 people are a very small study and this could just be coincidence.
Another theory I found states that “when your body gets chilled the blood vessels in the nose and throat constrict. These same vessels deliver infection-fighting white blood cells, so if fewer white blood cells reach the nose and throat your defenses against a cold virus are lowered for a short time. When your hair dries off or you go indoors your body warms up again, your blood vessels dilate, and the white blood cells continue to fight the virus. But by then it could be too late and the virus might have had enough time to replicate and trigger the symptoms.” Source
So what this means is that if you already have the virus in your system, but no symptoms because your body is working to fight them off, and suddenly your body temperature drops then your body isn’t receiving enough white blood vessels to fight the infection and you’re more likely to become sick. If this theory is tested more, then I am pretty happy with my hypothesis, but there is still little evidence backing it.
I’m surprised that my research didn’t come up with any concrete answeres or larger scale experiments, I feel as though this is something that would help every human who deals with the cold and could help save a lot of hardship and money.
My parents are from Russia and have some crazy superstitions because of it. The most annoying thing they always nag me with is not going outside when it’s cold with wet hair because they say that I’ll get sick. I’m not sure if this is just a superstition, or if it actually holds some truth. A part of me thinks it makes sense; maybe wet hair makes your head colder, and I’ve always been told that most body heat is lost through your head (hence why we wear hats), and if your body is super cold maybe that lowers the strength of your immune system? And the other part of me thinks that our intuition is lousy and that this is just another weird superstition like no whistling or open umbrellas inside or you’ll get bad luck.
I first decided to see how this theory even began, and found that back in the 1800’s Louis Pasteur saw that chickens were immune to anthrax and believed it was due to their high body temperature, so he decided to experiment. He found that when he put the chickens’ feet into cold water and exposed them to the disease, they contracted anthrax and died. He then did the same experiment, let the chickens get anthrax, and then warmed them back up and they survived. This is most likely where the theory that cold temperatures cause colds came from. (More on that in my next article)
In this experiment, you can rule out reverse causation because clearly anthrax didn’t cause a decrease in body temperature as that was manipulated. And Pasteur even tested what happens when you bring the temperature back up and found that they chickens lived. However, bigger trials are needed for this, (although it has been found that cold monkeys are more susceptible to polio, and cold mice are more susceptible to pneumonia), more evidence is needed on what exactly causes this, and animals are not humans.
I found another experiment in Salisbury, England where thousands of volunteers were given a drop of infected mucus into their noses to see what happens. There was a variable group who were “assigned to bathe and then wander around cold corridors in wet socks and bathing suits ‘for half an hour or as long as they could bear it’”. Source
The control group did not have to subject themselves to the cold. The results showed that the variable group “showed a drop of several degrees in body temperature and felt rather chilly and unhappy for a time, but were no more likely to catch cold than their warmer colleagues”. Source
This seems to be a well-done study, however, it does not answer the question if going outside with wet hair leads to higher susceptibility to colds. While hospitals are home to millions of bacteria and such, they are also heavily sterilized unlike the outside world of dirty buildings and public transportation and such. Maybe having a decreased temperature and being surrounded by all of the germs outside can cause a higher risk of catching something.
I’m still not quit convinced as there aren’t many experiments that deal exclusively with wet hair outdoors and the chances of getting a cold, but the information I’ve found shows that scientists are pretty stumped too. There is also a question of why lower temperatures seem to cause higher rates of sickness, which I’ll tackle in my next article.
I’m hoping for more studies to be done on this in the future, as it may help either protect us from sickness, or just save us a little time.
While there is still no known cure, HIV/AIDS is an extremely serious epidemic affecting the lives of humans all around the world. Fortunately, with improved methods of medical treatment, most patients diagnosed with HIV can prevent the disease from developing into AIDS, ultimately saving them from death. However, these treatments are only readily available to those in developed countries, particularly those who can afford the medical expenses. Places where HIV and AIDS are more serious, such as Africa, have little to no access to these treatments, resulting in a high death rate.
However, while there is no cure yet for the disease, there is a small chance that someone “infected” with the disease could be completely immune, thanks to a beneficial mutation. Cysteine-cysteine chemokine receptor 5 (CCR5) is a receptor molecule, located in the membranes of white blood cells (WBCs) and nerve cells. CCR5 delta 32, the beneficial mutation, greatly affects the normal functioning of the CCR5, as it blocks the entry of HIV by slowing down the disease progression.
Like most mutations, this “HIV immunity” is a rare find in human beings. However, further studying the CCR5 molecule as well as the CCR5 mutation could offer great insight in the medical world, and perhaps even one day lead to the discovery of a cure.
“Beneficial Mutation” by Buzzle Staff http://www.buzzle.com/articles/beneficial-mutation.html
As a girl who dyes my hair every two or three months, I’ve always known that the dye is slightly damaging to my hair but I never thought it was too harmful. Yet recently I’ve heard that hair dye could cause cancer, which would seem plausible considering the harsh chemicals in the product, but is it true?
A question like this though is very broad but the American Cancer Society explains the possible link between cancer and hair dye. The main hair dye that would be a concern would be permanent dye because it causes lasting chemical changes in the hair shaft. This is the most popular of the hair dyes since it lasts the longest. You can be exposed to hair dyes if you dye your hair but the ones with the most exposure are hairdressers. Some studies have fed lab animals large amounts of the dyes over a long period time. These studies fond that the chemicals in the hair dyes, specifically certain aromatic amines, caused cancer in the animals. Yet they did not find a link between the dyes application to skin and cancer. When studying people, multiple observational studies focus on the risk of bladder cancer, breast cancer, and leukemia. The majority of studies who focused on hairdressers or other people exposed to dyes at work, reported a fairly consistent higher risk of bladder cancer. Yet they found no increase risk for those who only are exposed if they dye their hair.
Although the studies so far have not found a very strong relationship between hair dye and cancer, more research needs to be done to determine if hair dye might increase your risk for cancer after long term use. As of right now though there does not seem to be enough evidence to alter your hair dye usage.
I used to think depression was something that people could easily snap themselves out of. By finding activities to do, making more friends and just being a more social person in general, I thought it’d be nothing too hard to get over. However, after taking interest in things like loneliness and anxiety disorders and doing some research I’ve found that depression is not just a state of feeling, but a mental state as well. Depressed people have problems bigger than not being happy with their lives.
How does the mood disorder manage this? Well, when one is depressed, which can be triggered by plenty of events such as mental problems, chemical regulations in the brain and stressful life events, they suffer from poor chemical reaction in their brain. The reactions influence the perceptions and experiences. These chemicals, working in the nerve cells, make it possible for you to understand and take in these experiences, but when depressed the balance is thrown off. Some chemical such as serotonin, norepinephrine and dopamine are involved in the imbalances that take place in the brain and they are called monoamines.
A study done by Dr. Jeffrey Meyer at the Canadian-based Centre for Addiction and Mental Health would suggest there is a certain chemical in the brain that cause the imbalance of others. He found that something called monoamine oxidase A was capable of affecting the chemicals mentioned previously. In people with depression monoamine oxidase A was up to 34% higher. In short, depression actively affects the brains ability to control feelings and perceptions.
These imbalances can result in a number of negative feelings that could be detrimental. Depressed people experience things like fatigue or loss of energy frequently, feelings of worthlessness often and diminished interest in experiences they once found pleasure in. I found a video online that did a great job of explaining it in a simple, understandable way!
Photo courtesy of www.telegraph.co.uk
A study tried to find the link between Melanoma and indoor tanning to prove that it is strong after all. The scientists found that among 1,167 cases and 1,101 controls, 62.9% of cases and 51.1% of the controls had tanned indoors and the risk of melanoma increased through the years. It also claims that no matter what the age, your risk of melanoma is higher if you tan indoors.
This case has a good sample size, but the information was gathered through surveys and phone calls. Not only does this mean there could be bias in the answers given, it’s also not a very reliable way to collect information. There is always the possibility of chance, and the researchers didn’t get a full background report on all of the people involved in the study which makes it hard to tell if the person involved had a history of higher melanoma risk in their family. The study is observational and not experimental which isn’t as reliable in this case and correlation doesn’t always mean causation.
Even though this particular study is weak, there have been other studies conducted on the same topic as well.
This study also looks at the link between melanoma and indoor tanning. I believe that with all of the studies conducted, there is enough information to believe that there is a link between the two and it is harmful to people that tan indoors.
“Result Filters.” National Center for Biotechnology Information. U.S. National Library of Medicine, n.d. Web. 05 Dec. 2014. <http://www.ncbi.nlm.nih.gov/pubmed/20507845>.
“Result Filters.” National Center for Biotechnology Information. U.S. National Library of Medicine, n.d. Web. 05 Dec. 2014. <http://www.ncbi.nlm.nih.gov/pubmed/20669232>.
I’ve lived my entire life in Pennsylvania, which means I’ve been dealing with cold winters since I been a baby. All I know is cold come November and the cold stays under February/March. I love living in the Northeast because if I go on vacation or if it’s a nice day, I appreciate the nice weather so much more. People that live in summer 365 days a year doesn’t feel the same appreciation, but they are definitely not cool with the cold. Many studies have been done that aimed to figure out if weather can affect a person’s mood and to what extent. After reading John M. Grohol, PSY.D’s article on weathers effect on mood, I believe the evidence is irrefutable that weather is a direct cause of mood shifts. It may seem very obvious and typical that nice weather will make the day better. It means more time spent outside, more sunlight, less clothing, etc. It has a deeper effect than making a person happy that it’s sunny out or sad it’s cold.
In one study by Hardt and Gerbershagen (what a name), they questioned 3,000 patients of chronic pain on their depression. They’re questionnaire results showed there was no correlation between depression and the time of the year, nor the amount of daily hours of sunshine. This experiment could have been improved upon if the patients talked about how much sunlight they got and if their was a control and experimental group to compare.
“Weather Can Change Your Mood.” Psych Central.com. N.p., n.d. Web. 03 Dec. 2014.