Author Archives: Victoria Chelsea Bushman

Why Do Some People Hate Christmas?

For those of us who celebrate Christmas, it may just be the happiest time of the year. The carefully strung lights, decorated Christmas trees, time spent with family, and even presents as an added plus! Who could hate such a fun holiday? Believe it or not, there are quite a few people who do, each with their own variety of reasonings. There are even widely known movies about them, the best examples being How The Grinch Stole Christmas and A Christmas Carol. Which leads me to wonder if there is any scientific explanation on to why some people can be such a Scrooge during the holiday season.

Although the Grinch is a fictitious, non-human character, neuroscience explains just why the Grinch stole Christmas. Dr. John Cacioppo, a pioneer in the field of social neuroscience, says it is because of his social isolation. Because the Grinch is stowed away in his cave far away from human life, he has become lonely and bitter. With Christmas comes the perception of happiness, and because the Grinch does not understand, it explains his actions. This was all found in a study done by Cacioppo in which while in an fMRI machine,two groups of subjects viewed a series of images (human subjects that is) in which were all varying. Some of the images showed positive connotations (people having fun) and others posed negative associations (human conflict). The area of the brain that recognizes rewards showed the greatest response in the non lonely subjects in comparison to those who were lonely. It was also discovered that the people who were lonely responded very strongly to unpleasant images to people rather than unpleasant images of objects, suggesting that lonely people are especially drawn to human conflict. Non lonely subjects showed no difference between human or object. Thus, people who are isolated appear to have a reduced response to things that make most people happy, and a greater reaction to human conflict.

That study leads me to believe that How The Grinch Stole Christmas may have an underlying non-fictitious plot to it. Maybe the people who hate Christmas in fact are just very lonely, and hate to see all the merriness around them. I decided to look further into the matter, and stumbled upon a post from a real life Scrooge/Grinch. It is emphasized in this blog post about the stigma of religion that is put on this holiday, and how for a non-religous person it is offensive to be forced to partake in the customs associated. Not only that, but what makes it worse is that Christmas always comes way to early, considering right after Halloween (and sometimes even before) decorations and advertisements are out for the holiday, even though Christmas may be two months away. Some other notable points were how the Christ in Christmas has seemed to be lost, and that the whole gift giving notions tend to go to people’s head. Although I myself am a Christmas lover, I do believe there are valid points in the post.

So yes, maybe some people hate Christmas solely because it is yet another overhyped holiday for people to get all worked up about. Although in my research, I found an interesting article from Psychology Today, which says that according to the National Institute of Health, Christmas has the highest incidence of depression out of the entirety of the year.In fact hospitals and police officers reported high incidences of suicide and suicide attempts, while psychologists and psychiatrists reported a notable increase in complaints of depression from their patients. One North American survey even projected that 45% of Americans dread the holiday. So why are so many people depressed during and around Christmas? Some get depressed because of their annoyance with the commercialism revolving around what is supposed to be a religious holiday, while others get depressed because they are focused on gifts and the perfect social activities. People also get depressed because the holiday seems to be a big trigger to engage in excessive self-reflection , because they compare their lives to other people (based on presents etc.). Anxiety is also increased very much so because of the feel to need to spend a lot of money on gifts and activities for the holidays, which only further worries people about debt and what they do not have.

All in all I think there are a variety of reason why some people hate Christmas, and it can not be broadly pinpointed without going through each person individually (because there are way too many third variable causes to count). Although I did not find any research on it, I believe experiences from past Christmas’s, can definitely affect the way people feel about the present holiday. If they associate a very bad memory, or tough time with the holiday, it would make sense for them to feel a type of hatred towards it. So hate it or not, Happy Holidays!

Where Does Our Music Taste Come From?

Music is universal; and universally we all have different tastes. Some people prefer indie music while others swear by rap, and the list goes on. Of course this also means we can dislike other genres of music. Which leads to many arguments over what music genre is better, and who has the better music taste. Which all leads me to wonder, where do we get our music taste? And is it fair to judge someone based off their taste in music?

According to an article featured on Business Insider, the music in which you like says a lot about how you process information. In this particular study, researchers based it off of the idea that there are two ways people respond to their surroundings: empathizing (when someone is outgoing and interactive with others) and systemizing (more of an introvert, needs a pre-set notion of how they think they should act). The participants, who ranged from as young as being in their twenties, to as old as 61,  were first asked to take a psychological survey to determine if they were empathizing, systemizing, or a balance of both. They then asked them to evaluate 50 songs from 26 genres of music, to rule out any predispositions. The results showed that those participants who were empathic tended to like mellow, humble, or contemporary music such as folk, country, and even popular techno songs which could express negative emotions. Although they seemed to have a strong dislike for genres such as heavy metal or punk. On the other hand, the systemizing participants enjoyed music that reflected high-energy and positive emotion, and songs with complexity, such as classical music. The chart below shows what kind of music each group preferred, Type E being the empathizers, Type S the systematizers, and Type B being the balance of both.

To answer my initially proposed question, there in fact is a “magic age” in which we develop our taste in music. Just as every human has a critical age for language development, we also have a critical age for our taste in music; starting at around 14 and peaking at the age of 24. According to Daniel J. Levitin, a professor of psychology and the director of the laboratory for Music Perception, Cognition, and Expertise at McGill University, this is considered a critical age because at the age of 14 many of us are experiencing pubertal growth hormones. Thus everything becomes more important as we are reaching a point in our cognitive development when we establish our tastes. At the age of 24, when an individual has reached the “peak of musical taste acquisition” (quote), they are most open-minded and accepting at this age, what with meeting many different people with differing backgrounds etc.. The music heard at this age tends to be the types of music that stick with the individual, and after their twenties they tend to no longer branch out of their music taste. Part of why this happens is because starting at 24 or 25, an individuals ear will begin to lose sensitivity to higher pitches and therefore will be unable to detect small changes in pitches. This makes music less vibrant and pleasurable than the music they had heard during the peaks of their lives, thus it is less preferable. But one of the biggest reasons for why this happens is because after 24, the personality of an individual will start to firm up and they will be sure of their identities (can no longer be influenced or challenged by others).

Not to mention, researchers Dr. Samuel Gosling and Peter Rentfrow at University of Texas at Austin, have found music preference can say a lot about ones’ personality traits. In fact they found that music choice falls into four broad categories: Reflective and Complex, Intense and Rebellious, Upbeat and Conventional, or Energetic and Rhythmic. Collecting data about music preferences from undergraduate students and 500 listeners across the country, they were able to infer their personality traits (using a new scale called the Short Test of Music preferences, or STOMP). The study suggests that people favoring music in the Reflexive and Complex category (classical, jazz, folk, and the blues) are inventive and have active imaginations and are considered to be politically liberal. The Rebellious and Intense category (heavy metal, rock, alternative music) hosts individuals who are risk-takers and quite intelligent. Upbeat and Conventional listeners (pop, country, religious music) are outgoing and cheerful and see themselves as physically attracted with conservative views. Lastly, Energetic and Rhythmic listeners (funk, hip-hop, soul, electronica music) are talkative, have high-energy and confidence, and are generally forgiving people.

Thus it is fair to say it is in fact plausible to judge someone on their music taste, as you may be able to learn a little about the individual before you even fully interact. As for how we develop our music taste, I feel it is conceivable to agree that we do in fact have a “critical age” in which we develop our taste. Although I do wonder how set in stone that could be, because the studies that go along with that do not really rule out many third variables (such as not really listening to music as a teen, or being heavily influenced by those around you creating a false development possibly). Also I feel reverse causation has not been entirely ruled out because of the possibility that at the age of 14 we just started to be exposed to music more. Thus it may not matter about the growth hormones and development at that time.

 

New Year’s Resolutions and Why They Fail

Year after year most of us make a resolution(s) to bring in the New Year, ranging from vowing to workout more or to stop procrastinating -the options are endless. Its no secret that New Year’s resolutions are known for getting ditched not long after they are made, in fact some of them never even happen. So what is the point in making them, when we are all aware we will probably never follow through? Is there a scientific reasoning for why we have a hard time with our New Year’s Resolutions?

According to a blog on Bufferapp, 50% of Americans make a resolution during New Year’s and according to researcher Richard Wiseman, 88% of those Americans resolutions’ crash and burn. This is because there is actually a science to why us humans can’t stick with a resolution; we do not have enough willpower to do it. The brain cells that operate willpower are located in the prefrontal cortex (the area right behind your forehead) and is responsible for staying focused, handling short-term memory, and solving abstract tasks among other things. This was proven through a Stanford experiment led by Baba Shiv, in which several dozen undergraduates were divided into two groups. One of the groups was given a two-digit number to remember while the other group was given a seven-digit number to memorize. The groups were then told to walk down the hallway in where they were presented two different snack options; a slice of chocolate cake or a bowl of fruit salad. The results found that the students who had to remember the seven-digit number were nearly twice as likely to pick the piece of chocolate cake  as the students who were giving a two-digit number. According to Professor Shiv, the reasoning for this is because the extra numbers took up valuable space in the brain (a “cognitive load”) which resulted in making it harder to resist indulging in such a decadent dessert (willpower is weaker when the prefrontal cortex is so overtaxed). Thus New Year’s resolutions consistently fall through because our willpower is inherently limited, and according to Shiv our brains were “not built for success”.

Willpower is not the only culprit in which causes us to not be able to keep our resolution’s. In fact a lot of it also has to do with us human not wanting to truly change our habits, and not really wanting to make the leap to reinvent ourselves. Timothy Pychyl , a psychology professor at Carleton University, asserted that although people make these resolutions as means of motivation, but in reality most people are not quite ready to change their bad habits, especially particularly bad habits (such as smoking and drinking). Other times we set goals that are too grand to follow through with, which is known as a phenomenon called “false hope syndrome”. This phenomenon is defined by setting unrealistic goals (like losing x amount of pounds in a month) and ultimately this type of thing can lead to more harm than good. Roy Baumeister suggests if you really want to better yourself, rather than making a yearly resolution, choose to make one every month. He suggests this because you only have one supply of willpower, and in making a list of different resolution’s you leave them all competing with one another. Therefore in trying to follow one of those you reduce your willpower capacity for all of the others. In fact statistics show those who only make one New Year’s resolution tend to have a higher success rate.

All in all I think that the studies hold very plausible evidence to why we have trouble keeping up with our resolution’s. Of course I believe there can be many other third variables that have not been factored in, such as age group and lifestyle (do these people have kids, high stress job etc.), that can explain why many are not able to follow through. Therefore maybe there should be studies conducted on different age groups to see if there is any difference in following through.  So before you make a list of New Year’s resolutions, consider setting such goals every month rather than all of them at once, so you have a higher chance of success!

 

The Truth About Holiday Weight Gain

Tis the season for family, presents, and weight gain! Holiday weight gain is one feared by many, as it is said to happen around the time of Thanksgiving and last until around New Years when all the holiday festivities have fully simmered down. While I hear people talk of this quite often and especially right now, I myself have never experienced any noticeable weight gain during the holiday season. With that being said, is it really true that many people gain weight around the holidays, or is it just another debunked myth?

According to New York Times, although the average surveyed person says they have gained about 5 pounds over the holidays (and media stories exaggerate the weight gain to 5-7 pounds),  many studies show the average is actually just one pound or less. Many people tend to overestimate their weight gain due to post holiday guilt, i.e. they feel bad about how much they indulged during the holidays and therefore assume they packed on the pounds. So while the average person is not at any high risk for weight gain during the holiday season, people that are already overweight are. There has been many studies which suggest those who start with a higher BMI tend to gain more weight in general, and therefore during the holidays will in fact gain the 5-10 pounds, or even more. This was coincidentally found in the same study  (of 2000 participants) that debunked holiday weight gain, which showed 20% of obese subjects, and 10% of overweight subjects gained more than 5 pounds (only 5% of non-overweight people in the study had the same result).

Another study revealed that those who were formerly overweight struggled with weight gain over the holiday season. 178 people who had formally been very overweight, but lost a very significant amount (77 pounds or more), were tracked in this specific study, and also 101  people who have never been overweight. Despite trying their hardest to not gain weight and making careful plans,  39% of the formerly overweight people gained at least 2.2 pounds over the holidays, in comparison to 17% in the group of 101 people.

But what about us college kids? We are used to eating dining hall food and maybe even eating less than when we are at home. That being said, do college kids gain a significant amount of weight over holidays? A study conducted by researchers at the University of Oklahoma weighed 94 students the day before Thanksgiving, and then weighed them again about two weeks later. Students who were of normal weight gained about half a pound, while students who were overweight (body mass index of 25 or more) gained about two pounds. With that conclusion, I think it is safe to make the assumption that you can expect the same results in regards to Christmas and New Years. If you find yourself still worried about the possibility of gaining a few pounds over the holidays, Health.com lists 15 tips to help avoid it. Some of the most notable ones suggest setting up a workout to do in the morning, to be  very picky with what you eat (especially on the actual holiday), to build up will power, and to chew slowly to minimize how much you eat.

It’s safe to say that the dreaded holiday weight gain is in fact a myth as there have been countless studies throughout the years coming up with the same result; the weight gain is insignificant (although some studies show that the one pound gained over holidays is hard to lose, and some people never do; but this mostly pertains to adults). Of course for some this is not a myth, as if the third variable of being overweight is involved, significant weight gain can and most likely will occur. Thus most of us can face the holidays without fear of packing on the pounds. And with that, happy eating!

 

Christmas Trees: Real or Fake?

For years and years the dilemma of getting a real Christmas tree or a fake one has been prevalent during the holiday season. In fact, many say it was Theodore Roosevelt who started the debate when he refused to have a Christmas tree in the White House due to “environmental concerns”. While for some it is an obvious choice (due to allergic reactions to some of the most common types of trees) others struggle with deciding which would be better. There are many factors going into deciding, because it is true that getting and putting up a real Christmas tree is much more of a hassle; but I wonder if there is any evidence that one option is definitely better than the other.

According to Rodale’s OrganicLife, 60% of Americans set up fake Christmas trees over real ones. Although in terms of the best eco-friendly option, real Christmas trees in fact win despite the popular choice. This is because fake trees are made from the plastic Polyvinyl Chloride (PVC), which releases toxic chemical dioxin during PVC production (also made from the same material piping is made of). PVC contains hormone-disrupting plastic softeners called phthalates, which in the event of a fire will burn and emit the dioxins.  It has also been found that many fake trees have been contaminated with lead, some even coming with a warning label to wash your hands after handling them to avoid ingesting this metal. Not to mention that fake Christmas trees are not recyclable; therefore if you decide to toss it and get a new one, your discarded tree will end up in a landfill forever. In fact, to consider your fake tree to be more environmentally friendly than a real evergreen, you would have to use it for 20 years!

A different article from tdn takes both sides of the “debate”, saying it all really depends. Like the previous article, they mention the Montreal study done in 2009,  which found that an artificial tree (which typically has a life span of 6 years) has three times more impact on climate change and resource depletion than the natural tree. Artificial trees also create an even bigger carbon footprint if you factor in transportation. This is because most of the fake trees are shipped overseas from China; thus a family driving to a local tree farm leaves much less of a footprint than its counterpart. But of course on the other side of the argument, an artificial tree can in fact last 10-20 years if you store it in a safe area and buy a good quality tree (therefore carbon footprint from travel would be in fact reduced, factoring in travel to get a new real tree every year). Although despite that, families getting real trees for the holiday season allows the tree farmers to employ local people and give business to their local tree farm, which in term helps a towns economy. This article concludes that it all really depends on preference, because it really is easier to get an artificial tree, as they require no water, are set up in minutes, and many already come pre lit.

The Christmas Tree Association points out that a new and recent Life Cycle Analysis concluded that neither tree has a significant negative impact on the environment, therefore it really comes down to what tree best fits your own personal lifestyle. Although the analysis, conducted by PE International doesn’t seem to be very information filled, as it is very bland and does not mention any of the results they had found. The study was also sponsored by the Christmas Tree Association (which put out the article saying the impact from artificial trees is not environmentally significant). Which leads me to wonder how true it all is considering this was referred to as a “third party study”. I feel a file drawer problem could be prevalent in this situation, although there are not too many studies like the PE International one to support that hypothesis.

All in all, I’ve concluded it really is all about what is best for you. If you are looking for a hassle free set up, an artificial tree is the way to go. But if you are like my family, who value Christmas trees as a tradition, that is the obvious route. While there does seem to be a good amount of evidence to support the fact that artificial trees contain toxins, it’s highly doubtful they will bring any harm to a household.

 

 

Is Chewing Gum Bad for you?

Whether you are chewing gum out of boredom, or to mask the fact you forgot to brush your teeth, gum is a convenience to many. It’s easily portable, only a few calories, and many popular gum brands are sugar free. It all sounds great, but on the rare occasion I chew gum I wonder if it is possible that it could actually be bad for you (I especially wonder this when my jaw starts to hurt from all the chewing). After all it is just a small stick of… well what is it exactly? According to How Stuff Works, before WWII gum was made out of a Chicle, which was a latex sap that comes from a sapodilla tree, and mixed with flavorings. In other words this substance was just a form of rubber. Nowadays scientists have learned how to make artificial gum bases (which are essentially a synthetic rubber) that mimic Chicle, and add either artificial or natural sugars and sweeteners to make it taste good.

gum_9a5e3c6304d7428c93ec206bf23582f4

Now knowing what gum is made out of still raises a lot of questions, but the most prevalent one is still; is it bad to chew gum? (especially in excess) An article found on ABC News says there are in fact “gross” side effects caused by chewing gum. For instance, chewing gum before a meal will cause you to eat more junk food and skip the fruit. This is due to the fact that the minty flavor of most gums causes fruits to taste bitter, and it is suggested that it is a safer bet to drink tea before dinner over gum to alleviate calorie intake. Chewing gum is also linked to gastrointestinal problems, specifically Irritable Bowel Syndrome, as excess air can be swallowed and is a contributing factor to abdominal pain and bloating. The artificial sweeteners in most gums can even have a laxative effect on otherwise healthy people! Gum chewing can also trigger Temporomandibular Joint Disorder (TMJ) which includes jaw pain associated with the chewing muscles and joints that connect your lower jaw to your skull.

According to another article from Thank your Body, while eating gum with sugar is not recommended, sugar free gum is almost just as bad (if not worse), because it is sweetened with artificial sweeteners. As an example, Aspartame (an artificial sweetener used in most sugar free gums), has been linked to brain tumors, birth defects, and cancer. There are also many other fake ingredients in gum, which travel through our blood stream faster and in higher concentration because they absorb straight through the walls of the mouth. Therefore these ingredients do not undergo the normal filtration process of digestion. This source also mentions gum can cause headaches. According to Dr. Ben Kim, eight different facial muscles are involved in chewing. Thus unnecessary gum chewing can cause chronic tightness in two of these muscles  located close to your temples. Chewing can put pressure on the nerves that supply this area of your head, which leads to chronic headaches.

  The above chart emphasizes that now Americans have chosen to choose mostly sugar-free gum over gums made with sugar (sugarless being the healthiest option).

While there are downsides with chewing gum (as with anything), there are also some benefits. Mouth Healthy says that clinical studies have shown chewing sugarless gum for less than twenty minutes following a meal can help prevent tooth decay. Gum increases flow of saliva which washes away food and neutralizes acids produced by bacteria in the mouth. This provides disease fighting substances throughout the mouth (it also can strengthen tooth enamel). There is also evidence that chewing gum is good for the brain! This was concluded from Japanese research, published in the journal Brain and Cognition, which explained that the greater performance is caused by the chewing- which increases arousal and leads to temporary improvements in blood flow to the brain. Volunteers’ brains were scanned while carrying out tasks while chewing gum and not chewing gum. They found an increase in alertness and that the brain regions most active in chewing were those involved with movement and attention. Therefore, it is conclusive that gum chewing can lead to improvements in cognitive performance, although the article does state at the end gum could interfere with short-term memory (which is supported here).

Seeing there are both pros and cons to chewing gum, it doesn’t seem there is enough crucial evidence to suggest it is in everyones best interest to swear it off. Rather it may be wise to ensure you are not chewing gum in excess, rather than eliminating it altogether. Also, because there has been a lot of research on gum chewing as a whole, we can consider meta-analyses. Which of course means that each study and point given in these articles could be in fact suffering from the file drawer problem (which is why I drew inferences from both sides of the argument). That being said, chew away on sugarless gum but as with anything keep it moderate.

 

Are Cats Stress Relievers?

You may have heard, or have even come in contact with a therapy dog. These pups are specifically trained to provide comfort for the people who need it, such as people who are  stressed out (specifically students) and need to relieve some of their stresses. As much of a dog person as I am, I wonder if cats can provide the same comfort and relieve stress much like dogs have been known to do. And if studies show cats do relieve stress, would they be able to provide therapy?

According to Examiner, cats do in fact reduce stress in humans. For one thing, cats are undemanding. Many people who are the most stressed constantly feel like everyone and everything needs something from them, while cats are there for you without expecting anything in return. Secondly, cats can lessen the feeling of loneliness, much like a dog, because they are always there for you to play with, cuddle, and even vent to. Cats can also make you laugh, which is a universal stress reliever. As poised as they come off to be,cats leap at seemingly nothing, make funny noises, and love to play. That will surely bring out a laugh from anyone, and make them forget about the stressful things in life, even if just for a little while. It is also said that rhythmic stroking (which is also beneficial to the cat) is proven to relieve anxiety. This is because the soft feel of the cat’s fur and repetitive motion of petting can sooth seemingly even the worst stress. Lastly, cats always seem to known when something is wrong, and therefore will behave as if they wanted to comfort you. Such an act will relieve stress and strengthen a bond.

Many of that seems very familiar in comparison to a dog, so why are they not trained to be therapy cats? It appears many people believe cats to be not as sociable and more independent than dogs, so that may be the biggest reason why.  Although the Cat Behavior Associates are dissolving the myth that cats are not affectionate and completely independent. Cats show their independence in different ways, and may not always be sitting on your lap to show it. By simply being inches away from you, rubbing up against your leg, or slowly blinking at you, they are showing how much they care. Also cats love to be pet, and while they may not enjoy belly rubs like dogs, a chin scratch will suffice just the same. And while cats are more independent than dogs, in the sense they can be left home for longer periods of time, they still rely on human care and affection. This misconception is what causes many cats to suffer both physically and emotionally, even though they may not show it as clearly as their canine counterparts. And even though cats are known to be solitary creatures, who are not very fond of socializing, that is not an accurate notion. Although this article squashes that and says that is mostly due to the fact they hunt alone in a predatory sense, and are much more territorial than dogs are.

With all of that information I am still left wondering why cats aren’t made as widely for therapy as dogs. While I love both animals, some people are strictly dog people, while others are strictly cat people. That being said, what happens to those who do not like dogs, or are allergic? Maybe one day we will see more therapy cats.

To prove the above points, if you are feeling stressed, take a gander at this funny cat compilation to brighten up your day!

Nail Polish and your Health

If you’re like me, having naked nails *gasp* is not something I will ever sport; therefore I am constantly painting them. Today as I was doing so, the smell got to me, and before I knew it I was sneezing like crazy. It might have been in correlation with the fact that I am recovering from a sinus infection (after all correlation does not always equal causation) but it led me to wonder if nail polish has any adverse effects on not only the health of our nails, but the health of our bodies.

According to Tips on Healthy Living, nail polish is likely one of the most toxic cosmetic products out there. This is because the substances in the polish are poisonous, such as formaldehyde, phthalates, acetone, toluene, and benzophenones. Phthalates (solvent for colors) are toxic to the nervous system, and acetone and toluene (which keep color in liquid form) evaporate quickly and send noxious fumes into the air which puts your respiratory system at risk. It is also a possible link that benzophenones are cancer causing. While the article stresses that applying nail polish in adequately ventilated rooms should be ok, painting your nails several times a week is not advisable (also states you should only use nail polish remover twice a month). One scientist in particular, Dr.Thu Quach, of Stanford University and the Cancer Prevention Institute of California, takes this further and says exposure to these chemicals (found in nail polish) can lead to issues ranging from cancer to fertility problems. Although that finding has to do more with the people working in nail salons, who are exposed to the chemicals countless times per week. Luckily, as of today many nail polish companies have taken these chemicals, AKA the toxic trio, out of their formulas (phthalates in particular). Although it is still important to check the ingredients, because the United States does not restrict what goes into beauty products.

While the fumes can be, or rather once were, considered toxic to our health, is the actual nail polish having any effects on the health of our nails? Today, in an article regarding 10 things that are destroying your nails, says that the only thing it will do (if polish is left on the nails for a few weeks) is dry out your nails. If when removing your polish you see white spots on your nail, that is the indication of drying. It is suggested that you remove the polish as soon as it starts to chip, avoiding the nails from drying out.

A brand new study from researchers at Duke University and Environmental Working Group shows that there is evidence of a suspected endocrine-distributing chemical (widely used in popular nail polishes) in the bodies of more than two-dozen women who took part in a biomonitoring study. The study found that all women who had painted their nails in the last 10-14 hours had a metabolite of triphenyl phosphate (THPH) in their bodies. While their levels of diphenyl phosphate, or DPDH  (which forms when the body metabolizes THPH) had increased at nearly sevenfold. According to EWG’s cosmetic database, more than 1,500 nail products contain THPH. Those including very popular ones such as Sally Hansen and OPI. In a number of lab studies, exposure to such a chemical can cause reproductive and developmental problems, and even may be contributed to weight gain and obesity.

While the chemicals in nail polish to seem to suggest they could be damaging to our health, I would not jump the gun yet. There does not seem to be too many studies on nail polish and THPH, so this as anything could be a fluke or a false positive. Unless you are working in a nail salon, I wouldn’t worry just yet about nail polish damaging your health!

Is Microwaving Food in Plastic Actually Dangerous?

For years and years it has been said that microwaving your food in any plastic containers can be harmful to one’s health. Causing cancer,reproductive problems, and other ailments. Hearing it time and time again, over and over with no real direct explanation as to why causes me to wonder if it is actually truth, or myth. Is heating your food in plastic actually a danger to your health?

According to Harvard health, while there is some truth to this, a lot of it is misinformation. The earliest warnings stated that microwaved plastic releases cancer-causing chemicals into food, known as dioxins. Although the problem with this warning is that plastic does not contain dioxins, as they are only created when the material is burned. Therefore unless you are burning your food, no dioxins will be present. This articles also stresses the fact that there is no single substance called “plastic” because the term covers many materials made from an array of compounds. Substances such as bisphenol-A (added to make plastic clear and hard) and phthalates (added to make plastic soft and flexible) are two plasticizers that are believed to be “endocrine disrupters”, which means these substance mimic human hormones (not in the good way). Therefore it is said that when food is wrapped in plastic or placed in a plastic container, BPA(the worse offender) and phthalates could potentially leak into the food. Although the FDA realizes this and thus regulates each container before it can pass. If and when the container does, the microwave safe label is placed onto it (meaning no chemicals will migrate into your food). As for containers that do not have the microwave safe seal, the FDA has said they aren’t necessarily unsafe, it just has not yet been determined.

Wall Street Journal seems to be more against the heating of plastic, and emphasizes the importance of knowing when to toss a container in the trash. Dr. Halden, the director for the center for Environmental Security at the Biodesign Institute at Arizona State University, says that the amount of chemicals “leeching” into your food not only depends on the type of plastic, but the condition of it.  If the container is old, cracked, and has been washed hundreds of times, it will often give off more toxins when heated. Therefore any deformities or signs of discoloration means its better off in the garbage.

This topic is still up for debate as to whether heating food in a microwave can be considered as a culprit for later illnesses, such as cancer, later in life. Although a new series of studies from NYU Langone Medical Center suggests that two increasingly used chemicals (di-isononyl phthalate and di-isodecyl phthalate) have been linked to a rise in risk of high blood pressure and diabetes in children and adolescents. DINP and DIDP are ironically replacements for another chemical, di-2-ethylhexylphthalate (DEHP),which the same researchers proved to have had similar adverse effects. In their most recent study (on the replacements), the investigators reported a significant association in study subjects high blood pressure and DINP and DIDP levels. For every ten fold increase in the amount of phthalates consumed, there was a 1.1 mm of mercury increase in the blood. Researchers then tested blood and urine samples of a diverse group of 356 children and adolescents to evaluate for phthalates and glucose (also measured blood pressures). Taking diet, physical activity, race, gender, ethnicity, and income as factors independently associated with insulin resistance and hypertension as factors also. Although the results of this study are not emphasized, the takeaway is that families should limit their exposure to phthalates.

Overall, while this question is still under debate, I feel it is better to be safe than sorry. Instead of microwaving your food in plastic, it is probably better to just get a glass container, or use a glass plate to heat up your foods. That way you won’t have to worry that you could be consuming phthalates, which could be harmful to your body.

 

Are Some People More Susceptible to Cavities?

Growing up I always seemed to have gotten lucky with my set of chompers. Not only did I never have to get braces, but 18 years have gone by and I still have never had a cavity. I chalk this up to be peculiar, because I have quite the sweet tooth, and am notorious to forget to brush my teeth (much to my mothers dismay, who is a dental hygienist). Meanwhile, almost all of my friends have had few or multiple cavities; even the ones who have great brushing/flossing habits! Therefore I wonder, are some people just simply prone to cavities, while others are not?

o-FORGOT-TO-BRUSH-TEETH-facebook

What is a cavity exactly? Kids health defines a cavity as a hole that can grow bigger and bigger over time, due to tooth decaying or the breaking down of the tooth. According to Joel Berg, 97% of the population will get a cavity (also known as dental caries) in their lifetime, and gives some explanation as to why not every one is as cavity prone compared to others. A first explanation has to do with bacterial biofilms (biofilm forms when bacteria adheres to surface in some form of watery environment and begin to secrete slimy, gluelike substance that can stick to all kinds of material) which is better known as plaque. All bacterial biofilms differ, although some specific strains (such as Mutans Streptococci) are more of the primary culprits, however that does not mean a person infected with this will get a cavity. Due to the fact that every strain has a bug with varying levels of cariogenicity, it it is not just the amount of biofilm that is important in predicting who might get cavities, but also the specific strain which one is infected with.

A second explanation of why some people get more cavities than others of course diet. If you expose yourself to sugar often, you have a high risk factor of getting a cavities. This is because caries- causing organisms prefer sugar as an energy source. Thus the metabolism of this sugar into lactic acid causes cavities. A third factor is salivary flow and composition. Meaning the more saliva in the mouth the better, because it naturally cleans the surfaces of the teeth that house caries-causing organisms and the acids they produce. A fourth, and obvious explanation has to do with ones oral hygiene, as brushing and flossing keep caries under control. Lastly, Berg explains tooth morphology also has something to do with cavities. Deep grooves on the surface of the teeth (molars in particular) make someone more likely to develop caries. This is because the grooves entrap biofilms easier and make them harder to remove. Royal Dental also explains people are more prone than others to cavities because of the pH levels of their mouths. If the pH levels of your mouth drop below 5.5, then teeth enamel will begin to erode (acid is very damaging to your teeth).

While there does not seem to be many trials on why people are more prone, there are many related to what causes them, and if anything actually prevents them. As mentioned before, acidity is one of the major causes of cavities and can be found in many things we eat and drink, a specific culprit being soft drinks. A recent study published in Academy of General Dentistry Journal, reports that in drinking any sort of soft drink, you are hurting  your teeth, due to the citric acid or phosphoric acid in the drink. The study measured the acidity (or pH) of 20 commercial sodas, including the diet version of each, immediately after the cans were opened. Then, researchers weighed slices of enamel from freshly extracted teeth before and after being immersed in each of the sodas for 48 hours. Their findings were that teeth emerged in Coke, Pepsi, RC Cola, Squirt, Surge, 7 Up, and diet 7 Up lost more than 5% of their weight. The other sodas had less percentages ranging from 1.6%-5%. The ending results show that just the soda’s acidity is not the whole story, as other important factors are the type of acid in the soda and the level of calcium content. Therefore any soda can cause damage to teeth, and as the study says should be avoided. Although one can argue is this study really that realistic? As Tracey Halliday (an American Beverage Association spokesperson) says “these findings cannot be applied to real life situations where people’s eating and drinking behaviors are very different and there are many factors at work.”Although I myself conclude that this research is eye opening, as I was never too fond of soft drinks, as they make my teeth feel funny.

Therefore it is true, some people are really just more cavity prone than others. While there are many factors contributing to it, there are preventative steps against dental caries, which are important for a healthy and happy mouth. So get to flossing and brushing!

Why are We So Addicted to Social Medias?

You are planning to get some work done, or do something as simple as sleep when you decide it would be best to check your phone first before doing so. You start off by maybe checking your texts or emails, and then you decide to check your social medias such as Instagram, Twitter, or Facebook. And before you know it, you have refreshed each social medias’ newsfeed at least three times, and it has been over an hour since you started. This scenario is a familiar one for this generation (especially to us procrastinators), as we are practically glued to our phones on a daily basis. I feel the biggest culprit of this is not the phone itself, but the social medias that are downloaded onto our phones, which causes us mindless hours of staring at a screen. Is it just because we are hiding from the world and living through our phones, or is there a more scientific reason behind it?

Image found here

According to cmswire, we are addicted to social media because of psychology. As Dr. Pamela Rutledge explains, social validation is an important factor in human nature because it affirms our existence. Thus”it’s not how social media works that drives people to use it how they do- it’s people who are driving how social media works for us”. And according to this same article, there are some psychological traits that drive us to use social media as often as we do. The first one emphasized is the fear of missing out. If we do not check our social medias,  we fear we will not be updated and involved in the lives of others and may feel like we are missing out on something big (56% admit they check social medias regularly due to this). Secondly, social media feeds our ego, therefore in most cases it is more about having an audience rather than socially connecting because humans are self driven. Lastly, psychologically we are inclined to compare the lives of others with our own when we are uncertain about something (how to behave etc.). So by using social comparison, we are able to boost our self-esteem. Also in getting good feedback from an audience on social media, self-esteem also boosts which on psychological terms causes the want to use it more in order to feel self-secured.

A recent Harvard study showed evidence of why social media is addictive when they conducted two main experiments. In the first experiment, researchers asked their test subjects questions about their and other’s opinions while hooked up to an MRI machine. The MRI’s showed that when people would talk about themselves, the regions of the brain associated with reward were strongly engaged, and less engaged when talking about others. Researchers then performed a second experiment in which they found when people got to share their thoughts with a friend or family member, there was again a large amount of activity in the reward region of the brain. Although when asked to keep their opinions to themselves, there was less of a reward sensation.  Therefore the conclusion of this study is that one of the reasons people are so “addicted” to social media, and utilize it so often, is because using it activates the same part of the brain that is associated with pleasure (same pleasure we get from eating food, receiving money, or having sex).

I feel researchers could take this study even further, and set up a trial where a large number of adolescents is persuaded to live without any social media use for an entire month by paying them a certain amount of money. During this time period, a participant can choose to drop out at any point but they must give the money back. I think it would be an efficient way to see how social medias have effected this generation, and if we can actually go without it. Doing this by charting how many adolescents stuck with it and how many dropped out of the study. I feel it would be a good correlation to the previous studies done, because it does not seem like social media is going anywhere soon.

 

 

 

The Deal With Breaking Out

It happens when you least expect it, and when you least want it to. One day you wake up, look in the mirror, and there they are in their red clustered, ugly glory; pimples. The first question that comes to the mind is along the lines of “why me?” as you begrudgingly cake makeup on your face to hide the break out(s). And for the rest of the day, you are quite possibly left feeling insecure with yourself. While you may have some sort of idea of why acne happens and how to treat it, you may not know the whole kit and caboodle.

Acne, according to Mayoclinic, occurs when the glands (connected to your hair follicles) secrete an oily substance known as sebum in excess along with dead skin cells. This causes them to build up in the hair follicles (pores) and thus it forms a soft plug. If bacteria infects the clogged pores, inflammation will be the result. There are many reasons why we break out, according to vasseurbeauty, and one of the biggest and unavoidable reasons is hormones. This is especially prevalent in females with their menstrual cycles, as hormones fluctuate during this time and can cause severe breakouts. Another big cause is stress. This is due to the fact that stress induces the adrenal glands into overproduction of cortisol (a steroid) which results in oilier skin; a breeding ground for acne. Touching your face subconsciously is another cause of breakouts, as your hands have millions of germs on them, and in touching your face you are spreading it to your pores. Vasseurbeauty even suggests that putting your phone to your ear to talk could be causing acne on the cheeks, because your phone also carries bacteria. Surprisingly, the amount of dairy and sugar you consume could be a cause. Breaking out on your chin, jawline, or neck area is a sign that your body has consumed more sugar and dairy than it can tolerate. Thus cutting down on sweets and dairy products may show some results. Lastly, not washing your face can be causing you blemishes, due to the fact your pores are never deeply cleaned. Therefore washing your face regularly may result in clearer skin (if that doesn’t work, prescription drugs specifically designed to help clear skin may be prescribed in severe cases).

Genetics are also a major cause of acne- therefore if one or both of your parents suffered with acne there is a good chance you will too. Although there is no exact reason why some people will never have acne, let alone a pimple, a day in their life,  a more recent UCLA study could have an explanation why. According to this study 1 in 5 people will rarely develop pimples despite the fact that acne causing bacteria lives on everyones skin. This is due to their discovery that there are “bad” strains of acne (cause pimples) and “good” strains (that may protect the skin). Using pore strips, researches lifted Propionibacterium acnes (bacteria that thrives in depths of oily pores) from the noses of 49 pimply and 52 clear-skinned volunteers. Using a genetic marker, they identified the strains of bacteria in each volunteer and noted if they suffered from acne. One of their findings was that in 1 of the 5 volunteers who suffered from acne, two unique strains were present that rarely occurred in the clear-skinned volunteers. Although the biggest discovery as they put it, was a finding of a third strain of P. acne (this strain is a natural defense mechanism that destroys the acne causing bacteria before it can infect the skin) found commonly in healthy skin, and rarely in acne prone skin. Thus not only did these researchers possibly find the reason why some people will never develop pimples, but they may have found a better treatment for acne sufferers.

Although don’t rejoice yet, as because with any experimental study there are many things that need to be considered before determining if such a discovery can be found to hold true. I do not conclude meta-analyses could be causing problems here due to the fact this seems to be one of the first experiments they have done. Although without meta-analyses, the conclusion that this strain could be a treatment and reason why some people will never get acne , may quite possibly be a false positive.

So while acne is a persistent pest for many of us, taking care of your skin and being cautious with what goes in your body can help alleviate the breakouts. Although if your acne is severe and causing you discomfort, seeing a dermatologist can help get the treatment you need to fight this persistent bacteria!

 

 

Is Waist Training Harmful?

If you haven’t heard, one of the newest celebrity fads  is waist training. That’s right, celebrities such as Kim Kardashian and Kylie Jenner constantly post pictures  to Instagram wearing their trainers, and gush about how well they work.  Waist trainers are designed to do just as the title entails; train your waste into being more slim and feminine in form by compressing your core. Basically, they are tight corsets designed to be worn during everyday activities such as going out or working out.

My question is, is it safe to use a product that is virtually able to shrink your waist? According to Womens Health, if you are wearing it everyday for long periods of time, it is most definitely not. In fact, according to health experts it doesn’t even really work, as it is more temporary than it is permanent. Therefore, if you take it off, your waist will eventually go back to its normal size. Also, in wearing it everyday it not only restricts your breathing but it also can do damage to your ribs over time. In a more in depth article, Yahoo says waist training can cause acid reflux. This is because your waist and organs are so compressed, that the little food you are able to eat does not digest correctly; thus acid reflux being the result(i.e. heartburn or acid taste in throat). Also, it causes rib bruising, although most users confuse this as their bones shifting into a slimmer shape. Really they are just harming harming and bruising their ribs which sometimes can cause even more serious effects.

The article emphasizes that even more damage is caused in your lungs, as waist trainers are notorious for restricting breathing  and causing oxygen shortages – which can lead to loss of consciousness. There is also a linkage to build-up of fluid in the lungs putting people at risk for sicknesses such as pneumonia. Lastly, it can cause decrease of blood flow into your veins which can not only lead to blood clots but it can also do damage to your heart. Some cases have even reported kidney and spleen damage!

All in all, using waist trainers religiously is not a safe, or permanent fix. Instead it is recommended to lose weight and slim your waist the old fashioned way; eating healthy and exercising regularly! So skip this trend to save your body from harmful damage down the road.

Is It Beneficial to Drink a Gallon of Water a Day?

Most people know that the recommended amount of water one should consume per day is 8 glasses.  Although recently another type of challenge has risen; to drink an entire gallon of water everyday for 30 days and record (i.e. take pictures) of the results. That’s right, an entire gallon, which would be equivalent to 16 cups of water per day! If you’re like me, you value the importance of drinking water, as it is shown to have many benefits. Although as an avid water drinker like myself , I feel  a gallon of water may be overkill. Or is it? Maybe 8 more cups a day would result in even more benefits?

According to Live-strong drinking a gallon of water a day is more than recommended, even though it is double the suggested intake (our bodies will just excrete the excess in the form of sweat and urine) for many beneficial reasons. One of the reasons is that in hot weather, it is required to drink more than the recommended amount of water to stay hydrated. According to Johns Hopkins University, how much you sweat does not indicate dehydration due to the fact that dry weather causes sweat to evaporate more quickly. Therefore, it may lead you to believe that you aren’t losing too much water, when in reality you may very well be. Another beneficial affect is healthy circulation and more elastic (firmer), skin. Live-strong also emphasizes that drinking such a great amount of water throughout exercising can help you stay in top physical form. In fact, according to the American Council on Exercise, you should be drinking 7 to 10 ounces of water, for every 20 minutes of working out. Not only that, but hydration before exercise is also key, as 17 to 20 ounces of water should be consumed 2 to 3 hours prior to exercising so your body is completely ready for whatever workout you do.

 

Lastly, in drinking such a great amount of water per day, you are assuring complete hydration, and canceling out any chance of becoming dehydrated. Although of course, drinking too much water, too quickly, can cause a condition known as water intoxication. Water intoxication, or hyponatremia, is extremely rare but mimics symptoms of a heat stroke (hot, headache, overall feeling badly, nausea, diarrhea, vomiting). Webmd describes the issue in ones sodium levels, as drinking too much water causes an imbalance and the liquid moves from your blood to inside of your cells which causes them to swell. This condition is deadly, as brain swelling is lethal.

So why not try drinking a gallon of water a day for 30 days (and no other types of drink, including coffee and juice etc.)  like these women did; who recorded each of their experiences on a blog called no excuse mom. In reading each log, most of the participants cravings stopped, and their skin cleared up immensely. Therefore in different results acne cleared, eczema effects went down, and dark circles were a thing of the past. Some even experienced some weight loss due to the fact that they stopped craving sweets and other fattening, junky foods. And all said that they had a lot more energy than normal. Who knew something as simple as drinking a gallon of water a day can have so many beneficial effects! Now drink up!

 

 

 

Impulse Shopping and Science

Yes I am a girl, and yes I love to shop; but sometimes I feel that I may like shopping too much. ‘WHAT? HOW IS THAT EVEN POSSIBLE?’ is the question you may be asking yourself, but it is true. I am what some call an impulse buyer- I buy what I like right when I see it. Most of the time there is no reasoning behind my purchase, unless you count “I just had to have it” as a reason (that’s probably the only way I can explaining buying a $40 dinosaur head hat at Disney). Although thankfully upon coming to college I have learned to keep my impulses at a minimum, as to buy something would mean more money going out then coming in. But the question is: is there any science behind impulse buying? Or is it just a phrase specifically dubbed for the average shopaholic?

According to Brain Blogger, one of the reasons we impulse shop could be due to our appetite. That’s right, having an appetite not only influences our buying in grocery stores, but also in regular stores as well. This article emphasized two experiments; one having participants pick either a lottery with immediate, smaller winnings, and the other option having a larger winning that they would receive over a period of time. They showed participants pictures of either nature or pictures of food and found the ones who were exposed to food pictures picked the smaller and immediate winnings. In the second experiment emphasized, women were given a budget and sent to either a store with a no scented candle, or a store with a cookie scented candle. The study showed 50% more women spent more money in the cookie scented store, despite having little wiggle room budget wise. Thus simply being hungry can cause impulse buying.

An article by Psychology Today also gave some reasons as to why we tend to impulse shop.  The simple and most logistical one is that some people just really love to shop. The feeling of getting something new is an act of empowerment for those who can’t get enough, and therefore do it often. Another reason is the loss aversion switch, which in this article is defined as being afraid you will regret the purchase. This is why retailers put up discounts so the fear then switches to “what if I miss this deal?” and the purchase is made. Basically it is emphasized that the desire to save is what really causes the impulses to kick in. If the consumer thinks they will be saving money (ex. buy one pair get the other half off) then they will go for it.  Retailers also use twisted heuristics to get the consumer to buy. That means retailers will put things in bulk, or add extra things such as a free item to a purchase,  so the buyer thinks they are getting a good value (when in reality they might not be). The last reason Psychology Today gives is that a lot of consumers live with “rose-tinted glasses”, meaning we think of future rather than present, which deludes our thinking and thus our purchases.

Therefore, I have come to the conclusion that I not only really like shopping, but more times than not I am sucked in by retailers sale schemes. And while there is nothing wrong with impulse shopping for some, I am glad I have suppressed it; at least for now.

 

 

Can You Prevent A Cold Before It Strikes?

Ah it’s that time of year again! The time when germs run rampant and absolutely no one is safe. Here at Penn state it is dubbed the plague, but more commonly known everywhere else as the common cold.  And if you’re like me, you find yourself more often than not, afflicted with this prevalent illness. Which is why I’m posing the question: is it possible for one to take preventative measures against a cold before it strikes?

Speaking from past experiences, the day I wake up with a slight sore throat I know I am doomed to be miserable,as due to my deviated septum I will most likely end up with a sinus infection. This is also because a lot of the times the common cold complicates into something worse (sinus infection, ear infection, or bronchitis). Of course I have tried the logistical things, such as avid orange juice drinking and slurping on  unusually great amounts of chicken noodle soup; but nine times out of ten to no avail. Thus I did a little research to find out if there really is any way to stop a cold in its’ tracks. According to webmd you can get rid of a cold in just 12 hours- the key is catching it early. As soon as any symptoms of a cold (runny nose, scratchy throat) creep up, take immediate action. Colds are worse enough as it is, but because bacteria thrives in trapped mucus, nasty infections commonly precede. Dr.Marshall, who is a quoted source in the webmd article, states that keeping your nose open (i.e. being able to breathe through your nose) will lower your chances of a worsened and prolonged cold.

Common Cold Graph

graph found here

So how does one go about keeping their “nose open”? One of the ways is by using a decongestant spray or pill in order to open the nose lining so the mucus is given way to drain. You can also try taking an antihistamine, which is a sometimes sedative drug that inhibits the action of histamine, therefore trying to eliminate allergic reaction (definition found here). This article also stresses getting rid of the mucus, and therefore suggests trying over-the-counter medication such as Mucinex to break down the trapped mucus. It is also very important that you are blowing your nose correctly; yes that’s right, many people blow their nose completely wrong and only cause more stress to the sinuses. The correct way to blow your nose is by placing a tissue over it, while covering one nostril. Then proceed to blow your nose for 3-5 seconds before switching sides and repeating the process.

Like I mentioned prior, this articles also states that sipping chicken soup is beneficial in cold treatment. Studies have shown that it reduces inflammation thus relaxing the sinuses. Speaking of sinuses, it is important to keep them warm. So take a steaming hot shower, put a warm wash cloth over your face , or drape a towel over your head over a pot of hot water and breathe it in. All of these steps can help prevent or stop a cold sooner rather than later, so if you find yourself feeling the symptoms of the plague, take these preventative steps immediately!

 

Is Being Gluten Free the Healthier Option?

Gluten;  A word heard day in and day out when regarding a persons diet. Of course many of the people who cut gluten from their eating habits  are probed to due so because of a medical reason (celiac disease), but some do it just because they think it is healthier. Although is it really healthier to cut such a source from your diet? The answer is no.

Gluten is the general name for the variety of proteins found in wheat, rye, barley, and triticale which holds the food together and maintains its shape (source). Thus it is found in breads, cereals , pastas, baked goods and many more (refer to source link for the full list).  A person with Celiac disease is not able to consume gluten because it will do damage to the small intestine, therefore causing less than enjoyable symptoms such as severe stomach pains and diarrhea (source).  Yet lately it has become a trend for people without celiac disease or a sensitivity to gluten to go “gluten free”, due to the falsity that it is the healthier option.gluten-free

In reality , it is considered unhealthy to cut out gluten from your diet, because it cancels out many nutritional food options. Gluten is rich in many vitamins and minerals (B vitamins and iron) and even fiber. It also has been proven to lower risk of heart disease, type 2 diabetes, and even some form of cancers. Therefore cutting such an important food category from ones diet could result in nutritional deficiencies according to health experts. Their warning is that it is not advisable to go gluten free unless medically inclined to. It is not worth the unhealthy risk to eliminate such an important food category. So if you have been toying with succumbing to the gluten free fad (and don’t suffer from celiac or sensitivity) , don’t do it. Enjoy eating gluten; it’s the healthy and fun thing to do! source

Don’t believe me? Watch this Royals song parody from a person with celiac , sharing his constant food envy :Click to watch

Me+Science=??????

Once upon a time there lived a girl who thought she loved science; she even thought she was quite good at it. That is until she took honors chemistry and she realized sitting in her desk on the day of the first test that she was absolutely and positively screwed for the rest of the year. That girl was me. My name is Victoria Bushman and I’m from Manahawkin, New Jersey (the town outside of Long Beach Island) and I plan to major in broadcast journalism and political science. I took this class because not only had my advisor recommended it, but it didn’t involve ANY sort of math; at least not that I’m aware of….

Now while I do find science to be very intriguing , I am turned off by the fact a lot of it is math heavy. Due to the fact that I am in no way a math person under any circumstance besides algebra, being a science major just really isn’t for me. Although I won’t completely write science off, as most of it tends to be very intellectually captivating . That being said check out this video of cool science facts that may just “blow your mind” Click here to watch!IMG_6200

 

And for your viewing pleasure, here is a panorama shot I took with my iPhone after a storm on Long Beach Island. (I swear there is no edit)