Author Archives: Anthony Michael Calligaro

Is Laughter Actually the Best Medicine?

I like to laugh a lot, and honestly, who doesn’t? Nonetheless, I have always been known to laugh more than the average teenager. Unfortunately, this is not always a good quality. When I hear bad news or someone else tells me a sad story, my first instinct is to try and laugh it off. Although humor can be used as a coping mechanism, there are some circumstances where I have gotten myself in some trouble. Anyways, in the majority of situations, scientists, and the public in general, believe that laughter has numerous health benefits, but is there any validity behind this belief, and is it fair to say that laughter is actually the best medicine?


There are many theories that laughing is healthy, with numerous short and long term benefits. According to the Mayo Clinic, laughter can increase the endorphins that are released by the brain, which can trigger a positive feeling in people’s bodies. Laughing is also thought to increase oxygen intake and stimulate muscles, lungs, and the heart. Additionally, laughter can help to relax muscles, relieving stress and tension in the body. On the other hand, one possible long term effect of laughing is that it helps the body produce its own natural painkillers. Similarly, by releasing neuropeptides, the body can possibly prevent disease more effectively for people who laugh a lot. Unfortunately, none of these beliefs are supported by any evidence in this instance.

Most people, even some scientists and professionals in medicine, just accept the fact that laughter is beneficial. Luckily, there are multiple studies in regards to the relationship between laughter and health benefits that at least strive to provide some evidence of the null hypothesis, that laughter does not cause health benefits, or the alternative hypothesis, that laughter does cause health benefits. In one small study of 20 people between the ages of 60 and 80, conducted by researchers at Loma Linda University, the participants each took a quick memory test and saliva samples to measure their levels of cortisol, a stress hormone. Half were asked to watch a funny video, while the other half were supposed to sit silently. After about twenty minutes, the participants took another memory test and their cortisol levels were measured again. The group that laughed from the funny video improved by 43.6% in recalling events, while the group that sat silently only improved by 20.3%. The group that laughed also had a larger decrease in stress levels than the group that sat silently. Another small study conducted by Vanderbilt University indicated that laughing for about 10-15 minutes a day can burn up to 40 calories.

Other small studies, according to Susan Brink, suggested even more benefits of laughter. In a 2005 study, scientists calculated the blood flow of 20 people before and after they watched a sad and a funny movie. They discovered that after watching the funny movie, the average blood flow for each person had increased by 22%, whereas after the sad movie, the average blood flow was more restricted for 14 out of the 20 volunteers. Interestingly, a 2003 study of 33 women showed that not just laughing, but laughing out loud, had more natural killer cell activity, which helps people fight diseases. Lastly, according to Noreen Fraser, scientists have conducted multiple studies, which started in the 1970s, involving the relationship between laughing and brain waves. These scientists claim that laughter creates a sense of well-being by releasing endorphins from the brain.


While there are a substantial number of studies supporting the belief that laughter might, in fact, be the best medicine, none of these studies have been large enough. Also, many studies that supposedly “prove” that laughter is the best medicine are not conducted correctly. For instance, according to the University of Maryland Medical Center, in one of the few large-scale studies regarding this topic, researchers found that laughing will reduce the risk of heart disease. In an observational study of 300 people, where half had already either had a heart attack or heart surgery and half had not, the participants were given a questionnaire to rate themselves on how much they laugh in certain situations. Not surprisingly, the people who have never experienced a heart attack or heart surgery laughed more. I say “not surprisingly” because this study shows that people who have undergone heart surgery or a heart attack are less likely to laugh a lot, which is a fairly obvious conclusion. Yet, the researchers claim that this study proves the opposite, that laughter reduces the chance of getting a heart attack or heart surgery, which is absurd based on the manner in which the study was organized.

Besides incorrectly conducted studies, some studies simply do not provide enough evidence to reject the null hypothesis that laughter does not provide people with health benefits, or to fail to reject the null hypothesis. According to William Strean, researchers believe that studies determining whether laughter is the best medicine or not barely provide any evidence that laughter has positive health benefits. However, Strean does go on to admit that there really aren’t any negative effects of laughter. Thus, people can take the risk of laughing as it can really only cause positive health effects. Well, most of the time… (look at picture below).


Consequently, in my opinion, I believe people should laugh, but not because it is the best medicine. In class, we discussed how unhappiness does not actually correlate with sickness, despite many people’s belief that it does. This is a similar situation. I think it is safe to say, based off of studies that have been conducted so far, that laughter is definitely not the best medicine. The original saying was probably meant as an exaggeration, but in the literal sense of the phrase, I would consider it to be incorrect. However, numerous studies have indicated that laughter could cause certain health benefits. Although more research needs to be done, laughter can’t hurt anybody. Therefore, I would advise everyone to laugh, as people could easily gain health benefits from doing so, and if not, the worst that happens is that they wasted a few seconds of breath, which is not the end of the world.

Is Creatine Bad For You?

My mother has always told me, “don’t put junk in your body.” I always thought muscle-building supplements fell into that category, which is probably accurate to say. Nevertheless, creatine may be the outlier. A few weeks ago, I was shocked to find out how many of my friends took creatine. I thought, like the majority of other muscle-building supplements, that creatine would make people appear stronger than they actually are, but would damage the body over time. After researching the matter, I was surprised to find that creatine is a little more complex than I originally anticipated.

Before I explore the pros and cons regarding creatine, it is important to understand its history, how it has developed over time, and why it has become so popular as of late. Chevreul, a french scientist, recognized an undiscovered component of skeletal muscle in 1835, and decided to call it creatine. Scientists first noticed that creatine is necessary for muscle activity when they learned that wild animals contain about ten times more creatine than domesticated animals do.  According to the Mayo Clinic, oral consumption of creatine supplements became popular in the 1990s as people wished to be stronger, look stronger, and gain an advantage in athletics. To this day, creatine is still very popular in the US due to how effortless it is to take and its multiple benefits.


According to Iron Man Magazine, creatine is the number one sports supplement. This, in part, is due to studies showing that 80 percent of people using creatine have found it effective. Moreover, according to Health Research Funding, creatine decreases the time needed to regenerate ATP, which gives muscles energy. As a result, creatine is expected to improve performance for brief, but intense workouts by 5 to 10 percent. According to Mark A. Jenkins, several double-blind, placebo experiments have been conducted, where half of the participants were given about 20 grams of creatine each day for 5 to 6 days, while the other half were given placebos. After these 5 or 6 days, each study made the participants do a certain exercise for a short period of time. The studies all concluded that the group of people taking creatine performed better than the group that took the placebo. Surprisingly, creatine might make the heart stronger and healthier. According to Penn State Hershey Medical Center, a few studies were conducted of people with heart failure. Similar to the other studies, this one divided the participants into two groups, half taking creatine and half taking a placebo, while both groups received standard medical care. The results of the studies showed that, in general, those who took creatine could do more exercise before becoming fatigued than those who took the placebo. On top of all these benefits, creatine is also cheap and easy to use, making it the most popular sports supplement in America today.

Like any supplement that people put in their bodies, there will always be side effects, but is taking creatine worth these possible consequences? As supplemental creatine has only been popular for a few decades, there are not many studies on how it can harm the body. Obviously, too much creatine can be unsafe. According to MedlinePlus, some research indicates that creatine could also damage the liver, kidney, and the heart. Still, not enough studies have been completed in order to show a significant correlation between taking creatine and these side effects. According to WebMD, other outcomes such as irregular heartbeats and skin conditions are possible effects of creatine, but, yet again, there is not enough research to see if there is a relationship between them. However, studies have shown that five to seven percent of people who take creatine will get either diarrhea, stomach aches, or both, according to Men’s Health. Additionally, creatine does not work as well for meat and fish eaters because these people already have high levels of creatine in their muscles. Some studies have indicated that creatine could cause kidney problems for people with high blood pressure or diabetes, for those on anti-inflammatory drugs such as ibuprofen, or for people over the age of 40. Despite this evidence, unsurprisingly, there have not been enough large-scale studies conducted in order to see a distinct correlation between creatine and these side effects.

Possible Side Effects of Creatine

Possible Side Effects of Creatine

All in all, making a decision on whether creatine is bad for people or not, at this moment, would be unjust. There are simply not enough large-scale studies out there to figure out if there are correlations between creatine supplements and the long list of possible side effects. From my own research, the only study where a negative conclusion could be clearly stated claimed that five to seven percent of people will either get stomach aches, diarrhea, or both if they take creatine. Nevertheless, five to seven percent is not a very high chance. Furthermore, these side effects are not terrible consequences and are only short-term problems.

The file drawer problem could also play a part in this lack of information on the side effects of creatine. Body building companies have performed studies on the effects of creatine, yet a majority of the body building websites claim that the side effects are, to the most part, all myths. It is hard to believe that websites like webMD, Men’sHealth, and MedlinePlus have all concluded that there is not enough information on the side effects of creatine supplements, yet body building websites have done enough large-scale studies to conclude that these side effects are all misconceptions. I find it more likely that body building websites are only publishing studies with positive results, while hiding studies that show a correlation between supplemental creatine use and certain side effects. Hence, the file drawer problem could exist in this case. Therefore, in my opinion, I believe there is not enough evidence to say whether creatine is bad for people, in the short-term and especially in the long-term. Personally, I don’t see the advantage in taking creatine supplements until more studies are conducted on the matter as people could be in risk of serious side effects.


Why Do People Believe in Conspiracy Theories?

I’ve never been a huge fan of conspiracy theories. If a conspiracy theory was actually true, I could never understand how such a significant event could be kept a secret without the public hearing about it, while there are mountains of evidence showing that the conspiracy theory can’t possibly have occurred in the first place. On the other hand, I love discussing conspiracy theories and whether they are plausible or not. I may not believe in conspiracy theories, but I wonder why people are so attracted to them. According to a Fairleigh Dickinson University poll, 63% of registered American voters think at least one political conspiracy is real. Could it be the appeal to mystery, inclination to doubt powerful people and groups, desire to feel in control of a turbulent situation, or just an honest belief that there is enough evidence to believe in certain conspiracy theories?


There are many different reasons that people believe in conspiracy theories. According to political scientists Joseph Parent and Joseph Uscinski, education could play a large role in whether someone believes in conspiracy theories or not. In one study, they discovered that only 23% of people with postgraduate degrees, compared to a shocking 42% of people who haven’t graduated from high school, are likely to believe in conspiracy theories. This may seem like Parent and Uscinski are claiming that those who believe in conspiracy theories are stupid, but that is not the case. This study is only indicating that people with more practice in searching for actual evidence rather than being persuaded by emotion and instinct can notice more easily when a conspiracy theory has no credence to it.

Another reason certain people tend to believe in conspiracies is to feel in control. Jan-Willem van Prooijen performed multiple studies to demonstrate this assertion. In one study, Prooijen gathered 119 people, telling half of them to write down instances when they felt in control, and the other half to write down times when they did not feel in control. After researchers went on to ask each individual about a certain conspiracy, the results were clear; the people who wrote down times when they did not feel in control were much more likely to believe in the conspiracy theory.

Besides feeling out of control, cynicism, narcissism and low self-esteem are other attributes of people who generally believe in conspiracy theories. According to Viren Swami, a psychology professor, believers in conspiracies are more likely to think poorly of themselves and to be distrusting of others. In similar studies, researchers at the University of Kent conducted multiple studies of over 200 people where they would ask each person if they agreed with certain conspiracy theories or not. The researches would also ask them to assess themselves on their egotism and their self-esteem. The results revealed that, in general, those who believed in conspiracies more often were also more likely to have low self-esteem and be narcissistic.

Considering all of these studies regarding why certain people believe in conspiracy theories, could they actually be right? In some cases, yes. One of the most famous instances occurred in 1953 to 1964 in a project known as MKUltra. The CIA worked with 80 institutions like prisons, hospitals, and universities to perform immoral experiments on people without their permission. The goal was to gain knowledge on biological and chemical materials in order to create better weapons in preparation for the Cold War. Many of these “patients” were tortured in odd and various manners during the program’s existence. Another conspiracy theory that turned out to be true occurred in 1964, when North Vietnamese torpedo boats attacked a US battleship, the USS Maddox. After the USS Maddox had “successfully” defended itself and fired back, President Lyndon Johnson announced that two of the enemy boats had been destroyed, maybe more. Some people created a conspiracy theory that the US battleship had pretended they were being attacked when no one was even there in order to intensify the conflict in Vietnam. It turned out that these people were correct.


Despite these rare cases where believers in conspiracies were actually correct, there are numerous other instances where there is plenty of evidence to disprove conspiracy theories. One of the more ridiculous conspiracy theories is that the 1969 moon landing never happened. Not only would hundreds of people in NASA have had to keep that a secret for over 40 years, but there is also evidence of the moon landing. The Apollo 11 astronauts brought back lunar rocks, which were then studied by scientists, who would have noticed if the rocks were simply from Earth. Also, China’s lunar probe, India’s lunar probe, and other space agencies’ probes have visual evidence of Americans walking on the moon. Additionally, the men took picture evidence themselves. Surprisingly, according to a 60 Minutes/Vanity Fair poll, 14% of Americans still believe the first moon landing was fake.


In fact, according to Oxford University’s Dr. David Robert Grimes, conspiracies should take no longer than four years to be revealed to the public. Dr. Grimes made an equation to determine the probability of a conspiracy being accidentally, or purposefully, revealed to the public by someone inside of the operation. Consequently, conspiracies regarding the moon landings, 9/11, Paul McCartney, Elvis Presley, global warming, the flying saucer in Roswell, Princess Diana, and JFK should all be debunked based on Dr. Grimes’s equation. However, it is good to question commonly held beliefs and these conspiracies can be factual in rare circumstances. Yet, if there is solid evidence to support that an event has taken place, one should not continue to doubt it based on blind faith. This is similar to the prayer experiment we discussed in class, where no matter what the results showed, they are rendered useless if certain people can’t be influenced due to their blind faith for a certain belief. In my opinion, based on the studies mentioned earlier, most of these believers are likely to be narcissistic, cynical, have low self-esteem, feel like they are not in control, or are possibly less educated. Once again, I am not saying people who believe conspiracies are bad or stupid people, but based on the science, they are likely to have these certain attributes.

Should the drinking age be 18 or 21?

As one of the best party schools in the nation, Penn State is no stranger to underage drinking. Yes, Penn State students may drink, on average, more than students attending other universities, but this phenomenon of drinking alcohol occurs at every college across America. Before 1984, many scientists, parents, and concerned adults worried that the drinking age of 18 was too low as the age group with the most common drunk drivers was 16-20 year olds. Once President Reagan virtually forced each state to increase the drinking age to 21 on July 17, 1984, the percentage of drivers involved in deadly car crashes decreased dramatically for 16-20 year-olds. In 1982, 61% of fatal car accidents involved people from this age group, yet by 1995, only 31% of these accidents involved people between the ages of 16 and 20 years old. Nevertheless, does this evidence prove that President Reagan made the correct decision in raising the drinking age to 21, or are there other factors that could indicate otherwise?


According to a petition trying to return the drinking age to 18, about 77% of countries around the world implement a drinking age of 18 or less. Additionally, the United States, even with the drinking age at 21, ranks third out of all countries in percentage of road accident fatalities involving alcohol, with 31%. According to Professor Ruth C. Engs of Indiana University, the decline in drunk driving incidents began in 1980, not 1984, and was due to numerous confounding variables other than the legal drinking age being raised to 21. For instance, awareness of driving inebriated has increased since 1980. Furthermore, driving in general has become less dangerous with safer automobiles, lower speed limits, increased use of seat belts and air bags, and increased use of taxis or Ubers to drive people under the influence rather than them driving themselves.


Besides third variables possibly discrediting the belief that the increase in the drinking age has helped reduce drunk driving incidents in America, there is evidence that this law may actually be hurting young adults more than helping them. According to John McCardell, the president of Middlebury College, students under 21 years old are attracted to risk, causing them to drink more often. Similarly, since they are technically not allowed to drink, many young adults will binge drink before going out in public. Although this is just an anecdote, without any real scientific method being applied, McCardell could be on to something. Among college students who drink, 32% of the students under 21 consider themselves heavy drinkers (over five drinks per week), compared to just 24% for students over 21. Scientists worry that heavy drinking in a short amount of time should be a larger concern for young adults as binge drinking can impair the prefrontal cortex of the brain, which is still developing for young adults. The prefrontal cortex gives people the ability to judge consequences of their actions, have control of their urges, and experience abstract thought, which are all clearly essential for young adults to have.

Nevertheless, there are plenty of reasons why MADD, Mothers Against Drunk Driving, and other groups argued as ardently as they did to raise the drinking age to 21. Primarily, experiments performed in the late 1970s and early 1980s compared states with different drinking ages, but similar cultures to remove certain confounding variables. The evidence in favor of raising the drinking age was overwhelming. Furthermore, once states increased the drinking age to 21, alcohol-related driving accidents immediately dropped. Researchers have even taken into account third variables such as seat-belt use, safer car designs, and drinking rates in each region, yet the evidence still supports that the frequency of drunk driving incidents decreases for people between the ages of 21 and 30, and declines even more dramatically for people under 21 when the drinking age is raised to 21. According to James C. Fell, the drinking age should be 21 because young people get drunk twice as fast as adults, yet young people find it difficult to know when to stop. Fell continues to argue that the drinking age should remain 21 due to the harming of young adults’ developing brains, the reduction of car accidents, and the welfare of young adults in general. Lastly, according to Gabrielle Glaser, a study conducted from 1998-2005 showed that the number of people between the age of 18 and 24 who suffered alcohol-poisoning deaths almost tripled in this seven-year timespan.

Although many believe this argument has been settled years ago when the drinking age was changed to 21, the debate over changing the drinking age in America still persists to this day. Many studies have been carried out and there is plenty of evidence to support both sides. Raising the drinking age may help to prevent drunk driving, but may also cause teens to drink more (and binge drink more) because they are attracted to risk. Both of these conclusions, however, are soft endpoints. Still, common sense tells us that drunk driving obviously causes more fatalities from car accidents and that more binge drinking cannot be healthy for teens’ developing brain and for their overall safety. Experiments to decide which drinking age is better are practically impossible to conduct because scientists cannot make a large population of people drink or not drink based on their ages and see over the next few years which groups were in more fatal car accidents or suffered more brain impairments. An experiment like this would be immoral and not realistic. Nevertheless, observational studies can be very reliable if confounding variables can be accounted for. I am on the fence about whether the drinking age should be 18 or 21, but what struck me was that researchers have taken into account these third variables, and the rate of fatal car accidents still decreased drastically after the drinking age was raised to 21. I find this study very difficult to argue with the results. Therefore, I believe the drinking age should remain at 21 in America as lowering it to 18 would be too risky and could be harmful to many young adults, but with more research, I would be willing to change my mind.


Is Time Travel Possible?

I have always thought of time travel as the Santa Claus of science; I want to believe it is real, but I know deep down, that I am terribly mistaken. Or am I? After watching movies like Back to the Future, Hot Tub Time Machine, Looper, Click, Men in Black 3, Austin Powers in Goldmember, Groundhog Day, Escape from the Planet of the Apes, and Interstellar, I have trained myself to believe that time travel is too good to be true, but maybe these movies had some substance to them.

For about the last one hundred years, scientists have slowly realized what it would take to travel through time. To move forward in time, one would have to move faster than the speed of light, which is 299,792,458 m/s.  I know, such a large number can be hard to comprehend. In order to move as fast as the speed of light, one would need to go around the Earth 7.5 times in just one second. This is yet another example of how humans have terrible intuition, as it is impossible to even imagine being able to go around the Earth that many times in one second. Although scientists have many theories on the matter, the basic laws of science established many years ago support that objects traveling faster than the speed of light would be impossible.

In 1916, Albert Einstein’s general theory of relativity claims that it would be impossible to move an object at the speed of light for two reasons. First of all, the faster an object goes, the more mass it accumulates. This is shown through Einstein’s famous equation, E=mc^2. C equals the speed of light and is therefore constant. Since an object must gain an enormous amount of energy to travel at the speed of light, the object’s mass must increase as well. As a result, an object would need infinite mass to travel at the speed of light, making the proposition impossible. Additionally, if an object were to go as fast as the speed of light, the warped length and time intervals would make the length of the object 0, which would also make this endeavor impossible.

However, numerous experiments have been conducted which suggest that time travel into the future might be possible. For instance, in 1975, Carol Allie used two synchronized atomic clocks, which are extremely accurate and precise, to show that time moves more slowly when you are moving faster. Allie repeatedly flew a plane around for hours with one atomic clock on the plane and the other on the ground. Consistently, the results showed that the clock on the plane was a fraction of a second slower. This slowing of time due to motion is called time dilation. Furthermore, scientists have placed atomic clocks on the top and bottom of skyscrapers to see if the mass of the earth would affect time. The results showed that the clocks on the ground, closer to the Earth’s mass, were slightly slower than the clocks placed at the top of the skyscraper.

Although there is evidence that traveling into the future is possible, traveling back in time is a completely different challenge. One way someone could accomplish this would be to travel faster than the speed of light, which is thought to be impossible. Even if it wasn’t, we don’t have the resources to build such a machine, or even anything somewhat close. Another way would be through a wormhole to use as a shortcut back in time, yet to this day, there is no proof that wormholes even exist. Furthermore, there are many problems with the idea of time travel into the past. The grandfather paradox, proposed by Nathaniel Shachner in 1933, wonders how someone could travel back in time and shoot their grandfather. How could this grandfather get killed by his grandchild if the grandchild should not be born since the grandfather had died before he could have kids.

Clearly, experiments, as of now, are impossible to perform at large scales that could actually provide evidence of significant time travel because we do not possess the resources or energy to do so. However, on minute scales, time travel into the future has been proven possible. In my opinion, not in my lifetime, not in my children’s lifetimes, but in a few hundred years, people might be able to create the technology and harness enough energy in order to travel a few minutes into the future. But hey, who knows? I bet people of the 1700s never could have imagined the technology that the 21st century possesses. Maybe humans will surprise themselves.

Grass vs Turf: Which is Safer?

Anyone who has played an outdoor sport in their life can relate to this topic and probably has an opinion of their own. Grass or turf? For most people, it is a matter of preference. As a kid, turf fields seem much more professional because most kids play on grass fields that are poorly taken care of. However, once you play in high school, you start to form your own views. In the summer, turf is too hot. In the winter, grass becomes muddy and icy. In both scenarios, the playing field can be hazardous for athletes to run, cut, slide, kick, tackle, and pretty much any other movement needed to play a sport. Rather than asking which surface players prefer to play on, I wonder whether there is a difference in the frequency of injuries on turf compared to on grass.

Comparing the safety of new and improved modern artificial turf fields to grass fields has become a back-and-forth debate over the past few years. Studies have provided evidence supporting and opposing the idea that turf fields are more dangerous than grass fields. According to Justin Shaginaw, an athletic trainer for the US soccer federation, a 2011 study found a higher frequency of ankle injuries on turf for football, soccer, and rugby players. Also, a 2012 study showed that more college football players suffered ACL injuries on turf than on grass. On the other hand, both a 2010 study on college football players and a 2013 study on female college soccer players showed a higher frequency of all injuries on grass than on turf. However, according to John Brenkus in one of his Sport Science videos, recent studies have shown that turf reduces the risk for injury by over 10%, yet turf increases stress on the ACL joint by about 45%. Brenkus even talked about a study covering over 2,600 NFL games, saying players were 67% more likely to sprain their ACL on turf than on grass.

So what makes one surface more dangerous to play on than another? According to Mark Drakos, most scientists believe there are two features of turf and grass that affect injury rates: the coefficient of friction and the coefficient of restitution. The coefficient of friction is exactly what it sounds like, how much friction the surface creates. For instance, a low coefficient of friction would cause people to slip a lot, whereas a high coefficient of friction creates a stickier surface. Therefore, surfaces with higher coefficients of friction cause more ACL injuries because there is not as much “give” on the turf or grass. Similarly, a higher coefficient of restitution will, in general, cause more injuries. Basically, the coefficient of restitution means how hard the surface is and is measured with a G-max value. For instance, concrete has a very high G-max value. Consequently, certain grass and turf fields have higher G-max values, which lead to a higher rate of concussions and other contact injuries.

Although grass and turf fields might cause a similar rate of injuries, some scientists believe there is one feature of turf that makes it much more dangerous to play on. No, I’m not talking about sprained ankles, pulled muscles, or torn ligaments, I am talking about cancer. According to Hannah Rappleye, the crumb rubber in artificial turf is made up of bits of tire that contain carcinogens and chemicals that could cause cancer. There is evidence that exposure to benzene, carbon black, and lead, among other substances, can cause cancer, but proving this is more difficult than it seems. Crumb rubber has tens of thousands of different tires from different brands, making it difficult for scientists to research the relationship between crumb rubber and cancer.

Personally, in my experiences playing soccer, football, and even baseball on grass and turf fields, I enjoy playing on turf more, but am definitely more sore after playing on turf than I am on grass. However, assuming the weather is normal and both turf and grass fields are well taken care of, it is a tough choice. In a few years, the distinction between injury rates on turf fields and on grass fields should be more clear. Nevertheless, with the information discovered up to this date, I would prefer grass for a few reasons. Primarily, grass may cause more common injuries such as ankle sprains and muscle strains, but turf fields seem more likely to cause more serious injuries like ACL and MCL tears. Also, although this may end up being nothing, I would rather avoid any potential risk of cancer if I had the chance. The higher risk for common injuries would be worth it compared to risking an ACL injury, or worse, getting cancer. So there it is, grass fields win. To conclude, enjoy this video from The Comebacks of someone tearing his ACL, great scene.

Image 1

Image 2

Image 3


Dear Science, it’s not you it’s me

Hi everyone, my name is Anthony Calligaro and I am from Narberth, Pennsylvania (right outside of Philadelphia). For those of you who don’t know where Narberth is; I lived next to Lower Merion High School, where Kobe Bryant went to school. Here is a video of Kobe’s top 10 plays from high school (I also like basketball).

I am a Freshman and when I attended New Student Orientation I was enrolled in the Eberly College of Science to major in Mathematics. For some reason, on that second day of Orientation, I decided to go into the Division of Undergraduate Studies to explore all of my options. As of now, however, I am aiming to major in something within the business school.

Clearly I haven’t mentioned science yet for a reason. From biology to chemistry to physics, it was just one headache to another. I learned that the mitochondria are the powerhouse of the cell, the periodic table is impossible to memorize, and I’ll never know how to correctly perform an experiment. The only part of science that kept my attention was terrible science jokes:


I chose this class because I’ve heard great things about it and that the class actually makes science interesting. So why not give it a shot? Based on some of the topics on the schedule, SC200 seems like it will be a very entertaining and fun class.