Author Archives: Trevor Dennehy

Is eating dirt good for you?

One of the most vivid memories I have from childhood is of the time my brother ate dirt. He was an adventurous child when it came to eating non-traditional foods, like cork, stickers, receipts, and yes, dirt. I remember sitting on my kitchen floor drawing, only to glance into the other room to see my brother, younger than me by three years, munching down on the potting soil from a rather large plant my mom had placed on the floor. Quickly, my mother noticed, and washed his mouth out, telling him eating dirt was bad for him and that he shouldn’t do it.

This is probably something we’ve all heard as children. Parents and teachers are certainly not to be faulted for telling us to avoid chowing down on our favorite brand of soil, whether it be sandy silty loam or the limestone heavy soil we have in this part of the state. The reason for this fear of eating dirt is because of the many potential parasites found in dirt, such as intestinal worms. Parents, wishing to protect their children from the problems that arise from the contraction of parasitic bodies, feel as though the best way to do that is to keep them from eating dirt. Additionally, it is mostly traditional wisdom that eating dirt is ill-advised.

hookworm-400

Science is anti-authoritarian in nature, however, so it would make sense that a recent meta-analysis was published claiming that there may be some benefits to eating dirt, at least under certain conditions. This study was mostly conducted by viewing over many various recorded instances of people eating dirt, many of which were anthropological accounts. By it’s conclusion, some 480 reports of dirt consumption were found, including the earliest ever recorded, by Hippocrates in ancient Greece.

The null hypothesis of this study would be that eating dirt served no benefit to people, and the alternative hypothesis would be that it had at least some positive health benefit. Researches were a little unsure about exactly what benefit it would yield, if any. The first thing they looked for was people who ate dirt to fulfill their nutritional needs, which seemed to be faulty, as the majority of dirt eating accounts came from people who ate clay, which is not very high in many essential nutrients, except calcium.

The next possible reason they looked for in eating clay was for satisfaction, especially in instances where food was scarce. This was also found to be unlikely, as many of the counts of eating dirt came from societies where food was plentiful.

The conclusion that the conductors of this meta analysis eventually came to was for protection from possible pathogens and parasites. Eating the dirt directly is supposed to act as a deterrent, to build up one’s immune system. This was supported by the fact that most accounts of consumption of this bizarre food were in pregnant women and younger children, like my brother when he ate the dirt. People at these stages in their lives have weakened immune systems, as stated in this study. The researchers of the dirt study are quick to point out, an_23535123-read-onlyhowever, that most of the soil people eat comes from deeper layers, and parasites usually live near the surface. Additionally, they are also quick to note that it is common practice to boil the dirt prior to consumption, which leads me to wonder how this helps to build up immunities.

This correlation found between the building of immunities and the consumption of dirt could be explained one of four ways:

Direct Causality- The consumption of dirt causes the build up of immunities against pathogens and parasites.

Reverse Causality- The build up of immunities against pathogens and parasites causes people to consume dirt.

Third Variable- The consumption of dirt and the build up of immunities against pathogens and parasites are separate events, each caused by something separate.

Chance

It would seem as though the researchers want us to believe that direct causality is responsible for the relationship between the two variables, but seeing as though this is a meta analysis, composed mainly of anecdotal accounts of dirt consumption, I feel as though much more research needs to be conducted involving this correlation. Perhaps an experimental study would provide the clearest picture of whether or not there is a causal link between the consumption of dirt and protection from pathogens and parasites. Although that may not be feasible, as feeding participants dirt could result in an ethical grey-area, as putting people at risk for parasitic infections is probably dangerous.

Still, the data of the meta analysis seems to support its alternative hypothesis, and one must accept the data in order to avoid a false negative. So, at least until further research is conducted, it would appear that eating dirt could help you build up immunities to pathogens and parasites, although it would seem wise to take it from deeper within the ground, and perhaps even to boil it first.

Picture One Source

Picture Two Source

Soils of PA Source

Parasites Found in Dirt Source

Meta-Analysis Source

Pregnant Women’s Immune System Source

Can running help you stay young?

I was raised around running. My dad was a runner his whole life; in high school, he held the school’s record for the 400 meter sprint. (A record which, as he always admits he broke at the last meet of his senior year, only to have it broken by someone else at the first meet the next year.)  When I was a  younger, my dad would frequently compete in various races and track meets all over the local area, and when I was in seventh grade, I began running with him–or should I say running behind him. That fall I enrolled in Cross Country and that spring I did track, two sports I participated in every year from then until I graduated.

During that time, I had heard about the many health benefits of running, although that was not something that I paid attention to. I was into running because it was fun, not because it made me healthy benefits. That’s not something that had occurred to me as a twelve year old. But now, as a 19 year old, I have started to pay attention to the health aspects. Many are rather immediate, such as improved cardiovascular health, stronger joints, and the burning of calories. These benefits are the reason many people continue to run for many years.

man-running-mh

According to a recent study, however, the effects of running could be much more long-term than just burning some of your daily caloric intake. Apparently, running could actually help you live longer than non-runners.

The study, published by Stanford’s medical school is a longitudinal study, as it studied the same participants for 20 years. The participants were chosen from a group of people 50 years or older, back in 1984. There were two major groups of people, the first was runners, of which there were 538. The second group was made up of non-runners, and was approximately the same size.

The study was started in response to the popular scientific opinion at the time that supposed that intense exercise like running held a negative effect for older people. This belief most likely arose from the fear that it would cause more older people to break some of their bones, now feeble with age. The researchers disagreed with this supposition, believing that running could make basic tasks, such as walking, easier to accomplish even with the difficulties of aging.

With that in mind, the alternative hypothesis was that running would help slow the effects of aging, while the null hypothesis was that running expedited them.

So, in order to set up the study, participants were asked to answer a yearly questionnaire, about how well they were able to perform these basic tasks. By 2008, when the study was published, only 16% of the runners had kicked the bucket, while 34% of the non-runners had gone the same way. The group was significantly less able to perform the basic tasks at this point, but the runners became that way on average 16 years after the non-runners.It is important to note though, that the runners began the study completing on average 240 minutes a week, compared to only 76 minutes in 2008. The researchers involved in the study believe the mechanism for these results lies in the runner’s more slender body-types, as well as their commitment to health.oldmanrunning

In this case, the alternative hypothesis is to be accepted, as it seems clear that not only were the runners having an easier time participating in basic everyday activities, but they were also living longer lives. Back in 1984, this would have been considered an unusual result. The study typifies science’s anti-authoritarian nature, and how our knowledge is constantly changing. As far as I could find, however, there are not many other studies of this nature that have been done, an occurrence that could be explained by the file drawer problem. The court of opinion about elderly runners could have been so strong that studies looking to disagree with it may have not been published or even attempted.

The study isn’t perfect of course. 538 people is a relatively small sample size for each group, although perhaps any more might be difficult to keep track of over such a long time. Additonally, the results of the study could be due to chance, although from the data it seems relatively unlikely. The results could also be due to confounding variables, such as genetic predispositions to cancer and other illnesses that could have caused participants to die, partially skewing the statistics. It’s definitely a study that could benefit from meta-analyses, or even similar studies of the kind, to see if they could replicate similar results.

At least for now though, it would seem that running in the autumn of one’s life is advisable. It could make life easier down the road. Just being able to put clothes on and walk upstairs to the bedroom increases one’s quality of life as an older person. Most people say that being old is not that great, but if running is a part of their life, it need not be so bad.

Picture One Source

Picture Two Source

Cardiovascular Health Source

Stronger Joints Source

Study Source

Could exposing infants to peanuts prevent peanut allergies?

I love peanut butter, and I am definitely not one of those many health-conscious people who have decided to go gluten-free, so having a food allergy to one of those things would be really difficult for me. When I was a kid, my parents had me tested to see what allergies I may have, and while I tested positive for allergies to 50+ medicines, grasses, trees, particles, and animals, I consider myself very fortunate to not have any foods listed among them.

These allergies don’t typically develop at birth, but instead early in life. Allergies are somewhat caused by genetics, however. According to WebMD, Children have 33% chance of having their own allergies if they have one parent with some, and they have 70% chance if both of their parents are allergic to something. People who are genetically prone to allergies typically develop them from frequent close contact to allergens. So it is makes sense that with this in mind, traditionally people have been recommended to keep their young children away from possible food allergens, especially if one or both of their parents has allergies.

t-allergies-enhd-ar1

So when is it prudent to introduce these allergens to children? The American Academy of Pediatrics traditionally recommends to wait one year to introduce children to cow’s milk, two years to introduce them to eggs, and three years to introduce them to peanuts and other nut related products. The reasoning behind this is that children have weaker immune systems when they are infants, so exposure to these possible allergens at that time could lead to an increased risk of being allergic to them.

In addition to being exposed to these foods at a young age by eating them, some researchers believe in another possible method of exposure, through the infant’s skin. This method, combined with early oral exposure has become known as dual-allergen-exposure hypothesis.

Figure 1 - Dual signal hypothesis for pathogenesis of food aller

Even with this hypothesis becoming popular in recent years, there are some recent studies that have begun to support early introduction of these allergens, rather than shielding infants from them. One such study surveyed Jewish school children in the UK, and Israel to see if they had  peanut allergies. In the UK, 5171 children were surveyed and 5615 were surveyed in Israel. The reason that this is could be significant is that while the UK conforms with the late-introduction guidelines for allergens, Israel does not. In Israel, children are usually introduced to peanuts at a much younger age–typically around 8 months.

The study’s null hypothesis is that early exposure of peanuts does not provide an immunity to peanut allergies, while the alternative hypothesis is that it does provide an immunity to the allergens.

peanut_butter

The study’s results showed that the percentage of UK children surveyed with an allergy to Peanuts was 1.85%. The Israeli school children with peanut allergies was substantially lower, at 0.17%. So according to the conclusion of this study, early exposure to peanut products does in fact build up some kind of immunity to peanut allergies. Of course these results could be due to third variables, such as parent’s disregarding the guidelines, as well as genetic predispositions. Additionally, by limiting the demographic of the study so much (to only Jewish school children), the widespread applicability of this study.

Furthermore, the study is observational, as the researchers did not manipulate the early introduction of peanuts to any of these children, as that would probably be unethical. This is an area that could certainly use more research to further explore it. More in-depth questionnaires could be administered, asking parents of subjects when they had decided to introduce their children to peanuts.  Surveying a wider range of demographic of people would also help strengthen further studies in this area.

This study has, however, changed the way that some people think about the early introduction of peanuts to children. The American Academy of Pediatrics now recommends that parents need not wait the traditional 3 years to introduce their children to Peanut products, but still do not recommend introducing them prior to 6 months of age. So at least for now, there is little reason to fear introducing young children to peanut products.

Picture 1 Source

Picture 2 Source

Picture 3 Source

Traditional AAP Recommendation Source

Allergy Statistics Source

Dual-Allergen-Exposure Hypothesis Source

Study Source

Updated AAP Recommendation Source

Do Video Games Make People Sexist?

Ever since their creation, video games have gotten a bad rap. The media has accused them of everything from not being art to causing real-life violence. It’s amazing to see  the amount of controversy caused by an industry that has surpassed even movies in terms of profits, becoming America’s most popular form of entertainment.

Recently, in the wake of more widespread awareness to political correctness, video games have been accused of forming sexist behaviors in their players. The basis for these claims comes from the stereotypical portrayal of women in video games. These animated women typically are constituted by their flat characterizations and exaggerated features, most notably their breasts and behinds, both of which are usually enlarged to a size much greater than what is considered realistic. Examples of these types of women are all over the medium, from Lara Croft of the Tomb Raider franchise to Cortana of the Halo franchise, pictured below.

cortana_h4

Advocates of the theory that video games are conducive to sexist behaviors in their players, like the researchers who performed this study, believe that games with these types of female characters reinforce a possible lack of empathy in their players, in that they do not empathize with the female stereotypes in the games, which then translates to a lack of empathy for women in real life.

This study was conducted by using 154 Italian high school students, who signed themselves up voluntarily. They were randomly split in to 3 separate groups, one of which was told to play Grand Theft Auto, another played Half-Life, while the third played either a pinball game or a puzzle game, each for 25 minutes. The idea behind choosing these games was that GTA was both violent and sexual in nature, while Half-Life was violent but not sexual, and the third group’s games were neither violent nor sexual, which meant that group was essentially the control. The participants were not told ahead of time what the researchers were testing for, so it was a blind trial.

The null hypothesis for this study would be that playing violent and sexual video games do not cause a lack of empathy towards women, while the alternative hypothesis would be that playing violent and sexual video games does in fact cause a lack of empathy towards women.

After playing these games, the researchers asked participants several questions about what they had just done, as well as about masculinity and attitudes towards women. They were given statements that they were supposed to rank on a scale of 1 to 7 how much they agreed with them.

screen-shot-2016-10-19-at-6-35-37-pm

They were also shown this picture and were asked to rate on the same scale how much they believed the woman depicted was suffering. When the results came back, it was found that the people who had played Grand Theft Auto had much less empathy for both the women in the games that they played, and also the women pictured based on an average of their scores from 1-7 for each question. Based on the results the researchers involved in this study rejected the null hypothesis in favor of the alternative hypothesis.

Another study was recently conducted in which participants were found over a phone survey of 50,000+ people who were at least 14 years old. The survey asked them about how often they played video games. Of those surveyed, 12,587 said they played frequently. From those 12,000+ respondents, 4,500 were chosen randomly to participate in the first wave of the study, and an additional 902 were chosen to participate in the third wave of the study, which was the only other wave to ask questions about the roles of each gender.

This study would seemingly have a similar null and alternative hypothesis to the previous one. Although whereas the first study was measuring participants empathy towards women, this one was measuring the more nebulous “sexist attitudes” of the participants. With that in mind, this study’s null hypothesis would be that frequently playing video games does not cause players to develop sexist attitudes, while the alternative hypothesis would be that it does.

kids-playing-a-video-game

Like the previous study, participants were asked to rate how much they agreed with statements on gender, this time on a scale from 1-5. The results of this study seems to indicate that the frequent playing of video games, does not cause player to have sexist attitudes. This was consistent in both males and females.

The two studies, while measuring slightly different things, seem to support diametrically opposing conclusions.  Why could this be? Well, for one thing, the results of either study could be due to confounding variables such as the attitudes of the participant’s friends and family prior to their participation. For another reason, it is important to look at the design of the studies.

The first study, while it has a smaller sample size, chose to go about it’s procedure in a more experimental manner, having participants all play the same games for the same amount of time, rather than just asking how often they played video games. Their questioning seemed to be more in depth and more varied as well, and their research question more focused.

The second study has a much larger sample size than the first one, to its credit, but it seems far more limited in its reach. It is an observational study, which gives researchers a little bit less control, and it also measures a less specific problem, which could lead to some inaccuracy. Additionally, as the surveys were conducted over the phone, the study admits to abbreviating their scales to make answering the questions easier. So while both studies have some flaws, I believe that the second study to be better conducted.

Keep in mind though, that the second study was not necessarily measuring if the games made the players more sexist, but rather if they lessened their empathy towards women. Both studies could use some meta-analysis to check for problems such as the Texas Sharpshooter problem, but for now, it would seem that video games are at least partially guilty of one of the many charges leveled against them.

Picture 1 Source

Picture 2 Source

Picture 3 Source

Accusation of being not art source

Accusation of Violence Source

Study One Source

Study Two Source

Do weight loss apps really help people lose weight?

Americans have long struggled with obesity. According to the National Institute of Diabetes and Digestive and Kidney Diseases, a division of the National Institutes of Health, in 2010, 74% of American men and 64% of American women were either overweight or obese. This number has been multiplied by more than two since 1962, a fact that leads many to label this trend as being an epidemic, the cure for which is unclear on a large scale.

On a much smaller scale, people can choose to use diet and exercise to lose weight. Many people consider this to be a goal they are currently pursuing. It’s something that always comes up in conversations about New Years resolutions, especially ones that people quickly abandon by January. But why do people have such trouble with this concept? It should be just a simple process of eating less calories than you need to function. But sometimes this is difficult, as it is hard to tell how many calories are in each specific foods, or even how many are needed for daily functionality, which is why many people turn to fad diets that promise quick results.

Some people choose to go the traditional route, however, and in the digital age, there are several apps that can help people in doing this. Apps such as MyFitnessPal allow users to input the foods they eat into the app, which will in turn calculate what foods are highest in calories, and total their calories for the day. This is to help users keep track of how many calories they need to cut to lose weight. That is, it’s supposed to in theory. But does it really?

weight-loss-apps

A study was recently conducted figure out if these apps had effect made on the weight lost by young people. The study was conducted as a randomized control trial of adults aged 18-35, who were deemed suitable for the study if they were considered obese or overweight and had a cellphone.  From this criteria, the researchers were able to obtain the participation of 365 people, who were broken down into three groups.

The first of these groups was the group using the weight loss app to help keep track of the calories they consumed. The second group was given access to personal coaches that would give the students recommendations on what kinds of foods to eat, and help them stay on track in their weight loss goals. These coaches would perform a similar function to the apps, but with the difference being in that they were able to create a human dialogue with the participants about their weight loss, a dialogue that the app could not reproduce. The third and final group was the control group. Each group had an approximately equal population, at 120-123 people.

The null hypothesis of this study would be that weight loss apps do not contribute to the successful loss of bodyweight The alternative hypothesis of the study, however, would be that weight loss apps do in fact contribute to the successful loss of body weight.

Each of the three groups was measured in their weight at the begining of the study, and then again at 6 months, 12 months, and 24 months. 86% of participants responded al the way until the end of the study, making the end result mostly representative of the original sample.

oby21226-fig-0002

At 24 months, the subjects who had access to the personal coaches, designated as PC on the graph above, has lost an average of 2.45 kilograms. The control group had lost an average of 1.44 kilograms, and the users of the weight loss app, designated on the graph as CP, lost an average of .99 kilograms.

The conclusion of the study would be forced to accept the null hypothesis of the study, that weight loss apps do not contribute to the successful loss of weight. The data supports this hypothesis, as those using the app lost less weight than even the control group, who had no guidance on how to lose the weight. This data makes sense when one considers how these apps work. Users have to input the foods they eat manually, and then the app calculates how many calories are in each food. This could prove to be inaccurate, however, as users may not input a specific enough food, or the correct amount of servings, causing the app to calculate a different amount of calories than they actually consumed. These incorrect calorie counts could lull users into a false sense of security, fooling them into thinking that they can eat more than they actually should.  In short, the app could suffer from user errors, causing calories to reported incorrectly.

The false sense of security provided by inaccurate calorie counts is probably what caused the app users to lose even less than the control group. With a personal counselor, these kind of errors are a little less likely, as people have the ability to discern when the subjects make mistakes, unlike the app.

The results of the study could still be due to third or confounding variables, such as the determination of each individual person to lose the weight. The study’s design as a randomized control trial, however, certainly makes this less likely, as participants in one category are probably no more likely to be more determined than another due to their random divisions. Another possible problem in the study could be found in the relatively small sample size of only 365 participants.

This study seems to be one of the first of its kind, although its likely that other studies in the field may have suffered from the file drawer problem, especially due to the negative nature of the results. Previous studies may have held off on being published due to their similarly negative results. Further studies, or meta analyses, of this one could be conducted, and probably should, to verify if this study does in fact hold true. A possible improvement could be made by adding more participants to the study to get a larger sample-size, and thus a clearer picture. For now though, there seems to be no reason to rely on weight loss apps to support you on your weight loss journey.

Picture One Source

Picture Two Source

Study Source

NIDDK Source

Obesity Epidemic Source

MyFitnessPal Homepage

Link

My name is Trevor Dennehy, and I am a Sophomore Political Science Major and film studies minor. I probably plan on going to law school after graduating in a few years. The minor in film studies is just because I am a pretty big film buff. Here is a memorable clip from Network, one of my favorite movies.

A70-4808

Poster for Network (1976)

I’ve always been fascinated by science, in fact I used to have aspiration to be some kind of scientist, back in elementary and middle school. But the I got to high school, I began to realize science wasn’t just doing fun experiments about how the world works, in fact, it involved a good deal of Math, a subject that I have always disliked. On top of that, a Chemistry teacher of mine basically ruined the class for me, by incorporating zero labs, (except one time, when we literally cleaned test tubes) and instead spent the class lecturing and making us memorize numerous ions, none of which I remember today. So, that’s probably why I’m not a science major.

I took this class mainly because of my interest in the exploratory nature of science, rather than the math aspects. This seemed like a class that focused more on more of what I was interested in. Plus, it fit nicely into my schedule.