Daily Archives: October 3, 2016

Does cracking your knuckles cause arthritis?

Ever since I was a little kid, I have cracked my knuckles as a nervous habit. I’m not a nail biter, nor a hair twister; I push down on the joints of my fingers until I hear that satisfying “popping” sound that calms my nerves. Because of this habit, I have been told by everyone I know countless times that I will develop large knuckles or arthritis in my fingers later on in life. After several years of cracking my knuckles, I have yet to develop swollen fingers or any pain in my joints which leaves me to question whether there is a correlation between cracking your joints and developing arthritis.

Image result for cracking knuckles
Image found here.

The “popping” sound we hear when we crack our knuckles comes from the bursting of gas bubbles located in the synovial fluid which surrounds our joints. This occurs when we press on the joints or pull and stretch the bones apart (Harvard Health Publications). Research shows that anywhere between 25 and 54 percent of the population cracks their knuckles, the habit being more common among males than females. Because of this phenomenon, several studies have been done to find out whether this habit creates any risks to our bodies (Medical News Today).

One experiment was published in 1998 by Dr. Donald Unger. He reported that he used his right hand as the control while he cracked the knuckles on his left hand twice a day for 50 years. It is estimated that his left knuckles were cracked up to 36,500 times. After the 50 years it was found that neither hand had arthritis nor were there any significant differences between the two hands. This study concluded that knuckle cracking did not cause arthritis (Medical New Today).

However, while cracking your knuckles may not lead to arthritis in your hands, it may correlate with swollen fingers and a weak hand grip. Further studies conclude that 84 percent of chronic knuckle crackers develop swollen joints later on in life compared to a mear 6 percent of non- crackers (Samuel Merritt). Now, does this mean that cracking your knuckles directly causes swollen joints and decreased hand strength? Or is there another variable that creates these harmful effects? Perhaps many knuckle crackers are often people who use their hands a lot such as manual laborers, typists, writers, or painters. This would therefore mean they feel the need to stretch or “relieve” their fingers more often and so, they crack their knuckles. This wouldn’t mean that the knuckle cracking directly causes the swelling or weakness, it would be due to the confounding variable such as their occupations.

Another idea to consider would be the idea of reverse causation meaning that arthritis and swelling of the joints causes people to crack their knuckles. This idea obviously makes no sense and is impossible therefore, it can be ruled out. As always, chance is a possibility. It may be a complete coincidence that people who crack their knuckles do or don’t develop arthritis or swollen joints.

In conclusion, studies have reported that cracking your joints does not lead to the development of arthritis. It is still undecided whether it causes swelling and a weakness of grip in the hands but there is a correlation between the two. For now, I would say that cracking your knuckles creates no real harm to your body and may just be an annoyance to the people around you so, continue to “pop” away.

 

 

 

Is Sitting the New Smoking?

Think about how much time you sit during your day. If you are like the average college student, you are likely sitting for long periods of time sitting in classes, studying for tests, and writing blogs. A friend of mine was recently expressing his opinion to me that sitting is like the new smoking. As a curious (and worrisome) individual, I wanted to see for myself whether sitting for long periods of time can have a negative effect on our health. I wondered, is there a correlation between sitting too much and negative health effects such as early death? And if so, could the correlation be causal or is it due to chance?

1111Figure 1

First, I needed to find a non-biased, recent study that could at least give me a better understanding to what we are looking at. I first looked at the US National Health and Nutrition Examination Survey that studied people’s actions throughout their day using accelerometers. According to this study, U.S. adult citizens spend on average more than half of their day in a sedimentary position.  Below is a chart depicting the average time one spends in a sedimentary position.2222Figure 2: http://www.sciencedirect.com/science/article/pii/S0168822712002082

Next I looked at a study conducted by Canadian researchers studying the link between sitting time and mortality rates. The study accounted for multiple possible confounding variables such as physical activity, body mass index, smoking status, and alcohol consumption. As seen in figure 3, the results show a particularly strong correlation between time spent sitting and survival rate over a span of 14 years.

The P-value in this particular case was determined to be less than 0.0001. This study makes a strong inference that the link between sitting for too long and early death is causal because it studied a large group of people (17,013 Canadians), accounted for multiple potential confounding variables, and has a considerably low P-value.

Although the researchers have conducted what seems to be a thorough and convincing study, more studies should be taken into consideration in case of the extremely small possibility that the correlation between sedimentary position and higher mortality rate are due to chance.

I do not have the time nor the resources available to me to conduct a thorough meta-analysis. However with that being said, meta-analyses can be very helpful in proving correlation between two possible occurrences. Australian researchers included six studies ranging from 1989-2013 in a thorough meta-analysis. In total, more than 595,000 people were involved with the studies, accumulating more than 3,500,000 person-years of data (Chau). The study concluded that it is very likely that sitting too much decreases one’s life-span. However, it should be noted that no P-value was made evident.3333Figure 3: http://www.ergotron.com/portals/0/literature/other/english/acsm_sittingtime.pdf

Though the mechanism is not clear, some scientists believe that when muscles are not used frequently, lipoprotein lipase activity, an essential process of the human body becomes suppressed (Dunstan).

Although the effects may not be immediate, there seems to be a positive correlation between sitting and negative health effects. The more time spent sitting per day generally associates you with a higher likelihood of developing serious health issues and earlier mortality. After looking into this myself, I will attempt to stand up and move around to break up my studying. So next time you are cramming for a test or trying to write 5 blogs the day before the deadline, make it a point to move around.

Works Cited:

Dunstan, David W. “Too much sitting – A health hazard.” Elsevier, vol. 97, Sept. 2012. Science Direct, www.sciencedirect.com/science/article/pii/S0168822712002082. Accessed 13 Oct. 2016.

Katzmarzyk, Peter T. “Sitting Time and Mortality from all Causes, Cardiovascular disease, and cancer.” . Google Scholar, www.ergotron.com/portals/0/literature/other/english/acsm_sittingtime.pdf.

Chau JY, Grunseit AC, Chey T, Stamatakis E, Brown WJ, Matthews CE, et al. (2013) Daily Sitting Time and All-Cause Mortality: A Meta-Analysis. PLoS ONE 8(11): e80000. doi:10.1371/journal.pone.0080000

Got Milk?

As a child I was always told to drink milk. My babysitter would always make me drink the milk out of my cereal or drink a glass of milk with my bagel. She always said that I needed to drink milk everyday to make my bones strong. Milk is calcium rich but we can get calcium from other food we consume. The real question is: is milk truly important?

If you would search for milk images on the internet, you would probably find a hundred images with seemingly endless celebrities with milk moustaches, known as the “Got Milk?” campaign. It was an ingenious way for the dairy industry to push the importance of milk in our diets. But, is milk a necessary staple in our diet? This blog is not to argue the merits of whole milk versus non-fat milk nor raw milk vs. pasteurized milk. This blog seeks to reveal the true merit of drinking milk, whatever type, at all.

hulk-got-milk

Obviously, milk is essential in the early stages of mammalian growth and development. In fact, the primary characteristic of mammals is the presence of mammary glands to produce milk for their offspring. One important feature of all non-human mammals is that they suckle their young until they are able to become independent (able to eat solid foods). This varies between each mammal with some mammals such as the hooded seal nursing for four short days, while others such as the orangutan may nurse up to seven years! But what does that mean for humans? In the United States, women receive conflicting advice about when to wean their children completely from breastfeeding. The American Academy of Pediatrics recommends one year, while WHO and UNICEF recommend at least two years. No matter the duration, what benefit, if any, is there to consuming milk after the weaning process?

milk-figure-a

According to the Food Guide Pyramid, published by the USDA, humans should consume 2-3 servings of dairy (milk, yogurt, cheese) every day, as we see in the photo above. Why? The USDA says that milk and milk products are the best source of calcium, which helps build and maintain strong bones, preventing osteoporosis. But if milk is so important in receiving the required quantity of calcium for healthy bones and preventing osteoporosis, why do mammals stop producing milk?

Whenever calcium is lacking in the diet, it is pulled from the bones temporarily, until it can be replaced. But this is not what causes osteoporosis later in life. During the aging process, bone is lost naturally.

However, if calcium is consumed at high rates early in life, the amount of bone density lost as we age is reduced. The uncertainty occurs because studies have not yet determined how much calcium is needed to build optimum bones and teeth.

Long term studies suggest no correlation between high calcium intake and a reduced risk for osteoporosis. In a large Harvard studies of male health professionals and female nurses, individuals who drank one glass of milk (or less) per week were at no greater risk of breaking a hip or forearm than were those who drank two or more glasses per week.

Additionally, when researchers combined the data from the Harvard studies with other studies, no association between calcium intake and fracture risk could be established.  American adults may not need as much calcium as is currently recommended. In countries such as India, Japan, and Peru the incidence of bone fractures is very low despite the fact that they take in less than a third of the U.S. recommended calcium intake (for adults, ages 19 to 50).

According to this Harvard milk study conducted by David Ludwig, a professor and pediatrician, milk is only one of many sources of calcium. Calcium can be found in other food sources such as dark leafy green vegetables and some types of legumes. And though the dairy industry boasts the calcium-rich content of milk, studies show that we may need less calcium in our diets then once thought. And if that is true, the calcium we get from a diet rich in green leafy vegetables and legumes may be adequate.

So is milk truly necessary in your diet? Well in my research I found that it is extremely helpful but maybe not as necessary as we are made to believe. Would I advise one to stop drinking milk based on these findings, no. Milk mustache to your hearts content!

 

Works Cited

Dettwyler, Katherine A. “A Time to Wean.” La Leche League International. N.p., 14 Oct. 2007. Web. 3 Oct. 2016.

Harvard T.H. Chan School of Public Health.” The Nutrition Source. N.p., n.d. Web. 03 Oct. 2016.

Hulk Image

Servings Image

Have plants reached their “peak” carbon consumption?

Anyone who knows anything about botany knows that plants absorb carbon dioxide from the air during photosynthesis. Photosynthesis is one of the most important things a plant can do; without it, the plant would not be able to survive and grow. In a study conducted by a journal called Weather, botanists found that the consumption of carbon in plants located in the Northern Hemisphere peaked in 2006, and has been steadily declining in each subsequent year since. The results of this study have come as a shock to many people in the world of botany; according to Cosmos Magazine, this peak was projected to take place at some point after 2030. Not only is the amount of carbon consumed by plants decreasing, it’s decreasing quickly. The decrease in carbon consumption from 2013 to 2014 was equal to the amount of carbon omitted by humans in China for a whole year.

Image result for climate change

Diagram that illustrates some of the many negative impacts of climate change on the Earth (epa.gov)

Not only are these findings shocking, they are also alarming and signal a big problem within our world. If plants aren’t absorbing as much carbon dioxide during photosynthesis, that means more and more COwill stay in the atmosphere. As a result, the process of climate change, which is a hot-button issue in society today, happens at a quicker rate of speed. These findings bring up several questions: why are plants absorbing less CO2, and how can we help stop this alarming trend?

Figure 1.

Graph showing the increase, peak and subsequent decrease of carbon dioxide consumption of plants – a shocking & alarming finding that could cause big problems for the planet (Weather)

According to the study conducted by Weather, plants are consuming less and less carbon dioxide because to put it simply, more COis being produced by humans every year. The findings of the study show that 30% more COis produced by humans every year. According to the United States Environmental Protection Agency, this means a COproducer the size of China is added to the Earth every single year. Since these extremely high numbers demonstrate that the Earth’s plants simply cannot absorb COas much as it could at one point, more measures need to be taken to cut carbon omissions in the atmosphere. This is an issue that has garnered mainstream media attention, with celebrities like Leonardo DiCaprio urging people to take care of our planet and help slow the process of climate change. This study is another convincing piece of data to show the world that climate change is truly one of the biggest issues facing our planet today.

Leonardo DiCaprio (right), pictured here with two elephants and two other environmentalists, is one of the most famous people in the world to actively help maintain the Earth’s environment (Twitter/@LeoDiCaprio)

See you in 7 years…or sooner?

We’ve all been there…that terrifying moment when you’re aimlessly walking around and suddenly you feel a punch in your throat and then what feels like a never ending boulder rolling down to you stomach. That’s right, swallowing gum. One of the worst feelings that I bet has caught almost everyone on this blog by surprise.

Kid’s blog about swallowing gum

 

 

I know you’ve heard the myth that you won’t see that piece of gum for another 7 years, but is that really true? NO!

According to Michael F. Picco, M.D., certified in gastroenterology and a consultant for Mayo Clinic in Florida, swallowed gum does not make a home in a person’s stomach if swallowed. It cannot be digested along with food that has been swallowed, however, it can be moved throughout the large and small intestines and in most cases, will exit the body in a short period of time.

For the kids who love their gum, it’s important to make sure that they know the reasons why its called CHEWING gum, not SWALLOW gum. The American Academy of Pediatrics (AAP) recalls a case of a young boy with severe constipation, found to be caused from an extreme amount swallowed chewing gum). It was not one piece of gum that did this, but the behavior of the child and his habit of constantly swallowing the gum he was given. This can be the rare case  if you do not chew gum properly (or at all).

If there were to be an experiment on this topic, researchers would either have to force children to swallow large amounts of gum, creating constipation, have children not swallow gum, and also have kids chew gum at their own rate. There would be a null hypothesis of nothing happening when the children swallowed different amounts of gum, as the gum would just come out of their body in stool a few days later. However, on the other hand their could be an alternative hypothesis where the gum could create a great build up in the intestines and harm the body. 

The National Center for Biotechnology Information (NCBI) says that since a chewing gum type substance has been around since the time of the cavemen, there would be more information on the topic if it was really though to be doing  a ton of harm. But since the only foreseeable outcome is bad constipation for a short period of time if a child swallows a lot of gum, then there is not much research left to do.

So, we know gum doesn’t stay in your system for 7 years. But what happens to it while it is inside you

While the food is being digested and all the good parts of food (and gum int his case) are being absorbed into the body, the rubber part of the gum is left sitting in your stomach, immune to the acids inside you. So when it does come time for the gum to pass through, it is nothing more than rubbery remnants  of the gum that once was.

So the bottom line, don’t swallow chewing gum. It;s meant to be chewed and spit out. Although it won’t reside in your intestines for 7 years, why risk the potential uncomfort of constipation and the everlasting pain that swalowing gum comes with. If you need ot put your gum down for a while, you can always stick it behind your ear. 😉

demonstration of how to porperly chew gum

Searching for a maniCURE

Growing up, I dreaded getting my nails done. I am the daughter of an ex-manicurist, so often my dining room table was turned into a makeshift salon table. Though this would typically seem like a Libby Lu-obsessed tween girl’s dream, there was one issue; I was a nail-biter.

My nails were never “nice”, and even if they were polished, I would still bite them. My mom tried everything on me; specially formulated “no-bite” polish, acrylic nails, gel nail polish, yet nothing worked. So I wonder, what really causes nail biting, and is there any way to kick the horrible habit?

1467126886-nail-biting-main

Why do we bite our nails?

It seems that many impulsive behaviors, such as nail biting, are a result of primitive instincts relating to grooming.

Nail biting can be triggered by a number of things. For a compulsive nail-biter, even mundane everyday activities can trigger a nail-biting binge. So much so, that the latest edition of Diagnostic and Statistical Manual of Mental Disorders included Obsessive Compulsive and Related Disorders, which includes pathological grooming such as nail-biting. This pathological grooming has also been seen in mice, leaving many with severe hair loss.

Another reason why people bite their nail is simply because it is satisfying. Many use it as a stress reliever or it can be triggered by anxiety.

Why stop?

Nail-biting is not only a nuisance but it can also be very dangerous. The habit increases the risk of infection and sickness by leaving your body susceptible to bacteria and germs. Yet, as we discussed during class in regards to the hand washing experiment, it is unclear to how much of the bacteria on our hands is actual harmful.

You know you want to stop, but how?

A 1973 study showed that habit-reversal showed to be widely effective in decreasing nail biting. Though the trial was successful, when looked at closer it is seen that this specific study only involved 12 people. As Andrew taught in class, such studies no matter how effectively preformed do not provide sufficient evidence to responsibly validate a claim.

However the scientists involved in this particular study recognized that fact and preformed the study again in 1979, this time with 97 participants. Habit reversal, once again, proved superior, reducing nail biting by 99% throughout a 5-month observation. Though when considering what we have learned in class, I still find this data to be insufficient, since it involved less than 100 people and only followed their progress for a short period of time.

Scientists today are even going as far to observe the effect of pharmaceutical drugs on reducing nail biting. A 2013 double-blind randomized placebo trial found that a drug called NAC, significantly reduced nail-biting in children. Though this study is an experimental, controlled study, it is another small study observed only over a short period of time and had a large drop out rate. As it was instilled in us in class further testing is required before a solid claim can be made.

577172_51e7

So what now?

For me, my nail-biting habit was insignificant. I grew out of it by high school, and now the nasty habit only comes back during finals and other times of high stress. However that is not the case for many.

As shown through the presented studies there has yet to be a clear cure for nail-biting. Leaving many compulsive nail-bitters turning to domestic remedies, such as regular manicures, to kick their habit. Some even go as far as hypnosis, of which is exemplified in this video.

Maybe one day we will find a cure, but until research catches up we will just have to deal with a society of short nails and wasted manicures.

 

Picture Links:

Picture 1

Picture 2

Aspartame: Is it really cancerous?

As a fan of Diet Coke, I have always vouched for its zero calories and no fat product over its companion, coca cola. The carbonation and sweet taste have always brought my taste buds pleasure. However, I have always been curious as to how Diet Coke is able to taste so good with a non-existent calorie count on the label. I recently discovered the answer to my curiosity; the product used to create such appealing taste is a controversial artificial sweetener known as aspartame.

Aspartame has recently been labeled as a potentially cancerous product, turning many away from products where its name is shown on the ingredients label. It is frowned upon by household health freaks, but are their concerns valid?

Aspartame is an economic blessing for large brand producers. According to the American Cancer Society, the artificial sweetener is much more effective in producing a sweeter taste with less quantity than using actual sugar. Therefore, because a small amount of aspartame produces the same taste as an exponential amount of actual sugar, the calories within the product decreases dramatically. Naturally, from the mindset of a giant producer such as the Coca-Cola Company, this product is appealing financially. However, as much as aspartame is beneficial to Coca-Cola Company, it is potentially just as much as a hazard to the health of all who consume it.

aspartame

As aspartame became more and more popular for industries looking to save money by mimicking sugar with its more efficient counterpart, researchers became curious about the side effects for this artificial sweetener. The American Cancer Society article refers to two studies done to determine the potential health risks involved in consuming aspartame. The article refers to two studies one experimental and the other correlational.

 

Image found Here

Within the experimental study, researchers used rats and gave them copious amounts of aspartame to see the effects it had on their health. According to the article, there was no link between the rats’ health and the amount of aspartame they were exposed to.

As for the correlational studies observing humans, the majority of studies have shown that there is no cancerous effect of aspartame. A study in 1970 shows a potential correlation between the two, but confounding third variables such as the diets of those studied in 1970 versus 2016, make the argument shaky at best. The article in the American Cancer Society website makes reference to two large studies that examined upwards of 500,000 adults where the same result was concluded, aspartame most likely, since nothing is definite, not causing cancer.

As a result, I strongly urge people who refuse to drink Diet Coke and consume products labeled with aspartame as an ingredient to think twice about writing these products off as unhealthy. Unlike the correlation between lung cancer and smoking as discussed in class, as of now, there is no definite link between cancer and aspartame. Therefore, being a rational person, I will continue to drink and enjoy my favorite soda without worrying about the faux health risks of aspartame.

 

Do Celebrity Endorsements Promote Unhealthy Eating?

Whether we like it or not, every time we turn on the TV, click on a YouTube video, or wait for a news article to load, we come face to face with advertisements featuring celebrities. As we know, celebrities endorse all types of products, from clothing brands to cleaning products to bath and beauty, and everywhere in between. One of the largest markets with celebrity endorsements is the food market. Beckham at Burger King. The Manning Brothers selling Oreos. Sofia Vergara donning red lipstick and sipping on a Diet Pepsi. As entertaining as all of this celebrity screen time is, could it be having a negative effect? Consumers are enticed by the sight of their favorite celebrities, making them more willing to buy a product. But could all of this celebrity endorsement actually be a cause for unhealthier eating habits of consumers?

The direct causation would be celebrity endorsements causing unhealthy eating habits. Since a consumer’s unhealthy eating habits cannot cause celebrity endorsements, we can rule out reverse causation. However, some confounding variables such could have an effect. For example, people who watch more TV are likely to be more inclined to snack on junk food in the first place.

I’ve thought out two hypotheses that could be the outcomes of any experiment conducted to answer my question.

  • Null hypothesis—Celebrity endorsed advertisements do not cause unhealthier eating habits.
  • Alternative hypothesis—Celebrity endorsed advertisements cause unhealthier eating habits.

A study performed by the NYU Langone Medical Center last June follows the idea of my question. Researchers at NYU have discovered a correlation between the frightening rise in childhood obesity and junk food advertised by well-known musicians and pop stars. They conducted a study, discovering that very few celebrities endorsed natural foods like fruits, vegetables, and unprocessed grains, but high numbers of famous people endorsed snacks, junk food, and chain restaurants.

The researchers began conducting their study by gathering data on musicians. To do so, they looked through Billboard’s Top 100 from 2013-2014, then analyzed each artist’s popularity with the youth. They discovered that 65 of the 165 celebrities picked covered 57 different food product endorsements. The scientists then researched the food brands endorsed by celebrities, finding that 21 of 26 products were considered unhealthy. However, this study was merely research. No experiments were conducted. This study also cannot be taken as gospel, because “unhealthy” in an observational sense is subjective.

Luckily, in 2006 the University of Liverpool conducted an experimental study with a similar hypothesis. In this study, 181 children from ages 8 to 11 were asked to view a 20-minute cartoon with commercial breaks containing one of three advertisements or normal footage of famous British TV sports anchor, Gary Lineker. The three commercials advertised Walker Crisps (endorsed at the time by Lineker), another unnamed snack, or a children’s toy. While they watched the show, the children were given two different bowls of chips, one marked “Walker Crisps” and the other labeled “Supermarket Crisps.” Although both bowls were actually filled with Walker Crisps, the children who viewed footage of Lineker or the commercials for Walkers Crisps ate more of the bowl labeled “Walkers” than those children who watched other commercials.

celebrity-endorsements-effective-at-increasing-children-s-junk-food-intake-study_strict_xxl

Gary Lineker with his Walkers Crisps

This study is interesting because it provides evidence that children are affected by celebrity endorsements even when the commercialized product isn’t being broadcast to them directly. Like we discussed in class, there is still room for chance, especially considering there is no data on the p value of this experiment. However, while this one experiment cannot prove for sure that celebrity advertisements cause unhealthier eating habits, we can throw out the null hypothesis, because there is evidence of a correlation.

What I’ve taken away from researching this question is that I should be more careful about how much I allow celebrities and the media in general to influence my purchases, especially food products. I take pride in the healthy lifestyle I try to live, and holding myself back by falling for advertising tricks would be foolish. Even though chance is always an option, the two studies I’ve looked at lead me to believe there’s a strong correlation between the unhealthy food consumption and celebrity backings. However, before we could be completely certain in the alternative hypothesis, we would need to go more in depth with our testing of it. An interesting way to take this study further would be to have participants track their eating habits of celebrity endorsed, unhealthy products and record changes in their health over an extended period of time. Researchers would then compile the data, observing whether or not the food influenced a declining healthy lifestyle.