Ever since we were little, our mothers have pushed us to eat our vegetables so that we would grow up to be strong and healthy. So now that we have a choice on what to eat, we still try to choose some fruits and vegetables to balance our diets and receive enough nutrients. But even though we are eating what seems healthy, is it actually healthy if it is not organically made? Many people argue the fact that organic produce is healthier than conventionally grown produce. But why is this? And is it true?
Many studies done by universities, including Washington State, have done research that proves organically grown produce is not only healthier for the consumer, but for the earth itself. Organic farms replace nutrients in their soil where they grow their products. But industrial farms tend to just suck up all the nutrients from the soil without replacing any of them. This causes the soil to erode and eventually turn into infertile soil oozing with chemicals. Another way industrial farms negatively impact the environment is by all the pesticides they put in their produce. The excess chemicals runoff from the farms into rivers which cause the fish to become infertile and the river ecosystem to die of the poison.
When consumers hear the word organic, they think “clean”. Organic products have not been sprayed with any sort of pesticides or chemicals, so this gives them a natural look, feel, and taste. Plus, added hormones, and pesticides are not healthy for the human body. The food we consume should give us enough nutrients and vitamins without the help of hormones and added chemicals. A study by Washington State University experimented with organic and conventionally grown strawberries to see which had more nutrients. And in fact, the conventionally grown strawberries lost some of their vitamin content due to the added chemicals.
So in fact, it is true that organic produce is healthier for the human body, and for the environment itself.
http://www.grassrootsonline.org/news/articles/five-reasons-organic-food-good-you-and-planet
I think there is so much recent push toward organically grown foods, but I think the problem is not that Americans don’t want to eat healthier, it’s that shopping organically takes significant time and money, which many people just don’t have. I’ve also always wondered why organic foods need to be more expensive. What are they doing to the products to warrant such a significant increase in price?
A student in a class i took over the summer did this same topic for a short speech and had a totally different opinion. She stated that just because it was organic does not mean it is good for you. Is this still up in the air or is organic definitely the better choice?
I do believe there is truly a benefit to eating organic fruits and vegetables over conventionally grown produce. As healthy as eating an apple may be, it immediately loses some of it’s nutrients when the chemicals and pesticides are added. But just because organic foods are good for the earth, does that mean they are naturally as good for us? David Klurfeld who is chairman of the department of Nutrition and Food Science at a University in Detroit states that “There’s really very limited information in people on actual health outcomes with consumption of these products”. His full review can be found in this article (http://www.webmd.com/food-recipes/features/organic-food-better). So before you break the bank in order to switch to a healthier lifestyle, consider all opinions and options!
This is interesting to me because I like to think of myself as a healthy eater. Now I am questioning whether I am actually eating healthily or not. After this post, I will definitely try to eat more organic foods because it not only positively impacts or bodies, but also the earth!