Author Archives: Matthew O'Brien

Does your eye color affect your vision?


Josh Hamilton

Have you ever been a situation where you needed to smile for a picture outdoors but just could not help but squint because of the sunlight? For the longest time, I accepted that this was just some sort of strange idiosyncrasy of mine that prevented me from being very photogenic outdoors, until a story about baseball player Josh Hamilton made me think otherwise. Hamilton, a former professional baseball outfielder, had far superior batting statistics when playing at night (a more than 100 point difference)- a disparity that he attributed to his blue eyes in the sun. Is there legitimacy to his claim? Is eye color a factor that affects properties of sight?


Admittedly, I am not the first person to raise this question, but it is rather difficult to find any enlightening scientific data on this topic. This study, however, provides strong experimental backing to show a relationship between macular pigment and visual performance in glare conditions. By studying reactions to light stimuli in 36 healthy adults, researchers were able to determine that higher macular pigment densities led to a greater handling of glare. These findings can be synthesized with another study that provides the link between macular pigmentblue-eyes density and eye color. Researchers were able to associate lower densities with lighter eye colors. The likely explanation for this is that people with lighter eye colors are more stressed in glare conditions, and that additional stress accelerates macular degeneration. These findings are consistent with the idea that light-eyed people might be at a disadvantage with regards to their vision clarity.


But what about other aspects of good vision? This is where the data ceases to support any kind of eye-color induced eyesight disparity. There are no studies that support the notion that people with a particular eye color have better vision. There is a study, however, that suggests that different eye colors might have advantages in certain areas of sight. For example, the study found that generally speaking, dark eyed individuals perform better at reactive tasks such as hitting a baseball while light eyed people slightly outperformed in self-paced activities such as hitting a gold ball. This subtle difference might support Josh Hamilton’s claims slightly, though the study ultimately concluded that there were no significant differences.

This issue can also be looked at with statistical analysis. In this article, the author attempts to demonstrate a statistical relationship between eye color and a night-day batting performance brown-eyesdisparity. He began by determining that on average for all players, there was no significant difference in batting numbers between day and night games. He proceeded to analyze whether this held true for batters of specific eye colors. The statistics show that if anything, dark eye color slightly hindered performance, not aiding it as Josh Hamilton would suspect. The data support the idea that vision is largely unaffected by eye color.

Take away- It is true that people with lightly colored eyes might experience a slight disadvantage due to glare, but other than that there is no reason to suspect that having a certain eye color will come with any kind of eyesight advantage or disadvantage.

Note: The pictures themselves are links to their sources.

Can thermonuclear warheads protect Earth from Asteroids?


Comet exploding over Russia

Sixty-five million years ago, the fifth and most recent mass-extinction famously brought an end to dinosaur kind. It occurred when a massive asteroid about 10 km in size impacted the Earth by the modern day Yucatan Peninsula. While it is certainly unlikely that Earth will be hit by an asteroid large enough to cause a mass-extinction event any time soon (experts predict odds of about 1 in 600,000), the topic of asteroid avoidance is taken very seriously by world governments and space agencies. Perhaps this is because modern humanity is no stranger to the perils of what comes from beyond the atmosphere. On June 30, 1908, for example, the Earth was taken by surprise in what is now referred to as the Tunguska Event. Out of nowhere, an explosion more than 100 times as powerful as the atomic bomb dropped over Hiroshima decimated 2,000 square kilometers of forest in rural Russia. Residents 35 miles away reported having been blown off of their feet. Most recently, in 2013, 1,500 people were injured after a meteorite exploded 12 miles above Chelyabinsk, Russia. Scientists did not see it coming. Since then, the topic of thwarting credible threats from medium sized space rocks (164-492 feet in diameter) has been the subject of many studies and proposals. The questions I pose are: is it possible to use nuclear weapons to thwart these threats? Can they be deployed effectively on short notice? Is this idea actually being considered?

Who is considering this?

In 2005, the United States Congress mandated that NASA develop strategies to detect and subsequently deflect or destroy incoming meteors that can potentially cause loss of life or serious damage on Earth. In March of 2007, NASA presented Congress with a detailed report outlining plans to coordinate with other government agencies to improve detection and response capabilities. The report concluded that nuclear scenarios are 10 to 100 times more likely to be effective against asteroids when compared to any non-nuclear option considered. Itasteroid did express concern, however, that destroying the target could be potentially hazardous as the broken pieces might still head to Earth. To account for this, other options involving nuclear detonation should be considered. For example, computer simulations have suggested that detonating underneath the asteroid could potentially alter its path to miss the Earth depending on its distance and trajectory at the time. The significance of this NASA report is that they do in fact feel that nuclear weaponry is our best hope at thwarting these types of threats should we need to in the near future.

Feasibility and deployment:

Nine countries command a combined stockpile of 15,375 nuclear weapons, roughly 2,000 of which can be deployed on a moment’s notice. While these facts certainly have some frightening implications for mankind, perhaps they can save lives without every having to take one. These bombs vary in their yields and sizes and can be chosen based on which best suits the situation. Blast yield is not the biggest concern, as we have seen weapons with yields of 100 megatons of TNT developed before (The USSR’s Tsar Bomba). The current problem is with regards to delivery methods. Modern day Intercontinental Ballistic Missiles cannot feasibly be shot into space at an asteroid successfully, and the famous human sacrifice scene from the movie Armageddon is certainly not ideal. According to researchers at Iowa State University, deflection is not possible if


Tsar Bomba

we are given short (several weeks) notice of an impending collision. Fortunately, they are not without a potential solution. They pitched a design for what they call a Hypervelocity Asteroid Intercept Vehicle (HAIV). With this idea they managed to solve two key problems: first, that the bomb would be far more effective if denoted inside the asteroid rather than on the surface and secondly that a high speed impact my diffuse the nuclear warhead by failing to ignite. They solved them with a two piece spacecraft. The first piece would smash into the asteroid, creating a crater. It would be followed by the nuclear warhead which would detonate in the crater immediately before a disarming impact. The lead researchers, Brent Barbee and Bong Wie, claim that $500 million is needed to test it, and so far it has gone unfunded. NASA has acknowledged the idea however, and has expressed enthusiasm at the idea of further collaboration.


Getting hit by asteroids is not something that should keep us up at night, but the thought that we can use one of our thousands of nuclear weapons to save us in a doomsday scenario is quite captivating. This is not science fiction, and it will be interesting to see what progress is made on this front in the years to come.

Note: Pictures themselves are links to sources.

How does daylight savings affect public health?

Most (but not all) Americans participate in daylight savings each year, pushing sunrise back by one hour in March and bringing it back on the first Sunday in November. The history of daylight savings is long and complicated- it was used to support the war efforts of both World Wars and to conserve energy during the 1973 oil embargo. The current schedule was set by an act of fall-backCongress in 2005 and has been observed that way ever since. Daylight savings has always been controversial, it seems to defy logic and its relevance is under constant scrutiny. The economic impact of a longer evening has also been the subject of much criticism. I will not discuss the logistical and/or economic implications of daylight savings in this post, rather I will seek to apply scientific study to the idea that the current practice might actually be impacting public health and safety.

Risk of heart attack:

Naturally, skipping an entire hour of the night for the sake of implementing daylight savings shortens the amount of sleep that most people on fixed schedules get for that night. In 2011, a team of researchers from the Swedish Register of information and Knowledge about Swedish Heart Intensive Care Admissions (RIKS-HIA) conducted a study using statistical analysis to demonstrate that there is indeed an increase in the number of heart attacks suffered during the first week after the start of daylight savings. They were able to conclude (with a confidence interval of 95%) that there was a 4 percent increase in heart attacks that can be attributed to the beginning of daylight savings. The possible mechanism for the causal relationship might be more complicated than just losing an hour of sleep, according to an article published by Medical Daily. It postulates that the adverse health affects (including heart attacks and strokes) sunriseare likely more a result of the sudden change of environment than of one hour less sleep. This idea is supported by a 2007 study involving 50 test subjects that examined the disruptive effects of daylight savings on natural human adjustments to seasonal changes in sunrise. Essentially, the researchers found that the participants transitioned through the changing time of sunrise well without DST but did not when DST was in effect. Considering the evidence that circadian rhythms can affect hear attacks, this mechanism is certainly plausible and the affects could theoretically be avoided with the elimination of daylight savings.

Increase in stress:

There is experimental evidence to support the possibility that the sudden shift of time during daylight savings can cause stress. The 2014 study, which lasted 13 years, examined the differences in cortisol levels in 27,569 people who were subjected to different sunrise times. Cortisol level is a key indicator of stress. They also made sure to exclude anybody taking any kind of confounding medication or having a preexisting condition that might affect or skew beginning cortisol levels. The data showed a 5% increase in cortisol level in the bloodstream for each hour that sunrise was pushed back. The extremely damaging effects of chronic stress are well documented and certainly worthy of prevention. If the abrupt delay of sunrise can act as an additional stressor to humans, it begs the question of whether it is really worthwhile to continue the tradition of daylight savings.  

Other considerations:

The health affects of daylight savings might not be entirely negative, according to some studies. For example, better road visibility due to a later sunset might prevent some car accidents. This study, conducted in Minnesota, found that even though there are more cars on the road during times when daylight savings is in effect, there is still a noticeable decline in motor vehicle spring-forwardcrashes. This supports the idea that DST can actually be helpful and practical in some ways during the later parts of the say. Another study, involving 23,000 children, found that more daylight in the evening leads to a decrease in childhood obesity. The mechanism for this lies with the fact that kids play more when the sun is out than they do when it is not. It is important to note that the changes were rather small, but impactful when extrapolated to an entire population.

Take home message: 

Daylight savings originated with a specific purpose that is no longer relevant today, yet it remains an integral part of our yearly schedule. Scientific studies establish causal links between daylight savings and several noteworthy health risks and some benefits. Different people might weigh the pros and cons differently, but I conclude from my research that this is certainly a conversation worth having and worth researching further. Lives could depend on it.

Note: The pictures themselves are links to their sources.

Do glasses worsen myopia in nearsighted youth?

I have been a contact lens wearer for more than five years, and I think they are great. A few seconds in the morning gives me the gift of perfect sight without having to wear a primitive piece of eyewear on my face. Unfortunately, the second I take them out for the night, I am essentially blind at distances further than an arm’s length! I am certainly not alone. According to the American Optometric Association, nearly thirty percent of Americans are affected by myopia eyesight(nearsightedness) caused by an elongated eyeball and/or incorrect curvature of the cornea. Light entering the eye does not do so in proper focus. Contact lenses and glasses correct this. Myopia can be inherited genetically and provoked by environmental factors. I can say that my vision is definitely worse now than it was five years ago. I have chalked this up to some sort of “dependence” that I have developed as a result of corrective lens use and I have accepted it as a natural consequence of wearing them every day. This Nigerian study found that 64 percent of students feel the same way. Is there validity to this belief? Would people be wise to wear their lenses sparingly to avoid worsening myopia? Furthermore, is there a difference between contact lenses and eye glasses with regards to myopic progression?

A simple google search will return advice from many laymen claiming the affirmative- that corrective lenses do in fact worsen eyesight. The science, however, leads in a different direction. Testing this is not easy because it is normal for eyesight to naturally worsen with age, but this study offered valuable data. The scientists assumed that if it were true that prescription glasses worsened eyesight, then it would be logical to under correct eyesight to slow that deterioration. To study this, 94 children aged 9 to 14 were randomly assigned to two groups. The control group was fully corrected while the experimental group was given prescriptions that were .75 units below what would correct them fully. Their eyes were measured yearly. The results of the kid-wearing-glassesstudy were surprising: the control group eventually leveled off while the experimental group experienced elongated eyes and worsening vision. This study debunked the idea that a weaker prescription will slow myopic progression by proving that doing so in fact accelerates it. Those with proper prescriptions did not experience worsening vision as a result. Interesting note on this study: This study was actually a rare example of one that had to be stopped prematurely because the results were so immediately obvious. We discussed ethics in class with regards to this type of occurrence. Researchers concluded that to continue the study would be unnecessarily harmful to the experimental group. Under correction is no longer an acceptable practice for attempting to slow myopia.

The take away so far is that worsening nearsightedness as a result of corrective glasses is a complete myth. But does the same hold true for wearers of contact lenses? According to a project called Adolescent and Child Health Initiative to Encourage Vision Empowerment (ACHIEVE), children who wore contact lenses experienced no clinically relevant increases in contact-lensesmyopic progression. They determined this by randomly allocating 484 children who had not previously worn contacts into two groups: one that wore contact lenses and one that wore glasses. They then studied the progression of myopia in the two groups to determine that there was no statistical difference.

Take away: The notion that nearsighted people will experience more rapid eyesight deterioration by wearing corrective lenses is false. Under correcting to avoid worsening of eyesight will actually worsen it! Contact lenses are safe with regards to myopic progression.

Note: The pictures themselves are links to their respective source. 


Play Video Games for a Powerful Brain

America is in the midst of one of the slowest economic recoveries in its history, but that has not stopped the video game industry from exploding in recent years. According to the Entertainment Software Association, the industry generated more than $22 billion in revenues in 2014 and is growing at a pace more than four times as fast as the broader U.S. economy. Much of this revenue is driven by adolescents (12 to 17 years of age), 97 percent of whom play kids-playing-video-gamesvideo games according to a recent ABC News article. Strikingly, more than half of adolescents report playing video games on an almost daily basis.  I must admit that I was no exception to this trend. I played for at least a couple of hours a day when I fell into that age category and I likely would have played more had my mom not feared that my brain would rot. Perhaps out of pure spite I rejected the notion that gaming could be at all harmful to the mind and continued to play as often as I was allowed. I still contend that parents in fear of rotting brains need not worry, and in this post I will explore my hypothesis that playing video games actually improves cognitive functions and perhaps even grows the brain in a physical and measurable way.

Note: In this blog I will not seek to discuss the social and moral implications of the content that many of today’s popular games possess. The consequences of video game violence, drug references, sexual innuendos, etc. is a fiercely debated question that has yet to be fully answered and will not be considered in this post. 

In the course of my research, I found studies that show benefits that go beyond enhancing quick decision making and hand eye coordination. In this experiment, researchers were able to establish that playing video games with strategic elements leeds to improved strategic thinking and enhanced brain power. The experimental design was simple yet enlightening: researchers split 72 test subjects into three groups. The control group played a game called The Sims, one that requires no strategic thinking. The two experimental groups played a game called starcraftStarCraft, a real-time science fiction strategy game. One group played the game on a higher difficulty that required more complex strategic solutions to problems presented in the game. Potential confounding variables were considered by researches by taking the following measures: All of the groups had roughly the same median age and prior experience with video games. All participants were undergraduate female students. All three groups had roughly the same median score on benchmark tests of strategic thinking prior to the experiment. Each group played 40 hours of their game over a six week span. The results were measured by using a complex battery of cognitive tests to determine if the groups playing StarCraft saw an increase in brain function when compared to the Sims group. The tests showed that not only did the StarCraft groups perform better, but the benefits derived from playing seem to be proportional to the complexity of the game. In other words, the group statsplaying the more advanced version of StarCraft saw even more cognitive gains that the group playing the less complicated version. These results support the claim that video games can help to improve strategic decision making. The researchers were unable, however, to establish a firm physiological mechanism to explain their results (and to confirm my hypothesis), so I continued to research to try to find a causal link between video games and brain power…

To support my claim that video games directly improve the brain and its functions (and to explain the findings in the last study), I sought evidence to demonstrate that video games have a profound impact on the anatomical structure of the brain itself and I encountered a recent 2013 experiment that did just that. This experiment was conducted by the Max Planck Institute for Human Development. Scientists there used magnetic resonance imaging (MRI) to measure any volumetric increases that might have occurred in the brains of people who played Super Mario 64 daily for 30 minutes over a two month span. They also monitored the brains of a control group whose participants did not play video games for the two month period. They concluded that playing the video game sparked neurogenesis (the creation of new neurons in neurogenesisthe brain) and an overall increase in grey matter (where nerve cells are located). Growth was detected in many important regions of the brain including the right hippocampus, right prefrontal cortex and the cerebellum. According to the Mayfield Clinic, these structures are critical for such functions as hand-eye coordination, short-term memory, strategic thinking, and other cognitive functions. The researchers also noted that the ability to train specific parts of the brain with video games can have very valuable therapeutic applications that should be researched more in depth.

Based on my research, I conclude that playing video games is likely to improve many aspects of cognitive function by training the brain in a way that results in measurable volumetric increases in brain matter. I warn, however, that at present there is a lack of data to show the long-term effects of frequent video game play. It is unknown whether the initial benefits will erode quickly when one stops playing or if they will continue to benefit the player well beyond that. Despite this, I conclude with confidence that playing video games is a fun way to do your brain some good.


Get it out of your head!

SC 200 is deigned to make us non-majors better consumers of science so that we gain an appreciation for the scientific process and are able to intelligently consider the many scientific claims that will be made in our lifetimes. Essentially, we are here to learn to not blindly believe what anybody tell us! In the spirit of being better informed science consumers, I felt it would be an interesting idea to discuss and debunk three common and prevailing scientific misconceptions so that we can call stop believing them!

Seasons are caused by distance to the sun

The idea that it is hotter in the summer because that is when the Earth is nearest the sun is completely false. In fact, we are actually further from the sun in the summer than we are in the winter. The correct explanation for the seasonal weather changes that we experience annually is related to the 23 degree tilt of the Earth’s axis. In the northern hemisphere, we are more directly hit by the sun’s rays that we are during the winter when were are tilted slightly away from the. The southern hemisphere experience this in opposite fashion. Source

One human year equals seven dog years. 

There is no question that humans tend to outlive their 4 legged best friends by quite a good many years, but to simplify the lifespan to the common 1:7 ratio is misleading and incorrect. There are two main reasons for this. First of all, dogs develop at different (faster) rates than humans do. The first year of a dogs life can more closely be defined as 15 human years. While the last few years of a dog’s life are about 4 in terms of aging. The second reason to drop this myth from your mind is because of the rather larger variance in the lifespan’s of different size dog breeds. For example, living longer than 15 years is much more of a feat for large dogs (labs, etc.) than it is for small ones that weigh less than 20 pounds. Source

There is no gravity in space

Just because one does not necessarily “fall” in space as they would on Earth does not mean that there is no gravity. In fact, there is gravity everywhere in space and the force can act upon objects insanely far away. Gravity binds the Earth to the Sun and even keeps every star in the galaxy revolving around the supermassive black hole in the center of it. Gravity is a relatively weak force when compared to the likes of electromagnetism and nuclear energy, but it is everywhere. This article sums it up nicely.

There are many more misconceptions to be discussed, but most are a bit more outrageous or obvious that these. I wanted to pick a few that many college-aged people such ourselves tend to  actually believe in hopes that they we can let go of them. These three are a good start!


Should we eat the whole egg?

Egg whites are all the rage nowadays. From McDonald’s to Chick-fil-A there are many companies and brands that have been making efforts to appeal to the growing number of consumers who feel that leaving the yolk out of their eggs is a healthy choice for them. Is there validity to this line of thinking, or are egg whites just an example of another diet fad that shall soon pass?

Argument for egg whites:

egg-whiteAccording to the Livestrong Foundation, an egg without the yolk remains an excellent source of natural protein with more than two thirds of the protein that is found in a whole egg. In other words, it is wrong to assume that ditching the yolk means sacrificing the protein benefits of egg consumption. Additionally, egg whites are completely fat free, making them an excellent snack or breakfast add-on for anyone who wished to lose or maintain weight. According to Livestrong, natural protein sources that are also fat free is an excellent way to lower the risk of heart disease. Another benefit of ditching the yolk is that by doing so you also do away with the very large amount of cholesterol found in them. A single egg yolk nearly reaches the maximum daily cholesterol intake recommendation of 300mg.

Argument for keeping the yolk:

yolkAs it turns out, cholesterol isn’t the only thing being tossed away with the yolk. According to a nutrition facts chart found here, to save a little bit of fat from the yolk one must sacrifice a host of valuable nutrients that can be beneficial to health. For example, the yolk of an egg contains all of its vitamin A, vitamin D, vitamin B12, and almost all of its calcium. If you take a look at this chart from Harvard University, it becomes apparent that the vitamins in the yolk of an egg has the potential to do things such as reducing cancer risks and revitalizing hair, skin, and nails. The yolk of the egg is also rich in omega 3 fatty acids that have a whole list of their own health benefits. When a dose of health antioxidants is added to the list, it seems pretty illogical to discard an egg yolk on account of saving a couple grams of fat that can actually be healthy.

What to do?!

At quick glance, eating egg whites seems to give most of the good and almost none of the bad associated with egg consumption. This is a dramatic misrepresentation of the truth. The yolk of the egg, despite having cholesterol and a few grams of fat, also happens to be the source of many great health-enhancing vitamins and oils that should be a part of any nutritious breakfast. Unless you are diabetic and need to cut cholesterol or there is just no room in your diet for a few grams of fat, disposing of the yolk is definitely not the healthier choice. Eat the yolk and look elsewhere for some fat savings if necessary!




Are there more than four dimensions?


Warped fabric of spacetime (Source)

Start with the Massive

It has been more than 100 years since Albert Einstein finalized his famous Theory of General Relativity, describing his idea of 4 dimensional spacetime and redefining how mankind views its universe. In short, he was able to mathematically demonstrate that time is not linear as we once thought, but rather warped by gravity and relative speeds of objects to each other. For example, time moves ever so slightly faster at high altitudes on Earth than it does at low ones. The effects are almost negligible to humans, although GPS Satellites must be periodically adjusted to account for the fact the time moves faster for them than it does for slow-moving humans closer to the center of the Earth. If you had a twin that left at nearly the speed of light and returned, you will have aged and likely died while they would have only perceived a few minutes passing by (Read about the Twin Paradox). It seems like science fiction, but it is true. And it has been experimentally demonstrated through use of extremely accurate atomic clocks on fast moving planes and rockets (Example here). Einstein’s revolutionary theory describes well the universe on a massive scale (galactic, etc.) – but not at all on a minutely small one (subatomic).


Image source

Now the tiny

Fortunately, the physics of the very small did not go unexplored in the 20th century. The development of the mathematically and conceptually complex nature of what is called quantum mechanics has told us a lot about how the four fundamental forces act on a subatomic scale. It has led to the discovery of the building blocks of protons and neutrons (quarks), the understanding that light can also function like particles and matter like waves, and the discovery of antimatter, amongst other things. It helped us finally answer how gravity holds on to mass and observes that it can do so from an infinite distance (with gravitons). Much of the work of contemporary particle physicists has centered around observing and proving the existence of the many particles that govern the fundamental forces. In this endeavor, they have had a great deal of success in recent years. The search for particles to prove quantum theories is the main purpose of the famous CERN Large Hadron Collider in Geneva Switzerland. Much of quantum theory is very new and absent from many high school and college textbooks.


Elementary particles of quantum mechanics (source)

How do these two generally accepted theories come together?

Short answer: they really don’t- and that is the question that has kept physicists busy for the past century. The mathematics used by both are incompatible and inconsistent. Both theories have observational evidence validating them yet they do not follow the same rules of physics. The goal of modern theoretical physics is to develop what they call a Grand Unified Theory, thus bridging the gap between general relativity and quantum mechanics and unifying the forces. None of the most influential people in physics have been able to this. In fact, many believe that classical physics is not capable of such a theory.

So did they all give up?

Not at all. Though many have accepted the possibility that deriving a grand unified theory will require abandoning the constraints of the four dimensional reality that we can perceive. It is thought, for example, that perhaps gravity seems so weak compared to the other forces because most of its energy is manifesting in other spatial dimensions. One of the leading schools of thought on this are is Superstring Theory. It takes us much closer to a grand theory but its mathematics require the existence of at least 10 dimensions. Some theories require as many as 26. Opening our minds to this possibly might be the key to underusing the universe.

What would a 5th or higher dimension even look like?

Short Answer: We do not and probably cannot know. Our perception is limited to the 3 spatial dimensions (X,Y,Z) and the fourth, time (t). While we can continue to mathematically probe higher dimensions and gather more and more evidence of their existence, we will never be able to perceive and experience them. A helpful way to think about it is imagining a 2 dimensional being. If such a being existed (imagine a shape on a piece of paper), it would have no concept of what the third dimension could possible look or be like (Read about Flatland). Yet as third dimensional beings us humans can easily observe and manipulate something 2 dimensional. If a 3 dimensional object were to visit a 2 dimensional plane, its true form would not be perceived correctly (only as a 2D cross section). The same holds true with us and higher dimensions. If they do exist, we are just as oblivious to its reality as a 2 dimensional shape would be to ours. If there are 5th dimensional beings, we are only perceiving their 3 dimensional cross sections and are completely oblivious to their true form (pretty mind-blowing).

The takeaway here is that this is not science fiction. Math and logic point to the existence of higher, imperceivable dimensions that we might very well need to understand if we are ever to explain the true nature of the universe. But of course, as we learned in class, human intuition is extremely limited and it is always possible that we have been going about this all wrong!


Einstein's Discovery of General Relativity, 1905-1915

The Physics of Everything: Understanding Superstring Theory


It’s a bad time to be a science major!

Hello everyone,


My name is Matt O’Brien and I am a sophomore enrolled in the Smeal College of Business. I want to major in both accounting and finance while simultaneously pursuing a Master’s degree in accounting (MAcc) through Smeal’s integrated program. I plan on tacking on minors in Spanish and International Business. I am taking this class for a couple of reasons. First, my former roommate, who happened to be this class’s highest scorer last year, insisted that no course he could have possibly taken that year would have been more valuable and worth his time. Secondly, I do not understand why non-science majors would devote an entire course to one specific subset of science. The idea of a course that can draw from all scientific fields that are relevant to our being good “consumers” of science in one semester is very appealing to me. We need not waste time on the nuts and bolts to gain an appreciation for and an understanding of what scientists do and why it is so important.


To be honest, I do enjoy science quite a bit. I like to keep current on the failures and successes of modern day scientific endeavors. I was fascinated by Steven Hawking’s novel “A Brief History of Time” and I often find myself putting a great deal of thought into some of the great unanswered scientific questions. That being said, I do not have a desire or a drive to be the person who makes a scientific breakthrough! The aforementioned “nuts and bolts” make me miserable and destroy my liking for science.





When choosing a major, It comes down to choosing the one that best utilizes your passions and aptitudes, and it comes down to choosing one that will almost certainly lead to a lucrative career with security and opportunity to advance through hard work. For me, business school checks these boxes. In business the sky is the limit. Unfortunately, the hard sciences produce graduates who on average are grossly underpaid and scarily insecure in their employment. There just is not a whole lot to do without a Ph.D. This is why I believe that science lovers would be better off in engineering and technology fields.


Here is an interesting article about why we should drop the “S” from STEM.