Recently a lot of my posts have focused on technology and somehow how society has continued to improve it time and time again as the years go by. Advances are seen literally daily in many aspects of life. One topic that is very interesting is artificial intelligence.
A recent study done by a U.K. research team gave 400 kids advanced technology in the classroom. The main piece of technology used was a state-of-the-art desk that gave the teacher much more freedom to control the class speed. The teacher was now able to see the kids work on the problem at any point, and assign different problems to different kids.
The theory behind this is that because the teacher can better micromanage each student, the group stays closer together, and therefore can learn at a more rapid pace instead of waiting for the slowest kid. The other aspect that helps the kids out is the ability to see what each other student is doing, work on problems together, and have the teacher let a different student work on parts of the problem, one knock against this study is that it is conducted with younger students, and the math is not very difficult or conceptual. The study does admit that it needs to conduct much more research to finalize its findings, but so far the kids have shown significant improvement since using the technology.
This study is a great sign of how science can be used in various parts of our lives. It is not very easy to think of learning as a science, but using a control group of the basic students allows for the researchers to accurately see the difference in the technology students, and the basic students. The only downside to this experiment is that it is very expensive and therefore can not have a very large test group, so while the control group is very well randomized, the test group is very small in comparison.
All throughout school, my teachers emphasized the need to reread, revise and edit everything. Of course, spell-check or auto-correct on Microsoft Word and smart phones usually caught the obvious mistakes-using an “a” instead of an “i” in definitely, writing “teh” when we meant to say “the”, etc. However, more subtle errors often went unnoticed. Sadly for many of us, spell-check often neglects to pick up on human errors like writing “to” instead of “too” and using the incorrect form of “their/there/they’re”. Being the perfectionist that I am, these errors do not cease to irritate me, causing me to delete many a misspelled Tweet or go back and edit my SC 200 blogs until they are error-free. But in reading other blogs, I have found too many frustrating spelling and grammar errors that seriously interfere in my understanding of what the person is trying to communicate. Often these blogs remain only half read, as the three or more errors in the first paragraph makes finishing and commenting on the blog almost unthinkable. So what’s to blame-a lack of revision, an absence of/low quality spell-check or just sheer laziness? Probably a combination of all those factors, but the so called “Grammar Nazis” want answers. And so of course, I went searching.
My older sister works in Silicon Valley and constantly shares with me the newest things happening in technology. A few weeks back, she told me she was at a restaurant down the street from her office and saw a man wearing a weird headpiece with a small glass frame on the corner of one eye. A coworker of hers pointed and said, “Look, Google Glass!”
Google Glass is a new product that Google is experimenting with in its early stages. In fact, only few people have actually been able to experience Google Glass with the exception of Google employees themselves.
What is Google Glass?
Google Glass is an entirely hands-free computer that you wear like a pair of sunglasses, except there are actually no eyepieces. Instead, there is just a small box in the corner of the frame on the right side. In a New York Times Article, (http://pogue.blogs.nytimes.com/2012/09/13/google-glass-and-the-future-of-technology/) it was said that when a user focuses up at the piece of glass, the half-inch display actually appears comparable to a big laptop screen.
Although Google Glass is still so new, think about what this could mean for the
future of technology. In a video on youtube called Google Glasses Project (http://www.youtube.com/watch?v=JSnB06um5r4), they provide some examples of how Google Glass can be incorporated into your day. Imagine a computer that you could just speak to and it would act, that was in view whenever you looked up, and that could be a companion throughout the day. Google Maps could transition into arrows that appeared in front of your face. What will happen to our brains, the way that we think, our ability to multitask? Will textbooks become a digital media that can appear in front of your eyes?
So how will the technology work?
Google explains that glass will eventually have the ability to “get online.” Right now, Google Glass hooks up wirelessly to a cell phone, but in the future the device could have it’s own system entirely. This technology overall is pretty exciting, and I think it really could be the next big thing in technology. I imagine that these will replace cell phones and even computers some day. What is the need for a heavy laptop when you have a practically weightless, hands-free device that serves the same purpose?
There are some problems that could arise with this technology, however. For example, if the glasses doubled as a textbook, they would also need to be banned from classrooms to avoid cheating. In an article in Technology Review (http://www.technologyreview.com/review/428212/you-will-want-google-goggles/), the writer tries on a pair of Glasses after interviewing Starner, a technical lead for the project, who originally had the glasses on himself. The journalist actually saw that in the glasses, Starner was reading what he could and could not say on the screen during the interview. Right in front of his eyes, he was able to participate in his interview and read a screen without being detected. Google Glass could change the way we communicate, the way we learn, the way we think, and so much more. It could very well be the new big thing in technology, but how much is too much? I also consider the many hypothesis we have gone over in class that seemed like the next big thing or some revolutionary idea that went terribly wrong. Only time and more trial and error will tell, but how do you feel about the future of Google Glass?
Google has a lot of other cool stuff it’s working on. Check out these links below:
Google’s Self-driving Cars
There’s actually a Google Mars (Like Google Earth)
Most of us have probably cried when computers or ipods broke and were unable to be fixed. Okay maybe not cried, but have been pretty upset. We are afterall constantly using them for most or all aspects of our lives. If you’re on a sports club team you get emails about practices and meetings, a THON committee requires you to check your email or facebook group often, and when it comes to classes, everything is digital.
So our computers have become like our children. We take care of them, spend a lot of time with them, buy fun toys for them to connect to. If we lost them, we might find ourselves lost until we replace them with newer, prettier ones.
For all the love we give our computers did you ever think they cared about us even a little bit? In a recently published article experts in the field of computer software and technology are trying to make computers engineered in a way to be able to assess our emotions by our facial and body expressions.
For more than two decades the Massachusetts Institute of Technology has been working on a software that will read when a user is growing bored with the information they’re learning or reading about. The computer will be able to tell if the user is annoyed, tired, confused, or unhappy with search results, information, or computer programming.
The idea for technology to be able to read users emotions wouldn’t only be limited to computers.
Doctors have developed a prototype to help people with asperger’s where they would wear special glasses that would warn them when people they are talking to are becoming bored with the conversation. Often people with asperger’s cannot read when people are growing bored, uninterested, or irritated in a conversation. The glasses would have a yellow light flashing when they sense the person is uninterested or finished with the conversation.
GPS systerms are also being taken into account. The idea is to be able to assess when a driver using a GPS system is getting frustrated because the GPS is leading them into traffic jams or longer routes. The GPS system would be able to adjust their normally cheerful voice if they sense the driver is less than cheery.
The software has come a long way in the past couple of decades and we may see computers in schools as early as next school year that will be able to read a child’s reaction to lessons, tests, and quizzes online.
This is all very futuristic stuff if you ask me. Although there are many advantages to this new technology, it may prove to be too invasive for some. What do you think? Is technology reading our emotions and reacting as a human being would too much to swallow in 2012?
Where are you reading this blog post from? Your laptop? iPod? iPhone? iPad? A better question for the sake of getting my point across is “how many of these items are within arm’s reach at the moment?” We are constantly surrounded by not just one but multiple electronic devices, as the New York Times article Digital Devices Deprive Brain of Needed Downtime explains by detailing the experience of 40 year-old Diane Bates, as “she listens to a few songs on her iPod, then taps out a quick e-mail on her iPhone and turns her attention to the high-definition television” all while at the gym.
This behavior is very typical of frequent gym-goers and is also popular amongst the general public, as we have all witnessed and experienced. But have we become too invested in our electronic screens? Phones, which have essentially become miniature computers, are used for a lot of functional purposes. Simultaneously, they are becoming relied on to replace every moment of boredom that we experience in our lives. We rely on our cell phones to “relieve the tedium of exercising, the grocery store line, stoplights or lulls in the dinner conversation.”
The problem is that while we might not be “bored” anymore, we aren’t giving ourselves any mental alone time because we are “forfeiting downtime that could allow them to better learn and remember information, or come up with new ideas.” As an individual growing more and more dependent on her phone for entertainment (but also as someone who greatly values deep-thinking) I was alarmed when I thought about the way I have been compromising my thought development for meaningless conversations, pointless games, or monotonous scrolling through social networking sites.
So where is the science in this? Well, I’m getting there. The article addresses research from the University of California, San Francisco as well as from the University of Michigan. In California, a study about experience and brain activity was conducted on rats. The experiment showed that “when rats have a new experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only when the rats take a break from their exploration do they process those patterns in a way that seems to create a persistent memory of the experience.” With the assumption that our minds develop memories similarly, the study communicates the importance of taking a “mental break” from our lives so that our brain can process our experiences and develop them into long term memories. So could our phones really be preventing us from getting the most out of life experiences?
The University of Michigan study examined the effects of walking in nature versus walking in an urban environment. The study found that the urban environment overwhelmed individuals by all of the information needing to be processed while walking in nature allowed them to think clearly. People are actually fatiguing their minds by being constantly in tuned with electronics, according to the article.
I have to say that this article has honestly made me consider my dependence on my phone for a constant source of entertainment. I am definitely going to maintain my phone usage for functional purposes, but I am going to strive to avoid doing insignificant activities just to entertain myself when I could otherwise be thinking. A perfect example of this is walking around campus. There’s no reason not to put my phone away and focus on the landscape and architecture instead of what my friends are tweeting.
Something I never understood is why people insist on getting the newest version of everything, even if there is no difference. Every company does it, they make it seem like last years model is insignificant and the new model is a must have. Apple has seemed to master this technique with their new Iphone 5. The new Iphone is slightly thinner, and slightly faster and has a slightly bigger screen. All of these changes are so slight, that people cant actually tell the difference. Check out this clip from Jimmy Kimmel Live where he fools people that an Iphone 4s is an Iphone 5 Jimmy Kimmel Live. In this video the people are convinced that the Iphone 4s they are holding is actually the Iphone 5.This Article explains why some people crave the newest technologies. These people, that constantly yearn for the newest and best technologies are called neophiliacs. These people are never content with what they have and are constantly searching for the newest piece of technology to give them excitement. Apple has control over the minds of all the neophiliacs and more and more of us are becoming one.
I feel as though this all started back in the 50’s with the sense of “Keeping up with the Jones'” where we will buy the newest and best thing, for no practical reason, other than the fact that it is new. We all fall victim to this, we just bought a new Iphone and are really excited, only to find out that next month there is an Iphone 5 or 6 or 7 that just makes our technology seem obsolete. The new Iphone still doesn’t have many of the features that similar smart phones like androids have. Shortfalls of the Iphone. Apple has turned all of us into neophiliacs, and I wonder if that will ever change.
I feel as though this will never change though, because if it does Apple will go out of business. No one will get the new Iphone if they realize it’s the same as the one they have. However we are reaching an end, there is only so much Apple can do to improve on its Iphone, so when they can no longer improve, they will simply go out of business. You be the judge, can you tell the difference? If so, is the difference worth at a minimum 200$?
In my previous blog post I talked about the dangers of concussions in football and how they could have an extremely negative impact on the current and future of the players’ lives. In this blog post I will explain what is currently being done to try and minimize the chances and risk of concussions and if they are working or not.
Hey guys!My name is Harry Augustus,sophomore in broadcast journalism major.I am originally from Tibet.I love my country and so do I love the slow-tempo life in State College.
Overwhelmed by both the grandeur and the confusion that the science has created for us in this ever-changing world, I feel it practically impossible to detach from usages of the technologies. There’s also absolutely no doubt that the science is the fountain-head of innovations, and the latter has been constantly enhanced to pander the blown-up demand of our survival needs.
When I was young, science was simply not my cup of tea. I viewed this enormous field as a swamp of unidentified patterns awaits the whole world to unveil. Honestly speaking, even during my adolescence, I didn’t have a snippet of fervor in scientific explorations, that was not saying I was deprived of the aspiring curiosity that a kid should have innately bestows or I was a reclusive hermit. This insensitivity, widely speaking, rooted in the intrinsic aversion toward the acquisitions of scientific knowledge. As for many of my peers, science was a captivating nebula which hides colossal mystiques for disentanglement; whereas to me, the science, to its most essence forms, is a paradigm of absurdity. In my dictionary, science is a complete set of human knowledge that exists in specific media. Some are mutant, while the other perdurable; some are esoteric, while the other self-telling; some are conducive, while the other deleterious. Despite the dichotomy that it brings to this world, there stands a monolithic truth that science, which defies manifold camouflages, is a futile enterprise. Arbitrarily any morsel of scientific lore precedes its own revelation; that is to say, we are only the acknowledger of science, not the facilitator. For instance, based on the spectrum of recognition to this date, the theoretical freezing point of the eater nails to 32 degree Fahrenheit. Although in actual circumstance, this thermal threshold might swing away from the mark, scientists propose that in an absolute environment, the vitality, namely the liquid property of water, will be stemmed and morphed into a solid state. If we were the facilitator of science, we could have turned things around. Were the water freeze at a higher temperature, less glacier will be put into jeopardies of disappearance; on the other hand, were it freeze at a lower temperature, a wide expanse of permafrost will be relegated to tundra or taiga landforms, where might provide more optimistic future for new settlement of human beings. Nevertheless, human beings are servants to science, and will remain the subordinate position evermore.
The world system itself, as we refer to science, has been settled by the world itself. There is a telling metaphor to it. Science, consisting of unlimited items of facts, has been placed into a boundless matrix, in which each column and row has been circumscribed by disparate vectors, such as when, where, what, how and why. A scientific fact will be exposed, if not sufficiently evidenced, when scientists endeavored to locate the complete coordinates of an item in this matrix. In sum, science survives because multitudes of researches tirelessly dove themselves in this intricate puzzle. It’s the puzzle manipulates us rather than we galvanize new wisdom wherein, which perfectly explains the reason why I withdraw from science in the past, at present, and in prospective future. Science is amorality has no life. Researchers persistently invest immense efforts in their oh-so-exquisite rules of the world by excavating those facts which has remains extant since the time that even human could not exhaust. Scientists end up in disconsolate mopes when they fail, whereas they end up in unrestrained elation when they succeed. These emotions aroused by their enterprise, of course, are not putatively reproachable; but quite unfortunately, they are grave-diggers, not trail-blazers, and that says it all.
It seems that I am a daredevil dissident of the value of science, but I decided to take this course. Obviously, I am in great desire to fulfill my department requirement. On the other hand, which is the one I want to underscore, is that I do have the incentives to be more acquainted in science. And the reason is by no means to develop long-lost interests, or you might call it appreciation, I merely don’t like the embarrassment to be despised as a “yeoman “every next time someone talks about the high-techs. Once I set this goal, I fully dedicate to it. That is my learning style: show some respects to the material I am studying.
The picture below was taken in Tibetan area.Welcome to Tibet!