RSS Feed

Posts Tagged ‘english’

  1. Emotional Contrast: Being Sad Makes You Happy

    April 21, 2014 by Daniel Friedland

    I really like being happy. In fact, it is my life goal to be truly happy. However, I tend to overlook the importance of sadness in achieving happiness. Most, if not all, people do not like being sad, but if we were never sad, we could never be happy either. Our emotions lie on a spectrum ranging from deep sorrow and grief to joy and bliss, but imagine if all negative emotions were removed? All that would be left is happiness, and that would be horrible. Now, I will attempt to explain why this is.

    I’ll start with a brief synapsis and interpretation of the final portion of Brave New World by Aldous Huxley. John lives in a world in which all people are genetically engineered and strictly contained to a predetermined path in society. Emotions are controlled by a substance referred to as “soma,” which induces a state of hallucinogenic bliss and disillusionment. It seems to be truly ubiquitous, as all people use it and are exposed to it as a substitute for negative emotions. John comes from a “savage reserve,” which contains people who were born naturally and live in a technologically more basic society from the outside world. He leaves the reserve and takes the world by storm. After observing how the “brave new world” around him operates, he is appalled at the apparent lack of humanity. Although he lusts after a woman, Lenina Crowne, he refuses to be with her so as not to contribute to the superficial nature of the world he observes. After attempting and essentially failing to find any scrap of genuine, natural human emotion, John hangs himself, symbolizing the death of true humanity in a superficial world. The ending quote from the novel remains one of the most powerful conclusions I have read:

    “Slowly, very slowly, like two unhurried compass needles, the feet turned towards the right; north, north-east, east, south-east, south, south-south-west; then paused, and, after a few seconds, turned as unhurriedly back towards the left. South-south-west, south, south-east, east. …”

     

    I interpret John’s death as a reminder that emotional contrast is essential to living a meaningful life. Most of the characters in Brave New World cope with negative emotions by merely using “soma” to forget their problems. Thus everyone lives in a perpetual state of artificial happiness. John is unable to cope with artificial happiness because he realizes that it is meaningless. If we are not sad ever, how can we be truly happy? People who experience only good things have no negative experiences with which to compare them. Therefore, their “happiness” becomes the normal, median emotion, which is synonymous to apathy, essentially. The more sorrow we feel, the greater capacity we have for feeling joy.

    Here lies the brutal reality of life: We must go through heartbreak, grief, and disappointment if we want to feel positive emotions to the same degree. It is up to us to find a healthy medium between our emotions. We certainly do not want to be sad enough to the point of depression or suicide, but we really shouldn’t discount our tough experiences in life as pointless. The point of negativity is to make way for positivity. Tell me what you think!

     


  2. Teachers: The Backbone of Public Education

    April 21, 2014 by Daniel Friedland

    In discussing education reform, I have discussed a wide variety of potential problems that need to be addressed, including socioeconomic inequality, competition and rankings, faulty school curricula, and more. Although I briefly discussed teachers in a previous post, I believe they deserve a fully fleshed out consideration.

    No one can deny that teachers are the most directly related resource in educating students. However, the public education system treats them as just “resources” when they are so much more than that. While teachers are responsible for providing information for students to ponder and digest, they hold an incredible amount of autonomy and directly influence a child’s eventual passions and intellectual curiosities.

    I think we can all remember the magnitude of a teacher’s influence on us, whether for better or for worse. For example, I had never taken a philosophy course before college when my mother forced it upon me. I didn’t know what to expect, but I gained so much from that class, not just in terms of intellectual development, but also in terms of personal growth. Now, I plan on making philosophy my second major. My professor taught philosophy in a very open way in that she didn’t take her interpretation as “law.” She was always eager to consider what the students thought about a given passage or idea, and this made for a very discussion-based environment. This class eventually led me to choose to major in philosophy.

    On the other hand, students dismiss a subject as boring or uninspiring based solely on a teacher. Here lies a terrible problem. Perhaps a student had the mental capabilities to become a great mathematician, but somewhere along the way, the student suffered through a class with a horrible teacher. This has probably happened to all of us to some extent, but we don’t really like to think about it. For example, I was once extremely interested in pursuing math as a career, but after one bad teacher along the way, I decided against it. Granted, I now realize that advanced mathematics is really difficult for me and ultimately beyond the scope of my intellectual curiosity; however, that one teacher influenced me so strongly without me even realizing it.

    The point is that teachers have a huge influence on their students. Thus, the true value of teaching is reflected neither by the value we as a society place on it, nor by the financial value that comes in the form of a teacher’s paycheck.

    I don’t think I need to convince you of the importance of teachers in the public education system, but the stigma that goes along with the career, as well as teachers’ relatively low incomes both require some further discussion.

    I have posted something like this before, and I’ll post it again: “Those who can’t do teach.” This sentiment essentially describes how many people feel about the profession as a whole, and before casting this off as mere bigotry, let’s consider the trends. There is much controversy over the true statistics regarding the class standing of future teachers in college, and while the idea that teachers frequently make up the “bottom-third” of their college graduating classes is not necessarily proven, we can say as a generalization that teachers as a whole do not perform as well in college as do students in many other majors. Why is this? In general, the requirements for graduation are perceived as “easier” than requirements for majors such as engineering or mathematics. Whether or not this is true, engineering has the highest dropout rate of any major. Interpret that how you will.

    So far, I have been careful not to step on any toes here, but it’s time to get real. How can we attract higher achieving students to the field of education? Some assert that a higher income for teachers is the answer, and that makes logical sense. However, the prevalent idea  in our society that effort yields reward cannot be entirely dismissed. While teachers serve an important role in society, the education needed to get a teaching certificate in most states is a Bachelor’s degree. The range in income for teachers is generally between $35,000-$80,000, and the median starting salary for undergraduates with a bachelors degree is $45,000. Based on these numbers, the teacher’s salary is about the same as the salary of the average person with a bachelors degree. Therefore, we could logically say that the teacher is being paid accordingly based upon level of education (“effort”).

    But do we think that the important role of teachers in society entitles them to more money? Also, since a greater income would logically draw potentially more “high-achieving” students to majoring in education, is it permissible to increase the income of teachers throughout the nation? I would argue that the answer is yes for both questions, but it is important to consider both sides of the arguments. What do you think?

    That’s all for now! Stay fly, and goodbye!

    -Dan

     


  3. Legal Ethics

    April 11, 2014 by Daniel Friedland

    I recently came across an interesting hypothetical scenario that allows us to fully explore the potential limits of morality. It goes like this:

    Three men and an orphaned child are stranded on a boat in the middle of the ocean with a declining food source and no means of catching fish or attaining any other food. Rescue and aid will not reach them for two months, and they are all aware of this fact. The three men each have a family of their own on the mainland, yet the child has no family or other quasi-familial social ties. When the four are down to only a few scraps of food, one man pulls the other men aside and asks the question no one wants to consider. He asks, “What should we do when the food runs out?” The other men grimace as they are forced to consider their options: 1) The men could kill and eat the child so as to promote their own survival OR 2) The men could choose not to kill and eat the child, essentially solidifying their eminent death.

    What should they do? There are numerous factors to consider. First, the men, like civilized society, believe that murder is wrong. Therefore, they feel to some extent that killing the child would goes against their moral codes. However, without killing the child they will all four die of starvation regardless. Logic, to some people, dictates that one death at the cost of three survivals is more practical than four deaths.

    Other factors lie in the nature of the child as he compares to the men. Because the child has no family or close social relationships, he will not necessarily be sorely missed in the context of the world. The men, on the other hand, have families who rely upon them to provide shelter, food, and general care. Their deaths could be considered more detrimental because of the dependents for whom they provide. (DISCLAIMER: I do not mean to perpetuate the male chauvinism in our society; I am merely giving a hypothetical scenario in which the men happen to be support their families financially in these isolated cases.) Another factor to consider is that the child is so young and naïve, yet to experience the wonders of the world. Should he be given the chance to live a full meaningful life, which the men arguably had the chance to do already?

    Synthesizing all of the moral arguments for both options, the basic question comes down to a choice between personal morality and survival. The idea of “survival of the fittest” comes into play, as well. Regardless of the aesthetic sophistication marked by present-day society, men are still animals and are strongly influenced by the will to survive no matter the cost.

    Bringing this scenario around to apply to legal ethics, I will conclude the scenario:

    The men choose to kill and eat the child, thus allowing them to survive until they are rescued after the duration of two months. Upon rescue, the remains of the child are found on the ship, and the men are charged with murder.

    How should courts rule in this case?

    Often, there is a clear connection between legality and morality, but what happens when the line is blurred?

    Must courts adhere to the law that murder is inherently illegal or should they recognize or place value upon the prospect of survival that was involved?

    Should the men be found guilty or innocent?

    Legal ethics tries to answer tough questions like these. Tell me what you think! How would you answer these questions? Remember, there are no wrong answers in philosophy!

    That’s all for now! Stay fly, and goodbye!

    -Dan


  4. Hero Worship

    April 7, 2014 by Daniel Friedland

    While researching for my persuasive essay this past weekend, I came across this term several times – “hero worship.” I am attempting to persuade education reformers and teachers alike to implement philosophy into the public school (as in pre-higher education) curriculum for my paper; however, while I chose not to address it directly in my paper, I would like to address it here because it really is a fascinating, yet controversial idea.

    Simply put, “hero worship” is the idea that studying certain people (philosophers, in this case) causes one to idolize them and their thoughts and become restricted from development on those ideas. For example, believers of hero worship would say that if I study Plato in-depth, I am likely to take what he writes as wise and true so that I will not attempt to disagree with him and will therefore be stunting my own philosophical prowess by worshipping the works of Plato. In general, I disagree, though there are some valid concerns here, as well.

    As for the valid ideas within hero worship, it makes logical sense that if a person becomes too obsessed with one certain idea of way of thinking, it tends to constrict their perspective. This can be seen in politics. Generally (as a stereotype), Democrats watch MSNBC, while Republicans watch FOX. The obvious bias towards either end of the political spectrum concurs with the general audience of the network, which is arguably detrimental to all audience members involved. A liberal who is continuously exposed to his or her own ideas over and over again will become more solidified in those beliefs and be less likely to consider any conservative ideas. The converse is true for conservatives.

    The danger certainly exists that a person who studies only one philosopher or way of thinking will be less likely to be consider other perspectives; however, I would argue that this is extremely rare. Most who study philosophy study both a breadth and depth of philosophers so as to broaden their perspective and not get too caught up in a certain idea. While the concern of hero worship makes logic sense, the phenomenon addresses a very rare danger, and therefore is limited in its applicability and by extension, its validity.

    In my opinion, the idea of hero worship is extremely negative and could have negative effects on society if accepted at face-value. Studying philosophers or any specific academic concentration is extremely important. We generally learn the broad history of the field so that we can build upon past and present ideas to promote progress beyond what has been done. In order for progress to occur, we must first recognize previous ways of thinking so that we understand why a field of study exists as it currently does. Without that basis, we would merely rehash old ideas and digress back into the past. Is this really what we want? Probably not.

    It seems to me that this idea of hero worship, though valid in a limited scope, is overall a hollow argument that would promote laziness and regression in today’s society.

    Tell me what you think!

    That’s all for now. Stay fly, and goodbye!

    -Dan


  5. Philosophy in Public Schools

    April 7, 2014 by Daniel Friedland

    I came across an interesting article the other day that revealed a misconception I had about public education in the United States. The Huffington Post article cites a study done by the University of Southern California (USC) which found that the United States is the “clear leader in spending-per-student.” At first, I was baffled by the fact that the U.S. still has a weaker public education system than many other prominent nation with such high amounts of money being spent per student. However, it makes sense if you consider that the quality of the education we receive in the U.S. does not meet the standards of many other nations throughout the world.

    We have been discussing ways of improving the quality of education in the U.S. throughout this semester, and one of the main reform efforts we have discussed is the de-standardization of public education. I would like to take this a step further and assert that we must not only free ourselves from stringent standardized guidelines, but we must also implement ways of broadening the minds of students so as to truly break away from the regurgitation mentality. Therefore, as many education reformers have suggested, I support the implementation of philosophy in public schools, namely middle or high schools.

    The prospect of injecting philosophy into the curriculums of public schools below the level of higher education is extremely controversial, and I certainly realize this. Critics definitely have some valuable concerns, yet some are factually inaccurate. For one, opponents state that philosophy is too academic and esoteric in nature to be valued by the every-day person. This is the farthest from the truth, and the argument against this assertion follows that of the value of history (“How can we learn from our mistakes without studying the mistakes of the past?”). Philosophy is simply the “love of wisdom,” and similarly, how can we understand where we are today and where we are going in the future without first realizing the thinking that brought about societies institutions and ways of conventional thinking.

    Another argument lies in the assertion that philosophy in public schools could cause children to question their religion. Maybe this is just me, but I find this argument to be so fundamentally flawed that it doesn’t really serve as a valid argument at all. Religion has been tested for thousands of years, yet it continues to endure despite individual questioning. More importantly, though, religious indoctrination without the ability to ask questions is not okay, in my opinion. Granted, I understand that parents have raised their children a certain way, and they certainly have a right to do so; however, philosophy and religion are not mutually exclusive. In fact, there are numerous philosophers who practice a diverse array of religions from. There exists this conception that those who study philosophy lose their faith and become atheists. Philosophy helps people to explore all possible realms of thought with an open mind so as to promote a more universal perspective, but that does not mean that they do not retain their own faith. Oftentimes, the study of philosophy strengthens an individual’s faith in their religion because he or she is able to understand other ways of thought and morality and choose what they agree with. This is a lot more meaningful than religious indoctrination, in my opinion.

    Lastly, there always remains the idea that philosophy may be valuable to study, yet it is not as essential as the rest of the academic subjects. Therefore, it is left out of curriculums in high schools because there is simply not “enough room” for it. However, I would argue that philosophy is potentially even more valuable than some subjects in school because it introduces an entirely new concept into public education – free thought. Currently, almost all academic subjects before higher education are knowledge-based. Although some would argue that english or literature classes are more interpretive than empirical, from my own experience, this is only partially true. In my AP literature class, our grades were often based upon regurgitation of given themes of short stories or detailing events from a novel, which, I would argue, borders on memorization and regurgitation. Even so, philosophy would allow students to think for themselves and critically consider various perspectives on life, death, and other more tangible applications to society. To me, it is more important for students to develop critical thinking skills, logic, and a more broadened perspective so as to promote both themselves as individuals and the improvement of society.

    Here is the article I discussed at the start of this post. Give it a read!

    “It Lies Within: The Key to Education Reform”

    That’s all for now! Stay fly and goodbye!

    -Dan


  6. Spirituality: Theism vs. Atheism

    March 28, 2014 by Daniel Friedland

    Every philosophical argument has to operate on some subjective assertion, so here goes mine: people have an insatiable desire to know. By “knowing,” I do not mean the mere accumulation of arbitrary knowledge. I mean knowing what fulfillment entails, what meaning is, and who is really in control of your life. These are questions with which, on some level, everyone grapples. The frustrating observation I make in regards to these types of “life” questions is that they are unanswerable. Sure, we can give subjective answers to them by assigning values to various aspects of our lives, but how do we really know that our values hold weight or “are correct?” In my opinion, we really can’t know. We can only hope and have faith that our values in life reflect the type of universal or personal meaning for which we strive.

    Being spiritual to me means being in touch with oneself intellectually, physically, and emotionally. In other words, spirituality entails that we get to know who we are on the deepest of levels. This often includes exploring questions without answers, such as those regarding universal meaning. The way I see it, people generally attempt to be spiritual upon the basis of theism or atheism. I would like to discuss both forms of spirituality separately, eventually linking them together to show the underlying root of their discrepancies.

    I will start by defining theism as the belief in God or Gods as a supreme power that is believed to have created the world and humanity, as well as a being that indirectly or directly intervenes in our lives. Most of the world’s religions are based on either monotheism (belief in one God) or polytheism (belief in more than one God). It is often asserted by theologists that religion is a man-made construct created to explain the unknown. Religions provide potential explanations for the meaning, purpose, and creation that serve to guide people’s spirituality. The concreteness of God is comforting or even essential for many people because it provides a basis for all of life. For some, it provides the purpose of serving God(s), and for most, it provides a basis for morality. In my last post, I discussed my own belief that our commitments are to OURSELVES, but I also recognize that generating concrete or abstract constructs to which to commit makes people more productive and more likely devoted. In discussing religion, we commit on some level to the idea that something greater than us exists. This can be neither proven nor disproven, at least for now.

    Atheism take the opposite approach in exploring spirituality. It is defined as the rejection of the existence of any deity. Many atheists assert that the individual is the root of reality, which I agree with to an extent. However, the question still remains, “How did we get here?” Science tells us that nothing can be spontaneously created out of nothing, so who or what caused our existence? I am not sure whether atheists do not care about this question or whether they have decided it is not relevant. The truth probably lies closer to the latter.

    The main difference between theism and theism is this: Theists prefer to believe that the existence of a deity cannot be disproven, while atheists prefer to believe that the existence of a deity cannot be proven. The question comes down to whether or not you believe that beliefs or assertions can or should be proven versus disproven. Currently, I believe that nothing can be truly proven; however, theories can be disproven on the basis of logic. Therefore, I tend to favor the side of the theist more so than that of the atheist because I don’t think it is possible or practical to live based on unproven speculations than to live with no basis of anything (if nothing can be proved). I seem to have strayed a bit from the original topic, but what are your thoughts on the validity or value of athiesm vs. theism? Do you believe that assertions can be proven, disproven, both, or neither? Tell me what you think!

    That’s all for now! Stay fly and goodbye!

    -Dan

     

     


  7. Commitments: Do we truly commit to ourselves or to others?

    March 20, 2014 by Daniel Friedland

    I would like to consider the nature of a commitment and the role of society and relationships in our commitments. I will start by defining a commitment (in this instance) as something we feel obligated to do. This definition is potentially flawed and certainly subjective, but it works as a general starting point for the nature of commitments I would like to discuss.

    The question posed here was inspired by a statement made by a friend about his commitment to studying viola. We were discussing balancing school work, free time, and other commitments. He said something along these lines:

    “I love viola, and the way I see it, if I can’t keep that commitment to myself, how and why should I be keeping a commitment to anyone else?”

    This struck me for a lot of reasons.

    First, he makes the assumption that doing things like school work and the so-called “obligatory” aspects of college life as a commitment not to himself, but to something or someone else. But what is that “something else”? Is it a commitment to his career? To his parents? To his professors? To the university? The answer seems truly illusive and abstract.

    My first thought is to denounce this type of assertion as a means of placing responsibility on something or someone else. We do our school work so that we can learn FOR OURSELVES or to get good grades FOR OURSELVES. Any deviation from this fundamental belief (of mine) is delusional, in my opinion.

    But I also realize that it makes sense to try to base some of our commitments on other people or other concepts such as our future success. It allows us to separate from the idea that all of our actions ultimately are a result of our own desires. (I could also extend this belief to my view on religion, but I’ll save that for another post.) While I think we all realize on some level that we are ultimately committing to ourselves in everything we do, we use abstract constructs of our desires (i.e. future success, pleasing another person, pleasing “the system”) to our advantage so that we do not feel solely responsible for the commitments we make to ourselves.

     

    I don’t want to be a hypocrite here, though. As much as I acknowledge the nature of my commitments, I am no different from anyone else. I still pretend to commit to other people or organizations subconsciously because it makes it easier to cope with the fact that my decisions have potentially graver consequences for me than I would care to realize.

    Upon further thought, I realize that separating ourselves from our commitments has another really important consequence – accountability. I will use an example here to explain. I want to do well in school so that my GPA is good enough to get into medical school. I want to do well in medical school so that I can hopefully have a fulfilling career in medicine and play a role in helping people to achieve the highest potential physical state to achieve their own dreams. Because I base this commitment upon a future goal of mine (an extension of myself), I am more likely to be motivated to achieve it. If I were to take this sequence even further, I would realize that I really just want to do well in school because it will make me happy. My desire to be happy and feel fulfilled makes me more likely work for the state of my future. But the problem is that the future does not really exist until it happens. I make a commitment to something other than myself, in this case, my future, so as to instill more accountability. All I am really doing is trying to make myself happy, but what kind of motivator is that? The concept of happiness is so abstract and vague that it won’t motivate me nearly as much as some “thing” I can strive towards.

    I guess all I am trying to say here is that it is OKAY to base your commitments on things other than yourself so long as you realize deep down that anything you really commit to is a commitment to yourself.

    Realize what you do. Be selfish. Work for YOU. And most importantly (not really), DON’T BE A PART OF THE SYSTEM.

    The Lonely Island – Threw It On The Ground

    That’s all for now! Stay fly and goodbye!

    -Dan

     


  8. Freedom of Choice: When should it begin?

    March 7, 2014 by Daniel Friedland

    This week, I began thinking a lot about whether we should be guided or even compelled into making choices by anyone other than ourselves. This idea stemmed from a musing regarding a phenomenon in the music world that I have encountered time and time again.

    As some background, I play the violin and have been doing so actively for about 11 years, and while I work fairly hard, I consider myself decent at best. I try to practice 2-3 hours per day, but I continually struggle with the reality that I will never “make it” as a soloist or concert violinist. I am some of you have seen or encountered so-called child prodigies who cultivate their musical abilities at an almost unbelievably early age. These kids are the ones that come to dominate the mainstream classical music scene during adulthood, and, admittedly, I am left with bitter feelings of resentment and inadequacy a lot of the time.

    Each time I am preparing a new piece of music, I go to YouTube to listen to recordings to help guide me in the right direction. Undoubtedly, no matter how “difficult” the piece is considered, there always seems to be a 6-8 year old child among the results who plays the piece with a level of technique and musicality I don’t even come close to reaching. And while I recognize that it is not especially effective to compare myself to others, I can’t help but feel a tinge of frustration about my comparatively low level of playing. Sometimes, I wish I was one of them.

    Take a look at this video, and you’ll see what I mean…

    Wieniawski’s Variations on an Original Theme – Soo-Been Lee (11 years old)

     

    However, I started to think about these young children and how they got their start. These prodigies must start their instrumental studies at around age three, but at this point, they probably do not show a specific interest in learning an instrument. Rather, it tends to be the parents who decide when a child begins their studies. The children are guided in practice by teachers and parents until they have the self-discipline to do it on their own.

    My question is this: Is it ethical to “force” a child to devote a large portion of their life to something (i.e. playing an instrument) they may not want to do?

    I have heard a variety of perspectives on this, but I can really see both sides of the argument, and I hesitate to name one answer as fully correct. I often hear stories of children being forced to play an instrument at an early age, and they sometimes grow to resent both their parents and the instrument itself. On the other hand, some children grow to love their instrument and develop their skill into a successful career and lifetime passion.

    But how can you know who will go in which direction? Is it more ethical to wait until a child requests to start an instrument (like I did at age 8), yet have them face the setbacks that come with starting later than the prodigies? Or is it more ethical to start a child on an instrument early so that they have the potential to really make something of themselves as a musician later in life? Now, bear in mind, I am not saying that there are not successful violinists (I use violin as an example because I am most familiar with it) who start late, but look at when some of “the greats” began their studies:

    Sarah Chang – Age 4

    Hilary Hahn – Age 3

    Itzhak Perlman – Age 3

    Jascha Heifetz – Age 2

    I think it is an interesting moral dilemma for parents to make such a choice for their children. We all like to think that we should be free to make our own choices, but maybe it is necessary for others to make choices for us when we are unable to make them for ourselves. Or maybe it is wrong to coerce people into certain choices because you are “taking advantage” of their innocence or naïvety. What is the answer? I don’t know.

    Let me know what you think!

    That’s all for now. Stay fly, and goodbye!

    -Dan


  9. Individualism: How do we achieve it?

    February 21, 2014 by Daniel Friedland

    Throughout your life, you have probably been told to “Just be yourself!” Individualism is something which our society praises (for the most part), and it something that a lot of people work towards. Individualism in the context that most people think about it relates to finding yourself and expressing who you are as an individual separate from all others in some distinctive way. And while we strive for this sense of individualism, we seldom think about how to achieve individualism or how to qualify it at all.

    When it comes to finding one’s individualism, two schools of thoughts come to mind. One, which is supported by both Plato and Emerson, states that individualism is a solitary concept in which the individual must be removed from society in order to find oneself. Emerson, specifically in “Self-Reliance”, advocates for solitary individualism because he qualifies individualism as a term synonymous to self-certainty. He believes that society hinders self-certainty because it conforms a persons beliefs by its very nature. In other words, he is saying, “How can someone be sure of themselves when other people oversimplify and impose customs and values on each blossoming individual?” This is certainly a valid question, but it pertains specifically to the conception of individualism as self-assuredness. Other conceptions lead to other questions.

    The second school of thought asserts that individualism must base itself upon societal entities. John Dewey argues this point in “The Lost Individual,” claiming that things like religion, politics, and other institutions are essential in terms of individualism because they provide the basis for all the values and other ideas we could have. Dewey appears to qualify individualism as association. Based upon this assertion, it makes sense that individuals should not be solitary. If individuals are not exposed to the ideas that societal institutions provide, then they have no basis for any ideas. In an argument with Emerson, Dewey might assert that self-assuredness risks narrow-mindedness. Without exposure to diverse ideas, individuals rely only on what they perceive personally, severely limiting their potential understanding of the world from other perspectives.

    Although these two means of attaining true individualism seem to counter each other, both means make sense following each person’s definition of what individualism is:

    Emerson’s Individualism = Self-assuredness

    Dewey’s Individualism = Association

    Which is the correct definition? That is hard to say because I believe the answer depends on the person and their circumstances. As a college student who is immersed in society in a variety of ways, I find it difficult to fathom solitary individualism because I cannot imagine who I would be without the influence of others. For that reason, Dewey’s idea of association resonates more with me. I believe that it is important to be exposed to ideas because in order to be the fullest and most genuine individual possible, one must realize other perceptions. The saddest and most frustrating thing for me is that I only have one narrow perspective on the world, which limits my understanding of it. But if I can get a glimpse into other perspectives, I feel that I can grow as an individual and assimilate those ideas with my own to the degree I see fit.

    Remember, this is all my opinion, which is ever-changing so feel free to disagree. Which quality of individualism resonates with you?


  10. Public Education: Personalization vs. Standardization

    February 7, 2014 by Daniel Friedland

    The American education system is something to which I can easily relate, as I take part in it by being a Penn State student. Unfortunately, many people do not question the current quality of public education because it is taken as an immoveable precedent which essentially serves its purpose. Currently, articles, debates, and other forms of media and communication explore a variety of subtopics and controversies within the broad scope of the American public educational system. Some of these issues include the debt accumulation for college students, the role of the teacher in education, and the relative autonomy shared between the variety of administrators and instructors in the educational setting. However, we first must ask, “What is the purpose of learning?” I do not mean public education, nor do I mean the institution of education in any form. I am referring to the purpose of learning as an entity completely separate from education.

    To me, the purpose of learning is to explore one’s consciousness through the acquisition of experiences. Learning certainly does not refer to the accumulation of the knowledge it takes to operate in American society, although the vast majority of people (on both sides of the aisle) seem to agree to this very conception. One of the most basic arguments of public education involves the value of learning, specifically, if learning is meant to be a personal exploration of the mind and the world or a standardized method to produce proper, hard-working citizens of the American economy. Regardless of my preconceived notions of what learning is, the institution of public education closely follows the latter conception of learning.

    As a public institution, education must be working towards the improvement of society, rather than the improvement of an individual, which is perfectly logical if we accept the fact that humans are social creatures who depend on societal entities and other associations to influence and shape who they are. Moving past all of the philosophical hullaballoo, the fact remains that public education is here to stay, and that is the basis for the rearing of young and adolescent minds in the United States.

    Even so, the question of personalization vs. standardization continues to play into today’s form of public education in this country. If the goal of public education is to produce responsible citizens of a nation who are devoted and willing to work towards its goals and defend its founding principles, is it better for students to all be taught the same thing? Or would it be better for each student to be taught differently according to his or her determined needs and (possibly) interest?

    Both standardization and personalization have obvious pros and cons. Standardized education allows for every child to have a completely equal opportunity for “success” in society because what each student learns is essentially the same, promoting the sense of equality on which this nation was based. Also, students who share the same educational background are more likely to relate to another and unify due to their shared experience, and unity is an essential aspect of the “ideal” American nation. Unfortunately, standardized education does not guarantee that each child will adequately acquire and apply the education he or she needs to thrive in American society. Some students learning quicker than others, as well as more effectively, and standardization within the context of public education merely ignores this blatant issue.

    On the other hand, personalization does allow for different students to learn in different manners and at different paces. This is certainly effective in producing productive citizens that are able to fulfill the variety of roles within a society. However, personalization does not generate citizens with similar mindsets, necessarily. The more diversified an education system becomes, the less likely it is to produce equally prepared students in terms of their relative preparedness for entering and thriving in society. Also, personalized education is not practical in that it is expensive and requires diverse teachers and learning equipment.

    Upon considering the pros and cons of each education method, it becomes apparent that the answer lies somewhere in the middle. Legislators are currently debating WHERE in the middle public education belongs. Before exploring the solutions to the cons of both methods and prior to synthesizing them into one uniform methodology, it is important to recognize what the issues are so that they can be methodically resolved.

    I look forward to exploring the complexities of public education this semester and possibly deducing some sort of answer for myself regarding the vast issues encompassed by the topic. Thanks for reading!

    Stay fly and goodbye!

    -Dan


Skip to toolbar