Do We Need Letter Grades?

A, B, C, D and F. What do they all have in common? Of course, they are all part of the alphabet, but they also make up the letter grading system in schools and universities across the country. 

Grading scales’ history goes back to as early as the 1600s, however it wasn’t until 1897 that Mount Holyoke College became the first to use letters tied to a numerical/percentage scale. Later, K-12 schools also began to implement letter grades; due to mandatory attendance laws and a subsequent increase in enrollment, a detailed descriptive report for each student was losing popularity to a much more time-efficient letter notation. And with the desire for standardization of grades across the country, grading scales have been embraced by the US for decades. 

Now, the longtime tradition has been facing controversy regarding how effective the letters are in showing the correlation between success and learning. So, do we need letter grades?

There’s a reason letter grades have been around for so long. The most obvious reason is that they are universal, making the meaning behind them easily understood no matter the circumstances or geographic location. Virtually every parent, teacher or child knows the difference between an “A” and “F” simply by glancing at the letter. Even with minor changes, letter grades can allow students to seamlessly track their academic progress even if the location of learning changes. The simplicity of the scale as opposed to its historic descriptive counterpart provides many benefits to educators as well; teachers can easily compare how a student is doing in comparison to their classmates and provide extra help as needed just by looking at a letter.

But, others believe that it is time for a change. Opponents argue that the simplicity of the letters can also be a drawback as they are extremely subjective. In other words, the actual grading is not standardized. Although this may not be true for subjects like math and science where answers are either right or wrong, subjects like English are more interpretive and the grade received is based on how the teacher feels about the answer. Consequently, one teacher may grade an essay a B while another teacher may give it a C. In the end, the letter grade does little to inform the student on how well they did actually writing the paper.

This also has negative effects as many students and parents base intelligence on the grades received. 

With high stakes of GPA, college acceptances and job offers, many students value getting the A rather than actually learning with current grading systems. As such, many instructors are looking to transition their grading to be based more on student effort on the assignment or giving narrative feedback in order to motivate students to grow academically. It’s too soon to tell, but these new ideas could usher in a new future of grading. 

Do We Need Standardized Tests?

To all of my in-state readers, how many of you remember the two weeks in the spring when we got to take the Pennsylvania System of School Assessment or more commonly known as the PSSAs. Even if you are from out of state or international, I am sure you had some version of a test that tested core curriculum skills in school and when applying to colleges.

In the United States, standardized tests have been used as early as the mid-1800s to assess a student’s proficiency/readiness and to aid administrative/policy decisions. In 1899, the College Entrance Examination Board (now known as College Board) was formed to provide a quantitative measure of whether or not a student is ready for college; the test that is now known as the SAT was first administered in 1926.

However, it wasn’t until the No Child Left Behind Act in 2001 that standardized testing became the norm in every grade level. Although the emphasis has been taken away in the past decade with the Every Student Succeeds Act, standardized testing is still a very common practice in schools across the country. But, do we really need these tests? 

Some say yes, arguing that it provides a universal standard to make it easier to evaluate specific geographic areas, schools and even teachers.

Most tests are multiple choice tests, which means they can be administered fairly in schools with different curriculum at a lower cost to the state. The results offer quantitative evidence of students’ academic progress, which could then  provide valuable insight into whether the curriculum is effective and hold school administrators, policymakers and teachers accountable for inadequate performance. More importantly, if standardized tests are used constructively, they can lead to “higher standards” in curriculum and give all students valuable problem-solving, time-management and critical thinking skills for college and career.

However, there is another side to the story. First of all, standardized tests assume that all students learn and behave the same. By focusing on a very narrow curriculum, the tests fail to recognize that the same students who test poorly may excel in other areas such as the arts or have learning disabilities without the proper aid to demonstrate their skills. Additionally, despite being a “fair”  examination, many standardized tests can be considered discriminatory. One study done on college admissions found that the SAT and ACT often discriminate against low-income and minority students who may not have access to tutors and well-funded schools.

As such, according to Mark Kantrowitz, the study’s author, academically talented students are faced with a barrier due to the not-so fair standardized tests. 

Standardized tests are most likely to stay in classrooms, but there have been many changes to curriculums and college admissions in recent years. With more colleges continuing to stay test-optional after the pandemic and school administrators finding new ways to test students’ abilities, we must continue to ask ourselves: do we need standardized tests?

Do We Need Technology In Education?

From communal laptops in carts to individual iPads and Macbooks, I have always had some sort of technology to help me learn in class from kindergarten through high school. Even now, we depend on our laptops more than anything when it comes to higher education learning in universities across the world. 

The 21st century has seen a shift in the way we use technology to make our lives easier and build for the future. Schools are no exception; in order to compete with the technologically advanced society and dependence on devices, educators have begun to integrate technology into their classrooms to change the way they teach while simultaneously allowing students more access to learning. This could include the use of smartboards for teaching curriculum, 1-on-1 devices for students to do classwork and homework, and online platforms like Canvas and Google Suite. 

 

 

With all of these benefits and more, why is there such a huge debate about the use of technology in education?

Many teachers and professors have specific teaching styles that could be helpful to some, but challenging for other students. Technology can help combat this by integrating visual, auditory and kinetic activities like virtual labs and videos to reinforce what was learned in class. It can even give teachers the ability to make more fun and engaging lessons through platforms such as Kahoot.

 

In the United States, students that spent more than an hour on a device in the classroom per week  “achieved the highest outcomes for reading and science” according to McKinsey & Company.

Technology is also helpful to educators. Not only has technology given teachers a plethora of resources to make lessons, but it has streamlined the process of grading and tracking student progress with learning management platforms such as Canvas.

 

However, the integration of technology in classrooms also has its drawbacks. Students are already constantly in front of screens with gaming, social media, video streaming, etc. Opponents argue that allowing screens to be a dominant form of education would be harmful to students due to the physical and mental effects it can cause such as obesity or depression. Others also argue that it can have an impact on their learning. A study done by the National Institute of Health has shown that children often scored lower on aptitude tests if they spent more than 2 hours in front of a screen. 

 

Technology may also prove harmful to skills that are not taught explicitly, such as interpersonal skills through human (face-to-face) interaction. Specifically, Charles Nechtem Associates believe children will be more “socially withdrawn and awkward.”  

Other drawbacks include a reduced attention span with distractions, more cheating and the cost of implementing such ideas (especially in low-income areas).

Technology has revolutionized the way we communicate, work and now, gain an education. It has certainly helped us during COVID-19 with distance learning, however there are still many drawbacks that must be taken into consideration. So, do you think that we need technology in education? 

Do We Need Regulations On LGBTQ+ Discussions?

School is a place of learning, exploration and getting to know yourself as an individual and student. But, what happens when discussion about a part of your individuality is banned? If you don’t already know what I am talking about, it’s Florida legislature’s House Bill 1557, more commonly known as the “Don’t Say Gay” Bill, signed by Governor Ron DeSantis in spring of 2022.

 

It has already taken effect in July and is one of many targeting LGBTQ+ students all across the country.  Consequently, it has led to debates amongst parents, educators and politicians about whether these laws are truly needed. 

The US has seen a long history of activist groups and lawmakers targeting LGBTQ+ discussions across the country; there have been restricted access to certain books and debates about transgender students in bathrooms and athletic teams. 

 

This specific law, however, restricts classroom instruction on sexual orientation and gender identity in kindergarten through third grade. House Republicans have also introduced a federal bill with similar ideas in October, which could affect LGBTQ+ at all schools and federally-funded institutions. The bill is titled “”Stop the Sexualization of Children Act” and aims to prohibit the use of federal funds to develop and implement sexually-oriented material for children under 10. 

However, despite its nickname in the media, proponents insist that the bill is not anti-gay but rather making conversations more age-appropriate; the bill is commonly known as “Don’t Say Gay” due to the anti-LGBTQ+ intent, however, the bill doesn’t actually ban the word “gay” from being said in school. Instead, it protects young children from ideological “woke” indoctrination at school. In an interview with Focus on Family, a Christian fundamentalist group, DeSantis believes there is a “concerted effort to inject… gender ideology and sexuality into the discussions with the very youngest kids.” The bill would supposedly restrict this imposition of ideology and give parents more of a say in their children’s education.

 

On the other hand, critics point out that this bill is a way to assault the rights of LGBTQ+ Americans to meet political ideology. By hindering conversations in classrooms and federal institutions, the government would essentially limit students’ ability to explore themselves and erase LGBTQ+ history. 

 

Chasten Buttigieg, a former teacher and husband of the Secretary of Transportation, went even further and tweeted that the bill “will kill children,” addressing the implications it would have on LGBTQ+ youth’s mental health. Opponents also argue that depending on how the bill is interpreted, it could also lead to Title IX violations and increased discrimination in and out of school. 

LGBTQ+ discussions in school have been a widely debated topic for many years with questions being raised on what the subsequent bills actually mean for schools and federal institutions, consequences and if it will actually be beneficial. 

Do We Need Affirmative Action?

Affirmative Action. It’s probably something we’ve all heard about when we were applying to colleges, especially to those that are extremely selective. But, what does it actually mean and why do (or don’t) we need it in the college admissions process? 

Affirmative action refers to the processes put into place to improve racial and minority equity. The so-called “positive discrimination” allows workplaces and educational institutes to offer employment or admissions to those who have been historically underrepresented based on gender, race, sexuality or nationality. Its history can be traced back to 1961 when President John F. Kennedy issued Executive Order 10925, mandating government contractors to provide equal opportunities for all, during the Civil Rights movement.

However, it has become a highly debated topic in recent years with opposing policies in various presidential administrations and court cases. 

One of the primary reasons advocates fight for affirmative action is due to the fact that we probably wouldn’t have as much racial diversity without it, hindering higher education opportunities for racial minorities. According to Valerie Strauss of the Washington Post, diversity “won’t happen by choice.” For instance, California, who banned affirmative action in the UC system, has seen a decrease in black student population at UC Berkeley from 6 percent in 1980 to only 3 percent in 2017. 

It also has the power to draw people to new areas of study. A popular example is women in STEM with women working in engineering occupations increasing from 3% in 1970 to 15% in 2019.

There have also been more men in female-dominated degrees like nursing and teaching, illustrating how affirmative action has allowed for people to break stereotypes and increase diversity in certain fields. 

However, on the other hand, opponents argue that affirmative action actually allows for negative discrimination. When minorities get into a certain college, people often credit the success to affirmative action and demean their actual efforts by saying they only got it because of their underrepresented status. Similarly, colleges may favor those who are underrepresented, despite another candidate having better academics and extracurriculars. In 2018, Harvard University was sued for discriminating against Asian American applicants. The criticism was based on the fact that specific races often had a “penalty” because of stereotypes (like how most Asians go to STEM fields); the case showed that they needed to work harder than underrepresented applicants (higher SAT, GPA, etc.). 

Affirmative action is a very tricky debate with many moving parts. On one side, it is beneficial in providing opportunities to those from every background, but tends to undermine the work and effort of those who are well-represented. I believe that while affirmative action is beneficial, it needs to be severely restructured and amended to allow for all students to be on an equal playing field.

 

 

 

Do We Need Gen Eds?

If you’ve registered for classes before, you may have noticed that besides taking classes meant for your major, you have “General Education” slots in your curriculum. Now, you not only have to worry about getting into major-specific courses, but also fulfilling the humanities, arts, and culture requirements. Otherwise, you probably can’t graduate. 

General education or gen ed classes can be seen in almost every university across the US and even internationally. These classes, usually unrelated to your intended major, are meant to provide students with a more holistic view of the world around them and encourage them to learn new skills and knowledge that can help them in the future.

 

 

The idea began in the mid-twentieth century when educators believed that pushing students toward specific majors leads to overspecialization without the skills to adapt to technological changes. Consequently, by 1990, nearly 85% of American universities required the completion of certain gen ed classes. 

However, now, these classes have been the center of educational debates for years.  Some believe that the classes help create well rounded students who find new passions or cater their existing skills toward different fields. For instance, a marketing major whose dream is to work in pharmaceutical advertising might find it beneficial to take biology courses. Supporters also claim that the humanities and culture requirements are a beneficial counterpart to the ever increasing push toward STEM, giving engineers and scientists crucial communication and interpersonal skills in the diverse world. 

On the other hand, opponents feel that it is a scam used by universities to take more tuition and keep students in school for 4 whole years (more years = more money). At Penn State, an in-state student is expected to pay around $700 per credit; factoring in the 45 credit General Education requirement, students pay a whopping $32,000! Not only does it become a required hefty financial obligation, but the extra classes also take away time from studying for the actual major or participating in activities outside of school. 

I personally believe that gen ed classes should not be required. In my experience, students tend to choose ones that are easier to get an A in rather than ones based on their genuine interests. However, I also believe students should get the opportunity to take classes if and when they feel like it. Rather than having set requirements, it may be time to shift toward an open curriculum like the one at Brown University. One graduate chose the Ivy League for the academic freedom and opportunity to create their own path unrestricted

 

General Education sounds good in theory. Why would anyone object to students exploring more passions and gaining the skills to effectively engage in society? But factor in money and the busy lives of students, and maybe we need a further discussion.