A Comparative Study on the Impact of the Use of Grammarly on the Quality of Writing in Core Business Courses

DOI: https://doi.org/10.64010/OAHP6175

Abstract

Effective writing skills are central in higher education and the working world. Writing is an expression of professionalism and could have legal ramifications if done poorly. The development of effective writing skills is a continuous process of improvement that takes time. A critical component of this process is consistent, timely, and useful feedback. This study focused on comparing the writing quality of formal papers submitted by students in two undergraduate core courses. Grammarly was used as a writing feedback tool in some sections of these courses, and instructor feedback was used in other sections. Can a computer-based feedback application provide better writing mechanics feedback than what an instructor can realistically and reliably offer? This study answers the question empirically. The data obtained provide strong empirical evidence that Grammarly’s use as a writing feedback tool leads to a statistically significant improvement in the quality of written work submitted by students.

Introduction

Background and Need for the Study

Effective writing skills are central in higher education and the working world. Over 70% of employers rank writing skills as paramount in the hiring process (Kowalewski & Halasz, 2019). Timely and precise transfer of information is critical to businesses as it can have legal ramifications and affect the bottom line.

The federal government also demands high quality written communication so much that it created the “Plain Writing Act of 2010.” The Act requires federal employees’ writing to be “clear, concise, well-organized, and consistent with other best practices appropriate to the subject or field and intended audience” (US Food and Drug Administration, 2020). It emphasizes that poorly written communication can harm the reputation of the individual and the organization and cost organizations thousands of dollars. However, educators and employers continue to voice their concerns about the significant gap between their expectations and the actual writing skills observed (Rentz & Lentz, 2018).

Not only are writing skills essential for professional success, one’s ability to compose high-quality text is the single best predictor of success in a student’s freshman year in college (Kellogg & Raul-erson III, 2007). Those who can write well also perform better in other subjects in the curriculum-even accounting (Campbell, Choo, Lindsay, & Tan, 2020). However, writing well is a significant challenge for students as it requires memory, language, critical thinking ability, and rapid retrieval of knowledge. It is not something just learned in a freshman composition class. College students must deliberately practice the craft of writing extended text throughout the curriculum and over time. Writers, just like athletes or musicians, must train and devote hours to honing their craft.

A barrier to helping students hone their craft is that faculty do not like to assign writing-intensive papers because they take too long to grade. Besides, grading assignments is often unrewarded work by the college administration. More emphasis is placed on faculty doing research and working with graduate students. The unrecognized work of grading may partly be why writing-intensive courses have been on the decline at American Universities (Bok, 2006).

In addition to fewer writing assignments, there is inconsistency in the quality of feedback and students’ motivation to use the input (Bok, 2006). Higher education students commonly report dissatisfaction with their feedback on assignments, calling it vague, discouraging, or too late (Mulliner & Tucker, 2015). When faculty are grading for content, punctuation, grammar, flow, construction of ideas, and proper sentence structure, they may experience fatigue and put increasingly less effort into the process as they proceed. Also, mood changes and motivation can add to the variability of feedback.

Another barrier to developing professional writing skills is that so many students ignore the feedback given to them by their instructor. Learners say they want and expect comprehensive feedback, but only 32% of all students actually read the instructor’s feedback (Mandernach, 2020). An instructor spends an average of 11-13 hours each week on grading only to have a majority of the students ignore the comments made. Without practice, detailed feedback, and the motivation to read and apply the feedback, students cannot hone their craft.

To lessen the grading load and allow instructors to focus on content, many faculty turn to computer-based feedback programs (Fiock & Garcia, 2019). Written feedback from instructors is very time consuming and may be misconstrued. The instructor may come across harsher than intended or make comments the students do not understand. However, using a computer-based feedback tool takes the emotions out of the message and may be more developmental. In addition, students can check their work before submitting and gain immediate feedback on their writing; the earlier the feedback, the better (Fiock & Garcia, 2019). The computer-based feedback on preliminary drafts may also motivate students to improve their scores before turning in their final paper, resulting in higher quality work.

Objectives of the Study

This study aims to determine if computer software feedback via Grammarly improves the quality of students’ submitted written work; and if it does, by how much and in what way?

Importance of the Study

Using computer-based feedback on writing skills may provide better writing mechanics feedback than what an instructor can realistically offer. Only about 3% of the instructor’s feedback is unique to a particular student (Mandernach, 2020). All the other feedback is repeated over and over among all of the papers graded. If faculty can turn over the task of providing feedback on writing mechanics to computer software, they can focus more on grading the content and construction of ideas.

Literature Review

The Importance of Writing Skills

Students must adhere to quality writing standards to succeed in higher education and the working world. To develop writing skills, feedback on student compositions is an essential component (Zimmerman, 2006). Unfortunately, higher education students express great dissatisfaction with the quality of feedback they receive (Higher Education Academy, 2013). In contrast, instructors often believe they are providing high-quality, timely feedback to their students. There appears to be a performance and expectation gap when it comes to feedback on writing.

The Challenge of Developing Writing Skills

Writing is not something that is learned overnight or in one particular class. Practice over time is essential. Faculty must train college-level writers in correct spelling, punctuation, grammar, diction, thesis development, paragraph development and organization, and paper flow (Kellogg, 1994). Deliberate practice is required to do all of this. According to the ten-year rule, one must practice deliberately for at least a decade to reach expert level (Ericsson, Prietu-la & Cokely, 2007). This deliberate practice involves the student’s exertion to improve performance, the motivation to practice and learn, the desire to practice within one’s ability, the opportunity to receive detailed feedback, and the ability to practice high levels of repetition (Zimmerman, 2006).

The difficulty with providing students opportunities for deliberate practice include the spacing of the practice and the ability to provide timely and useful feedback (Kellogg & Raulerson III, 2007). Appropriately distributed practice promotes long-term learning and transfer of skills (Schmidt & Bjork, 1992). Writing in marathon sessions is something often practiced by students and even their professors to meet deadlines. Such writing causes tension, anxiety, exhaustion, and writer’s block (Kellogg & Raulerson III, 2007). Deliberate practice also requires timely and useful feedback. However, providing that feedback is a challenge for instructors, so they assign fewer writing assignments and fewer training opportunities (Bok, 2006). Moreover, the quality of feedback often diminishes as the instructor becomes exhausted after hours of grading (Mandernach, 2020).

There is persuasive evidence that high-quality feedback is the most powerful single influence on student achievement (Brown and Knight 1994; Hattie 1987; Hattie and Timperley 2007). For feedback to be useful, students must engage with it and act upon it. To do that, they must also understand the feedback, and it must be given in a timely fashion (Mulliner & Tucker, 2015). Computer-based feedback is one way to provide both-understandable and timely feedback. In a study conducted by Grammarly (n.d.), students report that Grammarly’s computer-generated feedback increased their writing confidence, saved them time, and improved their grades. In another study conducted at a Central Queensland University (QUniversity), Australia (O’Neill & Russell, 2019), students reported that grammar feedback was “very important” and that Grammarly improved their assignments.

Grammarly Feedback

Correct writing mechanics is an essential aspect of any writing. Today, various tools are available to help students improve their writing quality. Grammarly is one such tool. Grammarly is a computer-based writing feedback software developed in 2008. Today it is the leading tool on the market (Product Habits, n.d.). Grammarly’s free Chrome extension has been downloaded 10 million times, and the company has 6.9 million daily active users.

Grammarly offers a free version and a paid premium subscription. The difference between the two is not immediately apparent to students unless they can compare the subscriptions’ output. Students often use the free version and are disappointed when their instructor points out additional writing errors that Grammarly missed. Even with the paid version, students must be encouraged to proofread and evaluate whether the Grammarly suggestions are applicable. If use

Summary of Literature Reviewed

High-level writing skills are critical to student success and essential in the work world. Students should be given the opportunity to hone their craft at writing throughout their curriculum; however, the ability for deliberate practice is hampered by having few writing assignments, poor or late instructor feedback, and/or lack of motivation to read the feedback and implement it. One way for instructors to reduce the amount of time spent on evaluating writing mechanics, improve the timeliness of feedback, and empower students to engage with the feedback is by incorporating a computer-based writing assessment program such as Grammarly.

Scope and Focus of the Study

This study involved a comparative analysis of formal papers submitted by students in two Southern Oregon University Business program undergraduate core courses, BA 330: Principles of Marketing and BA 427: Business Policy. Both of these classes require a significant amount of writing. In both courses, students wrote and submitted drafts of specific sections of a market research paper over the course of the term. These sectional drafts were read and graded by the instructors for content and writing quality. The drafts with feedback and comments on both the writing structure and content were then returned to the students. At the end of the term, the students combined these sectional drafts to comprise the final paper (full paper). The final paper was expected to incorporate the feedback and comments made by the instructors on the drafts. For the draft papers submitted in the Without Gram-marly classes, the feedback on writing structure was manually made by the instructors as they read and graded them. On the other hand, for the draft papers submitted in the With Grammarly classes, the instructors used the premium version of Grammarly as an automated way of generating detailed feedback on writing structure. The instructors still provided feedback on the draft, but these were on the content only.

The study focused on comparing the writing quality in each of the two courses where Grammarly was used as a writing feedback tool against sections of the same courses where Grammarly was not used. The following set of Grammarly goals were used in evaluating all the papers in this study:

Audience: Knowledgeable

Formality: Formal

Domain: Academic

Tone: Neutral

Intent: Inform

Research Questions

This research attempted to answer the following questions:

  1. Does Grammarly’s use as an automated writing feedback result in improved quality of submitted written work by students?
  2. If Grammarly’s use as a feedback mechanism results in improved quality of submitted written work by students, how much is this improvement on a per-student basis, on average?

Measurement of Writing Quality Variables

For this research, “improved quality” of submitted written work is defined and measured in four objective ways as Grammarly reported on each paper analyzed:

  1. Overall Percentile Score of Paper – This score represents a paper’s writing quality measured as a percentile rank. The percentile score is based on the total word count and the number and types of writing issues detected by Grammarly, which are compared in terms of accuracy relative to other Grammarly papers with the same set of goals. This set of goals includes audience (general, knowledgeable, expert), formality (informal, neutral, formal), domain (academic, business, general, email, casual, creative), tone (neutral, confident, joyful, optimistic, friendly, urgent, analytical, respectful), and intent (inform, describe, convince, tell a story) (Grammarly, 2020). For example, a score of 75 means that writing in a paper is more accurate than the writing of 75 percent of the documents with similar goals.
  2. Number of Advanced Issues per 100 Words – Grammarly considers the following as “advanced” issues: confused prepositions, overuse of the passive voice, wordy sentences, repetitive words, common writing and grammar mistakes like misplaced apostrophes (Collins, 2020). While Grammarly reports the total number of advanced issues in each document, the incidence of advanced issues is expressed in terms of per 100 words for this study.
  3. Number of Critical Issues per 100 Words – a “critical” issue is an explicit grammatical error (Francis, 2016). Grammarly reports the total number of critical issues in each document. For this study, the incidence of critical issues is expressed per 100 words.
  4. Number of Issues per 100 Words – this is the sum of the number of critical and advanced issues. The incidence of writing issues is expressed per 100 words.

Methodology and Data Analysis

In performing an objective comparative analysis on whether or not the use of Grammarly results in improvement on the quality of students’ written work, two formal written papers, the first and the final papers, from the participating classes described in the section above were run by the instructors through the premium version of Grammarly. For the courses where Gram-marly was used as a writing feedback tool, only the students’ first paper did not have the benefit of having Grammarly feedback. Subsequent papers up to the final papers had the benefit of Grammarly feedback, however. For the papers submitted in the courses where Grammarly was not used as a feedback mechanism, the Grammarly reports were generated on the electronically archived student papers. The following data points were obtained and recorded for each paper:

  1. Overall Score (percentile ranking)
  2. Total number of words 57
  3. Total number of sentences
  4. Number of Writing Issues
  5. Number of Advanced Issues
  6. Number of Critical Issues

Due to the varying lengths of the papers in terms of the number of words and sentences, the writing issues were converted into instances per 100 words for consistency and comparable measures of quality of writing. Three new data points were computed from above: (1) Number of writing issues per 100 words, (2) Number of advanced issues per 100 words, and (3) Number of critical issues per 100 words. In addition to the Overall Score, these three computed data points comprise the variables used for measuring writing quality, as presented in this study.

Two statistical analyses were performed on the data – a between-group comparison and a within-group comparison. In both comparisons, the magnitude of the differences in the means of the two groups as well as the mean differences within groups were estimated.

Between-group comparison: With Grammarly vs. Without Grammarly –

this compared the quality of the final papers produced by the sections in BA 330 and BA 427 in which Grammarly was used as a writing feedback tool against the final papers submitted by the classes wherein Grammarly was not used as a feedback tool. In the latter group, the feedback on the drafts was made by the instructors as they were reading and grading the papers. The two-sample t-test of means for independent samples was used for analyzing the four measures of writing quality. This statistical test answered the first research question, “Does the use of Grammarly in evaluating and providing feedback result in improved quality of submitted written work by students?”

Within-group comparison: First Paper vs. Final Paper –

this compared the difference between the quality of the first paper against the final paper submitted by each student within the classes where Grammarly was used as a writing feedback tool. The first paper submitted in these classes did not have the benefit of Grammarly feedback while the final paper did. The paired t-test of means for dependent samples was used for comparing the quality of the first paper against the final paper using the four measures of writing quality. This answered the first research question “If the use of Grammarly as a feedback mechanism results in improved quality of submitted written work by students, how much is this improvement on a per-student basis, on average?”

Results of the Study

Presented below are the statistical results and analysis of the Grammarly data of the papers in the classes considered in this study.

Tables 2 and 3 below summarize the results of the Two-sample t-tests for Independent Samples on the four measures of writing quality of the groups: a) With Grammarly and b) Without Grammarly. Table 2 summarizes the comparative tests for BA 427: Business Policy and Table 3 for BA 330: Principles of Marketing.

Table 2: BA 427: Business Policy – Comparison of Means on the Four Measures of Writing Quality, With Grammarly versus Without Grammarly

Table 3: BA 330: Principles of Marketing – Comparison of Means on the Four Measures of Writing Quality, With Grammarly versus Without Grammarly

Findings

Between-group comparisons: With Grammarly vs. Without Grammarly

The quality of the final papers written in both groups of classes in BA 330 and BA 427, where Grammarly was used as a feedback tool for students’ writing, were statistically significantly better, at 0.05 significance level, than the group of classes where Grammarly was not utilized as a feedback tool. This result was consistent in the four measures of writing quality and in both courses.

For BA 427, the overall mean percentile score of the final papers submitted in the With Grammarly group was about 10% higher than the mean scores (p-value of 0.000584) of the final papers submitted in the Without Grammarly group. In terms of the quality of writing measured by the number of issues per 100 words in the papers, the With Grammarly group had a significantly lower number of issues, either advanced issues or critical issues, compared to the Without Grammarly group. The improvement, or the reduction in number of issues, advanced or critical, per 100 words was 24.2% (0.000724 p-value), 19.4% (0.007857 p-value), and 37.4% (0.006732 p-value), respectively.

In the case of BA 330, similar results were observed. The overall mean percentile score of the final papers submitted in the With Grammarly group was 12.1% higher than the mean scores (p-value of 0.00864) of the final papers submitted in the Without Grammarly group. In terms of the quality of writing measured by the number of issues per 100 words in the papers, the With Grammarly group had a significantly lower number of issues, either advanced issues or critical issues, compared to the Without Grammarly group. The improvement, or the reduction in number of issues, advanced or critical, per 100 words was 32% (0.01295 p-value), 20.9% (0.04112 p-value), and 55.2% (0.00117 p-value), respectively.

Within-group comparisons: Final Paper (With Grammarly) vs. First Paper (Without Grammarly)

The Pairwise t-test for Dependent Samples was used to measure the improvement in the quality of writing of individual students in classes where Grammarly was used as a feedback tool. For each student, the quality of the first paper (without Grammarly feedback) submitted was compared against the final paper (with Grammarly feedback) for the four measures of writing quality. Tables 4 and 5 below summarize the results of the paired samples t-tests for the BA 330 and BA 427 classes where Grammarly was used. The data provide extremely strong evidence (p-values close to 0) that the use of Grammarly as a writing feedback tool results in significant improvement in the quality of the students’ submitted work.

In BA 427, the overall percentile score difference between the first paper (no Grammarly feedback) and the final paper (with Grammarly feedback) showed an improvement of 11.0 percentile points on average, on a per-student basis. For each student, the number of writing issues – advanced and critical issues – per 100 words improved (declined) as well between the first and the final papers.

Similar results were observed in BA 330. The overall percentile score difference between the first (no Gram-marly feedback) and the final paper (with Grammarly feedback) showed an improvement of 8.5 percentile points on average, on a per-student basis. For each student, the number of writing issues – advanced and critical issues – per 100 words improved (declined) as well between the first and the final papers. The number of advanced issues per 100 words declined by 4 per student, on average.

Conclusion

The data obtained in this study provide strong empirical evidence that the use of Grammarly as a writing feedback tool leads to an improvement in the quality of written work submitted by the students. The study data showed that in the classes where Grammarly was used for writing feedback, the submitted written work had better overall quality, manifested by a significantly lower number of writing issues, both advanced and critical, per 100 words, compared to their non-Grammarly counterparts. The reduction in the number of writing issues directly resulted in at least ten percent higher average overall percentile scores of the papers submitted in the classes that used Grammarly compared to those that did not. On a per-student basis, the use of Grammarly resulted in 11 and 8.5 improvements in overall percentile writing scores in BA 427 and BA 330, respectively, and across the board reduction in the number of writing issues per 100 words in both courses.

The results of this comparative study provide strong evidence for promoting the use of Grammarly as an automated writing feedback tool in classes where a significant amount of writing is part of the curriculum. The comparative data clearly and strongly show not only the significant improvement in the quality of submitted written papers where Grammarly was used but also revealed the magnitude of the improvement manifested in the increase in the students’ writing percentile ranking and the reduction in the number of advanced and critical issues observed in the submitted papers.

Benefits to the Instructors

From the instructors’ perspective, the lessening of the grading workload is the most significant benefit of Grammarly’s use as an automated writing feedback tool. It allows them to devote more time reading, evaluating, and commenting on content, where their subject matter expertise is more crucial, rather than spend an excessive amount of time proofreading and correcting structural flaws in students’ papers. While it is an important aspect of writing, the mechanics can be addressed much more comprehensively, consistently, and timely by an automated checking application like Grammarly.

Benefits to the Students

On the other hand, from the students’ perspective, the use of Grammarly provides instantaneous, comprehensive, and detailed feedback on writing issues they can immediately address and correct as they write their papers. At the same time, students can learn in real-time the mechanics and proper writing structure. A student in BA 427, where Grammarly was used, commented, “This was my first term using Grammarly, and I can say that might have been the most helpful tool I have used since I have been to college.” Another student observed, “Grammarly is easy and a beneficial way to check grammar, punctuation, and word choice, which is difficult for me to check on my own.” Grammarly’s active use can increase students’ writing confidence and motivate them to improve their writing quality scores resulting from improved writing mechanics.

Limitations of the Study:

The instructors used the premium version of Grammarly; however, it is unknown whether the free version would have the same impact or if any of the students used Grammarly, the free or premium version, on their own before submitting work. In addition, business students were the subjects of this study. It is unknown if similar results would occur in other disciplines, e.g. creative writing.

This study only examined writing mechanics. The plagiarism checker feature in Grammarly was not used. There is much more involved in writing a high-quality paper. Content, for example, was not evaluated in this study.

References

  • Bok, D. (2006). Our underachieving colleges: A candid look at how much students learn and why they should be learning more. . NJ: Princeton University Press.
  • Daily Log Challenge (2020). Retrieved from Grammarly Review 2020 (Premium vs Free): https://blog.dailylogochallenge.com/gram-marly-review/
  • Ericsson, K. A., Prietula , M. J., & Cokely, E. T. (2007, July-August). The making of an expert. Retrieved from Harvard Business Review: https://hbr.org/2007/07/the-making-of-an-expert
  • Fiock, H., & Garcia, H. (2019, November 11). How to give your students better feedback with technology. Retrieved from The Chronicle of Higher Education: https://www.chronicle.com/article/how-to-give-your-students-better-feedback-with-technolo-gy/?cid=gen_sign_in
  • Gain, A., Rao, M., & Bhat, S. (2018). Usage of Grammarly – Online grammar and spelling checker tool at the Health Sciences Library. Manipal Academy of Higher Education, Manipal.
  • Grammarly (n.d.) Grammarly User Survey Analysis. Retrieved from https://www.grammarly.com/press/research/docs/grammarly-studentsurvey-121018133119-phpapp01.pdf
  • Habitts, P. (n.d.). Product Habits Blog. Retrieved from How Grammarly quietly grew its way to 6.9 million daily users in 9 years: https://producthabits.com/how-grammarly-quietly-grew-its-way-to-7-million-daily-users/
  • Higher Education Academy (2013). HEA Feedback toolkit. York, United Kingdom.
  • Kellogg, R. T. (1994). The psychology of writing. New York: Oxford University Press.
  • Kellogg, R. T., & Raulerson III, B. A. (2007). Improving the writing skills of college students. Psychonomic Bulletin & Review, 14(2), 237-242.
  • Kowalewski, S. J., & Halasz, M. E. (2019, February 25). Why are written communication skills important for business students? Archives of Business Research, 7(2), 95-102.
  • Mandernach, J. (2020). Integration of holistic feedback to engage and motivate students. Innovate Educators.
  • Mulliner, M., & Tucker, M. (2015). Feedback on feedback practice: perceptions of students and academics. Assessment & Evaluation in Higher Education, 42:2, 266-288, DOI: 10.1080/02602938.2015.1103365.
  • Nickerson, R. S., Perkins, D. N., & Smith, E. E. (1985). The teaching of thinking. NJ: Erlbaum.
  • O’Neill, R., & Russell, A. M. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology, 35(1).
  • Rentz, K., & Lentz, P. (2018). Business Communication: A Problem Solving Approach. McGraw Hill Publishers.
  • Rizqan, K., & Darayani, N. A. (2018, January). Grammarly as a tool to improve students’ writing quality. Sains Sosial dan Humanio-ra.
  • Schmidt, R. A., & Bjork, R. A. (1992). New conceptualizations of practice: Common principles in three paradigms suggest new concepts for training. Psychological Science, 207-217.
  • US Food and Drug Administration. (2020, September 5). The Plain Writing Act of 2010. Retrieved from https://www.fda.gov/about-fda/plain-writing-its-law/plain-writing-act-2010
  • Zimmerman, B. J. (2006). Development and adaptation of expertise: The role of self-regulatory processes and beliefs. In K. A. Erics-son, The Cambridge handbook of expertise and expert performance (pp. 705-722). New York: Cambridge University Press.