Supporting Student Engagement and Sense of Community through the Use of Video in Online Discussion Boards

DOI: https://doi.org/10.64010/LYON8593

Abstract

Students and instructors are gravitating toward online education due to its ease of use and the convenience of anytime, anyplace learning. As the use of online education increases, the exchange of best practice techniques is needed for facilitators to have the tools to lead effective learning. Understanding and increasing the student’s connection to the course is also of importance. There is much research that supports a constructivist, student-centered approach to online learning. This paper presented a pilot study that compared and contrasted different types of asynchronous discussion prompts (text vs. video) to measure the student’s reaction within the online classroom. It was a quasi-experimental quantitative survey research study conducted within a Midwestern institution of higher education. The results provided a model for measuring varied techniques within the online classroom. keywords: constructivism, online discussion board, online education

Introduction

The use of online education has continued to increase in education (Simonson, Smaldino, & Zvacek, 2015). Traditional and non-traditional student populations seem to be gravitating towards flexibility in education, and online learning provides a vehicle for both asynchronous and synchronous learning opportunities. Within the field, however, there remains much debate over whether or not the modality can support learning outcomes in the same way that face-to-face instruction can. What are the online education teaching techniques that foster students’ ability to feel connected to the class, the teacher, their classmates, and the educational institution? Can online learning mirror or mimic the face-to-face interaction achieved in a traditional classroom? How can a sense of community be achieved in an online learning format?

Much of the research has indicated that online learning supports constructivism (or student-centered, collaborative) learning, which can be seen as a positive approach to teaching and learning (Campbell & Schwier, 2014; Grandzol & Grandzol, 2006). It has also been shown that the sense of community is important to students in the online setting even though this community may have a different emotional feel than a traditional classroom (deNoyelles, Zydney, & Chen, 2014). Instructors looking to use best practices in the online setting must implement techniques to support student-centered learning and a high sense of community within the various domains of learning (Si-monson, 2015). There has been a plethora of applied research conducted to discuss best practices in online learning; however, much of the dialogue is qualitative in nature. The research is often conducted by practitioners teaching online who share what works and what does not work from personal perspectives. The field is only beginning to call for more specific quantitative measurement of techniques that add value and increase sense of community in online education.

Connecting students to learning within the online environment presents challenges not present in the face-to-face format due to disconnections in both time and space. In a face-to-face format, students and teachers share curriculum together in a classroom at the same time and thus build community during classroom instruction and other interactions before and after class. In the asynchronous online learning environment, these same conditions are not present. The separation causes a natural disruption in the ability to interact, to create, and to feel community. The disconnection can hinder the opportunity to achieve a student-centered learning focus.

The methodology presented in this study is a first step to creating a model to specifically test and measure various techniques in the online classroom. Since the primary means of interaction within online learning is through the discussion board (Andresen, 2009), the purpose of the pilot study presented here sought to compare and contrast different types of asynchronous discussion prompts (text vs. video) and identify whether or not the type of prompt influenced the student’s sense of community within the online classroom. The methodology developed is also presented as a model for future research to begin to add to a data-driven body of literature to define quality online education.

Review of Literature

Simonson et al. (2015) outlined the evolution of online education and identified that the modality has become widely accepted in education. In fact, research articles, academic presentations, and conferences have increased and there has been a momentum to study online education including its pedagogy, hardware, software, assessment, and more (Simonson et al., 2015). Studies have shown there are numerous advantages to online learning for students and teachers. The courses can be available with a simple internet connection from anywhere in the country and even the world (Simonson et al., 2015). Students can often work at their own pace within the constraints of the course (Simonson et al., 2015). In recent years, many online learning technology methods have become straight-forward, easy to use, and easy to update (Simonson et al., 2015).

Baghdadi (2011) outlined best practices for online instructors. Most importantly, the online instructor should be actively engaged in the course by regularly interacting and responding to students (Baghdadi, 2011). Instructors should also use a carefully prepared syllabus that includes specific information to help the student navigate the online course, and the instructor should prepare a well-defined pattern to the course material (Baghdadi, 2011). Grandzol and Grandzol (2006) compiled an extensive list of best practices in online education in business programs. These included consistency in course design, courses prepared in advance of the course start date, learner-centered teaching style, and an establishment of trust within the first few days of class (Grandzol & Grandzol, 2006). Thiede (2012) listed best practices such as blogging, simulation and case studies, use of wiki pages, video, e-portfolio, and online discussion boards as ways to engage online students. While these studies share valuable information for the online instructor, the information was provided as suggestions from practice rather than from research.

The online discussion board has been demonstrated as a common area for students to interact with one another and for teachers to nurture response and inquiry (Woods & Bliss, 2016; deNoyelles et al., 2014). Garrison and Anderson (2003) pointed to the online discussion board as an area to promote interaction and facilitate learning. Many research studies have been conducted to explain the importance of the discussion board and techniques to use in the discussion board. However, much of the research has been qualitative in nature.

Woods & Bliss (2016), for example, developed several suggestions for best practices in online education based on the literature. The suggestions included: flexible design in discussion questions, incorporating reflective assignments, and managing class sizes, among others (Woods & Bliss, 2016). The suggestions were evidence-based (Woods & Bliss, 2016) and future research might continue to quantitatively measure their effect on students. Levine (2007) compiled a list of ten relevant conditions to assist instructors in effectively using an online discussion board. The conditions, including posting meaningful questions and focusing on high cognitive domains, were based in research; however, there was no quantitative evidence to support the use of the conditions. In recent years, some researchers have begun to look at specific tools to employ within the discussion board, such as audio, video, text, and visual elements (Covelli, 2017). Kahn, Everington, Kelm, Reid, and Watkins (2017) supported this notion that educational technology can promote engagement of online students. Among other factors, Kahn et al. (2017) noted the increased use of social media in the online classroom and the links to increased interactions. Teng and Taveras (2004) provided a practitioner case study whereby they added streaming video, audio, synchronous chat, and asynchronous open forum to an online class and observed positive results. However, their measurements were instructor and student feedback that were interpreted by the researchers rather than a statistical model that might demonstrate greater clarity or statistically significant results. Clark, Strudler, and Grove (2015) conducted a mixed-methods study to compare asynchronous and synchronous video against text-based discussion. The use of video in the study was centered around video conferencing type interaction (Clark et al., 2015). Overall, the study found the video discussions rated higher in the areas tested (Clark et al., 2015). Swartzwelder, Murphy, and Murphy (2019) studied text-based and video-based discussions in nursing education. They found that text-based discussions were favored over video discussions, although the paper encouraged instructors to continue the use of video to promote social learning (Swartzwelder, Murphy, & Murphy, 2019). This type of research demonstrated promise to continue to add best practices to the field and to continue to test different modalities. Rudd and Rudd (2014) conducted a review of literature that supported the use of video in synchronous and asynchronous online environments including web conferencing, virtual classrooms, and computer based training in the form of videos. Similar to other studies, the compilation of information was helpful, but it failed to test these techniques in a statistical manner.

The goal of reviewing and continuing to add research in the area of best practices in online education stems from the desire to improve online education and the use of specific techniques to engage the student. The Community of Inquiry (CoI) framework has grown in popularity as a construct to observe and measure effective online teaching through three lenses: teaching presence, social presence, and cognitive presence (Garrison, Anderson, & Archer 2000; Armel-lini & DeStefani, 2016). As described on the Canadian-based website dedicated to compiling and sharing information on the CoI framework, the theory was originally developed and funded through a research project examining the text-based computer environment in education (Garrison et al., 2000). The cognitive presence was described as the ability of the learner to create meaning within a course or experience (Garrison et al., 2000). The social presence was described as the ability of the learner to feel connection and derive meaning from the experience, and the teaching presence was described as the ability to facilitate the social and cognitive lenses to create worthwhile learning experiences (Garrison et al., 2000).

As the CoI framework evolved, additional indicators were developed within each presence area. In the cognitive presence, indicators were identified in four phases: a triggering event, exploration, integration, and resolution (Akyol et al., 2011). In the social presence, indicators included affective expression, open communication, and group cohesion, and in the teaching presence, indicators can be measured in the areas of design and organization, facilitation, and direct instruction (Akyol et al., 2011).

The CoI framework viewed online learningthrough a constructivist or student-centered focus (Garrison, Cleveland-Innes, & Fung, 2010). Akyol, Vaughan, and Garrison (2011) and others offered that CoI provides an effective framework for building online learning communities. Newer research on CoI indicated promise that the three constructs remained valid areas of importance in online education, with some indication that the social category was central to both teaching and cognitive areas (Armellini & DeSte-fani, 2016). The element of teaching also continued to be important. Budhai and Williams (2016) found that planning teaching around the needs of the students was important. Also, using emerging technologies and incorporating these into the online classroom was proving to be a best practice within the CoI framework (Budhai & Williams, 2016). Additional studies using the CoI framework that measure specific techniques for creating a robust online learning community will help contribute definitive research and have the potential to continue to add best practices to the profession of online teaching.

Research Design and Methods

This study, and the model developed, furthered the use of quantitative measures to test specific strategies and tools that can be used to enhance a community-centered approach to online learning. The research questions were: Does the use of asynchronous video discussion prompts impact the sense of community within online undergraduate courses? In what way is the teaching presence, social presence, cognitive presence impacted by the use of video discussion prompts versus text discussion prompts?

A validated survey tool was available as a CoI questionnaire, measuring 34 indicators in three categories, ten subcategories using a five point Likert scale (Garrison et al., 2014). The independent variable was the type of discussion prompt. The dependent variables were the ten indicators: teaching presence (1. design & organization (TDO), 2. facilitation (TF), 3. direct instruction (TDI)), social presence (4. affective expression (SA), 5. open communication (SOC), 6. group cohesion (SG)) and cognitive presence (7. triggering event (CT), 8. exploration (CE), 9. integration (CI), 10. resolution (CR) (Akyol et al., 2011; Arbaugh, 2008; deNoyelles et al., 2014).

The survey tool had been previously tested and was found to be reliable with Cronbach’s Alpha values of α = 0.91 for teaching presence, α = 0.91 for social presence and α = 0.95 for cognitive presence (Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, & Swan, 2008).

The research design was a quasi-experimental quantitative survey research study using a one-time post-test delivered via surveymonkey.com near the end of four undergraduate psychology courses at a Midwestern public university. Four courses were chosen based on the following research parameters: two courses were chosen that had two sections (four total courses) available at one institution, at the same course level (undergraduate), in the same academic department, taught by the same instructor in the same semester. The learning management system for all four courses was Blackboard, and the general style of instruction was similar across all the courses. The study and use of the survey tool was approved by the internal Institutional Review Board (IRB).

The study involved two groups. Section one of each course was used as the control group. The discussion boards in the control group experienced text-based asynchronous discussion prompts. Section two of each course was the quasi-experiment group. The discussion boards in the quasi-experiment group experienced video-based asynchronous discussion prompts.

The sample size was determined by enrollment in the courses. The researcher did not interfere with the enrollment process for the courses. Students self-selected their courses through the standard process of course selection at the university.

Methods of analysis included standard statistical interpretation of the data set. The research questions, hypotheses, variables, and methods of analysis are summarized in Table 1.

Results

Using SPSS 22, the data was viewed in one data set (two control courses and two quasi-experiment courses). Variables were coded for ease of i nterpretation.

Descriptive Data

Seventy-two students responded to the survey; however, one student chose to withdraw from the study at the final question. Therefore, the N = 71.

In the control courses (section one), there were 43 participants, and in the test courses, there were 28 participants.

An initial descriptive means test was performed with type of discussion prompt as the independent variable and the 34 Likert presence questions as the dependent variables for the data set. Within the CoI factors of teaching, social and cognitive presence, the survey participants reported overall high scores with a mean range for the 34 CoI factors in all four courses of m = 3.71 – 4.51 demonstrating a positive sense of community in all four courses (text-based and video-based).

The range of means for test, video-based courses was m = 3.71 – 4.29, and the range for control, text-based courses was slightly higher at m = 3.77 – 4.51.

Overall, students scored the control, text-based courses higher than the video-based courses in each category.

The CoI survey questionnaire was divided into three presences with ten subcategories of variables. A more detailed descriptive means test was performed with type of discussion prompt as the independent variable and the ten subcategories as the dependent variables for the data set. When viewing the means by combined subcategories, the survey participants reported even higher scores with a mean range for the ten CoI subcategories in all four courses of m = 3.76 – 4.43.

The range for test, video-based courses was m = 3.76 – 4.15, and the range for control, text-based courses was slightly higher at m = 3.92 – 4.43. Recalling that the Likert scale was a five-point scale with five representing strongly agree and one representing strongly disagree, the means test also demonstrated that students scored the control, text-based courses higher than the video-based courses in each category. Table 2 summarizes the means for the ten subdivided categories.

Inferential Data

In this study, the scale reliability analysis for the survey indicated a Cronbach’s Alpha α = 0.99 for the 34 Likert scale items on the CoI survey suggesting a high level of consistency for the scale. Within the three CoI categories, the Cronbach’s Alpha values were α = 0.97 for teaching presence, α = 0.93 for social presence and α = 0.98 for cognitive presence. See Table 3 for the alpha for each subcategory. The high scores support the previous study by Arbaugh et al. (2008) that indicated a high level of reliability using Cronbach’s Alpha.

An independent sample t test was performed with type of discussion prompt as the independent variable and the ten subdivided presence categories as the dependent variables: teaching-design & organization, teaching-facilitation, teaching-direct instruction, social-affective expression, social-open communication, social-group cohesion, cognitive-triggering event, cognitive-exploration, cognitive-integration, cognitive-resolution. Table 4 summarizes the data.

The t test revealed little significance between the variables across the control versus test courses. Social presence-group cohesion (SG) was the closest to significant in a 2-tailed test. The t test for social presence-group cohesion (SG) was the nearest to a statistically reliable difference between the mean of video-based (m = 3.87, sd = 1.23) versus text-based (m = 4.30, sd = 0.91), t (69) = -1.71, p = 0.09, α = .05.

Discussion

When examining the results of the primary t test and the results of the comparative means of the video-based versus text-based scores within the context of the CoI presences, both hypotheses were rejected indicating the type of discussion prompt used in this study did not have a measurable effect on sense of community.

The hypotheses had qualitative support from the literature review that included studies from Teng and Taveras (2004); Rudd and Rudd (2014); and Clark et al. (2015) that suggested sense of community was important and deliberate use of tools (such as video) and techniques should be employed to develop this sense of community. The research indicated positive results when delivering audio and visual within the online learning classroom that impacted the factors of the CoI framework. While the literature supported the general idea that video adds value to the on-line classroom, this study revealed that there may be different uses of video that may or may not contribute to the overall sense of community. A video discussion prompt did not lead to a significant difference in the student’s sense of community. These results specifically support why the methodology presented here is so important to the field. While video as a modality was supported as adding value to the online classroom, this study quantitatively measures that video as a discussion prompt alone, does not promote greater sense of community. Multiple uses of video or varying uses of video should be tested and measured to identify which types of video may or may not be most effective. For example, varying the type of prompt (problem-based, project-based and debate prompts) has been shown to effect the student’s interaction with higher levels of cognitive function in the discussion board in varying ways (deNoyelles et al., 2014). An informational prompt may have a different effect than a more interactive prompt. The academic grade level of the student (graduate versus undergraduate) may also make a difference in a student’s cognitive interpretation of a discussion prompt delivered in a text-based or video-based format. The mannerisms of the instructor, the tone, facial expressions, body language may positively or negatively impact the student’s interpretation of the video. Finally, the student’s technological access to the video (whether via a public computer, private computer, tablet, phone, etc.) and the speed of the internet connection might affect how the student connects with the content on the video, and this in turn may influence the student’s opinion of the video.

One of the reasons behind this study’s results may be related to the controls put forth in the set-up of the study. Recall that the research design held numerous elements constant in an effort to capture pure data on what was occurring between video and text-based discussion prompts rather than be influenced by erroneous variables such as the skill set of the instructor, the technical prowess of the instructor or institutional, and other outside factors. The instructor was held constant. The university was held constant. The learning management system was held constant. The type of courses within one academic department was held constant. These constants helped narrowly study whether or not the video-based versus text-based prompts made a difference. It was found that the video alone did not significantly factor into the scores related to sense of community. If these various factors were varied, it may change the results of the study.

In this study, the mean scores were high in both types of courses. It is plausible to state this particular instructor may offer a unique ability to create a sense of community in both text-based and video-based environments (as demonstrated by the mean scores). If this were the case, a more general difference between the two types of prompts may not be evident in the data based on the set up of this study. Also, there may be factors within the learning management system that contribute to fostering community or the type of courses (psychology) that contribute to students’ sense of community.

There are other conditions for effective use of discussion boards that involve more than simply the initial prompt. This study sought to specifically measure the effect of the prompt rather than measuring multiple effects. This study revealed that the prompt media (video/text) does not make a significant difference. Other factors that may lead to a greater sense of community within online discussion boards might be: outlining rules, a how-to guide to threaded discussion, stimulating participation, encouraging reflection and summarizing key ideas (Levine, 2007) along with the use of video, text or other tools. The students’ use of video, rather than just the instructor in the discussion board may also make a difference. Measuring these and other types of techniques was outside the scope of this study.

The fact that both text-based courses and video-based courses scored high reveals that the media of a discussion prompt does not have a statistically significant impact on the student’s sense of community. Therefore, using the methodology presented in this study, other techniques should be explored to continuously measure the specific tools that impact the online experience.

Strengths and Limitations

For the purposes of this study, the sample size was deemed significant; however, a larger sample size may add greater validity to the data as more participants would add value to randomizing the data.

For future studies, the process of assigning students to the courses should be considered. Randomizing a survey sample is often a challenge. Here, it was assumed that the enrollment process would sufficiently randomize the desired sample. However, most institutions fill section one of a course prior to filling section two of a course. This creates a potentially smaller sample size in section two, and may unnaturally fill the sections with students who possess uniform traits (i.e. those who tend to register early versus those who tend to register later). It is suggested that future studies seek to further randomize the sample, if possible, by shuffling students within the sections in a more random manner.

Additional research questions to explore might be: does the use of video discussion prompts and video feedback within the discussion boards influence the students’ sense of community? Does the use of video feedback within the grading process of the course have an effect on specific subcategories within the CoI framework? How does age factor into the students’ sense of community and reaction to video in the online classroom? As digital natives begin to greater populate the online classroom, what is the effect of the use of video and audio techniques on the CoI framework factors? Does the instructor’s level of experience (number of years teaching online) impact the CoI framework factors?

Expanding on this study might also lead to continued research on the use of audio tools within the online discussion board and the combined use of audio and visual tools. It would also be interesting to study the use of these tools by students and whether or not the student use of video within discussion boards has an effect on the sense of community within the classroom. Replicating this study by expanding it to a multi-university, multiple instructor study would further test whether or not these factors had an impact on the data found in this study.

Finally, further research should be conducted to study the type of discussion prompt and to measure whether or not the type (i.e. case study, ice-breaker, reflective, problem-based, project-based, debate, open-ended, closed-ended, required) and the media (video/text) correlates to creating a greater sense of community.

Conclusions

Online learning has become an important modality in education. As such, research to enhance the pedagogical techniques of delivering online education is vital to the future success of online students and educational pathways and degree programs. Engaging online students in a student-centered, interactive, collaborative manner is essential to aid in their sense of community and support the learning process.

The CoI framework remains a strong tool to understand how students interact in the online learning environment. The tool helps to measure students’ sense of presence within the classroom and can aid in the study of online learning.

This study presented a methodology that can be repeated as a quantitative quasi-experiment to measure specific teaching techniques in the online environment. The study investigated the three presences of the CoI framework: teaching, social and cognitive within the ten subcategories of teaching (design & organization, facilitation, direct instruction), social (affective expression, open communication, group cohesion), and cognitive (triggering event, exploration, integration, resolution).

The results failed to reveal a statistically reliable difference between the text-based discussion prompts and the video-based discussion prompts. The hypotheses had qualitative support from the literature as the use of video had been found to have positive effects within the online classroom. Furthering this study and using video in multiple ways within a discussion board may reveal positive aspects for the online learner, their connection to the online classroom and their sense of community. This study was weighted with non-traditional aged students, and the students’ age may impact their opinions on video-based tools and online learning.

Additional research should explore other specific tools and techniques within online asynchronous discussion boards and how these tools impact the sense of community. The emphasis on quantitative research was suggested, as the statistical data aids in the discussion of which specific techniques work or do not work, rather than superlatives and generalizations about tools used to improve online learning. Video tools may add to the sense of community, but this study provided data that the discussion prompt may not be the best technique.

While this study was limited in scope, it demonstrated that a video-based discussion prompt alone will not produce a greater sense of teaching presence, social presence or cognitive presence in a course.

Implications to Practice

Online learning continues to increase and remains a relevant, expanding modality for learning in both formal and informal educational settings. On a professional level, this study added to the ongoing research in online learning, the CoI framework and the use of asynchronous online discussion boards. Garrison et al. (2014) maintain ongoing records of research being conducted using the CoI framework and seek to continually refine the tool and framework to contribute to best practices in developing online and blended learning courses, environment and the inter-connection of the teaching, social and cognitive presences. Studies of this sort contribute to the body of literature and provide evidence-based support for implications of the framework.

On an organizational level, this study added value to practitioners, faculty members and instructional designers seeking to include tools and techniques to online courses to improve the student experience, the student’s sense of community and the student’s connection to the institution and to his or her knowledge. There is often a general sense that certain techniques, such as video or audio additions will aid in the student experience, both in the face-to-face environment and the online environment. However, little data specific to these techniques is often available, or the techniques are tested in mixed studies analyzing numerous factors within a course. This study sought to specifically measure one technique (video-based prompt) in one area of the online classroom (discussion boards). By controlling the conditions, the study also successfully accounted for extraneous variables to ensure the measurement of the video-based prompt versus the text-based prompt demonstrated results. While narrow in scope, the study provided data and proved through statistics, that a video-based discussion prompt, alone, does not impact the student’s connection to the teaching, social or cognitive presence within the online classroom.

Institutions of higher education can use this data to explore varying teaching techniques such as incorporating video into other areas of the online classroom, in addition to the discussion board, and then measuring the effect. Organizations can also use this study as a model to test other types of manipulation with discussion prompts such as the use of audio, combinations of audio and video, use of visuals, photos, links to external feeds and more. The type of discussion prompt is also an important factor in the student’s experience in the online discussion board, and institutions can encourage instructional designers and instructors to vary their type of discussion prompt and measure the results. The organizational goal should be to improve online learning and provide greater outcomes for students.

On the individual level, this study added data and discussion surrounding the student’s experience with online learning. The sense of community is an important factor in learning, and the online classroom is no different. Promoting opportunities for students to feel a greater sense of community within the teaching, social and cognitive presence in the online classroom aids in the individual improvement of learning. Interactive and collaborative online learning is at the core of the Community of Inquiry framework. Discovering tools to contribute to a student-centered online classroom is a worthy pedagogical goal. This study has continued the dialogue and encourages future research to reach the goal.

References

  • Akyol, Z., Vaughan, N., & Garrison, D. R. (2011). The impact of course duration on the development of a community of inquiry. Interactive Learning Environments, 19(3), 231-246.
  • Andresen, M. A. (2009). Asynchronous discussion forums: success factors, outcomes, assessments, and limitations. Educational Technology & Society, 12 (1), 249–257.
  • Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in
  • online MBA Courses? International Review of Research in Open and Distance Learning, 9(2), 1-21.
  • Arbaugh, J. B., Cleveland-Innes, Diaz, S., Garrison, D. R., Ice, P., Richardson, J., &
  • Swan, K. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education, 11, 133-136.
  • Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology, 47(6), 1202-1216. doi:10.1111/ bjet.12302
  • Baghdadi, Z. D. (2011). Best practices in online education: Online instructors, courses, and administrators. Turkish Online Journal of Distance Education, 12(3), 109-117.
  • Budhai, S. & Williams, M. (2016). Teaching presence in online courses: Practical applications, co-facilitation, and technology integration. The Journal of Effective Teaching, 16(3), 76-84.
  • Campbell, K., & Schwier, R. (2014). Major movements in instructional design. In
  • Zawacki Richter, O. & Anderson, T. (Eds.), Online Distance Education: Towards a Research Agenda. Edmonton, AB: AU Press.
  • Clark, C., Strudler, N., & Grove, K. (2015). Comparing asynchronous and synchronous
  • video vs. text based discussions in an online teacher education course. Online Learning, 19(3), 48-69.
  • COI. (n.d.). Community of Inquiry. [Website]. Retrieved from: https://coi.athabascau.ca/
  • Covelli, B. (2017). Online discussion boards: The practice of building community for
  • adult learners. The Journal of Continuing Higher Education, 65(2), 139-145.
  • deNoyelles, A., Zydney, J., & Chen, B. (2014). Strategies for creating a community of inquiry through online asynchronous discussions. Journal of Online Learning & Teaching, 10(1), 153-165.
  • Garrison, D. R., & Anderson, T. (2003). E-Learning in the 21st Century. London:
  • Routledge Falmer.
  • Garrison, D. R., Anderson, T, & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2: 87–105.
  • Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring casual
  • relationships among teaching, cognitive and social presence: Study perceptions of the community of inquiry framework. The Internet and Higher Education, 13, 31-36.
  • Garrison, D. R., Cleveland-Innes, M., & Vaughan, N. (2014). CoI survey. Retrieved from: https://coi.athabascau.ca/coi-model/coi-survey/
  • Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business
  • education. International Review of Research in Open & Distance Learning, 7(1), 1-18.
  • Kahn, P., Everington, L., Kelm, K., Reid, I., & Watkins, F. (2017). Understanding
  • student engagement in online learning environments: the role of reflexivity. Educational Technology Research & Development, 65(1), 203-218. doi:10.1007/s11423-016-9484-z
  • Levine, S. (2007). The online discussion board. New Directions for Adult & Continuing
  • Education, (113), 67-74.
  • Rudd, D. I., & Rudd, D. P. (2014). The value of video in online instruction. Journal of
  • Instructional Pedagogies, 13.
  • Simonson, M., Smaldino, S., Albright, M. & Zvacek, S. (2015). Teaching and learning at
  • a distance: Foundations of distance education 6th ed. Boston: Pearson.
  • Swartzwelder, K., Murphy, J., & Murphy, G. (2019). The impact of text-based and
  • video discussions on student engagement and interactivity in an online course. Journal of Educators Online, 16(1).
  • Teng, T., & Taveras, M. (2004). Combining live video and audio broadcasting,
  • synchronous chat, and asynchronous open forum discussions in distance education. Journal of Educational Technology Systems, 33(2), 121-129.
  • Thiede, R. (2012). Best practices with online courses. US-China Education Review,
  • A(2), 135-141.
  • Woods, K. & Bliss, K. (2016). Facilitating successful online discussions. The Journal of
  • Effective Teaching, 16(2), 76-92.