IMG_0007.png

Educative.io

 

Educative.io:

Usability of Online Programming Courses

Winter 2018

sami west, john frens, kit collins, jihyun lee

 

Executive Summary

We conducted a usability study to assess the ease-of-use for three major features of Educative.io courses — navigation, quizzes, and coding exercises. We observed 7 University of Washington students as they thought aloud while working through selected tasks in an introductory Javascript lesson. Our observations and subsequent interviews uncovered opportunities to improve the user interface (UI) and layout, increase the salience and usability of interactive feedback, and individualize the pace of the content. Our recommendations to improve the UI include increasing the visual contrast, drawing attention to interactive elements, and matching UI behavior with presentation. Feedback recommendations include using beginner-friendly error messages, increasing prominence and immediacy, and offering additional specificity and justification. Our recommendations for layout and pace include segmenting the content into shorter pages, placing examples and exercises in close proximity, and providing access to additional or extended content. This report offers a detailed discussion of the findings and prioritization of these opportunity areas.


Overview

Educative.io features online coding courses that leverage interactive interfaces in support of textual course content. The primary users of Educative.io have introductory or intermediate understanding of software development and have little to moderate experience with online courses. The purpose of this study is to provide data about the usability of the interactive features and uncover opportunities to improve. This study focuses specifically on the courses as opposed to other parts of the website, and particularly on the interactivity of navigation (see Figure 1), quizzes (see Figure 3), and coding exercises (see Figure 4). We conducted our usability sessions on a Javascript lesson from Educative’s introductory web development course.


Figure 1: Javascript lesson. The navigation bar on the left side allows users to change pages, search for content, and keep track of finished lessons.

Figure 1: Javascript lesson. The navigation bar on the left side allows users to change pages, search for content, and keep track of finished lessons.


Research Questions

Our study investigated the following research questions:

  • Do users understand the navigation of the course content? Do the lesson breakdowns and structure make sense?

  • Can users quiz themselves easily and effectively? Can they get feedback they want quickly and understandably? Do they perceive the feedback to be helpful?

  • Do users understand the two language (HTML/JavaScript) coding practice interfaces? Can they edit and run the code easily?

  • What obstacles do users encounter with the interface while working on a lesson?


participants

We recruited a combination of seven undergraduate and graduate Human Centered Design & Engineering (HCDE) students from University of Washington (UW) interested in learning introductory level Javascript programming. We distributed a screener to multiple HCDE affiliated Facebook groups. All seven respondents participated in our usability sessions.


Demographics

The participants were aged 19 to 29 years old. Six participants self-identified as female, one male. Two of the participants were undergraduate HCDE students while the remaining five were in the masters program.


Figure 2. Interactive Coding Example. Users are presented with an example of how to use Javascript to create an alert. In this example the code is already written; users press the “Click Me!” button to see the alert in their browser.

Figure 2. Interactive Coding Example. Users are presented with an example of how to use Javascript to create an alert. In this example the code is already written; users press the “Click Me!” button to see the alert in their browser.

Figure 3. Quiz. As a user progresses, they are presented with opportunities to “check their understanding.” We were interested if users could use these quizzes to quickly receive feedback on how they were doing.

Figure 3. Quiz. As a user progresses, they are presented with opportunities to “check their understanding.” We were interested if users could use these quizzes to quickly receive feedback on how they were doing.

Coding Experience and interest

We asked participants to report previous experience with Python, Java, Javascript, HTML, CSS, PHP, and C++. All seven participants reported prior coding experience in at least one language. Figure 4 summarizes participants’ experience levels.

Figure 4. Participant coding experience. All participants had some experience with coding. Five participants had no Javascript experience, one had less than a year, and one participant had over a year of Javascript experience.

Figure 4. Participant coding experience. All participants had some experience with coding. Five participants had no Javascript experience, one had less than a year, and one participant had over a year of Javascript experience.

We asked participants to indicate their interest in learning the languages. Six out of the seven participants selected Javascript as one of the languages they are interested in. On the later pre-session questionnaire, all seven participants agreed or strongly agreed with the statement, “I am interested in learning Javascript.”

We asked participants if they had prior experience with online learning platforms. Six participants indicated they had previously used an online learning platform. The platforms listed included Code Academy, Coding Dojo, Lynda, Udacity, and Coursera.


methodology and data

We conducted a 45-minute usability session with each participant on UW campus between February 13 and February 27, 2018. We asked participants to think aloud as they worked through seven tasks (see Appendix A). During each session, one research team member orchestrated and moderated the study, another timed each task, and a third team member recorded notes while watching a second monitor.


Procedure

At the beginning of the session, the moderator explained the procedures of the study and requested the participant’s consent. Next, the participant answered a pre-session questionnaire. The moderator then opened the educative.io course and read each task. After the user completed all tasks, they answered a post-session questionnaire. Finally, the moderator asked the participant five questions in a semi-structured interview.


data gathering

Data types gathered during the study:

  • Think-Aloud

  • Pre/Post Questionnaire

  • Post-Session Interviews

  • Time on Task

  • Observations

  • Number of Page Changes

During the usability session, we collected the amount of time the users spent on each task, as well as the number of times they changed pages. This descriptive quantitative data gives a sense of the variance among users and provides a benchmark for future studies utilizing the same tasks. For instance, new navigation features could be tested against times from the status quo. We also collected observational notes from each session and semi-structured interview. Finally, we collected pre-post likert measures of interest and confidence in Javascript, as well as additional post-test likert questions gauging users’ perceptions of the content, quizzes, exercises, and platform overall.


1_2OcV5IDyurix7iiReBLMqw.png

Findings and successes

What worked well?

To the right is a table of several positive themes we uncovered throughout our testing sessions. Primarily, participants liked the visual design of the platform and thought there was a large amount of rich content on learning javascript. Participants also found the left-hand course navigation bar easy to use.

Areas of Opportunity

In our study, we discovered areas and aspects of the lesson where participants struggled. We have thematically grouped these findings into three primary categories: UI Clarity, Course Feedback, and Content & Layout.

UI Clarity

One theme we identified across different aspects of the platform was that certain aspects of the interface did not function as expected or it was unclear to some participants what exactly they were supposed to be doing. For example, when participants attempted the quiz, they were surprised that after selecting one of the multiple choice options, the quiz would automatically jump to the next question. Especially since there was a “Next” button in the bottom corner (see Figure 3 above), participants expected that they could select an answer and then click next. Another example was the hamburger menu icon next to the left-hand navigation bar (see Figure 1 above). When participants clicked this icon, they expected a menu of additional opens to appear, but the actual function was to hide the left-hand navigation bar. Additionally, multiple components of the UI were not discovered by participants even though they were directly relevant to the tasks.

Course Feedback

During the study and following interviews, participants often made comments about the feedback presented to them during the quiz or interactive coding session. The most prominent example of participant confusion occurred when an error stack trace appeared during the coding exercise. Participants were not able to interpret the information and subsequently struggled to fix the issue in their code. Also, at times, participants referenced their need for additional or more detailed feedback. One participant stated she would like to know exactly where and why the code went wrong. Another participant suggested adding a feature to be able to ask questions to instructors. Finally, in many cases we observed participants ignoring or skipping opportunities for feedback, such as the “view details” button at the end of quizzes or by not clicking the “test” button on coding exercises.

Content & Layout

Over the course of our sessions, many participants commented on the content and the layout of the lesson. A significant theme across participants was that there was a large amount of text, the text often included technical terms that were unfamiliar to the user, and the lesson page was long. Observations showed that when trying to answer the quiz questions or complete the exercise, participants often would scroll back up to the textual information to try and remind themselves. One participant said that the lesson sometimes read like they were reading a “PDF of a textbook.” Participants liked the interactive coding sections and wanted more of those sections.


Implications and recommendations

UI Clarity

During our sessions, we noticed that while participants liked the visual design of the user interface, some of icons and interactions were misunderstood (e.g., check marks on quiz answers, hamburger menu to hide navigation) or went undiscovered (e.g., search bar, details tab on course overview page). These issues often led participants to being confused when the interaction was different than expected or the tasks took longer than expected.

We recommend increasing the contrast between the interactive elements and the non-interactive elements as well as considering a redesign of icons like the hamburger menu and the quiz multiple-choice check marks. By addressing these concerns, we think that the platform will be easier to use and understand for all users.

Course Feedback

One critical finding we discovered was the effect of the error message when participants attempted to create an event listener in the javascript exercise at the end of the lesson (See Figure 5).

0_0iJpOAlAfgep1OHl.png

Similarly, when users gave an incorrect answer on the quiz, they first needed to click “View Details” to see which question they answered incorrectly, and then the platform shows the correct answer in addition to the incorrectly selected answer (See Figure 6).

In both of these instances, participants said they wanted more feedback and helpful hints.

In the case of the exercise feedback on the coding attempt, participants said that the Runtime Error information was too complicated and intimidating. One participant said that they would leave the platform if they encountered that kind of error message. Participants also said that the error did not help them realize what they did wrong and they wanted access to hints or tips that would help them figure out how to fix their code. Participants said they wanted the feedback related to them in a conversational tone instead of a complicated error stack that they could not understand.

In the case of the quiz, participants liked being able to see which answers were correct and incorrect, but they also wanted more information on why the correct answer was right and why the incorrect answer was wrong. Observations during this portion of the session showed that participants spent little time (if any time at all) viewing and considering the quiz feedback.

We recommend revising or supplementing the current feedback systems with additional information or helpful hints. We also recommend rephrasing/restructuring the feedback so that it could be easily approached and understood by a beginner level developer. We believe this will help users better identify their mistakes and make quicker improvements to their code.

Content & Layout

0_23CAcYERaOaMPpOL.png

During our usability study sessions, we noticed that the course was very text heavy. Although the text served as the main guide for learning the course content, the amount of text per page actually slowed down the learning process. One example of this was found during the exercise portion of the lesson. By the time that participants reached the exercise, they had to scroll up to the examples/lessons and back down to the exercise continuously in order to complete it.

We also noticed that within the text were a large number of new vocabulary terms (see Figure 7). Participants generally had little experience with programming, so retaining these terms as they made their way down the page was difficult — especially with the amount of content on the page and lack of additional resources.

We would recommend to first segment the lessons into shorter pages. This would help users digest the lessons in small portions, reducing the chance of them feeling overwhelmed as they worked through the content. This would bring rise to examples living closer to the actual exercises, which would help users complete the task easier. For those that are struggling, we would also recommend that you provide additional resources for support. This could be in the form of a glossary for vocabulary words, or access to an instructor when they feel stuck on an exercise or concept.


Future studies

For future usability testing of Educative.io, we recommend the following:

Increased Diversity of Participants

One limitation of our study is that our participants were predominantly female Masters students in the HCDE program at the University of Washington. We recommend future studies recruit participants with different educational backgrounds/interests. We also recommend recruiting a more diverse age range and gender breakdown.

Assessing a Variety of Courses

Due to time constraints on the session, we could only assess the usability of the first Javascript lesson in the Introduction to Website Development course. We recommend adapting this usability study to focus on courses with other software development languages and practices. We also recommend attempting this study at a later portion of the course progression so the study does not only focus on the introductory materials.

Consistency Between Sessions

Due to room availability on the UW campus, the location of our study sessions was different for almost every participant. In each location, there was a different layout of the room, a different monitor size and location for the observing the participants interactions, and significantly different amounts of space. Additionally, we switched roles throughout the study to gain experience in moderating, timing, and notetaking. We recommend that future studies utilize the same space every time and that roles for study administrators are standardized so that each person serves the same role for all participants.

Remote Testing

A challenge to studying this product is that the participant is meant to learn and practice coding, which often takes time and is not done in front of an audience. We conducted our sessions in person with the moderator and notetakers observing the participant in the same room and this could have made them more nervous or anxious than if they were using the platform alone. We recommend attempting this type of study using remote testing so that the presence of the researcher is less likely to impact the participant’s behavior. Additionally, remote testing may also improve the external validity of the study because the participant would be in their home environment — a place where they are more likely to try an online course platform like Educative.io.