NoobLab is an online e-learning platform developed – by Paul Neve, a lecturer at Kingston University, and used in conjunction with his lectures in Programming 1 course; it incorporates dynamic elements such as practical workshops and gamification in order to promote “active” pedagogy.


i.) to present the framing material for a practical programming workshop task alongside a facility for students to compose, edit and run the workshop code and ii.) to provide dynamic feedback based on the student’s activities as they complete a workshop task, and provide an analogue of the learning loop.


The aim of this project – through the evaluation and analysis of NoobLab and its users – is to improve the user experience of NoobLab in order to elevate student engagement with the platform and enhance their motivation for learning programming.


In order to achieve the aim that was set out for this project, several key objectives are established. These objectives then helped to facilitate the research questions:

1. Conduct in-depth user research in order to gain an insight into the relationship between NoobLab and its users.

2. Evaluate NoobLab’s usability by looking at factors such as its efficiency, intuitiveness, and subjective satisfactions by its users.

3. Review NoobLab’s information architecture and its content structure.

4. Analyze NoobLab’s user-interface design and visual aesthetics.


Due to time and resource constraints, the end user “undergraduate students” attending Neve’s Programming 1 module at Kingston University was selected as the sole focus of this study. a set of research questions – revolving around the students – was developed:

1. Is NoobLab’s current user-interface intuitive to use and easy to navigate?

2. Are students content with NoobLab’s ability to help them learn programming?

Are there any functions or features they deem missing but would be beneficial to their learning?

3. Does NoobLab’s visual design evoke positive emotions from students?

Do students find NoobLab desirable?

4. Does NoobLab allow students to track their progress and achievements effectively?

What method of data visualization could help students improve their progress tracking?


A timeline was developed with all parties involved, and the project was split into two phases. Phase One – produced in this project – concentrates on research and user development.

Phase Two – not a part of this report – involves further user development and iterations, prototypes, user testing, data visualization and implementing the design recommendations into NoobLab.


The core of the research is focused on experience and perceptions: the epistemology between the researcher and the research would adhere to an emic paradigm. The methodology chosen to achieve the aim of the project is grounded theory.

Five participants were chosen – for the in-depth interviews – based on their gender, level of skills in coding and the team in which they were assigned: team Skywalker for the more advanced students or team Solo for beginner students. A more condensed version of the planned interview was also conducted with an additional three participants at a Codebash workshop hosted by Neve.

A guided approach was followed and a set of predefined questions were created: the questions are all open-ended designed to promote discussion between the interviewer and interviewees.

A secluded setting was chosen for the in-depth interviews in order to minimize distractions while the condensed interviews were conducted on site at the Codebash where students were partially engaged with NoobLab alongside the interviews. All participants were informed of the purpose of the study, and were made aware of how the data will be used and their rights to withdraw at any given time.

Open coding – as a part of the analysis for grounded theory – was carried out in hopes of discovering patterns and categorizations and ultimately forming conclusions that would lead to design recommendations.

The first round was conducted manually with paper and pen, with the coding done informally; once a clear direction and a rough concept for potential categories were established, a second round of coding using an excel spreadsheet was performed.

Three comparative e-learning platforms were chosen – based on the criteria of accessibility, cost and the existence of a learning path– for competitors analysis. They are Codecademy, Khan Academy and Duolingo. The three systems were first briefly examined for the type of learning they provide, the topics they cover, the gamification elements they offer, as well as overall impression of the platform. An informal expert review was administered – using Nielsen’s 10 Heuristics (Nielsen, 1995) – in order to assess their strengths and weaknesses with regards to user experience.

Two evaluators were involved in the evaluation, and scored each requirements on a 5-stars scale, with 5-stars being perfect, containing no problems, and 1-star being catastrophe, completely unsuccessful. The evaluators were given three tasks: i.) browse through NoobLab’s interface freely and ii.) to follow through with the two workshops of their choosing, and iii.) complete one of each of the medal challenges (one bronze, one silver and one gold) within exercises of their choosing. The final score takes the average between the evaluators and their feedback were combined into a report.

An expert information architecture review was also carried out to in the form of a content analysis and content mapping – where the consistency and appropriateness of labeling and information grouping hierarchies are evaluated. The byproducts of this process includes a blueprint sitemap and an inventory of NoobLab’s contents.


Below are summary tables showing findings from each of the methods used in this project. To see the full report for each of the methods, please download the PDF here.

As a result of the card-sorting excised conducted with another MSc student during the interview planning stage, a total of 12 categories – consisting questions from both researchers aiming to answer their research questions – were created and assembled into an interview guide: some of the categories overlapped between the two researchers’ needs and some tailored specifically to their separate objectives.

From the codes, five categories emerged: they are Design, System, User Needs, Usability and Help. Design refers to the visual feedback and design of NoobLab. System encompases all of the elements within NoobLab itself. User Needs relates to how the students perceive NoobLab, as well as how NoobLab affects their learning. Usability focuses more on the practical usage of NoobLab, including functions and learnability. Lastly, Help is attributed to the assistance students need when working with NoobLab, this include error messages as well as method of seeking help from either Paul Neve or the teaching assistants.

As the codes were being examined against the interview results, an unexpected pattern also emerged: a good amount of the codes seem to draw a relationship with Nielsen’s 10 Heuristics, and the feedback from the participants – regarding NoobLab’s strength or weakness – directly reflected the guidelines for set out by Jakob Nielsen.

Based on the results of the open coding, four personas were developed under the basis of four categories – summarized from the results of the interviews: “Context of Use”, “Information Needs”, “Expectations”, and “Progress Tracking”.

To see the results from the heuristic evaluation, please download the report here.

The classroom structure of Programming 1 divides the students into two different teams: team Skywalker and team Solo, but the content structure of NoobLab presents a number of workshops that overlap between the teams. So while team Skywalker has 19 workshops and team Solo has 18 workshops, in reality, there are only four workshops within each team that are tailored exclusively to themselves; the rest of the workshops between the two teams are the same. The sitemap also outlines the number of available exercises within each workshop, as well as the number of medal challenges available within each exercise.


Below is a summary table showing some of the redesign recommendations produced from the analysis conducted for this project. To see the full report for the redesign recommendations, please download the PDF here.