🧩 Business Opportunities
• The expanding business' need for an improved Learning Management System with a flexible curricular structure, allowing for content to be re-used across different courses. Â
• A pattern of declining student engagement rates on several stages of the course, directly affecting the company's north star metric.
🧪 My role
As Product Designer of a core business squad relating to Education Experience responsible for Student Engagement and overall improvement of the learning platform, I was tasked with tackling the redesign of the LMS.
The project included the design of a new navigational structure for a flexible curriculum, as well as key UX improvements with the end goal of increasing student engagement.
✨ Results and Deliveries
• Delivery of the blueprint of a brand new, flexible Learning Management System, as well leadership and guidance for other squads to fit their contexts and tools within the new navigational structure.
• Increase of 32,4% of Student Engagement Metrics on Course Sections with lower engagement / approval rates.
đź‘‹ Context
About Trybe
Trybe is an online tech school whose business model operates on the Income Share Agreement (ISA) Payment model; students can pay for tuition only after landing a job that pays above a minimum grade.Â
As all classes are online, the school relies on digital tools to teach students in both synchronous and asynchronous methods, such as Zoom, Slack, and an online learning platform.
Originally, the platform was conceived as a rudimentary MVP that allowed students to view a daily content agenda and read lessons and assignments for its Flagship Web Development Course. All student communication and messaging happened through Slack, and Online classes took place within Zoom. Students had to develop Code Projects and Exercises on their text editor app.The problem
As Trybe expanded and took in more cohorts, so did the needs of its users and its business plan. By mid-2022, the school aimed to develop into a complete Tech School that could offer a variety of tech-related courses. Its second Course, focused on C++, was scheduled to launch in June 2022.
However, the LMS had a legacy code problem which made updates and improvements difficult, besides a visible decrease in student engagement rates starting from the last half of the Front-End Module, as taken from samples from the last 3-5 cohorts.
For said purposes, the LMS' curricular structure, navigation, the way its learning content was stored and shown to students & other learning functionalities would all have to be re-built into a flexible, modular, and replicable structure that several different courses would be built upon.
My Squad
During that time, I worked for the Learning Experience Squad (named EdX) as a Product Designer. My team focused on creating solutions for the end user to improve student engagement metrics that directly impacted course approval rates & job landing success.
As the EdX squad was directly responsible for student engagement and improving content consumption, it became the squad responsible during Q3-Q4 / 2021 to create and develop the blueprint for the front-end of an LMS platform that would be able to host several tech courses with modular, replicable learning functions alongside adaptable learning management tools.
We also collaborated with an operations team that designed the Back-End experience of the platform and how content would be stored and reused.
Key Metric
Student Engagement: Overall, this metric is defined by a student's interaction with the course's content during a study day (6 hours). An engaged student has a minimal number of commits made in code projects, clicks on video-classes, quizzes and attendance on live classes and mentorship sessions.
Engagement with the course's content was a leading metric that directly related to the school's north star metric of student approval and therefore, student hirings.
In summary, a cohorts' overall engagement was measured by the number of students whose behaviors fit those listed above.
Definitions of Success
• Delivery of a complete modular structure for the LMS embracing every context including careers, finance/payments and educational content.Â
• An increase of at least 20% in student engagement in the last 2 segments of the Front-End Module of the course. This is a point in the student's journey where engagement and approval rates have a tendency to be the lowest due to an increase in difficulty as shown by samples from the last 3 cohorts. With a brand new curricular structure and improved UX, we expected engagement rates to rise by testing the new platform with two new cohorts.
🔎 Discovery Process
Analyzing the platform and the school's curricular structure
My squad and I met with stakeholders to better understand the business’ needs and what needed to change within the curriculum's structure. The former learning platform was focused on a single course: Web Development. All of its learning contents, data, and resources for the user were integrated into the only existing course, creating a completely static experience.Â
To design a multi-course Learning Management System, Trybe's curriculum and Learning Assessment teams would need to rethink the former curricular structure to design a flexible, modular learning experience that could be replicated across several different courses, similar to a system template.
Within the former platform, the course was divided into the following hierarchies:
• Modules: The largest grouping within the Web Development Course, modules are divided by subjects in Web Development, such as Fundamentals, Front-End, Back-End and Computer Science.
• Blocks: Equivalent to one week of learning content within a module, a module's block hosts around 5 content days inside. Â
• Content Days: Contain texts, videos, exercises, assignments, etc. Formerly designed as a static, continuous curriculum.
Project timeline & governance
With a clearer picture of the scope of the project, my squad's Product Manager and I organized a timeline that spanned the entire three-month quarter with the intent of organizing our sprints and deliveries.
Also, every two weeks, my squad would have a general progress report meeting with Stakeholders and the company Board.Â
My squad's main priority was the new learning experience, meaning we would be focusing on re-designing the Flagship Course's structure to make it replicable across different courses. This involved re-designing the entire navigation experience, affecting platform areas that were ownership of different teams. So, inevitably, all squads would be partly involved in this construction.
For this reason, other squad PM's and Designers were also included in report meetings, this way everyone could acquire context and define their squad's next steps during each stage of the process. The Design Team also held weekly Design Sync meetings, so I constantly updated other designers on the project as well.Â
While I designed the blueprint for the project, the Front-end development team would work on organizing platform architecture and developing product initiatives defined in the last quarter. I would also include developers in every step of the process so the entire team could gain enough context given the scope and complexity of the construction.
The delivery/sprint timeline was organized as follows:
Research
To guide both the team and stakeholder's priorities and decisions, as well as collect insights on how the platform and its content should be organized, I decided to:
• Browse the former flagship platform and conduct a quick analysis using Nielsen Moloch's heuristic evaluation of User Interfaces;
• Gather qualitative data regarding the user's learning experience. Although some major improvements would seem relatively clear to us, we wanted to prioritize the user's perspective and learn how we could best shape the experience to fit their learning needs
Method and objectives
Method: Exploratory interviews with the scope of two weeks including planning, conduction and consolidation.Â
Main Goals: To collect qualitative data in order to understand the decrease in engagement and approval rates during specific points of the course. We wanted to prioritize the user's perspective and learn how we could best shape the experience to fit their learning needs as well as discover ways to mitigate critical pain points.
Output Expectations: were to use this data as a means to guide our decisions as we designed the new LMS platform and released it for testing; and also to prioritize a roadmap of opportunities to further explore in the next quarters.
User Grouping
Intending to acquire a more varied sample of insights and also avoid repetition bias (by accidentally choosing similar student profiles at random), we decided that we had to divide participants according to performance & engagement within the platform. With the help of the data science team, we divided our student base into 3 groups. The definitions of performance and engagement are as follows:Â
• Group A
High Performing and engaged students
These students tend to receive approval on Projects and Assignments on an average of 5-2 days before the deadline.• Group B
Engaged students with a risk of falling off track
These students tend to receive approval on their assignments on an average of 2-0 days before the deadline. Some may be doing /have had to do recovery projects.• Group B
Disengaged and off-track students
. This group contains students who have fallen behind a cohort at some point during the course, in other words, students who did not hand in a recovery assignment on the deadline.I decided to conduct user interview sessions with a sample of 15 students total (5 from each group) coupled with short usability tests in the current platform.
Mainly, we would focus on Groups B and C when prioritizing our insights. Still, we believed that Group A students would offer valuable insights regarding study routine organization and conviction within the career of web development.
Exploratory Interview script
The interview script contained general questions about :
• Challenges with study routine and general experience with video classes and content;
What their study routine is like, what they feel is lacking, when was the last time they were frustrated and how did that happen;
• Challenges with projects and learning assessment;
When was the last time they practiced, and how did they manage their doubts and difficulties?
• How students last used the platform's features such as live class recordings, quiz exercises, etc;
Subsequently, I would also ask students to share their screens with me via zoom and ask them to simulate their last study routine, and I would observe and take notes on how students navigated the platform and used its features.
Research analysis and insights
After conducting the research, I decided to transpose all data on a Figjam whiteboard so we could get a broader, more organized view of the pain points and opportunities we would be dealing with. To better identify patterns and opportunities, I decided to transcribe research insights and nuggets into digital post-it notes, and then, cluster all data into an affinity diagram.
I decided to organize insights into groups with small opportunity solution trees, adapting the method described by Theresa Torres in her book; Continuous Discovery. The reasons why I chose to organize insights this way (instead of using personas or mental models) is because:
• I believed this would give the team and stakeholders a clearer, more straightforward picture of user pain points & opportunities to prioritize;
• We already had quantifiable data on our user base and the specific goal of grouping users was to acquire a varied number of insights
The team voted collectively on the opportunities we believed would have the highest impact when building a new navigation tree and improving our learning platform.
We prioritized the following:
Content consumption & learning assessment:
• Students mentioned difficulties with the platform's content, as most of it relied solely on text learning, and lacked more diverse or interactive forms of media (such as videos, quizzes, etc)
• This is further emphasized via study habits analyzed throughout the research; some students prefer to review the platform’s content before attending live classes, lack of more diversified media might impact their study habits negatively;
• Between studying by reading content and attending live classes, students mentioned that they do not have enough time during the day to focus on exercises;
• Several students from Groups B and C mentioned that content & exercises tend to accumulate and snowball into a large number of basic difficulties, making it harder to keep track of assignments and projects on time;
Video consumption:
Students mentioned having trouble finding live class recordings on the current interface;
• Students feel a lack of support resources in videos and during live class participation on zoom, such as audio transcription and codes/exercises demonstrated by the teachers.
• When browsing live class recordings in the recordings interface, students would present trouble finding a specific day;
• The recordings interface was not mobile optimized, which made studying outside school hours/when away from the desktop more troublesome;
• Students complained about the lack of information on each video. Just the title was not enough to remember what the class was exactly about, or if it would be useful for their current studies.
Routine organization:
• Students mentioned having trouble organizing their study routines, getting lost in the content listing was commonplace for most students in Group C;
• When browsing the content agenda interface, most students would present trouble finding specific days, except for a few students in Group A;
Hypotheses and bets:
• We decided to experiment with a modular video / live-class recording component that could be reused throughout the content, which would help extend the number of videos throughout the learning experience;
• The new video-class page should have space to include video transcription and other complementary resources;
• The new video-class interface should be easier to browse and contain clearer information on each video;
• We decided students needed a clearer, more straightforward way to keep track of their exercises and assignments;
• The new learning interface should have more variety between text, exercises, and videos. The curriculum/learning team should be able to seamlessly incorporate exercise/quiz components into text content to create a more dynamic experience with learning assessment;
• Exercise difficulty scale would have to be revised by the Curriculum team and stakeholders, and if necessary, be re-arranged across the content;
✏️ Definition
Shaping a flexible content navigation structure
Subsequent to collecting and documenting research data, I decided to use our newly-acquired insights to help guide the definition of what a flexible, high-quality learning experience should look like. The platform's structure would need to enable content modularity, address urgent user pain points, and be able to grow and change in the long term with high-quality code and navigational architecture.
While I collected and organized data, the curriculum team had finished designing how Trybe's modular learning experience should be organized. Collaboratively, we organized a hierarchy tree to better summarize this experience.
• Module: This structure stayed the same, dividing Courses into major study areas;
• Section: Similar to blocks, this groups a part of the module's content, and each section can host several lessons inside;
• Lesson: The equivalent of one day of learning content.
• Learning Objects: These are the smallest unit within the structure, they can be either text, quizzes, exercises, videos, and more. Each learning object is a modular unit that can be recycled throughout the curriculum and other courses, allowing the Learning team to build courses in scale.
Benchmarks
With a flexible learning structure defined by stakeholders, we decided it was time to shape the platform's new navigational tree. Thus, with the collaboration of a User Researcher and a Designer from the Operations team (who would be designing back-office and content organization architecture) we started shaping the LMS’ new navigation tree.
At first, the three of us conducted a desk research study on platforms like Allura, Coursera, Udemy, RocketSeat and Reforge, with the intention of collecting references and also identifying common patterns in the navigation structure of LMS/learning/course platforms.
We printed screens and laid out the navigation of our benchmarks on a Figjam whiteboard that we studied and commented on.
Subsequently, we collaboratively discussed and designed Trybe's new navigation structure; each designer would draw an individual tree containing their hypothesis of what an intuitive hierarchy for the new LMS would look like.
What we all agreed on was that the new structure would need to contain:
• A homepage/hub
Where students would be able to continue the courses they enrolled in, explore new courses, and have access to their data;
• Personal profile and financial/payment area;
An area where students would be able to edit/upload their data and prove whether they are working or not and execute payments;
• Study area / In-course content navigation
That should be replicated across all courses;
• A career assessment area
The area where students would be able to find job listings and career-related content;
• A Help area
With links to Zendesk, different FAQs, and Student Support access;
đź—ş Tree Testing
To assess the effectiveness of our structure and test our hypotheses, we decided to run asynchronous tree testing sessions.
We picked a sample of 15 students + 15 candidates picked from a marketing lead database and divided them into groups of 3 with 10 people each; each group would test one of the navigation trees we shaped, and participants would do the tests individually.
We decided to use the Maze app to create & send the test to our participants and to also gather quantified results.
The tree test method was conducted as a questionnaire with specific objectives that would help us understand user mental models, findability of platform features and learning content. Test objectives involved finding the user's Payment profile and specific ISA features,  a specific content day within the flagship course (which involved traversing the module structure), finding job listings, consulting school rules within the help menu, etc.
Test results &Â final navigation tree
Using the results collected from the Maze App, we not only assessed the effectiveness of the different navigation trees regarding each specific task but also observed which parts of the hierarchy would need extra attention when designing. We decided to combine the best pieces of each navigation tree into a final version, and validated it one last time with another sample of 5 different students before wireframing. It's worth noting that we also scoped time to conduct usability tests on the wireframes.
đź—ş Designing a Cross-Navigation Menu
Materializing a multi-faceted navigational structure
To enable users to easily navigate across several different courses, career assessment options and job listings, personal info, etc, I realized we would have to design a simplified, cross-section navigation menu for users.
Simultaneous with running the final test and consolidating our navigation tree, I decided to call for other team members & designers to participate in collaborative co-creating sessions to design a new menu. Considering time limitations, I decided to employ a simplified/summarized version of the Design Sprint Process across two days, alternating with one asynchronous moment for research & benchmarking.
Preparations
Below is a quick summary of how I organized and facilitated the collaborative design sessions with multiple participants, and how we arrived at a satisfactory result for this component.
Intending to unite a variety of views and skillsets, I settled on calling for different professional profiles to participate, communicating with them one week ahead of the start of the sprint.Â
• Two Product Designers from different teams. One designer was from the ISA/Payment Operations squad and another from the Student Admissions/top-of-funnel squad;
• One Product Manager, from my Squad;
• One Teach Lead, from my Squad;
• One person from the Learning Operations team, who teaches, helps, and guides students directly throughout their routine. The person was a Web Development (Front-End Module) teacher who conducts live classes with students;
• One Front-End Developer and one Back-End developer from my Squad. Both would both be directly involved with the construction of the component.
I organized the process across two consecutive days, with one asynchronous moment in between. I used Figma's Figjam as our whiteboard.
Day 01 [01 hour]
Synchronous moment / 1hr:
In 10 minutes, I introduced a moment to contextualize the team regarding the platform's new design & navigational structure. Also, some space for quick questions.
20 minutes / usability problem discussion: Subsequently, we would skip to the first Brainstorm frame, which contained the question: ”Besides the need for a new navigational structure, what are the problems with our current menu?". This frame intended to collect our most urgent usability problems that we would be looking to change.
All participants would list on sticky notes the different issues and problems with our current menu design;
and finally, we would organize each sticky note in groups by affinity and vote on the most urgent issues we noted.
20 minutes / brainstorm: Finally, the last frame contained the question: ”What would the ideal menu need?". The idea was a brainstorm intended to collect insights on usability improvements we could prioritize on the new design. Again, all participants would list different solutions and improvements on sticky notes, then, we would group the notes by affinity and vote on the most impactful points we noted. At the end of this brainstorm, we would discuss our insights.
I planned another 20 minutes for this step as well.
5 minutes / closing & next steps: I asked the team to collect visual benchmarks on good menu designs to use for the next workshop.
I decided to leave an extra five minutes on purpose, but even then, we still surpassed the meeting time by a couple of minutes, which motivated me to increase the time of the next meeting by ten extra minutes just in case.
Day 02 [01 hours, 40 minutes]
On Day 01, I asked everyone to collect visual benchmarks on good menu designs. As expected of a busy team at a tech startup, not everyone gathered their benchmark on time. I decided to (gently) notify participants who had not done their "homework" yet via Slack a couple of hours before the next moment, understanding that I would inevitably need to squeeze in a few minutes for benchmark-searching at the beginning of the moment.
Synchronous moment / 1hr:
I started the moment using 5 minutes for a quick refresher of the previous meeting. I added 10 more minutes for everyone to observe the benchmark shots we collected. We would add notes highlighting the positive aspects of each menu design & details we could incorporate into ours. Participants who did not collect their benchmark on time would be able to fetch a few pictures as well;
20 minutes / benchmark discussion: Each participant would briefly present their benchmarks and notes with the intent of creating discussion and collecting ideas for the next step;
10 minutes / crazy 8's: With our benchmarks and notes ready, this would be the moment to start ideating with Google Sprints’ Crazy 8's method:
Each participant would grab a sheet of paper, divide it into eight frames (one horizontal half and four vertical columns), and start drawing an idea of the product in each frame, with one minute per frame.
The idea is to detach from the idea of perfectionism and just get multiple ideas out;
10 minutes / crazy 8's presentation:Â Each participant would quickly present the ideas they sketched;
10 min /Â Final sketch: After the Crazy 8's and presentations, now was the time to sketch a cleaner solution closer to what we would expect to see on an ideal navigation menu; Each participant would sketch out their idea of an actual prototype; The majority of the participants converged on relatively similar designs;
15 min /Â Final presentations: Finally, each participant would present their prototype sketch and explain the ideas behind it. In the end, we would all vote on what we believed was the most effective design;
05 min / Closing & next steps: I added a frame to collect feedback that I could work on next time I facilitated a sprint/event;
I clarified to all participants that, although we had a voting time, all sketches were valuable ideas for the team to work with.
✏️ Wireframing and Validation
With a validated guide to our navigational structure, we decided to start with simple wireframes of all interfaces across the platform. Although this seemed like a large number of screens to design, we would also be reusing and repurposing some of the interfaces already available on the former platform; not to mention, my main priority was to focus on the content & learning assessment areas of the LMS.
Menu prototype
Before wireframing all interfaces for validation, I decided to start designing the new navigational menu based on our collaborative studies. Given the wide number of contexts and different hierarchies, we settled on a side-navigation-styled menu with a header for user profiles and notifications.
Header-menu:
1: Menu toggle:Â On desktop, the menu is open by default. Upon clicking, the menu collapses into an icon-only sidebar. On mobile screens, the menu is hidden by default and overlaps the content upon being opened.
2: Logotype:Â upon clicking, the student is taken back to the LMS' hub/homepage area.
3: Inside the student profile dropdown are located cross-platform areas such as profile, payment profile, logout options, etc;
The side menu features content areas separated into 3 hierarchies:
4: General:Â Standard platform areas such as the Homepage & Student Career Profile;
5: Learning area:Â When inside the context of a specific course, this section appears, showing every page within that course;
6: Course support: When inside the context of a specific course, this section shows student absences & support/help area;
Wireframing
With a base design for the menu and a validated navigation tree, I started sketching and wireframing interfaces for each area of the platform. Given our time scope and the number of interfaces, I intended to quickly lay out the essential features of each area so we could jump to the validating process in collaboration with other teams as quickly as possible, given that the Career Assessment, Operations and Finance squads would also have to re-adapt their respective contexts and interfaces inside the new LMS.
🧪 Usability Testing
With initial wireframes ready, I decided it was time for a final round of validation with users. While the tree testing validated the structure we intended to design, we needed to verify the interface experience and findability of every area that students would have to access.Â
Method
For the usability test, I decided to replicate the same activities of our former tree test, as it sought to validate the findability of every step of the user journey.
I organized all tasks in a spreadsheet that verified the effectiveness of each task. Besides the simple activities, I would also ask users to tell me how they perceived each interface.
The spreadsheet listed:
• If the user accomplished the task;
• How fast the user accomplished the task, aiming to measure interface efficacy, meaning if the user has accomplished the task with relative ease, without trying more than a couple of times. Also, if the user did not attempt to click at random to advance to the next step / the user did not need me to redirect them;
In other words, the interface would be considered efficient based on how easily users managed to accomplish tasks.
Each user had a specific page in the spreadsheet, where I would mark if they accomplished the task and if they did it with ease.
After the test, each student received a standard System Usability Scale (SUS) questionnaire in google forms to evaluate the interface based on the test.
Grouping
I grouped 5 different students from the B and C groups that did not participate in any previous discovery/validation rounds. Likewise with the interview process, I contacted students via e-mail with a small form and calendly link.
Test results and iterations
SUS (System Usability Scale) Result: 92,2 (excellent)
Given the overall results and final SUS metric, we determined the test was overall successful, except for a few urgent iterations we included in the final screens (which I will describe shortly) and several minor improvements we chose to put in the team's backlog for later prioritization.
🎨 Final Prototypes: Styles
Design tokens & styles
Before designing the final screens, I revised our Design System's global tokens and styles to better guide my final prototyping process.
All the design tokens were structured and revised by me and a senior accessibility designer while simultaneously working on our squad's projects.
🎨 Final PrototypesÂ
LMS homepage and course dashboard
This is the area where the user arrives upon logging in. The homepage is divided into three main sections:
01: Access to Current courses: Courses the student is enrolled in. Upon clicking a card, they are taken to that course’s context so they can study.
02: Access to Career assessment area: Only appears if a student is enrolled in a course. This area is ownership of the Career Assessment Squad. This is where users can see job listings, learn more about hiring partners, etc;
03: Explore Section: In this section, users can explore a variety of different tech courses to enroll in. In case a user is not yet enrolled in a course, this is the only section that appears, with a Featured Courses section. This area is ownership of the Student Admission and Top of Funnel Squads.
Course dashboard page
When a student is enrolled in a course, they are able to access its content through the "Your Courses" section on the homepage. This is the Course Dashboard, where users can check their overall progress, explore lessons and content in general;
01: This is a card that directly links the student to the latest lesson they have seen;
02: All Course Modules plus progress data. Upon clicking a Module card, the student ;
03: A project & assignment list plus progress data;
Course module page
Upon accessing a Course Module through the content page, the user will have access to a list of all the sections plus the lessons they can explore within that module.
01: Students are able to find specific lessons using a search bar;
02: Students can view a list of all sections within that module. Upon clicking a section, a dropdown with a list of lessons opens up. By clicking one of the lessons, the student is taken to a learning interface. Content that the student has completed has a "checked" status.
Lesson page
Upon entering a lesson, be it through the "Latest Lesson" card on the Course Dashboard or through the lesson listing on the previous shown Module Page, the student is able to read, study and interact with a lesson's content;
The intention behind this page's design was to create more breathing room and focus, with tokenized spacing and fewer elements.
01: Upon entering a lesson's interface, the side menu dynamically changes into a lesson menu in order to suit this new context. lists all lessons within the section the user finds themselves in, allowing them to easily navigate through the content;
02: Breadcrumb allowing students to go back to the content listing page;
03: Video-class module is a Vimeo embed; the student is able to control the video's speed, activate captions, enlarge the screen, etc. On mobile screens, upon clicking on the video module, it overlays the page's content;
04: Floating control bar that allows the student to switch between previous and next lessons, it's also accessible/controllable via the arrow keys. On mobile devices, it fills the bottom of the screen, so it remains with an accessible clickable area of at least 40px.
In-course context: Projects and Assignments Dashboard
This is the area where students can keep track of their progress on projects and assignment deliveries. Its accessible via the assignments tile on the Course Dashboard page or through the "Projects" item on the side-nav.
The assignments are presented on a list with several horizontal cards that include the name, deadline and progress status (in progress / approved / in recovery).
Upon clicking a card, the student is taken to that respective assignment's page where they may see the details of their overall progress, grades and an access to the repository where they can deliver. Shown below.Â
In-course context: video class recordings
Upon clicking the "Live classes" option on the side-navigation menu, the student is able to view a list of all their live-class recordings;
01: Students can watch the recording of their latest class;
02: A search bar in order to optimize class findability;
03: Students can filter recordings by module & also by most/least recent;
04: Smaller video thumbnails containing more information on each class, plus easier to optimize into a mobile interface. Upon clicking a thumbnail, the student is taken to a video interface containing the video's transcription.
đźšš Delivery
With each step of the process continuously reported and validated by stakeholders and the company board, my squad and I presented the final interfaces of the LMS (with a summary of the entire process) to the rest of the company.Â
In order to deliver the blueprint for development, I documented the entire project by context/platform area so we could better lay out our priorities. I started with a single page containing a general view of all screens (with respective screenflow navigation arrows) before specifying each LMS area on separate pages. All documented and specified within Figma.
Documentation
Final Prototypes grouped by context & functionality:
• General HomepageÂ
• Course Content / module / lesson pages
• Video Class Page
• Project & Assignment Page
Hand-Off
Every Final Prototype Page contained:
• The final interfaces for every single flow, are connected by arrows for major clarity.
• Above each prototype, its respective Jira Task number & link.
• Below each prototype, a quick description of the interface.
• Responsive screens
• I also explicitly point out components and behaviors with tooltips for developers.
Rollout and development
Finally, the LMS was set to rollout by the end of Q2 / 2022 before the C++ course launched. During Q4/21 and Q1/22, our team would develop, test, and validate each stage of the platform, starting with the Course Content, Module, and Lesson pages.
During the rest of the development process, I validated the front-end work alongside the team's Product Manager as I worked on other projects and initiatives.
✨ Experiments & Conclusion
Finally, The LMS would rollout in three phases:
• Phase 01: Internal Teams
Timeframe: Q1 /22
Only developers, designers, PMs, and Stakeholders gained access to the LMS platform with the intention of reporting bugs, errors, and other urgent changes;
• Phase 02: Experiment with Cohorts
Timeframe: Q1 /22
We observed the engagement rates of two different cohorts who were halfway through the Front-End module and about to reach a point of decrease in engagement by its end. One cohort, the Experimental Group, would test the new LMS' improved layout and structure while studying. The other cohort, the Control Group would continue their studies on the former platform.
As the Experimental Group would be testing a brand new platform, we chose to roll out the new LMS earlier in their Front-End journey in order to allow the users to grow accustomed to the new experience and layout first. We would also make feedback widgets available to the experimental group so they could report bugs and other problems to the team. Nearing the end of the module, we observed a significant rise in engagement from the Experimental Group even as they reached its final projects.  While many students did lose engagement and a small percentage wasn't approved on their first try, student engagement was over 30% higher than the Control Group's.
• Phase 03: Final Release
Timeframe: Q2Â /22
Finally, after measuring the results from the second phase, we released the LMS platform for our entire user base, also installing Widgets in order to continuously measure CSAT and as ways to complement future initiatives within the product.Thank you!