Empowering People to Leverage Personal Data with Learner-Centered Tools

I am a Human-Computer Interaction (HCI) researcher committed to empowering people through learning - by helping them gain insights from personal data and online information to enhance their understanding and decision-making when tackling complex or unfamiliar topics. At the core of my work is a belief in taking a holistic view of learning - one that includes mastery-based learning for vocational skills, spontaneous exploration for personal hobbies, or critically engaging with information for better decision-making in everyday life. To support this vision, I draw on my expertise in HCI, Learning Sciences,  and Information Visualization ****to develop and study self-monitoring technologies that leverage individual learning histories to promote self-directed learning while fostering self-reflection.

My work has been published in top tier Human-Computer Interaction and Visualization venues such as CHI, TVCG, VIS, VL/HCC, GI, and Learning@Scale.

Leveraging personal data for supporting self-directed learning of complex computational skills

A core part of my research draws from the domains of personal informatics in health, well-being, productivity, and theories in Learning Sciences for designing and developing tools to help individuals leverage their personal data within the context of learning complex skills using online resources.

Figure 1: MILESTONES is implemented as a browser plugin and accessed through a side-bar. a) A recording button allows the users to start/stop recording their study sessions (URL, time, tags and bookmarks). b) MILESTONES allows associating tags with currently active web-resources. c) Input for assigning a new tag to the current web-resource. Three interactive visual overviews of the collected data show time spent via Time Pulse (d) (Fig. 4), tagged resources via Cue-Connect (e) (Fig. 4), and automatically categorized resources via Sortify (f) (Fig. 3). g) A page action button allows the side-bar to be opened and closed on demand.

Figure 1: MILESTONES is implemented as a browser plugin and accessed through a side-bar. a) A recording button allows the users to start/stop recording their study sessions (URL, time, tags and bookmarks). b) MILESTONES allows associating tags with currently active web-resources. c) Input for assigning a new tag to the current web-resource. Three interactive visual overviews of the collected data show time spent via Time Pulse (d) (Fig. 4), tagged resources via Cue-Connect (e) (Fig. 4), and automatically categorized resources via Sortify (f) (Fig. 3). g) A page action button allows the side-bar to be opened and closed on demand.


A Semi-automated approach to tracking progress in Learning

In my dissertation, I designed, implemented, and deployed MILESTONES,  a novel self-monitoring tool for learning.  The inspiration behind the design of MILESTONES stemmed from one of my formative studies that showed that learners of complex computational skills tend to take a spontaneous trial-and-error-based approach to  “just get things done” without much opportunity to reflect on their choices which makes it challenging to gauge their progress [4]. Therefore,  I designed MILESTONES [1] to support the spontaneous approach to learning, while introducing opportunities for pauses for feedback and reflection by leveraging semi-automated data recording and on-demand interactive visual overviews (Figure 1). MILESTONES is implemented as a browser plugin that can operate on any web page and record it. While the recording is on, MILESTONES updates the interactive overviews in real-time to present the data in three different ways (Figure 2 and Figure 3). The recording feature in MILESTONES is designed to strike a balance between reducing the burden of data collection and providing user control. For example, It integrates automatic tracking of user interactions (e.g., URL,

Figure 2: The Time Pulse is a completely automatically generated calendar-based overview which presents the categorized web-resources in the order in which they were accessed. User can see an overview of their week (3 days shown in image), and also view a day at a time. 2) Cue Connect organizes the cue-tags that users associate with web-resources a) Cue-tags appear as a collection. User can view and visit the collection of web-resources that share the same tag by using the cue-tags as a filter. b) The curated web-resources, are displayed alongside the collection of tags allowing users to have an overview of their chosen subtopics and their hand-picked associated learning resources.

Figure 2: The Time Pulse is a completely automatically generated calendar-based overview which presents the categorized web-resources in the order in which they were accessed. User can see an overview of their week (3 days shown in image), and also view a day at a time. 2) Cue Connect organizes the cue-tags that users associate with web-resources a) Cue-tags appear as a collection. User can view and visit the collection of web-resources that share the same tag by using the cue-tags as a filter. b) The curated web-resources, are displayed alongside the collection of tags allowing users to have an overview of their chosen subtopics and their hand-picked associated learning resources.


time of access, duration of activity) while allowing manual adjustments (e.g., determining when to start and stop recording) by users. Specifically, MILESTONES allows learners to associate cue-tags (short topic-based action-oriented tags e.g., “read-more-about-tensorflow”) with the recorded web-resources. Cue-tags allow learners to curate resources while searching and use them as filters to retrieve relevant resources through the visual overviews. To evaluate the ecological validity of MILESTONES, I deployed it with 17 participants on their personal devices over 2 months, generating 103 hours of recording of 1700 web-resources, overall. Each participant used MILESTONES for an average of 6 hours and created an average of 5 unique cuetags. One of the key insights was that learners found that manually associating cue-tags with resources helped them maintain their focus on specific topics and integrated easily without disrupting their workflows. Additionally, the cue-tagging helped learners evaluate their progress by facilitating reflections in the context of their learning goals.

Interactive Visual Overviews to promote self-reflection on learning approaches

I included three interactive visual overviews to allow learners to explore and interact with their collected data on-demand anytime during the study sessions. The collected data includes automatically tracked data (e.g., webpage title, URL, time, duration) and user-generated ones (e.g., cue-tags and bookmarks). Each overview affords different types of interaction with the recorded data. For example, Time Pulse Overview (Figure 2.1) presents the data in a familiar calendar layout - allowing learners to change between daily and weekly views.  Cue-Connect Overview (Figure 2.2) allows learners to use cue-tags as filters to revisit relevant resources. Sortify Overview (Figure 3) uses an AI model to automatically categorize and color-coded resources based on pre-determined broad groups (e.g., tutorials, videos, articles, etc) and allows users to correct errors in categorization. MILESTONES presents the same underlying data using three variations of overviews, to enable learners to explore their data from different perspectives and discover patterns in their learning behaviors based on their personal self-directed learning goals. One of the key insights from the deployment field study was that learners tended to refer to the overviews to reflect on their learning during the learning session, rather than before or after. By allowing learners to view and interact with their data on-demand in various ways, MILESTONES was able to help learners expand their perception of progress from productivity-focused metrics (e.g., task completion) to personally meaningful indicators aligned with goals, such as the continual evaluation of the relevance of resources, gauging topic understanding, becoming aware of learning tendencies and adapting learning approaches to better align with goals. This work was recently accepted to be published at CHI 2025.


Figure 3: The Sortify Overview automatically categorizes all web-resources visited during a recorded session into broad groups (articles, lectures and videos, step-by-step tutorials) and specific resources (discussion forums, AI help, "my work"). a) The section headers display the aggregate time-spent on that group and dynamically reorganize in a descending order of time, giving users a sense of a resource-based view of their time. b) Each section expands to show the user specific web-resources they visited during the recorded session. c) and also allows users to recategorize resources.

Figure 3: The Sortify Overview automatically categorizes all web-resources visited during a recorded session into broad groups (articles, lectures and videos, step-by-step tutorials) and specific resources (discussion forums, AI help, "my work"). a) The section headers display the aggregate time-spent on that group and dynamically reorganize in a descending order of time, giving users a sense of a resource-based view of their time. b) Each section expands to show the user specific web-resources they visited during the recorded session. c) and also allows users to recategorize resources.

Understanding the Motivations and Needs of Diverse Learners

A core value that guides my research is taking an inclusive approach to designing technology. Understanding and addressing the needs of diverse users requires engaging with them directly and incorporating multiple perspectives throughout the design process. This value is reflected in my work as I employ participatory and iterative design-based methods to capture the needs of diverse users, actively involving them from the earliest design stages. I also seek insights from domain experts with hands-on experience in deploying functional tools. My studies with different stakeholders (e.g., individuals from various educational backgrounds, students, industry professionals, designers, and researchers) [1,2,3,4] have revealed key insights into how learners engage with online learning resources and support tools.

Figure 4: Three examples of paper-based visual overviews used in Study 1. Design idea#1: The muddy point approach is illustrated by Figures 1a. and 1b. Design idea #2 Goal- setting is shown in 2a. and 2b. The third column shows design idea #3 Vocabulary-based filter.

Figure 4: Three examples of paper-based visual overviews used in Study 1. Design idea#1: The muddy point approach is illustrated by Figures 1a. and 1b. Design idea #2 Goal- setting is shown in 2a. and 2b. The third column shows design idea #3 Vocabulary-based filter.

Challenges in Practicing Learner-Centered Design: Balancing Learners’ Needs with Experts’ Suggestions