Screen Shot 2016-08-29 at 20.14.47.png

Measuring experiences

Sensor based observational research

Bridging the quantitative-qualitative divide in research using sensors, digital tracking and design research methods 

We live, work, play and learn in a world where products, services, companies, individuals, groups and just about any other unit of organisation is vying for as much of our attention as they can get. Companies spend billions of dollars annually to position their often irrelevant and contextually unaware products and services in every facet of consumers' lives regardless of the effect this may have on our mental wellbeing and cognitive functions. Mobile devices beep, shake, and light up with notifications, bots invade our email inboxes with lifestyle offers, and increasingly aggressive public space advertisements herald the coming of a new future "for you and just for you". In short, we are experiencing an unrelenting onslaught of digital stimulus which continues to become more and more invasive into our lives and minds.

This invasiveness is penetrating every part of our lives but is particularly worrisome when it comes to education and our ability to learn under such circumstances. It is important that we begin to ask questions about whether digital tools are helping or preventing us to learn in the way we need to? Given that digital devices have become learning tools, how are people experiencing this form of distracted learning and what can we do to support the type of learning that is both effective and helps our minds to be healthy?

Our ability to focus and remain in focus for extended periods of time is under attack. As a result, teachers take extraordinary measures to reduce the distracting effect of technology in their classrooms by banning device use and pushing back against introduction of devices into the curriculum. At the time of this research, there were close to 1 million mobile tablets sitting unused in Chicago Public Schools for this very reason. Teachers simply did not have a way to understand the effect of devices and their use on students and their ability to learn. We set out to take a small step in bridging this gap in knowledge by orchestrating a study using innovative observational research methods to build a bottom-up model of distracted learning.

This applied research project was designed to utilize existing technology and both quantitative and qualitative research approaches to measure people's behaviours and understand the meaning behind them. We set out to build a robust understanding of "distracted learning" by sensing focus and distraction in the way graduate students interacted with their digital ecosystems and comparing their behaviours across various learning environments such as at home, in study areas, in class, etc. 

Figure 1. The study focused on distracted learning by understanding behaviours in context using combination of methods and tools adapted to measure the various components (in white)

Figure 1. The study focused on distracted learning by understanding behaviours in context using combination of methods and tools adapted to measure the various components (in white)

Our study focus area (Figure 1) and hunt statement (Figure 2) were a result of extensive secondary research as well as deep experimentation with technology. In hindsight, one cannot overstate how important and different it is to formulate research questions in instrumented research studies involving unstructured data such as usage of various digital application types. The outcomes of this project were: a) logic for a mathematical algorithm for sensing focus and distraction through digital tracking tools that can be scaled to a larger study population and a new, and b) a method for bridging the gap in quantitative and qualitative methods and insight in user research.

Figure 2. A detailed hunt statement is an essential tool for framing the project and avoiding scope creep throughout the many decisions you must make in data driven insight projects

Figure 2. A detailed hunt statement is an essential tool for framing the project and avoiding scope creep throughout the many decisions you must make in data driven insight projects

The study tracked the students interactions with their mobile, tablet and desktop devices by recording their "live" application usage second by second with the RescueTime productivity tracking software. Additionally, we used IFTTT for cataloging instances of communications, T2 Mood Tracker for measuring experiences (Experience Sampling), Chronos app for tracking location and self assessment tools to understand more about the context of their everyday lives. All these data sources were corroborated by several rounds of in depth interviews and data reviews where we created the environment in which the participants could interact with their data and help us explore the meaning behind their behaviours.

In order to have a holistic view of distracted learning we had to first understand what does focus and distraction even look like through the data, how different focus and distraction may be across various contexts, what types of digital activities may influence distraction, which devices and applications (and patterns of use) are enablers of distraction and whether there are any significant patterns we can pick up across our participant sample

86,500 seconds per day x 7 days x 4 participants = 2.42 million data points per device tracked in RescueTime alone

The quantitative evidence collected using various tracking tools was brought into qualitative interviews and data review sessions with research participants that allowed us to iteratively analyze the data collected and attribute user-defined categories to the raw, unstructured data collected. Figure 3 shows a snippet of rescueTime data after it has been cleaned, merged, tagged and prepared for visualisation and analysis.

Figure 3. A 45-second snippet from a fully prepped RescueTime data file

Figure 4 shows Shilpa's, one of our student participant aggregate data from 7 days of time spent on each of her devices, distribution of her digital activity across various 'self-assigned' activity categories, and the relative amount of time spent on 'Schoolwork' related activities across her devices. This snapshot view of her activity is a good indicator of overall usage but falls short of understanding focus or distraction because those concepts are experienced by participants as slivers of everyday digital life compared to the aggregates of moments shown in Figure 4. 

Figure 4. An overview of the participant's aggregate data illustrated the relative amount of time spent on her devices and on various categories of tasks

Figure 4. An overview of the participant's aggregate data illustrated the relative amount of time spent on her devices and on various categories of tasks

Shilpa's daily use offers a bit more detailed look at the density of her digital activity. Figure 5 is a look at her data across days, devices and broken down into activity types yet again. It shows us overlap of device activities, allows us to determine that Mondays and Wednesdays were particularly heavy laptop use days which we already know have close to 99% of all of her Schoolwork activities. We also know that she tends to use her mobile device towards the end of her day.

Figure 5. A time series view of all digital activity for Shilpa across the entire study period.

Figure 5. A time series view of all digital activity for Shilpa across the entire study period.

To truly understand the effect of distraction on a student's learning ability, we must look very closely inside the periods when we know a student's likelihood of being in focus is the highest. We asked our participants to tell us ahead of time when that might be and they have identified two general categories of "learning periods" as we dubbed them. Class Time is a rather rigid learning period that usually takes place at fixed times in the week. Study or Practical time is when students pick and choose the time when they work on Schoolwork activities. Each one of these has their pros and cons and our goal was never to evaluate the validity of each type of education but rather understand how different focus and distraction might look for students there.

Over the course of her study, Shilpa went through eight learning periods in 5 days (Figure 6). Since the learning periods she reported were of varying lenghts, we chose to focus on the relative ratio of Schoolwork and Non-schoolwork activities in this initial view. As you can see, her Schoolwork activity during Learning Period 2 (Tuesday 2-6pm) was the highest of all but in aggregate she spent the most time on Schoolwork on Wednesday across two periods. We can hypothesise that the likelihood of observing focus unique to Shilpa is in those periods as well as indicators of distraction around those periods of extended focus.

Figure 6. Comparison of a participant's Schoolwork and Non school work activity for each of the self-reported learning periods

Figure 6. Comparison of a participant's Schoolwork and Non school work activity for each of the self-reported learning periods

Through interviews with our participants we have defined focus or the absence of distraction as a period of time (duration) where one is deeply engaged in one single activity (one single application) and therefore when the rate of application switching increases there is an inverse effect on an individual's level of focus. Therefore, identifying areas of digital activity where a single  application on a single device would be used for an extended period of time would signal focus and areas of rapid switching between applications would signal distraction.

Looking deeper into each of the learning periods we can determine where concentration was achieved related to Non-schoolwork activities such as searching for inspiration or browsing the internet as the first 3 hours of LP5 indicate in Figure 7.

Figure 7. A holistic look at Shilpa's digital activity in a selected learning period representing periods of concentration related to non-Schoolwork activities

Figure 7. A holistic look at Shilpa's digital activity in a selected learning period representing periods of concentration related to non-Schoolwork activities

Figure 8 on the other hand distinctly shows a period of concentration when Shilpa was engaged in Schoolwork activities. Even though the number of switches between applications was quite high, the data shows that those switches happened largely in the beginning and end of the learning period when she has not fully focused on her task or when her concentration was waning respectively. After looking at a week's worth of activity for Shilpa, we can start to build an individual view of Shilpa's concentration, the applications and the scenarios that are causing her to be more or less distracted. 

Figure 8. Concentration with Schoolwork related activities leave a distinct digital footprint

Figure 8. Concentration with Schoolwork related activities leave a distinct digital footprint

From information gathered from Shilpa and other participants in this way, we can begin to build mathematical models that could help to automate the detection of focus and distraction based on the length of time a person is engaged with particular types of digital applications or tasks. A lot more detail can be derived from the analysis of the information in this study based in an understanding of combination of digital applications, a sub-categorization of the activity based on duration of time spent in each activity or by scaling this study to a larger number of people and a greater variety of learning contexts. Results of such analyses lead to models like the one below that can inform the logic of the mathematical algorithms built to interpret user's activities and the way responsive services might work accordingly. Here is an example of an interpretive model we have built for Shilpa's activity to illustrate how a model might work based on or findings. We have identified three distinct ways Shilpa's context influences her ability to focus.

We looked for distraction and its triggers, but found
 different lenses to identify concentration through task duration.

First, we observed that the overall intensity of task switching was different across the days in Shilpa's study. Her end-of-day debrief forms indicated that the days in which her rate of switching was fast and erratic were the ones where she travelled and switched learning environments frequently even though she had enough time to dive into a concentration state in each one of those places. What this meant is that there may be a particular "day switch" profile to everyone's digital activity in which the ability to concentrate would be compromised almost by the design of the day itself. Understanding something about your own activity may help someone over come a daily challenge by adjusting their approach in some way. 

Second part of our model was driven by the observation that the nature of the learning period significantly changed Shilpa's digital activity. Her ability to focus was much more pronounces and longer sustained in 'flexible periods' meaning ones she designated for herself than fixed periods at school. Even though this was based on a small data set we could begin to ask questions about how rigid the school system learning environments should be holistically if students are exhibiting a greater degree of concentration everywhere else other than school. More needs to be understood to make a claim with any degree of certainty.

Finally, we can make a determination with some certainty that switching between Schoolwork applications and non-Schoolwork ones as well as between devices introduces a lasting effect on the student's ability to concentrate following such a switch. We observed distraction events related to particular task categories such as email, messaging and social media checks as well as search browser use as particularly distracting.

This study, albeit small in scale and hardly determinate, shows the potential to exploring the realities of our everyday digital existence and the connection the virtual world has with the physical environment. What is most interesting to me is the profound impact of the digital technology on our ability to concentrate and also the effect of the context of use on the way those technologies are being used. Even though this has not been a part of our study explicitly, we explored a world of "slivers of everyday life", or in other words, micro interactions that clearly were a huge part of the way our participants make sense of their everyday world. These interactions and their pervasive in our everyday constitute a well established culture of digitalizm that requires a new approach and a methodology that includes both quantitative and quantitative tools. I intend to continue to develop and incorporate this approach into my professional work.