phoneNumber.jpg

Daydream

Mostly ramblings

Paternalism 'lite'

I’ve recently subjected myself to a thinking experiment. I spent a few months thinking about the potential of “big data” in creating responsive government services that can have a positive impact on life in modern cities. Simultaneously, I explored the implications of data collection and storage on our individual sense of privacy in context of socio-technical systems. It was a jarring contrast and it made me think of the role of designers in this context. 

The socio-technical approach recognizes the interaction between people and technology in various contexts and the rate of creation of these systems has accelerated dramatically in the past 20 years. When taking a look at the landscape of product and service systems being created for both end consumers and for workplaces, one might have a hard time finding a socio-technical system/solution that does not operate under the premise that the more information [it] can collect, and retain about the user, the better. Good examples are the Quantified Self systems or productivity management systems in workplaces. They are essentially behavior modification tools that allow users to understand something about their behaviors with the goal of changing behavior to a more “desired” state. 

I think that it is not always our [the designer’s] job to correct user’s behavior and, furthermore, that the assumption that anything is worth trading off for this “new desired state” requires a deeper contextual inquiry of the user’s information norms. More on the trade-offs in a little bit but I would like to briefly talk about what I think this means for our work as designers. I suspect we have all fell victims, at some point in our work (I know I certainly have), to this underlying assumption about the role of data in socio-technical systems. I’m referring to the promise of data leading to either systemic behavior change or ability to accurately predict human behavior. Regardless of your position on the ability of systems to deliver on this promise, I don’t think many of you will disagree that this has led to an unquenchable thirst for more and more information. We are living in an infinitely more connected world where information sharing is no longer an option but a ubiquitous state of being. I am creating a personal data exhaust while writing this article that is being used by many systems that are foreign to me both in existence and in purpose. 

We, designers and users of these systems, are victims to what behavioral economists would call “coherent arbitrariness” in which when a product or service is introduced in a particular way, it becomes a permanent point of reference, one to which we compare all future relevant solutions. It is illustrative of the momentum that every idea has once it is introduced. So in a really round about way, I think that this unquenchable thirst and the momentum that it has created has also led us to a continuous re-anchoring to a lesser and lesser expectation of privacy. This manifests itself systems that are designed to penetrate deeper into our lives in search for more personal information to deliver on the above-mentioned promises. 

Part of the reason that this is happening is simply because it is difficult to visualize the trade-offs you are making by “allowing” these systems to do their thing. To use behavioral economics terminology, our true cost of the convenience of Amazon’s predictive algorithm or Google’s robust analytics is hidden from us, it happens below our line of visibility. In addition, the nature of payment in these systems has changed as well since the currency now is personal information. This has created a cognitive degree of separation much like a credit card payment is separated from our idea of money. I don’t often think twice of logging in with my Facebook or Gmail account to another service because I don’t have a model or access to information that allows me to understand what this decision means to us. I would go even further and say that we don’t even consider this a transaction anymore because how does one directly compare the value of one’s personal information to Google versus the benefit of having access to a world of information and a powerful tool to maneuver it? It becomes too complex of a task and therefore we resort to the default, which is to not think about it. 

The real trade-off you are making however is the convenience and access to things in the present (present gains) versus the diminished ability to manage your individual identity in the future (future losses). Predictive nature of these services implies that they will know what you want (based on your behavior and that of people like you) before you do thus making them detrimental to your ability to control and express your own identity. Now this is just a simple case of shopping for goods online but if you apply the same logic to a situation where Quantified Self principles are at play, it doesn’t take a huge leap of imagination to see how this might be problematic if let’s say Nike Fuel Band strikes a partnership with a large insurance company. 

Of course as designers we cannot predict the future arrangement of companies and how they will share information about users. One thing we can do however is to understand that we don’t know whether the people we are designing for would be ok with this trade-off if they understood how to compare the two. I think a new approach to research and design of socio-technical systems could potentially lead to a more participatory meaning making in big data management. One that considers and engages uses earlier and more frequently in the research and design process leading to more transparency and value that maneuvers our information norms instead of breaching them. 

I don’t presume to have answers to this whale of a problem, however, what I do know is that as human beings we evaluate privacy, and information, based on the context in which it takes place. What constitutes an invasion of privacy is essentially a breach of what Helen Nissenbaum calls a contextually relative information norm. Whether we are aware of it or not, these breaches are a constant part of modern life and in my opinion have profound effects on our lives. Breaches caused by designs of socio-technical systems that continue to re-anchor to a new, more invasive vision of user-centeredness. A vision based on the relatively shaky faith in the predictive power of big data. 

I painted a pretty gloomy picture towards the end there but I also think there is amazing work being done by researchers in this field and I’m looking forward to sharing an actionable (yet untested) approach to privacy in design influenced by their work. I would like to encourage everyone to follow me in the privacy rabbit hole by checking out the references below. It is truly a fascinating and powerful topic to explore as a part of your design body of knowledge.