Making active choices in online environments

20 Aug 2021 03:42 PM

The Centre for Data Ethics and Innovation recently published a joint report with the Behavioural Insights Team, the last in a series focusing on ‘active online choices’. This means empowering individual users to better control how they use digital products and services, including aspects such as data sharing, advertisement and social media content.

The concluding report looks at how well different interface designs enabled participants of randomised control trials to make informed choices about their privacy and personalisation settings. The research looked at three different digital environments: smartphones, web browsers and social media platforms. For each of these, different choice designs were created and tested against a control design, which was based on common existing interfaces.

The designs varied for each of the environments, but as an example, participants in the smartphone trial were presented with three treatment interfaces: a slider design where the options went from ‘connected’ to ‘private’, a binary design with a choice between either ‘regular’ or ‘private’ settings, and a ‘trusted third party’ design where participants could select settings recommended by a range of organisations. For each choice there was a brief explanation of its implications, outlining whether apps could access location data, whether the default browsing mode would be private and if phone data could be used for personalised ads.    

The research focused on which of the designs best enabled participants to choose the right settings for three different ‘personas’ with varying levels of privacy preferences, if they understood the consequences of the different options, and lastly, the extent to which they felt in control. For the smartphone and web browser experiments, the treatment designs tended to outperform the control design on choosing the appropriate settings for different user preferences and understanding the consequences, but for the social media trial, the differences were negligible (and some of the treatment designs even performed worse). This highlights the complexity of user control and the need for solutions to be tailored depending on the specific platform and its audience.

Interestingly, the ‘feelings of control’ variable did not correspond with the designs that gave participants the best understanding of the consequences of their choices. This was particularly evident in the social media experiment where the sense of control increased with the treatment designs, yet participants had the same or worse level of understanding of the outcomes compared to the control design.

One of the main conclusions of the report is that gaining users’ consent is not enough, as it provides no guarantee that they have actually understood what they are consenting to. Designs that prioritise informed and active choices could help overcome this problem. The main implications for the tech industry highlighted by the report are as follows:

If your organisation is working on user engagement and control, and you would like share progress or an interesting approach, please get in touch with