• No results found

Log In or Sign Up

N/A
N/A
Protected

Academic year: 2022

Share "Log In or Sign Up"

Copied!
38
0
0

Loading.... (view fulltext now)

Full text

(1)

Bachelor’s Thesis Spring 2020

Log In or Sign Up

Författare​ : Emil Gunnarsson Handledare​ : Cassandra Troyan, Matilda Plöjel Examinator​ : Mathilda Tham Termin​ : 20VT

Ämne​ : Visual Communication + Change

Nivå​ : Bachelor’s Kurskod​ :2DI68E

(2)
(3)

Acknowledgements

I want to thank every tutor and visiting lecturer during this semester for engaging with my project and helping it reach its full potential. A special thanks to Cassandra Troyan, who has encouraged and assisted me in developing my creative writing skills during the last three years. Along with Cassandra, I thank Matilda Plöjel, Helga Steppan, and Anthony Wagner for the feedback, insights, and all the relevant works, methods, and inspiration you provided me with during the process.

Thanks to everyone who participated in my workshop and took my survey. And to all my classmates for creating the best possible working environment. Thank you for all your love and support to my girlfriend, my parents, and my sisters.

(4)

Abstract

This thesis details the theoretical framework, methods, and results of the design project “Log In or Sign up”. The result of this project is an ebook sharing the same title. It consists of short stories and poems that revolve around digital surveillance and its effects on the self and society. The writings explore how the digital sur- veillance ecosystem affects our behavior and self-image in ways that we might not even realize. The context is our digital lives in a time where we rely more and more on the internet for our daily activities. The writings attempt to portray the emo- tions we go through while using digital platforms that are designed to extract our personal data. The stories and poems are presented in an interactive layout.

Along with the aforementioned topics this thesis also addresses the responsibility of design in creating advanced digital surveillance methods created by companies such as Google, Facebook, Amazon, and Microsoft. For example, through the gami- fication of social media, deceptive presentation of data protection rights, and cura- tion of content for emotional effect.

Key words

Digital Surveillance, Visual Communication, Surveillance Economy, Capitalism, Economics, Privacy, Advertising, Personal Data, Social Media, Facebook, Google, Web tracking, Dark Patterns

(5)

Table of Contents

Introduction

1.0 – Log In or Sign up 6 Theoretical framework

2.0 – The Surveillance Economy 7 – 8 2.1 – Designing Away Consent 9 2.2 – Addictive Platforms and

Manipulative Design 10 – 13 Methodology

3.0 – Facebook Beacon and 13 – 14

“Ads Interest” Workshop

3.1 – Survey 15 – 16 Design Process

4.0 – Design That Deals With

Digital Surveillance 17 – 20 5.0 – Interactive Ebook 20 – 24 Results

6.0 – Results 25 – 32

7.0 – Discussion 33

(6)

1.0 – “Log In or Sign Up”

Digital surveillance by companies has become part of our everyday discussions.

However, this has not yet been enough for there to be a radical change in public opinion about how the internet should be. A few reasons can be attributed to why this is. One is that these are new platforms and they have simply not existed for long enough for the general person to understand the full extent to the surveillance and the economic motives behind it. Another is that all the surveillance is designed to be hidden. Both of these reasons will be examined further in this thesis.

Even though we know that Facebook uses our personal data to target ads and that Google gathers information about everything we do when using their services, information which they then use for whatever they please, that information has not had a deep enough impact. At least when the information has reached the public in the form of news coverage or opinion articles. As a designer, I wanted to use the medium of design to put these issues in another context and thereby highlight how the makers of the digital platforms use design to keep you uninterested in the sur- veillance aspect of their services.

By the method of short stories and poetry set in an interactive e-book, this project aims to lift the veil of the methods used to deceive us on the digital platforms. The reader should see through the stories and the interactive elements, how digital platforms have been designed to deceive and manipulate them. By painting realis- tic pictures of how digital surveillance impacts the self and our society, with meta- phors, analogies, and repetitions of familiar situations.

6

(7)

2.0 – The Surveillance Economy

To understand what motivates digital surveillance by companies, one must have some fundamental understanding of how the surveillance economy operates.

Shoshana Zuboff uses the term “Surveillance Capitalism” to describe the surveil- lance economy. In her renowned book “The Age of Surveillance Capitalism”, re- leased in 2019, she details the history of “Surveillance Capitalism”. She says that it started at Google in the early 2000s with the discovery of behavioral surplus. The book explains how it has both secretly and openly been expanding to different sec- tors of the economy while remaining unregulated. (Zuboff, 2019, p. 52-55)

Her definition of “Surveillance Capitalism” is that it is an economic logic that claims the human experience as a free resource. This resource is then turned into behavioral data. Some of this material, what is called behavioral surplus is then processed through “machine intelligence” and turned into “prediction products.”

These predictions are then sold in the new marketplace of behavioral predictions that she calls “behavioral futures markets.” In the process, our privacy is breached in multiple ways. Our social interactions, location, facial expression, personal pho- tos, emotions, medical history, web searches, and other personal data become mon- etized. She asserts that “Surveillance Capitalism” is a new and growing economic logic that will continue to have adverse effects on our society if left ungoverned or self-governed. (Zuboff, 2019, p. 8-12)

Zuboff’s detailed descriptions of the formation of the digital surveillance economy and the brilliant way in which she explains the methods of digital surveillance in her book, numerous articles and interviews were essential to providing a theoretical framework for this project. The ebook “Log In or Sign Up” even opens up on a quote by Zuboff in which she likens Google’s surveillance methods to a one-way mirror.

An analogy that also inspired one of the short stories in the book, Evelyn and The Magic Mirror.

In its first stages, this project aligned itself very closely to Zuboffs grand theory, however, with time it shies away from completely embracing every aspect. “Surveil- lance Capitalism” is very real and poses a large threat to our society. It may how- ever not be necessary to consider it as separate from or independent of capitalism.

This is Zuboff’s claim when she describes “Surveillance Capitalism” as a new rogue form of capitalism. (John, 2019)

“Surveillance Capitalism” does not have to be seen as a deviation of capitalism. It does not differentiate itself from the current state of capitalism. Katie Fitzpatrick also considers Zuboff’s political analysis insufficient in addressing the nature of the current state of capitalism. In her review of Zuboff’s “The Age of Surveillance Capitalism” she says:

“The Age of Surveillance Capitalism succeeds in painting a dark portrait of Silicon Valley’s growing power, but it ultimately fails in its political analysis. In whose service and at whose expense is the control of surveillance capitalism effected? Zuboff reaches for the grandest possible explanation: She argues that Silicon Valley is in the thrall of a radical instrumentarian ideology that aims to supplant liberal individualism with large- scale social engineering. But we don’t need a spooky new political theory to explain

what’s going on; it’s already perfectly legible in the context of liberal capitalism.

7

(8)

Companies do not pursue control in a quest for Skinner’s or Pentland’s engineered uto- pias. Their goals are much simpler: first, to accrue profits through targeted advertising and, second, to promote their direct economic and political interests. The problem with surveillance capitalism is as much the capitalism as it is the surveillance.”

(Fitzpatrick, 2019)

8

Figure 1.

From “The Age of Sur- veillance Capitalism”

(9)

2.1 – Designing Away Consent

The way we consent to our data being collected on the internet has long been up for debate. On one hand, websites, apps, and social media platforms insist that their services do not work without tracking. On the other side of that argument are those concerned for the public’s privacy who say that the laws are too relaxed and that the users are being exploited for their personal data without their consent and knowledge.

There is no denying that the GDPR regulations in the EU have pushed the envelope for the data rights of European citizens. GDPR is a good beginning. Unfortunately, the changes that were expected and are necessary to change the online environ- ment were not sufficient, and the digital platforms have quickly found their ways around these new laws.

This is because it was not followed through by setting rules for how the consent is given and therefore allows the creators of the digital platforms to formulate and design the question in a way that makes the interaction meaningless and the choice non-existent. To ”allow cookies” has become trivial and in most cases, the choice has been eradicated. This is one way that websites and apps can design away consent. Another way is how social media platforms slowly start obtaining more data about their users by small changes that are hard to notice if one is not paying attention. The main point here is that we cannot consent to something that is de- signed to be hidden.

It should be noted that cookie notices are usually not created by the sites they ap- pear on. They are made by consent management platforms (CMP). These are com- panies that sell ready-made systems that are utilized by site creators to simplify the cookie collecting process. Researchers from Aarhus University in Denmark, MIT, and University College London, found that five companies cover 58 percent of the UK’s 10,000 most visited websites. (Nouwens et al., 2020)

The study “(Un)informed Consent:” published in October 2019 by Ruhr-University Bochum, Germany, studied the effect position, colors, type of choice, and options has on consent notices. They conducted experiments on 80,000 users on a Ger- man website. In the study they note that websites often will highlight the option to “accept all cookies” by setting it in a larger font and more eye-catching color.

They found that people will simply click the button that is bigger or brighter. Giv- en a binary option, people are most likely to simply “allow all cookies”, compared to having to allow tracking for each category or company one by one. They declare the following about their findings: “Our findings demonstrate the importance for regulation to not just require consent, but also provide clear requirements or guidance for how this consent has to be obtained in order to ensure that users can make free and informed choices.”(Utz et al., 2019)

In April 2020 another report, by the Irish Data Protection Commission was pub- lished about cookie notices. They looked at 40 websites and found that a quarter practiced the aforementioned trick of pre-ticking boxes and that around half might not even meet the required rules set by GDPR. It is their belief that almost every one of those 40 websites has had trouble complying to the GDPR. They note that

the issues range from minor to serious. (Data Protection Commission, 2019)

9

(10)

2.2 – Addictive Platforms and Manipulative Design

Dark pattern is a term coined by user experience designer Harry Brignull who reg- istered the website www.darkpatterns.org in 2010. On the website, dark patterns are described as “Tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.”(Brignull, H. , no date) This term is widely recognized and often used by those critical of manipulative web and app design. There is even a twitter account run by Harry Brignull and Alexander Darlington that retweets screenshots from people that tag them in posts containing commentary about specific instances where companies utilize dark patterns. On the website, they refer to their twitter feed as their “Hall of Shame”. (Brignull, H. , no date)

Following are a few examples of the types of dark patterns the website lists:

“Roach Motel”

“You get into a situation very easily, but then you find it is hard to get out of it (e.g. a premium subscription).”

“Privacy Zuckering”

“You are tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook CEO Mark Zuckerberg.”

“Confirmshaming”

“The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.”

“Forced Continuity”

“When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. In some cases this is made even worse by making it difficult to cancel the membership.”

“Friend Spam”

“The product asks for your email or social media permissions under the pretense it will be used for a desirable outcome (e.g. finding friends), but then spams all your con- tacts in a message that claims to be from you.”

(Brignull, H. , no date)

10

(11)

11

Figure 2 – 4

From @Darkpatterns Hall of shame

(12)

In their publication about Dark Patterns published in May 2020, Sebastian Rieger &

Caroline Sinders urge policymakers and governments to respond and regulate the design of digital services and platforms. They believe dark patterns to be responsi- ble for adding to the erosion of privacy online. The main issue is that digital plat- forms in question are designed to make their users share personal data while simul- taneously making the said data difficult to protect. They claim that in Europe this weakens the European Union’s privacy regulations in regards to individual consent.

(Rieger and Sinders, 2020)

Many are familiar with being overwhelmed by notifications on our phones or in our browsers. This can lead to what is sometimes referred to as “notification fatigue”.

We wouldn’t feel pressured to look into every single notification badge if there wasn’t something telling us that behind them lies important information. Every red dot can potentially contain important information. And even though we know exactly what is behind the notification badge we still feel the urge to look.

Andrew Wilshere at DesignLab describes notifications as a powerful tool that UX designers and developers knowingly use to push our psychological buttons. He claims that because we want to be socially engaged we will look at notifications, no matter what. On DesignLab’s blog he contemplates about what has become of the notification:

“Notifications were once there to tell us something we needed to know. But has the des- peration of companies to get us to engage with their product turned notifications into an annoyance—a manipulative, destructive dark pattern?” (Wilshere, 2017)

Wilshere believes that designers and developers should have more respect for users and not subject them to these manipulative methods in order to generate clicks.

(Wilshere, 2017)

Tristan Harris is a former design ethicist at Google, he resigned in 2016. While at Google he studied the effect technology has on behavior, attention, and well-being.

In an article he wrote for Der Spiegel he describes our smart-phones as slot ma- chines that we keep in our pocket. He likens the “pull to refresh” feature on digital platforms to the pull handle on a slot machine. When we pull to refresh we are es- sentially gambling to see if we get an award or if we get nothing, he says. Either it’s a new post, new notification, a new email, or nothing. (Harris, 2016)

Harris is not the only one drawing comparisons between addictive design on digital platforms to the design of slot machines. In “The Age of Surveillance Capitalism”

Zuboff draws parallels between the social media design and casino design which she bases on Natasha Dow Schüll’s work about addictive design in slot machines. In Shüll’s book “Addiction by Design” she says that the addicted players seek to enter what she calls the “machine zone”, a state of complete immersion. (Schüll, 2012) Zuboff claims that the parallels to social media design are apparent. Both are de- signed to keep the user from looking away from the screen with the end goal that they will become incapable of looking away from it. Schüll describes this as a key element in designing towards the “machine zone”. (Schüll, 2012)

12

(13)

“Every feature of a slot machine – its mathematical structure, visual graphics, sound dynamics, seating and screen ergonomics – is calibrated to increase gamblers ‘time on device’ and encourage ‘play to extinction’. (Schüll, 2008)

When comparing the two, Zuboff points to Facebook’s marketing director boasting that it’s users never have to look away from the screen. She also points out that former Facebook president Sean Parker admitted that Facebook was deliberately designed to consume as much time and consciousness as possible. (Zuboff, 2019, p.451)

Zuboff claims that when applied to social media these methods affect mostly the younger users. She says “Social media is designed to engage and hold people of all ages, but it is principally molded to the psychological structures of adolescence and emerging adulthood, when one is naturally oriented towards the “others”, especial- ly towards the results of group recognition, acceptance, belonging and inclusion.”

(Zuboff, 2019, p.449)

3.0 – Facebook Beacon and “Ads Interest”

Workshop

On the 28th of February, I hosted a group discussion about targeted advertising and social media. Participants in the workshop were five students from the Linnaeus University Design department. Three of them were from the Design + Change pro- gram and two were from Visual Communication + Change, all my fellow students at the Linnaeus University, Department of Design. At the start of the discussions, I promised the participants that they would remain anonymous and that I would only be taking general notes about what came up during our discussions. I made them that promise in hopes of creating a relaxed environment for honest conversa- tions on a peer level. This is why the meeting was not recorded and the participants are not quoted word for word.

The workshop started with me reading a review of Facebook’s discontinued service

“Beacon” which was launched in 2007. At the time it caused controversy due to pri- vacy concerns and was considered by many to have been catastrophic for Facebook.

It was a service that allowed Facebook advertisers to track its users and disclose their purchases without permission. The advertisers would publish the users’ pur- chases on their “Facebook wall” for all their friends to see. This was not an opt-in service and users would suddenly notice that they had been posting their purchases on Facebook without their knowledge. After facing numerous complaints and law- suits Facebook shut down the service and issued a public apology. (Lynch, 2007) As a group, we discussed what could be learned about the nature of targeted ad- vertising by looking at the problematic aspects of its initial form as it appeared in Facebook’s Beacon for example. None of the participants of the workshop were aware of this early version of Facebook’s targeted advertising service. Some men- tioned that they felt that Facebook still did the exact same thing but that the infor- mation was being conveyed in more subtle ways. Meaning that you still get targeted advertising based on what your friends buy, but they don’t share the information about what your friends buy with you directly. Another participant mentioned that the whole thing sounded very clumsy and therefore out of character for a service

like Facebook.

13

(14)

Then the discussion shifted to our interactions with targeted advertising. One par- ticipant mentioned that she could often not tell which posts in her Instagram feed were posts from accounts that she followed and which posts were ads. As an exam- ple she explained that large scale ceramic resellers and online furniture stores used the same aesthetics as the designers she followed when posting images of their own work. She said that she often caught herself pressing “like’’ on images and then realizing it was from a store, which makes her want to press “unlike” on the picture.

Together we speculated on other reasons why the images blended so seamlessly into the feed. We wondered about the complexity of these algorithms that arrange the content we see and if the ads are placed in between posts that make them blend in.

Another participant noted that she felt that algorithms were removing the human element out of so many parts of our daily lives. All the participants and I included agreed with this and we spent some time discussing the topic. At the end of the discussion, I instructed all the participants on how they could access the “Ads Inter- ests’’ that either Instagram or Facebook claims to use when they send out targeted advertisements to their accounts. After laughing at oddly specific categories and topics that were completely irrelevant such as “horse care” in the case of one par- ticipant, we questioned the authenticity of these categories. This was my intention with the workshop. First, consider targeted advertising by looking into the past.

Then discussing the current state of targeted advertising, which incidentally led to discussions about technologies’ effect on our society (this was not planned, but a welcomed addition). After that I wanted to consider how Facebook and Instagram try to explain targeted advertising to us, hoping that we chalk it up as something useless and unsophisticated. The workshop ended up being successful and provided me with new perspectives that I got from hearing what my fellow design students had to say about digital surveillance.

14

(15)

3.1 – Survey

On the 18th of March 2020, I published a survey consisting of 15 questions regard- ing social media, targeted advertising, digital surveillance, personal data and the feelings the surveyor has in regards to those subjects as well as their opinions on them. The purpose of the survey was to gather opinions, experiences and person- al stories relating to digital surveillance. The content that I gathered was used to inform my writing and create realistic scenarios that the readers could relate to. It is my belief that having read about all these different perspectives on our digital lives has given me a better understanding of the topic. Without this content, all my writing would have been based on my own experiences or by guessing how others experience the digital platforms we use every day. It was important to include more perspectives so that in the end, the writings in my book would speak to many dif- ferent people of all ages.

The survey was first sent out to fellow Visual Communication students, all three years of the program. This way I knew that the answers were coming from design students who might see the design of digital platforms from a certain perspective.

Two days later it was published on social media and shared by three people which gathered surveyors in higher age groups. I ended up with 31 results and the age range was 17-77. About half of the participants were aged 23-26.

When asked to name one negative thing about social media many mentioned time-wasting. This issue was also brought up by others when asked to mention three feelings associated with social media. Participants also mentioned distrac- tion and addiction. This was true for all age groups. The younger participants often mentioned the comparison to others as a negative by-effect of using social media.

Some addressed this by specifically naming things such as social hierarchy, inferior- ity complex, anxiety, and envy as the feelings they associated with social media. Of- ten those who named such feelings would also mention the comparison to others or social hierarchy as the worst thing about social media. Most participants declared that they don’t like targeted ads, this applies to all age groups. A few said that they don’t mind targeted advertising and that they prefer to see something relevant.

Almost every single participant felt that fighting for privacy online was worth it al- though few explained why. The small minority that felt it was not worth it said that they felt it was already too late. Generally, the younger the participants were, the better they could explain how digital surveillance takes place in their daily lives.

This difference could be seen very clearly when comparing participants aged 17-22 with those aged 30-37, with the latter mostly relying on guesses instead of facts. It was a pleasant surprise to see how well informed a few of the participants in the previous age group were about digital surveillance methods.

15

(16)

What is your age?

24

Do you use social media? if so, which?

Facebook, Instagram, Snapchat, Strava, Linkedin

What three feelings do you associate with using social media?

Community, Social hierarchy, Anxiety

Which social media app or website is your favorite? Why?

Strava. It seems to be the least bullshit social media, you can only post activities and comment on other peoples activ- ities. Not many ads.

What is the best thing about social media?

Keeps you in contact with people far away.

What is the worst thing about social media?

It puts people in their own echo chamber or bubble, making people more polarized.

If you have ever used Facebook, what words would you use to describe the design of the website and app?

Started very simple back in the day, has evolved a bit since then but isn’t very pleasing in design.

Do you ever hesitate to give away personal information to a website?

Yes, when registering to various services etc. that I feel like they don’t need to know certain things. For example free wifis that require login, some login you might need to set up to use an app or device you have.

Do you feel surveilled when you use the internet? Can you write about an example where you felt like you were being watched?

For example not having googled a movie, but texted someone about it (using a google keyboard on the phone) and then getting ads for that movie. Maybe this is imagination or coincidence but it felt super strange.

What is your opinion on targeted advertising?

On one hand it is good to see stuff that you actually are interested in, but on the other hand if you are presented with random things you are probably less likely to buy stuff, i.e. consuming less.

Can you name some examples of targeted ads you have received? Can you explain why you received them?

After googling contact lenses, I seem to get a lot of contact lens ads. Also my instagram feed is full of cycling related stuff.

Have you ever played the location-based games such as Pokemon Go? How was your experience?

If you count tinder as a game, then yes. The tinder experience is very dependent on how you respond to the adictive nature of that app. But in terms of the location based aspect going to new places and seeing what was there was always interesting.

Do you believe there is such a thing as privacy on the internet?

For sure, but having some background with programming and web services etc. it is pretty clear that the data all has to be there and the companies usually need a certain level of data for their services to be useful. The question then becomes how much data should the collect and how the companies prevent the misuse of the data, i.e. the data leaking out or people within the company taking a look at the data. This also has to be done with legislation, and then there have to be checks to see if the companies are actually following them.

Do you think companies like Google and Facebook are concerned about their user’s privacy?

I think they have to comply with the legislation, but it is always in their interest to know more about the people. Hope- fully the employees of these companies can not use access individual data, or identify certain individuals without their consent, but it is hard to imagine that they can’t.

Do you think protecting our privacy online is something worth fighting for?

For sure, but it is also the responsibility of the consumer to know that there is always the chance that someone is do- ing something they are not supposed to. This means that legislation and regulation needs to be there as well, and we should fight for that.

16

Figure 5

Answers from a participant who agreed to have his answers made public

(17)

4.0 – Design That Deals With Digital Surveillance

In recent years many designers and artists have chosen to create critical work about digital surveillance. Common methods are recontextualizing quite fundamental aspects of surveillance methods. An example of this is two installations exhibited at the London Design Biennale in 2018. R Luke Dubois “Expression portrait” used facial recognition technology to guess the visitors’ emotional state, age, and race.

In the same exhibition “Expression Mirror” by Zach Lieberman uses technology similar to Instagram’s filter software to create collages matching the faces of the visitors’ who were displaying a similar emotion. Both of these works show the view- er facial recognition doing something that they are not used to seeing it do. All of a sudden they have to consider the reality that simple facial recognition software can read a lot of information. At the same time, they experience a stripped-down version of facial recognition that is not on their own personal device. The imperfec- tions of these artworks also reflect the imperfections of facial recognition software.

Both of the artists have commented that issues of privacy need more attention and that there is a need for more technological literacy for that to happen. (Aouf, 2018)

17

Figure 6

Zach Lieberman – Expression Mirror at the London Design Biennale

Figure 7

R Luke Dubois – Expression Portrait at the London Design Biennale

(18)

The design studio Metahaven, founded by Vinca Kruk and Daniel van der Velden, and their research project “FaceState” from 2011 serve as an inspiration for this project. “FaceState” explores the connection between social media and the state.

The “FaceState” is the dream state of neoliberal politicians and successful entrepre- neurial technocrats. It suggests that Facebook might become a tool for government surveillance in the future. They call it the “ultraminimal state”. In their visuals, they reference Facebook branding as a neutral entity and the standardization that takes place on social media. The “FaceState” embraces all of these aspects in their branding. The project took the form of an installation at the exhibition “Graphic Design: Now in Production” at Walker Art Center in Minneapolis. There Metahaven exhibited visual artifacts for the “FaceState” (Hyde, 2011). Their critique of Face- book branding posing as a neutral entity and their prediction of states embracing social media as neutral and democratic entities was quite accurate and proves that the project was based on a great understanding of the political state at that point in time.

18

Figure 8

Metahaven Facestate at Walker Art Center

Figure 9

Metahaven Facestate at Walker Art Center

(19)

“Puzzled by Espionage” a project by Ruben Pater engages the viewer with the com- plex topic of espionage by solving puzzles. These were published weekly in a Dutch newspaper shortly after former NSA-contractor Edward Snowden unveiled the mass surveillance being done by the governments of the United States, the United Kingdom, and their allies. With the project, Pater draws on the connection between cryptography and digital surveillance. (Pater, 2014)

19

Figure 10

Ruben Pater – Puzzled by Espionage

(20)

Any effort to engage the public in issues of privacy and digital surveillance should be applauded. In “Log In or Sign Up” similar methods and ideas are put forth as in the aforementioned projects and are used to bring the issues into a different con- text than the reader might be used to seeing them in. The question is, if reading about Facebook’s slip-ups or Google’s smart home technology being a surveillance system on the news doesn’t move public opinion enough, how can we paint a pic- ture of what is really happening? And how does that result in a change in general opinions? And that is exactly the intention of the book. Taking the current state of the digital milieu and showing what is hiding and how it really affects us.

Digital surveillance happens in various ways. In the writings, we see some of the methods used by companies to surveil users and how it affects them. As companies now pride themselves in being able to use digital methods to affect offline behavior we are entering a period of surveillance capitalism where the connection between these surveillance methods, the ads, and our online interactions are becoming harder to understand and grasp.

5.0 – Interactive Ebook

I was first introduced to the idea of working with interactive elements in pdfs and epub files in a lecture with Linnaeus University Department of Design teach- er Johan Ahlbäck in 2017. During the lecture he showed the interactive ebook he designed for Janna Holmstedt as a part of her doctoral thesis in fine arts at Lunds University. Her project “Are You Ready for a Wet Live-In?” included artworks that relied on sound and moving images in their presentation. (Holmstedt, 2017) Seeing this book at the time opened up a new way of thinking about ebooks for me. In the final project for the module where the aforementioned lecture took place, I includ- ed a video file in a long format poster that was meant to be read as a book in scroll- ing format.

Because of this earlier experience, it was natural for me to think of a book as some- thing that could also include interactive and moving elements. By including inter- active elements it became easier to imitate the digital experiences that the writings revolve around.

It is quite commonplace for articles relating to digital surveillance to include some slight references to the digital experience in the form of interactive elements or moving images. Digital publishers utilize these methods more and more and often use the opportunity when the topic relates to something digital. They either use the interactive elements or moving objects as an illustration or for infographics.

Two different articles about digital tracking, both published in the last year gath- ered a lot of attention. One in Norway and one in the United States. They included similar interactive elements in presenting their content. One was a New York Times article published in December of 2019. It was about “The Times’ Privacy Project”

obtaining the location of 12 million Americans through their smartphones. The file was handed to them by an anonymous source who was concerned about the infor- mation being so easily available and wanted to inform the public as well as law- makers. With the information in the file, the journalists could track the exact loca- tion of a single smartphone during the period of several months in 2016 and 2017.

(Thompson and Warzel, 2019)

20

(21)

The other article was from NRK, the Norwegian government-owned radio and tele- vision public broadcasting company. Their journalists bought access to the location of tens of thousands of smartphones in Norway from a British data reseller. In the article, they detail the whereabouts of one Norwegian citizen whose information they had bought and revealed all the details about his life they could gather from just his location data. (Futuly, Lied and Gundersen, 2020)

Both articles tell a story using interactive elements. The presentation helps put the rather overwhelming and scary information in context and keeps the reader engaged. One could even argue that the presentation in the latter article assists in convincing the reader that they are being tracked at this very moment. Both articles served as an inspiration to the project in their presentation method.

21

Figure 11

Twelve Million Phones, One Dataset, Zero Privacy

Figure 12

Twelve Million Phones, One Dataset, Zero Privacy

(22)

22

Figure 13

Twelve Million Phones, One Dataset, Zero Privacy

Figure 14

Twelve Million Phones, One Dataset, Zero Privacy

Figure 15

Twelve Million Phones, One Dataset, Zero Privacy

(23)

23

Figure 16

Avslørt av mobilen

Figure 17

Avslørt av mobilen

Figure 18

Avslørt av mobilen

(24)

24

Figure 19

Avslørt av mobilen

Figure 20

Avslørt av mobilen

Figure 21

Avslørt av mobilen

(25)

6.0 – Results

In the following section, I will detail the process of writing the short stories and poems as well as providing a context for the research and ideas they are grounded in. I will also write about the design choices I made when designing the interactive ebook.

Pop up - Terms of Service

At the very start of the book, there is a pop up asking the reader to agree to the terms and services. This pop up is placed there to show the reader that they can interact with the buttons in the book. It also sets the tone for the book by immedi- ately referencing the manipulative design of consent notices. The reader is given two options, to agree, or to read the terms and services and then agree. If the reader clicks to read, a wall of text shows up. The button to agree is much more appealing to click because it is bright, but the reader might click “read” out of curiosity.

25

Figure 22 Log In or Sign up

(26)

Find Your Friends

The first poem of the book is about the manipulative language companies use when asking you to sign up for a platform or a service. I wanted the reader to be reminded of the language of Facebook, which uses your friends’ names and profile photos to manipulate you into signing up for more of their services, which gives them access to more of your data. I also wanted it to remind of friend spam, where you receive messages that appear to be from or about your friends but are actually about a ser- vice or a product.

26

Figure 23 Log In or Sign up

(27)

Necktie

“Necktie” is a story about targeted advertising. More specifically it addresses how an algorithm takes advantage of a grown man’s low point and convinces him to consume. In this story, we follow a few hours in the life of a traveling businessman who is experiencing feelings of self-doubt and some anxiety. Through him scrolling through his Instagram feed we are informed of his personal life and his emotional state. From the context, the reader can assume that the character uses social media to escape his anxiety. I was thinking about the “machine zone”, which I mentioned earlier in the thesis, when I wrote about how he interacted with his smartphone.

I specifically mention it when he looks away so that the reader knows that before that moment he has been staring at the screen without looking away. In the story, we see how his algorithm communicates with him through curated content and ads on social media. In the end, our character, Adrian, buys an expensive tie to escape these feelings. The purchase is influenced by a targeted ad he received that made him feel like he was missing something. Adrian convinces himself that he needs the expensive tie to not look bad in comparison to his coworkers. At the point of the purchase, he believes that it is appropriate for him to buy this expensive tie, although it is implied that he can not afford such luxuries. The idea of a targeted ad having such an effect on is subject that they change how they see themselves is based on a study by Christopher A. Summers, Robert W. Smith, and Rebecca Walker Reczek that proves precisely that (Summers, Smith and Reczek, 2016). By separat- ing the phones or the algorithm’s perspective and placing it on the side of the story I think the reader can understand better the ideas put forth in the story.

At the very beginning of our story, the character is reduced to a set of numbers and a profile. In the middle of the story, a scrolling text appears where the content he interacts with is seen from his social media algorithms perspective. At the end of the story, he is reduced to a transaction. These parts of the story are set in a coding typeface and are to appear to the reader as information meant for an algorithm.

27

Figure 24 Log In or Sign up

(28)

Finish Setting Up

“Finish Setting Up” is a poem about being bullied into giving away information. It explores the neutral and friendly but also commanding language used in these situ- ations. More specifically it draws attention to Apple’s wish that you hand over your credit card information so that your phone can replace it as a payment method.

This way your phone can track every single payment you make. What sparked the idea for the poem was that on the iPhone there is a red notification dot that sticks to your settings app on the home screen unless you connect your credit card to ApplePay. If you refuse to do that, Apple will send you multiple pop-ups during the first days of using your new phone, asking you to sign up for the service. After that, the red notification dot stays there as a reminder of your disobedience.

After the reader has been on the page for around one minute, a pop up appears reminding them that they wish to continue without setting up. This time the lan- guage is a bit more demanding. On the pop up they are given a non-option, to sign up now or be reminded to sign up later.

Apple is on the verge of “Surveillance Capitalism”, meaning that they are not yet in the market of selling data. Not that we know of. But Apple uses the same methods as any other digital service to extract our data. Whether or not they take the full leap into “Surveillance capitalism” is yet to be declared, but if they ever need to, they sure have the capabilities. I will note that Apple still relies on selling products as their main source of income. If that changes I have no doubt that they will join the other tech giants in the data market.

I picked Apple as the target in this poem, not because of surveillance, but because they popularized the red dot notification. It was most notably used on Apple’s Mail app in Mac OS X. When the iPhone launched in 2007 the dot became a permanent part of our lives. (Herrman, 2018)

28

Figure 25 Log In or Sign up

(29)

Give me

A poem about the greed of the companies that surveil us and constantly ask for more personal data. The poem appears throughout the book in three parts. As the information being demanded becomes more personal the layout gets more broken down. What I tried to convey with the broken layout is that the viewer is seeing information that is meant for computers, or a corrupted file is in place and the text is not meant for the reader. For added effect, I added moving backgrounds, blinking elements, and a blacked-out background that disappears and appears again.

29

Figure 26 Log In or Sign up

(30)

Evelyn and the Magic Mirror

This is a short story about a teenage girl and a day in her life. In the story, her smartphone and social media take the form of a possessed magic mirror. Inside the mirror lives an evil presence that surveils her and manipulates her. The evil pres- ence is referred to as a black box in the story. In science, computing, and engineer- ing the term black box is used to describe systems and processes that can only be described by the input consumed and the output they create. Often those who work with algorithms will not know exactly how the algorithm works, only whether or not it can provide the desired outcome. The mirror concept was inspired by Zuboff referring to Google’s surveillance tactics as a one-way mirror. (Zuboff, 2020)

The story is inspired by a leaked Facebook document that reveals that Facebook officials bragged to advertisers in Australia and New Zealand that they were able to target teenagers when they were experiencing emotional low points. (Levin, 2017) It is also based on answers from the survey where younger participants detailed their feelings about social media. Teenage participants in the survey mentioned that they spent a large part of their day on social media and that they carefully con- sidered how they presented themselves.

At the end of the story, we enter the same sort of dark layout as was present when we saw the perspective of the algorithm in “Necktie”. Here we see how the black box takes Evelyn’s interactions and turns them into prediction products. Through moving text, we watch it arrange content for Evelyn’s day in front of the mirror.

30

Figure 27 Log In or Sign up

(31)

Leaving the room

“Leaving the room” is a poem about wanting to leave social media, the fear of miss- ing out on social media, and about who owns the internet. It also addresses what is at this point a familiar topic, surveillance. In the poem, I speak in the first person and wonder why we have allowed the tech giants to take over the internet and try to inspire the reader to imagine an internet where the people are the owners.

In the poem, digital platforms become buildings, the online places we interact in such as chats, forums, comment sections, or video calls are referred to as rooms.

The internet is the city. The giant tech corporations or their billionaire owners are the landlords who listen to our conversations through the vents.

During the spring months of 2020, we design students at Linnaeus University had no choice but to use the video calling platform Zoom on an almost daily basis. On that platform, the video conference meeting places are called rooms. In my mind, this is a shift towards referring to digital platforms as actual places, a trend that might catch on. Therefore I found it fitting to use this analogy in the final poem of the book.

At the end of the poem, I suggest that we change the way we use the internet and get out from under the landlords, the tech giants. I wanted to end the book with some optimisms to inspire the reader to believe in another digital future, one where we are in control.

31

Figure 28 Log In or Sign up

(32)

Epilogue

In the epilogue, I address many of the same things as I do in this thesis. I’m not afraid to oversimplify or over-explain this project because I genuinely care that the reader understands my writings and the project in general. This is why I did not shy away from speaking directly in the epilogue. I hope that if the reader had not re- ceived clearly the moral, ideas, and criticism in the book, that the epilogue helped clarify some of it.

Reading list

At the end of the book, I included a reading list. It included:

A few selected books about technology and surveillance.

Websites about design, surveillance, and data.

Links to articles that relate directly to the short stories, in case the reader believed that they were only based in fiction.

And at the end some resources to fight digital surveillance on an individual level.

32

(33)

7.0 – Discussion

At the start of this project, I set out to engage people in the topic of digital surveil- lance by companies by the method of creative writing and design. After doing ex- tensive research on what drives the surveillance economy I also began researching the emotional aspect of being surveilled. By seeing how designers, journalists, ac- tivists, and others engaged people in the issue I saw how my project could be an ex- tension and continuation of that movement. Having a workshop with other design students and receiving so many replies to my survey benefitted the project greatly.

I was able to see the issues I was so emerged in myself from others’ perspective and I was able to learn from all of them. This is the knowledge that would have been hard to find in books, articles, or documentaries as it was so deeply personal. It was also important for me to be reminded that people do care about the issue of digital surveillance. Many simply do not know where they should direct their anger, confu- sion, or fears.

As I mentioned earlier in the thesis, it seems that reporting on the privacy issues we face on the internet does not inspire enough of a change in attitudes towards the platforms that utilize it. People may find Facebook annoying and say that they hate it, yet most continue to use it or the other platforms it owns. As with many problems in our society, people need to be engaged in the topic, they need concrete examples, they need context and they have to see how it relates to them.

As for the limitations of my project, I wish I could have provided more solutions to the issue of surveillance. Fighting digital surveillance on an individual level can be a lot of work and it can feel lonely when people around you don’t understand or participate in your efforts.

However, what is most important to me is that people do not view the current digi- tal climate as normal or inevitable. They should be outraged and they should notice when they feel tricked, manipulated and surveilled. When they have noticed it they should share it and protest it. I hope my project can add to the discussion of digi- tal surveillance and that it inspires someone to work against the surveillance that takes place on the internet. I believe in the possibility of another digital future. I believe that we can turn to safer platforms that are not owned by billionaires. I can only hope that others will believe in and fight for the same cause.

33

(34)

References

Aouf, R. S. (2018) Cooper Hewitt reveals sinister side of facial recognition technol- ogy. Available at: https://www.dezeen.com/2018/09/25/cooper-hewitt-face-val- ues-us-pavilion-london-design-biennale/ (Accessed: 18 February 2020).

Brignull, H. Dark Patterns (no date). Available at: https://www.darkpatterns.org/

(Accessed: 4 June 2020).

Fitzpatrick, K. (2019) ‘At the Frontiers of Surveillance Capitalism’, 30 April. Avail- able at: https://www.thenation.com/article/archive/shoshana-zuboff-age-of-sur- veillance-capitalism-book-review/ (Accessed: 31 May 2020).

Futuly, T., Lied, H. and Gundersen, M. (no date) Avslørt av mobilen – Norge, Norsk rikskringkasting. Available at: https://www.nrk.no/norge/xl/avslort-av-mo- bilen-1.14911685 (Accessed: 4 June 2020).

Data Protection Commission, (2019) Report by the Data Protection Commission on the use of cookies and other tracking technologies. Available at: https://www.

dataprotection.ie/en/news-media/publications/report-dpc-use-cookies-and-oth- er-tracking-technologies (Accessed: 3 June 2020).

Harris, T. (2016) Smartphone addiction is part of the design - DER SPIEGEL - Inter- national, Der Spiegel. Available at: https://www.spiegel.de/international/zeitgeist/

smartphone-addiction-is-part-of-the-design-a-1104237.html (Accessed: 3 June 2020).

Herrman, J. (2018) ‘How Tiny Red Dots Took Over Your Life’, The New York Times, 27 February. Available at: https://www.nytimes.com/2018/02/27/magazine/

red-dots-badge-phones-notification.html (Accessed: 1 June 2020).

Holmstedt, J. (2017) Are You Ready for a Wet Live-In?: Explorations into Listening.

Lund University. Available at: https://portal.research.lu.se/portal/en/publications/

are-you-ready-for-a-wet-livein(19ab6e88-5844-4625-a118-10d49697692a).html (Accessed: 3 June 2020).

Hyde, A. (2011) Metahaven’s Facestate. Available at: https://walkerart.org/maga- zine/metahavens-facestate (Accessed: 14 April 2020).

John, L. (2019) ‘Harvard professor says surveillance capitalism is undermining de- mocracy’, Harvard Gazette, 4 March. Available at: https://news.harvard.edu/gazette/

story/2019/03/harvard-professor-says-surveillance-capitalism-is-undermining-de- mocracy/ (Accessed: 31 May 2020).

Levin, S. (2017) ‘Facebook told advertisers it can identify teens feeling “insecure”

and “worthless”’, The Guardian, 1 May. Available at: https://www.theguardian.com/

technology/2017/may/01/facebook-advertising-data-insecure-teens (Accessed: 23

March 2020).

34

(35)

Lynch, C. G. (2007) A Wake-Up Call for Users in Facebook-Beacon Controversy, CIO.

Available at: https://www.cio.com/article/2437512/a-wake-up-call-for-users-in- facebook-beacon-controversy.html (Accessed: 4 June 2020).

Nouwens, M. et al. (2020) ‘Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence’, Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13. doi: 10.1145/3313831.3376321.

Pater, R. (2014) untold stories, untold stories. Available at: http://www.untold-sto- ries.net/?p=Puzzled-by-Espionage (Accessed: 14 April 2020).

Rieger, S. and Sinders, C. (2020) ‘Dark Patterns: Regulating Digital Design’. Avail- able at: https://www.stiftung-nv.de/en/publication/dark-patterns-regulating-digi- tal-design

Schüll, N. D. (2008) ‘Beware: “Machine Zone” Ahead’, Washington Post, 6 July.

Available at: http://www.washingtonpost.com/wp-dyn/content/article/2008/07/04/

AR2008070402134.html (Accessed: 4 June 2020).

Schüll, N. D. (2012) Addiction by Design: Machine Gambling in Las Vegas. Princeton University Press.

Summers, C. A., Smith, R. W. and Reczek, R. W. (2016) ‘An Audience of One: Behav- iorally Targeted Ads as Implied Social Labels’, Journal of Consumer Research. Ox- ford Academic, 43(1), pp. 156–178. doi: 10.1093/jcr/ucw012.

Thompson, S. A. and Warzel, C. (2019) ‘Opinion | Twelve Million Phones, One Data- set, Zero Privacy - The New York Times’, The New York Times, 19 December. Avail- able at: https://www.nytimes.com/interactive/2019/12/19/opinion/location-track- ing-cell-phone.html (Accessed: 18 February 2020).

Utz, C. et al. (2019) ‘(Un)informed Consent: Studying GDPR Consent Notices in the Field’, Proceedings of the 2019 ACM SIGSAC Conference on Computer and Commu- nications Security, pp. 973–990. doi: 10.1145/3319535.3354212.

Wilshere, A. (2017) Are Notifications A Dark Pattern? Available at: https://

trydesignlab.com/blog/are-notifications-a-dark-pattern-ux-ui/ (Accessed: 4 June 2020).

Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 1 edition. New York: PublicAffairs.

Zuboff, S. (2020) ‘Opinion | You Are Now Remotely Controlled’, The New York Times, 24 January. Available at: https://www.nytimes.com/2020/01/24/opinion/sun- day/surveillance-capitalism.html (Accessed: 14 April 2020).

35

(36)

Images

Figure 1

Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 1 edition. New York: PublicAffairs. (p. 97)

Figures 2-4

Bloor on Twitter: ‘Adobe Reader now insists I **sign in** to read a PDF that I hold on my local computer. Fuck off Adobe. When you’ve finished fucking off, pause for breath, and FUCK OFF again. There is no “do this later”. There is no

“cancel”. Even Quit is greyed out. Force Quit -> Uninstall. https://t.co/WmVr- bv9tpJ’ / Twitter (no date) Twitter. Available at: https://twitter.com/alexbloor/sta- tus/1050396820692717574 (Accessed: 4 June 2020).

Josh Zelonis on Twitter: ‘.@NewYorker This is a dark pattern. The option for “Do Not Sell My Personal Information” appears to be ON as in “Please don’t sell my info” but the instructions say to toggle it to the left to turn it off. Which is it? @ah- hogan @darkpatterns https://t.co/MkUg2auhpi’ / Twitter (no date) Twitter. Avail- able at: https://twitter.com/josh_zelonis/status/1244441571405099008 (Accessed: 4 June 2020).

Sean Taylor on Twitter: ‘@darkpatterns how about do not show this message again https://t.co/B28xiPMF21’ / Twitter (no date) Twitter. Available at: https://twitter.

com/addressforbots/status/1202890676662226944 (Accessed: 4 June 2020).

‘Snapshot’ (no date a). Available at: https://twitter.com/josh_zelonis/sta- tus/1244441571405099008 (Accessed: 4 June 2020).

‘Snapshot’ (no date b). Available at: https://twitter.com/addressforbots/sta- tus/1202890676662226944 (Accessed: 4 June 2020).

‘Snapshot’ (no date c). Available at: https://twitter.com/alexbloor/sta- tus/1050396820692717574 (Accessed: 4 June 2020).

Figure 5

Gunnarsson, E. (2020) Anonymous survey, Digital Surveillance Figures 6 – 7

Aouf, R. S. (2018) Cooper Hewitt reveals sinister side of facial recognition technol- ogy. Available at: https://www.dezeen.com/2018/09/25/cooper-hewitt-face-val- ues-us-pavilion-london-design-biennale/ (Accessed: 18 February 2020).

Figures 8 – 9

Hyde, A. (2011) Metahaven’s Facestate. Available at: https://walkerart.org/maga-

zine/metahavens-facestate (Accessed: 14 April 2020).

36

(37)

Figure 10

Pater, R. (2014) untold stories, untold stories. Available at: http://www.untold-sto- ries.net/?p=Puzzled-by-Espionage (Accessed: 14 April 2020).

Figures 11 – 15

Thompson, S. A. and Warzel, C. (2019) ‘Opinion | Twelve Million Phones, One Data- set, Zero Privacy - The New York Times’, The New York Times, 19 December. Avail- able at: https://www.nytimes.com/interactive/2019/12/19/opinion/location-track- ing-cell-phone.html (Accessed: 18 February 2020).

Figures 16 – 21

Futuly, T., Lied, H. and Gundersen, M. (no date) Avslørt av mobilen – Norge, Norsk rikskringkasting. Available at: https://www.nrk.no/norge/xl/avslort-av-mo- bilen-1.14911685 (Accessed: 4 June 2020).

Figures 22 – 28

Gunnarsson, E. (2020) Log In or Sign Up. Available at: https://www.loginsignup.eu/

37

(38)

38

Read “Log In or Sign Up” at

http://www.loginsignup.eu/

References

Related documents

Torbjörn Becker, Director at SITE, and followed by a short discussion by Giancarlo Spagnolo, SITE Research Fellow and Professor, University of Rome II. Time will be available

The work with more focus on outdoor recreation monitoring and management activities in coastal and marine areas is not only an uphill process. In fact, the process can

Accordingly, this paper aims to investigate how three companies operating in the food industry; Max Hamburgare, Innocent and Saltå Kvarn, work with CSR and how this work has

While much has been written on the subject of female political participation in the Middle East, especially by prominent scholars such as Beth Baron 5 and Margot Badran, 6 not

By directing attention to babies’ engagements with things, this study has shown that, apart from providing their babies with a range of things intended for them, parents

Retention ponds in naturalized green areas will filter impurities out of storm water and protect the surrounding natural habitats. Clean, naturally treated storm water is directed

Despite the low annual volume, there are now clear signals that a recovery is un- derway. Transaction volumes have picked up significantly in the last two quarters, foreign

Through meaning making in the musical world pupils should reach a feeling of I can express myself through music, I can compose, I can make music, and I can listen to and