{"id":11682,"date":"2018-03-23T13:46:10","date_gmt":"2018-03-23T13:46:10","guid":{"rendered":"http:\/\/www.lifeandnews.com\/articles\/?p=11682"},"modified":"2018-03-23T13:46:10","modified_gmt":"2018-03-23T13:46:10","slug":"dont-quit-facebook-but-dont-trust-it-either","status":"publish","type":"post","link":"https:\/\/www.lifeandnews.com\/articles\/dont-quit-facebook-but-dont-trust-it-either\/","title":{"rendered":"Don&#8217;t quit Facebook, but don&#8217;t trust it, either"},"content":{"rendered":"<p><span><a href=\"https:\/\/theconversation.com\/profiles\/denise-anthony-454904\">Denise Anthony<\/a>, <em><a href=\"http:\/\/theconversation.com\/institutions\/dartmouth-college-1720\">Dartmouth College<\/a><\/em> and <a href=\"https:\/\/theconversation.com\/profiles\/luke-stark-454903\">Luke Stark<\/a>, <em><a href=\"http:\/\/theconversation.com\/institutions\/dartmouth-college-1720\">Dartmouth College<\/a><\/em><\/span><\/p>\n<p>Is it time to <a href=\"https:\/\/www.ft.com\/content\/bc8a642a-2b6a-11e8-9b4b-bc4b9f08f381\">give up on social media<\/a>? Many people are thinking about that in the wake of revelations regarding <a href=\"https:\/\/www.theguardian.com\/news\/series\/cambridge-analytica-files\">Cambridge Analytica\u2019s questionable use<\/a> of personal data from over 50 million Facebook users to support the Trump campaign. Not to mention the troubles with <a href=\"https:\/\/slate.com\/technology\/2018\/03\/cambridge-analytica-demonstrates-that-facebook-needs-to-give-researchers-more-access.html\">data theft<\/a>, <a href=\"https:\/\/datasociety.net\/output\/dead-reckoning\/\">trolling, harassment<\/a>, the <a href=\"https:\/\/datasociety.net\/output\/lexicon-of-lies\/\">proliferation of fake news<\/a>, <a href=\"https:\/\/datasociety.net\/output\/media-manipulation-and-disinfo-online\/\">conspiracy theories and Russian bots<\/a>. <\/p>\n<p>The <a href=\"https:\/\/www.vox.com\/policy-and-politics\/2018\/3\/21\/17144748\/case-against-facebook\">real societal problem<\/a> might be <a href=\"https:\/\/slate.com\/technology\/2018\/03\/the-real-scandal-isnt-cambridge-analytica-its-facebooks-whole-business-model.html\">Facebook\u2019s business model<\/a>. Along with other social media platforms, it makes money by nudging users to provide their data (without understanding the potential consequences), and then using that data in ways well beyond what people may expect.<\/p>\n<p>As researchers who <a href=\"https:\/\/scholar.google.com\/citations?user=xKR6oTIAAAAJ&amp;hl=en\">study social media<\/a> and the <a href=\"https:\/\/scholar.google.com\/citations?user=iQaNa-kAAAAJ&amp;hl=en\">impact of new technologies on society<\/a> in both the past and the present, we share these concerns. However, we\u2019re <a href=\"https:\/\/medium.com\/@cfiesler\/why-data-sharing-privacy-controversies-arent-killing-social-media-platforms-a3e3ecfdd801\">not ready to give up<\/a> on the idea of social media just yet. A main reason is that, like all forms of <a href=\"https:\/\/mitpress.mit.edu\/books\/always-already-new\">once \u201cnew\u201d media<\/a> (including everything from the telegraph to the internet), social media has become an <a href=\"http:\/\/www.pewinternet.org\/2018\/03\/01\/social-media-use-in-2018\/\">essential conduit<\/a> for interacting with other people. We don\u2019t think it\u2019s reasonable for users to be told their only hope of <a href=\"https:\/\/www.nytimes.com\/2018\/03\/19\/opinion\/facebook-cambridge-analytica.html\">avoiding exploitation<\/a> is to isolate themselves. And for many vulnerable people, including members of <a href=\"https:\/\/mitpress.mit.edu\/books\/out-shadows-streets\">impoverished, marginalized or activist communities<\/a>, leaving Facebook is <a href=\"https:\/\/slate.com\/technology\/2018\/03\/dont-deletefacebook-thats-not-good-enough.html\">simply not possible<\/a> anyway.<\/p>\n<p>As individuals, and society as a whole, come to better understand the role social media plays in life and politics, they\u2019re wondering: Is it possible \u2013 or worthwhile \u2013 to trust Facebook?<\/p>\n<h2>Designing for attention<\/h2>\n<p>Of course, social media platforms don\u2019t exist without their users. Facebook has grown from its origins serving only college students by exploiting the <a href=\"https:\/\/hbr.org\/product\/information-rules-a-strategic-guide-to-the-network-economy\/863X-HBK-ENG\">network effect<\/a>: If all your friends are socializing on the site, it\u2019s tempting to join yourself. Over time this network effect has made Facebook not only more valuable, but also harder to leave. <\/p>\n<p>However, now that Facebook and its ilk are under fire, it\u2019s possible that those network effects might unravel the other way: Facebook\u2019s <a href=\"https:\/\/techcrunch.com\/2018\/01\/31\/facebook-q4-2017-earnings\/\">number of active users continued to rise in 2017<\/a>, but in the final three months of the year, its growth showed signs of slowing. If all your friends are leaving Facebook, you might go with them.<\/p>\n<p>The design of social media platforms like Facebook \u2013 and many other common apps, such as Uber \u2013 is intentionally engrossing. Some scholars go so far as to call it \u201c<a href=\"https:\/\/press.princeton.edu\/titles\/9156.html\">addictive<\/a>,\u201d but we\u2019re uncomfortable using the term so broadly in this context. Nevertheless, digital designers <a href=\"https:\/\/darkpatterns.org\/\">manipulate users\u2019 behavior<\/a> with a wide array of interface elements and <a href=\"http:\/\/captology.stanford.edu\/wp-content\/uploads\/2015\/02\/RSA-The-new-rules-of-persuasion.pdf\">interaction strategies<\/a>, such as <a href=\"https:\/\/yalebooks.yale.edu\/book\/9780300122237\/nudge\">nudges<\/a> and cultivating routines and habits, to keep users\u2019 attention.<\/p>\n<p>Attention is at the center of the social media business model because it\u2019s worth money: Media theorist Jonathan Beller has observed that \u201c<a href=\"http:\/\/www.cabinetmagazine.org\/issues\/24\/beller.php\">human attention is productive of value<\/a>.\u201d<\/p>\n<blockquote class=\"twitter-tweet\" lang=\"en\"><p>\n            <a href=\"https:\/\/twitter.com\/philcooke\/status\/378319529081704448\"><\/a>\n           <\/p><\/blockquote>\n<p>          <script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<h2>Playing tricks on users<\/h2>\n<p>To attract users, keep them engaged and ensure they want to come back, companies manipulate the details of visual interfaces and user interaction. For example, the <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=2686227\">ride-sharing app Uber<\/a> shows customers <a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/mgbz5a\/ubers-phantom-cabs\">phantom cars<\/a> to trick them into thinking drivers are nearby. The company uses similar <a href=\"https:\/\/www.nytimes.com\/interactive\/2017\/04\/02\/technology\/uber-drivers-psychological-tricks.html\">psychological tricks<\/a> when sending drivers text messages encouraging them to stay active.<\/p>\n<p>This manipulation is particularly effective when app developers <a href=\"https:\/\/www.nngroup.com\/articles\/the-power-of-defaults\/\">set default options<\/a> for users that serve the company\u2019s needs. For example, some privacy policies make <a href=\"https:\/\/doi.org\/10.1023\/A:1015044207315\">users opt out of sharing their personal data, while others allow users to opt in<\/a>. This initial choice affects not only what information users end up disclosing, but also their overall trust in the <a href=\"https:\/\/doi.org\/10.1108\/14684520710832342\">online platform<\/a>. Some of the <a href=\"https:\/\/www.vox.com\/technology\/2018\/3\/21\/17148852\/mark-zuckerberg-facebook-cambridge-analytica-breach\">measures announced<\/a> by Facebook CEO Mark Zuckerberg in the wake of the Cambridge Analytica revelations \u2013 including tools showing users which third parties have access to their personal data \u2013 could further complicate the design of the site and discourage users even more. <\/p>\n<h2>Frameworks of trust<\/h2>\n<p>Was users\u2019 trust in Facebook misplaced in the first place? Unfortunately, we think so. Social media companies have never been transparent about what they\u2019re up to with users\u2019 data. Without <a href=\"http:\/\/www.kellogg.northwestern.edu\/trust-project\/videos\/grayson-ep-1.aspx\">full information about what happens<\/a> to their personal data once it\u2019s gathered, we recommend people default to not trusting companies until they\u2019re convinced they should. Yet neither regulations nor third-party institutions currently exist to ensure that social media companies are trustworthy.<\/p>\n<p>This is not the first time new technologies created social change that disrupted established mechanisms of trust. For example, in the industrial revolution, new forms of organization like factories, and major demographic shifts from migration, increased contact among strangers and across cultures. That altered established relationships and forced people to do business with unknown merchants.<\/p>\n<p>People could <a href=\"https:\/\/doi.org\/10.1086\/228791\">no longer rely<\/a> on interpersonal trust. Instead, <a href=\"http:\/\/psycnet.apa.org\/record\/1988-10420-001\">new institutions<\/a> arose: Regulatory agencies like the Interstate Commerce Commission, trade associations like the American Railway Association, and other third parties like the American Medical Association\u2019s Council on Medical Education established systematic <a href=\"https:\/\/www.russellsage.org\/publications\/cooperation-without-trust-0\">rules for transactions<\/a>, standards for product quality and professional training. They also offered accountability if <a href=\"https:\/\/www.researchgate.net\/publication\/261707664_Solving_the_Problem_of_Trust\">something went wrong<\/a>.<\/p>\n<h2>A new need for protection<\/h2>\n<p>There are <a href=\"https:\/\/doi.org\/10.1515\/auk-2004-0111\">not yet similar standards<\/a> and accountability requirements for 21st-century technologies like social media. In the U.S., the <a href=\"https:\/\/yalebooks.yale.edu\/book\/9780300122237\/nudge\">Federal Trade Commission<\/a> is one of the few regulatory bodies working to hold digital platforms to account for business practices that are deceptive or potentially unfair. The <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2018-03-20\/ftc-said-to-be-probing-facebook-for-use-of-personal-data\">FTC is now investigating<\/a> Facebook over the Cambridge Analytica situation.<\/p>\n<p>There is <a href=\"https:\/\/pdfs.semanticscholar.org\/d764\/3e79b0a382ef535a2fcd49d351069690920f.pdf\">plenty of demand<\/a> for <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=2567476\">more supervision<\/a> of <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=2573181\">social media platforms<\/a>. Several existing proposals could <a href=\"http:\/\/codev2.cc\/\">regulate<\/a> and <a href=\"http:\/\/www.hup.harvard.edu\/catalog.php?isbn=9780674035072\">support<\/a> trust online. <\/p>\n<p>Other countries have rules, such as the EU\u2019s <a href=\"https:\/\/www.eugdpr.org\/\">General Data Protection Regulation<\/a> and Canada\u2019s <a href=\"https:\/\/www.priv.gc.ca\/en\/privacy-topics\/privacy-laws-in-canada\/the-personal-information-protection-and-electronic-documents-act-pipeda\/\">Personal Information Protection and Electronic Documents Act<\/a>. However, in the U.S., technology companies like Facebook have actively <a href=\"https:\/\/www.eff.org\/deeplinks\/2017\/10\/how-silicon-valleys-dirty-tricks-helped-stall-broadband-privacy-california\">blocked<\/a> and resisted these efforts while <a href=\"https:\/\/theintercept.com\/2018\/03\/21\/ftc-facebook-chuck-schumer\/\">policymakers<\/a> and other tech gurus have convinced people they\u2019re not necessary.<\/p>\n<p><img loading=\"lazy\" src=\"https:\/\/counter.theconversation.com\/content\/93776\/count.gif?distributor=republish-lightbox-basic\" alt=\"The Conversation\" width=\"1\" height=\"1\" \/>Facebook has the technical know-how to give users more control over their private data, but <a href=\"https:\/\/medium.com\/@shanegreen\/facebook-ignored-recommendations-from-2016-internal-study-on-their-data-and-privacy-problem-6dc7c5f75b6f\">has chosen not to<\/a> \u2013 and that\u2019s not surprising. No laws or other institutional rules require it, or provide necessary oversight to ensure that it does. Until a major social media platform like Facebook is <a href=\"https:\/\/www.npr.org\/2018\/03\/21\/595791380\/sen-richard-blumenthal-weighs-in-on-how-congress-could-regulate-facebook\">required<\/a> to reliably and transparently demonstrate that it is protecting the interests of its users \u2013 as distinct from its advertising customers \u2013 the calls to <a href=\"https:\/\/www.theguardian.com\/commentisfree\/2018\/mar\/22\/restructure-facebook-ftc-regulate-9-steps-now\">break the company up<\/a> and start afresh are only going to grow.<\/p>\n<p><span><a href=\"https:\/\/theconversation.com\/profiles\/denise-anthony-454904\">Denise Anthony<\/a>, Professor of Sociology, <em><a href=\"http:\/\/theconversation.com\/institutions\/dartmouth-college-1720\">Dartmouth College<\/a><\/em> and <a href=\"https:\/\/theconversation.com\/profiles\/luke-stark-454903\">Luke Stark<\/a>, Postdoctoral Fellow in Sociology, <em><a href=\"http:\/\/theconversation.com\/institutions\/dartmouth-college-1720\">Dartmouth College<\/a><\/em><\/span><\/p>\n<p>This article was originally published on <a href=\"http:\/\/theconversation.com\">The Conversation<\/a>. Read the <a href=\"https:\/\/theconversation.com\/dont-quit-facebook-but-dont-trust-it-either-93776\">original article<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Denise Anthony, Dartmouth College and Luke Stark, Dartmouth College Is it time to give up on social media? Many people are thinking about that in the wake of revelations regarding Cambridge Analytica\u2019s questionable use of personal data from over 50 million Facebook users to support the Trump campaign. Not to mention the troubles with data [&hellip;]<\/p>\n","protected":false},"author":44,"featured_media":11683,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[8],"tags":[483,549,2024,702,1748,506],"_links":{"self":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/11682"}],"collection":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/users\/44"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/comments?post=11682"}],"version-history":[{"count":1,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/11682\/revisions"}],"predecessor-version":[{"id":11684,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/11682\/revisions\/11684"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/media\/11683"}],"wp:attachment":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/media?parent=11682"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/categories?post=11682"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/tags?post=11682"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}