Monday, April 18, 2011

Session 7: Management and Conflict


I chose to analyze the rules of Facebook for this week’s assignment.  For the final project, I will compare the different features of Facebook with a similar, Korean social network site (Cyworld). The official rules governing Facebook can be found on the main page by clicking the “Terms” tab on the bottom; you are automatically linked to the Statement of Rights and Responsibilities (http://www.facebook.com/terms.php) page, and this statement includes additional links to a detailed privacy policy (http://www.facebook.com/policy.php), safety policy, etc. Also, these rules can be found on the Help Center page (http://www.facebook.com/help/); this also provides a useful guide for Facebook users.


Broken Rules


1. This group of people created a group page to discuss their Anti-Israel thoughts; many of them used hate speech (using “F” words) towards Jews based on ethnicity, religion, or national origin. This violates the safety policy that You will not post content that is hateful, threatening… or [that] contains graphic or gratuitous violence. Because there is a systemic feature, other users can report their hate speech or violent behavior. If I were an administrator, I would first let users leave feedback before removing the group for sharing inappropriate content. Answerbag, which also has a feedback system that allows users to flag content as spam or inappropriate for administrator review, uses a similar system (Gazan, 2009). However, if the plan of the group is terrorism or the collection of money to support threats, I would remove the group immediately; the security policy states that “Any credible threats to harm others will be removed. We may also remove support for violent organizations that intimidate or harass any user

 


 
2. This user continuously posted pornographic content to their wall and promoted their adult video website with a link. This violates Facebook’s strict rule that “there is no nudity or pornography policy and any content that is inappropriately sexual will be removed. Because there is a feature for other users to report the person for inappropriate wall posts or even to block the person, administrators can remove the pornographic content and suspend the account immediately after receiving feedback. As Gazan (2007) explains, many rogue behaviors are reported by site moderators by reviewing and monitoring users’ feedback. However, unlike Flickr or YouTube, which both have safety mode as the default setting, Facebook does not have a feature to filter content that is not suitable for minors. As such, I would create such a feature that supports the idea that administrators or designers should listen to users’ feedback and never stop creating a user-centered design (Grimes et al, 2008). 


 
3. There are several fake Facebook accounts or fan pages of Justin Bieber in the search results that violate the Registration and Account Security policy that “You will not provide any false personal information on Facebook, or create an account for anyone other than yourself without permissionYou will not post content or take any action on Facebook that infringes or violates someone else's rights or otherwise violates the law.   




 
I heard there are fans that create fake accounts to promote their favorite celebrities. Even though there is already an official account, like the screenshot above, the user of the below screenshot uses almost the same images and structure to imitate Justin Bieber. What is worse is that there are some people who believe, because the name of the page includes the word “Official”, that it is real. Because impersonation is not allowed on Facebook, there is a feature that allows you to report the person by choosing the option, “This profile is impersonating someone or is fake".  However, in the real world, people can create multiple accounts using false information with different email accounts; as such, these fake accounts are not easily removed, since the process of verifying identities takes time and leads to privacy issues. However, in order to build trust between users, I would rather limit an IP address to creating only one account. Also, there are many groups that are created for reporting abuse and violation of Facebook. It is good that people gather and send feedback at the same time so that administrators can pay attention, “By working collectively, users have a much stronger ability to influence producers to make alterations. Functioning as a collective group in the form of a team or guild allows users the ability to exert more formidable pressure than they could ever accomplish individually” (Grimes et al, 2008). I also feel that there are so many similar groups that separate people into smaller groups; given this I would combine the groups into one under the heading “Feedback Center”, which I  would also build using updated documents of community standards and policies, “Oversight increased both the quantity and quality of contributions while reducing antisocial behavior, and peers were as effective at oversight as experts” (Cosley et al, 2005).
Finally, here are Five Unwritten Rules that I came up with for users:
  1. Do not share personal information with anyone.
  2. Use freedom of speech and expression, unless it violates the Facebook rules.
  3. Be aware of the importance of diversity.
  4. Always learn newly featured applications.
  5. Act like you are in the real world; treat other users with respect and good faith.
References
Cosley, Dan, Dan Frankowski, Sara Kiesler, Loren Terveen, John Riedl (2005). How Oversight Improves Member-Maintained Communities. CHI 2005, April 2-7 2005, Portland, Oregon.
Grimes, Justin, Paul Jaeger and Kenneth Fleischmann (2008).  Obfuscatocracy: A stakeholder analysis of governing documents for virtual worlds. First Monday 13(9).
Gazan, Rich (2009).  When Online Communities Become Self-Aware. Proceedings of the 42nd Hawaii International Conference on System Sciences, Waikoloa, HI, 5-8 January 2009.
Gazan, Rich (2007).  Understanding the Rogue User. In: Diane Nahl and Dania Bilal, eds.  Information & Emotion: The Emergent Affective Paradigm in Information Behavior Research and Theory. Medford, New Jersey: Information Today, 177-185.