Censorship

#OccupationUQAM

In "Crowd, Power and Post-Democracy in the 21st Century”, Zizek (2008) argues that when the normal run of things is traumatically interrupted, the field is open for a ‘discursive’ ideological competition. In other words, when there is a crisis.

occupeuqam-7

After our class yesterday there was a l’UQAM  occupation. It ended late into the night with an unnecessarily large police presence in riot gear, endless tear gas burning throats, and violent dispersion strategies. The night occupation was an immediate reaction to the events of earlier that day at UQAM.

Today, on the campus of the Université du Québec à Montréal, facing a court order demanding that classes be held and the threat of expulsion issued by their administration, hundreds of students turned up to disrupt classes and enforce their democratically voted strike mandate.

In response, the university administration called in the Montreal police, who arrived in full riot gear with pepper spray, tear gas, rubber bullets and batons at the ready. Paradoxically enough, their stated role there was to ensure that classes could occur as scheduled.

The students were shortly boxed in. The bulk of the reported 22 arrests happened as riot police swept onto campus. Students then set up barricades, and police formed a line and prepared to move in.

The occupation was also a response to the larger institutional issues of the different forms of violence against students (physical, economic, etc) and the austerity myth. The violence of those in power is much more insidious and invisible than the violence of destroying university property, which creates an unbalanced imaginary in this political struggle.

When we see images in the media and read the discourse of others which try to decontextualize a serious and complicated issue, and only focus on ‘violent protestors doing damage to property’ we must remember history, context, and the nuanced ways that bodies are excluded from spaces. What are the modes of recourse against the systemic and systematic violence we are faced with? Yes, all of us! Whether we recognize it or not. Although, some of us are affected by institutional violence more than others, which is where our friend intersectionality comes in. And even then, we must self-reflect what subject position we are coming from in the ways in which we orient towards events like the occupation last night. Why do we have x or y opinion on protests/protestors? What has shaped that perspective?

We must also remember the ways in which regulatory powers favor property over human bodies. When are riot gear, teargas and rubber bullets, etc. appropriate modes of policing bodies? (remember the Jason Farman example about increased security cameras pointing to computer labs and not for the safety of students on campus.)

occupeuqam-10

occupeuqam-1

occupeuqam-2occupeuqam-15    occupeuqam-13occupeuqam-12  occupeuqam-6 occupeuqam-5  occupeuqam-3

occupeuqam-8

Photos courtesy of Caroline Ramirez.

Blog Post 3: What the Removal of Rupi Kaur’s Photo Says About Our Society

Drawing on Rupi Kaur’s recent Instagram post displaying menstrual blood on her pants and bed, having been removed twice before being restored to her profile, the different bodies that are “allowed” to take up online space remain governed by minds (still) ruled by the asymmetrical gender norms established by our society.

rupi

As menstrual blood is something that women are meant to hide and “keep quiet” about, Rupi Kaur refused to pertain to that social norm, and chose instead to share the realities of womanhood by posting a photo considered explicit to Instagram. In a Facebook post that went viral this past weekend, Kaur explained her horror at the removal of the photo and her disbelief at the stigmatized notions surrounding menstruation, as demonstrated by the initial deletion of the photo.

The lack of self-surveillance represented by Kaur’s post, as it was non-conforming to what is expected and accepted by the dominant hegemony, exposed a truth that half of the world should be able to publicly identify with. Instead, for many, the notions of shame attached to menstruation lead to an initial reaction of rejection of that particular identity of the woman, as characterized by menstruation, displayed by Kaur.

Although Kaur is an “attractive” woman by most stereotypical standards: an able-bodied, fairly light-skinned young woman, her body was rejected from the public online space of Instagram because of the direct affiliation between her body and the period stains representing menstruation which remains a stigmatized topic.

What is fascinating about her post going viral is that these archaic notions surrounding menstruation are being confronted and discussed by those who have viewed, and shared, the photo. Dominant ideologies concerning menstruation that involve silence and shame, are now being criticized and re-conceptualized as empowering. In her Facebook post (which has 71 301 likes and 71 206 shares) Kaur encourages people to think of menstruation not as a disgusting bodily process but, rather, as “a source of life for our species,” considered holy by many civilizations. In this way, Kaur questions the mysogyny and patriarchy demonstrated by the people working at Instagram, and how these old-fashion ideals must be confronted with vigour.

One might even say that Kaur has become a micro-celebrity with this post. Characterized by the interaction she has with her followers online, she has inspired many to openly critique the negative reactions of those at Instagram, and of society at large. In fact Rupi continues to interact with her followers through a website that she has created which features a photo series of similar work: http://www.rupikaur.com. The photo series publicly displays the realities of her body in private space, dealing with the realities of the natural process of menstruation. Once again, through another online space, Kaur calls into question censorship, gender, and identity.

Do you think that Rupi’s work would have been as well received if it was placed in a public space such as a gallery rather than made public through an online space? Do you think that the fact that Rupi has an “attractive” body facilitated the acceptance and popularity of the post?

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed

This weekend I was presenting at the annual Society for Cinema and Media Studies Conference as part of a panel arranged by Prof. Fenwick McKelvey. Sarah T Roberts was part of the two-art panel and shared with us an eye-opening presentation on commercial content moderators (CCMs) -- the people that do the labour of approving or removing objectionable content when it is flagged. They are the ones who experience the reality of the internet to keep up the internet mythology we believe in. Below is the only available article on the issue, an article that took years to put together and relied a lot on Roberts's ethnographic research.

by Adrien Chen, 23 October 2014.

THE CAMPUSES OF the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies’ most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila. When I climb the building’s narrow stairwell, I need to press against the wall to slide by workers heading down for a smoke break. Up one flight, a drowsy security guard staffs what passes for a front desk: a wooden table in a dark hallway overflowing with file folders.

Past the guard, in a large room packed with workers manning PCs on long tables, I meet Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space does not resemble a typical startup’s office, the image on Baybayan’s screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appearsbecause I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before he disappears it with a casual flick of his mouse.

Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.

“EVERYBODY HITS THE WALL. YOU JUST THINK, ‘HOLY SHIT, WHAT AM I SPENDING MY DAY DOING?’”

So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.

This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.

Here in the former elementary school, Baybayan and his coworkers are screening content for Whisper, an LA-based mobile startup—recently valued at $200 million by its VCs—that lets users post photos and share secrets anonymously. They work for a US-based outsourcing firm called TaskUs. It’s something of a surprise that Whisper would let a reporter in to see this process. When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics. Many tech companies make their moderators sign strict nondisclosure agreements, barring them from talking even to other employees of the same outsourcing firm about their work.

“I think if there’s not an explicit campaign to hide it, there’s certainly a tacit one,” says Sarah Roberts, a media studies scholar at the University of Western Ontario and one of the few academics who study commercial content moderation. Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences, Roberts says. “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”

Read more »»»

Week 9 — Online Bodies

This week we focused on a variety of online bodies and spaces. We were going to have Ana Voog Skype/chat with us which didn’t work out and in the end we didn’t focus on her at all, and she is the most prolific camgirl that ever existed. Here is the video she made for us!

The space of the blog filled up today with a lot of interesting stories about sexuality, censorship, shaming online, which shows how invested we are in the internet and our lives online. This stuff really affects us. Digital dualism go away!

For this week, we read two historical pieces about the internet—Terri Senft’s chapter from CamGirls, and a series of conversations—focused on the internet as an embodied space, and the ways in which particular bodies embody the internet (before the internet was mobile). We unpacked reductive and oppressive ideas about narcissism, shaming others and their bodies through modes of surveillance, and viewing the expression of sexuality as frivolous and lacking in political valence. The two historical pieces are meant to help you reflect on the way we function on the internet today — particularly around social media, identity and image sharing. I assigned the series of emails to you, to demonstrate that forms of knowledge do not just come in published books, articles, and or distributed films. It is important that we take narratives, and epistolary exchanges just as seriously, and these are important for our thinking. They are also primary research material. Particularly to read first-hand accounts of how people were experiencing the internet and themselves online in that space in the early days. We also briefly looked at contemporary artist Jillian Mayer and a community of women attempting to circumvent censorship online.

We looked at the ways in which surveillance, self-surveillance, and sousveillance is enacted on the internet, and the way we police other people’s behaviours on social media, and in the ways that internalize ideologies and self-monitor our own behaviour.

Until the late 1990s, being on the Internet typically meant communicating with peers, on Usenet discussion forums, IRC, multiple player massive text based online environments/games.  What’s important for us is that in 1995, the Mosaic web browser was the first to allow the ability for inline images, rather than them opening up separately in a new window. This started the internet as we know now —a visually based media. 

We presented the history of the practice of webcamming recognizing the importance that these women (cam girls) built the very web platforms they used to distribute their work: it is not simply that there were no templates that made the upload and distribution of images readily accessible (like Squarespace, Tumblr, WordPress); the development of a culture of online content sharing of any form was contemporaneous with the construction of the platforms used for such practices. There was no Kickstarter or Patreon to ask people for funding and/or money.

These users built their own spaces, their own communities, they built space where there was no space for them. We must ask ourselves why we judge these experiences, as narcissistic as opposed to other forms of expression? Why is that when a woman wants to control her own image it is seen as narcissistic?

(more…)

Twitter bans nonconsensual intimate photos, a.k.a. ‘revenge porn’

A mere moments ago this appeared on my Twitter feed. Fitting for today's lively discussion. We must take seriously the space many of us spend a lot of time within — cyberspace! As important as it is that platforms help create safer spaces for everyone, we must also question the near-ubiquity of posting other people's nude photos as a shaming method. How did this start? Why is it so effective and affective? Why do we use each other's bodies and the expressions of our bodies as shaming devices? Why is the nude body such a threat that it can be used to ruin people's lives? Why do we find sexuality threatening and 'inappropriate'? Why is there so much discourse that states, "well if she didn't want her photos leaked, she shouldn't have taken them." That is the same logic that we have seen when focused on the Pamela George case that plagues so many rape/murder cases — "She shouldn't have been walking there." Or the historical discourse to teach women how not to get raped rather than teach men not to rape. How do we reproduce the rhetoric that sticks in making people's lives so unbearable that some of them commit suicide? 

What about if you had posted images online and now you want them removed? What happens then? Can people not change their mind about what content they want available? How?

These questions also point to the fallacy of the online/offline virtual/real binary.

by Kashmir Hill

It has historically been a nightmare if nude or intimate photos of you made their way out onto the Internet. Beyond the sheer embarrassment of exposure, it was very, very hard to get those photos removed. If pleas to websites to take down revealing pics posted by vengeful exes or hackers didn’t work, women — and occasionally men — resorted to creative legal threats, claiming copyright over scandalous selfies or filing lawsuits saying that the posting was an invasion of privacy. Websites, protected from liability for what their users posted, were often unsympathetic and legally in the clear. But the tide is starting to change around nonconsensual porn — also called “revenge porn” — with social media platforms making it easier for people to get pics they never wanted publicly exposed taken down. Last month, Reddit banned revenge porn. On Wednesday, Twitter followed suit.

Technically, Twitter added this clause to its rules: “You may not post intimate photos or videos that were taken or distributed without the subject’s consent.” So if your vengeful ex decides to tweet a graphic present you bestowed upon him when you were dating, you’ll now be able to report it and Twitter says it will take it down “in a timely manner.” In a recent blog post, Twitter said it’s tripled the size of its abuse response team, and responds to reports far more quickly now, though the company doesn’t give out specific numbers.

[…]

So, how exactly will this thing work? I asked the employee just how intimate a photo needs to be for a person to take it down. Does it need to be X-rated or could it be a “nonconsensual” underwear bulge or side boob shot? He said that while there’s no hard-and-fast rule on what counts as intimate, the company is trying to get at the ‘range of horrendous behaviors that people engage in’ including not just full frontals and lingerie shots, but up-skirt photos and perhaps even what Reddit likes to call “creep shots,” revealing photos taken of unsuspecting women in public.

One catch is that you have to recognize yourself in the photo and report it; Twitter doesn’t want “body police” going through tweets and reporting every pornographic image they find. If an offending tweet is removed, all native retweets will disappear too, but you’ll have to report all manual RTs and any further postings of the photo or video. Twitter does not yet have a technical way to block a given photo once it’s been flagged as banned, though the company is working on it. Franks, for one, thinks it’s problematic that bystanders can’t report the posting of explicit images of others. “Every minute private sexual material is available increases the number of people who can view it, download it, and forward it, so even ifTwitter responds quickly to complaints, it may be too late to stop the material from going viral,” she said by email.

Read more

#loveyourlines

As discussed earlier in class, the Instagram community guidelines are bizarre in the sense that they prioritize some images over others. There is no mention of violence, for example, but they would immediately censor sexual content. However, body parts can be shown in certain context like educational, marketing, or if it is beneficial for the community.

Screen Shot 2015-03-11 at 5.05.26 PM

This last point made me think about the #loveyourlines Instagram page. The account has more than 106K followers and is a true inspiration for women all over the world who lives with stretch marks. When I think about the accounts where breastfeeding or even breast cancer survivors images gets shut down, I think about this page. The photos can sometimes be very graphic, but it is in no way sexual. It is to show real women, real bodies and spread self-love to others.

How can Instagram even think of shutting down this page? It is not even an option. Photos are being sent from very different kinds of women, from all ages, but they all share this one thing, and I think it is what makes it special.