Spaces

#OccupationUQAM

In "Crowd, Power and Post-Democracy in the 21st Century”, Zizek (2008) argues that when the normal run of things is traumatically interrupted, the field is open for a ‘discursive’ ideological competition. In other words, when there is a crisis.

occupeuqam-7

After our class yesterday there was a l’UQAM  occupation. It ended late into the night with an unnecessarily large police presence in riot gear, endless tear gas burning throats, and violent dispersion strategies. The night occupation was an immediate reaction to the events of earlier that day at UQAM.

Today, on the campus of the Université du Québec à Montréal, facing a court order demanding that classes be held and the threat of expulsion issued by their administration, hundreds of students turned up to disrupt classes and enforce their democratically voted strike mandate.

In response, the university administration called in the Montreal police, who arrived in full riot gear with pepper spray, tear gas, rubber bullets and batons at the ready. Paradoxically enough, their stated role there was to ensure that classes could occur as scheduled.

The students were shortly boxed in. The bulk of the reported 22 arrests happened as riot police swept onto campus. Students then set up barricades, and police formed a line and prepared to move in.

The occupation was also a response to the larger institutional issues of the different forms of violence against students (physical, economic, etc) and the austerity myth. The violence of those in power is much more insidious and invisible than the violence of destroying university property, which creates an unbalanced imaginary in this political struggle.

When we see images in the media and read the discourse of others which try to decontextualize a serious and complicated issue, and only focus on ‘violent protestors doing damage to property’ we must remember history, context, and the nuanced ways that bodies are excluded from spaces. What are the modes of recourse against the systemic and systematic violence we are faced with? Yes, all of us! Whether we recognize it or not. Although, some of us are affected by institutional violence more than others, which is where our friend intersectionality comes in. And even then, we must self-reflect what subject position we are coming from in the ways in which we orient towards events like the occupation last night. Why do we have x or y opinion on protests/protestors? What has shaped that perspective?

We must also remember the ways in which regulatory powers favor property over human bodies. When are riot gear, teargas and rubber bullets, etc. appropriate modes of policing bodies? (remember the Jason Farman example about increased security cameras pointing to computer labs and not for the safety of students on campus.)

occupeuqam-10

occupeuqam-1

occupeuqam-2occupeuqam-15    occupeuqam-13occupeuqam-12  occupeuqam-6 occupeuqam-5  occupeuqam-3

occupeuqam-8

Photos courtesy of Caroline Ramirez.

Advertisements

Debunk dah Funk: Rethinking Legends, Icons, & Rebels

https://vimeo.com/123808787

Blog Post 3: Spaces and Bodies Online – #LOVEYOURLINES

I already discussed briefly the #LOVEYOURLINE Instagram account, but I thought it would be suitable to follow the discussion of this online space for the third and final blog post. This time I’ll be more critical and analytical of the subject matter.


 

Screen Shot 2015-03-30 at 8.42.16 PM

#LOVEYOURLINE is an Instagram growing community (+108K followers) that aims to inspire women all over the world who lives with stretch marks. The black and white photos can sometimes be very graphic, since there is a lot of nudity, but it is not aimed to be sexual but testimonial. The account displays ‘real’ women bodies sent from consented women to the page admin. Here I mean ‘real’ bodies compared to what we often see in the media: bodies that are digitally modified to enhance and sexualize women. This approach helps to bring awareness to women to make them realize that they are not the only ones who lives with skin covered in stretch marks.

Screen Shot 2015-03-30 at 8.43.55 PMAs previously seen in class, Instagram tends to have double-standards when it comes to women bodies because censorship promote, as they say, a “comfortable experience”. Systems such as the Instagram community guidelines have been in place because every types of bodies can have access to the World Wide Web. For this reason, ‘non-normative’ bodies disrupt online spaces and thus, are being censored. By doing this, I believe Instagram promotes stigma around important issues of  women self-image. Since Instagram allows certain bodies to have more attention, how does that shape our perception of bodies in online spaces? This is what I will investigate in this blog post. But certainly, Instagram must consider how we conceptualize bodies and interrogate the ideas of natural/beauty/etc. before censoring any kind of excluded bodies.

Screen Shot 2015-03-30 at 8.42.51 PM#LOVEYOURLINE is an interesting case because, clearly, Instagram is an online space exclusionary to certain bodies. However, this page promotes the idea that every kind of bodies are accepted: black, white, teenagers, mothers, etc. This shapes our perception that, when Instagram is confronted to a supportive global community account, even if it displays non-conventional body types, it is okay to be shared. Why? We can refer here to Lefebvre’s Differential Space:  ”(…) a new space (differential space) cannot be born (produced) unless it accentuates differences” (293). Here, #LOVEYOURLINE function as an online differential space that generate new diverse relations that foreground and emphasizes shared differences, and more specifically, from the experiences of women.

Screen Shot 2015-03-31 at 1.57.42 PM#LOVEYOURLINE is also a special place that exists online because the testimonials are the ways in which the images are described for viewers. Therefore, from a phenomenological point of view: ”(…) the body no longer conceived as an object of the world, but as our means of communications with it, to the world no longer conceived as a collection of determinate objects, but as the horizon latent in all our experience (…)” (54). From their lived experience, #LOVEYOURLINE enable women who share the same struggles to connect together and build a community. This way, the viewers participation on the online space has an impact on their day-to-day lives and their own perception of their bodies, because they know they are not alone.

Questions:

1. How can women feel empowered and strong when images of them are taken down from popular social sites such as Instagram? Is there an alternative to the censorship?

2. As a social community, what can be done to support oppressed/censored individuals?

By Ana Patricia Bourgeois

 

Works Cited:

Lefebrve, Henri. ”The Production of Space” in Gieseking, Jen J. and William Mangold, (Eds.), The People, Place and Space Reader. (289-293). London: Routledge.

Merleau-Ponty. (2009). The Experience of the Body and Classical Psychology. In Mariam Fraser and Monica Greco (Eds.), The Body: A Reader (52-54). London: Routledge.

Gender equality in music, still struggling

Last week, Canadian singer-songwriter Kandle put up a post on Facebook showing the lack of female acts in some major music festivals in the US and the UK. That was demonstrated by removing male acts from three festival posters, respectively Coachella (US), Reading/Leeds (UK) and Download (UK). For anyone familiar with Coachella’s poster, for instance, that action left us to a very starry sky.

Kandle - gender equality - music festivals

A question that shall be asked now is, what are the boundaries surrounding music festivals that limit women’s participation? As in, how are these spaces more opened/accessible to male acts?

I believe this question exceeds the idea of music festivals – it seems to be an even more profound issue: the answer basically lies in the way the music industry functions. It would not be fair to believe some female acts were refused participation to these festivals; but it could be in terms of their access to the music industry in general. Now, if it is hard to know how many bands/music acts there are in the world, it is as hard to know what is the male-female proportion in the music industry.

Based solely, and I really mean solely, on what I know and what I’ve experienced, there seem to be a higher proportion of male musicians, therefore the odds for their presence in festivals are, accordingly, much higher.

That seems undeniable, yet, how did it end up like this? Why can’t we see/hear more women?

Does this have to do with the particular music genre these festivals promote? Would that mean the indie and metal music scene don’t like women as much as they like men?

Of course other factors shall come into consideration. Maybe there has been some scheduling conflicts for some female acts; or, more realistically, no more additional spots were made available to women acts, simply because these acts don’t exist!

If one of these festivals had to be a pop one, then it would have been a whole different story. Beyoncé, Lady Gaga, Katy Perry, Rihanna, Taylor Swift… they would all have been there. Difference is, these women have got a gigantic industry running after them. Is becoming a pop star the only way to success for female artists? And what is it that restrains them from trying and showing their talent? Does this have to do with beauty standards? Does one woman need to be confortable with her body first in order to perform in such a space? Where should they find their confidence?

I don’t necessarily intend to answer all these questions, but rather to stir up a discussion. The struggle is real and is surely great food for thought.

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed

This weekend I was presenting at the annual Society for Cinema and Media Studies Conference as part of a panel arranged by Prof. Fenwick McKelvey. Sarah T Roberts was part of the two-art panel and shared with us an eye-opening presentation on commercial content moderators (CCMs) -- the people that do the labour of approving or removing objectionable content when it is flagged. They are the ones who experience the reality of the internet to keep up the internet mythology we believe in. Below is the only available article on the issue, an article that took years to put together and relied a lot on Roberts's ethnographic research.

by Adrien Chen, 23 October 2014.

THE CAMPUSES OF the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies’ most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila. When I climb the building’s narrow stairwell, I need to press against the wall to slide by workers heading down for a smoke break. Up one flight, a drowsy security guard staffs what passes for a front desk: a wooden table in a dark hallway overflowing with file folders.

Past the guard, in a large room packed with workers manning PCs on long tables, I meet Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space does not resemble a typical startup’s office, the image on Baybayan’s screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appearsbecause I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before he disappears it with a casual flick of his mouse.

Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.

“EVERYBODY HITS THE WALL. YOU JUST THINK, ‘HOLY SHIT, WHAT AM I SPENDING MY DAY DOING?’”

So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.

This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.

Here in the former elementary school, Baybayan and his coworkers are screening content for Whisper, an LA-based mobile startup—recently valued at $200 million by its VCs—that lets users post photos and share secrets anonymously. They work for a US-based outsourcing firm called TaskUs. It’s something of a surprise that Whisper would let a reporter in to see this process. When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics. Many tech companies make their moderators sign strict nondisclosure agreements, barring them from talking even to other employees of the same outsourcing firm about their work.

“I think if there’s not an explicit campaign to hide it, there’s certainly a tacit one,” says Sarah Roberts, a media studies scholar at the University of Western Ontario and one of the few academics who study commercial content moderation. Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences, Roberts says. “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”

Read more »»»

Terri Senft answers q’s about internet life

Theresa Senft (author of Camgirls that we read) answered questions for Leandra Preston's class ""Virtual Girls: Girls and Digital Media" at the University of Central Florida in 2011 via video. These are cogent articulations of some of the issues we have been thinking about throughout the semester and inherently argue against digital dualism and the hierarchy and binary of virtual/real, online/offline.

1. Are people more honest offline or online?

2. Can you have sustainable friendships online?

3. What should be the role of the net in “real” activism offline?

4. How well do you think your viewers know you?

5. Why are all the camgirl sites devoted to porn now?

6. What about pre-teens and young teens who want to become camgirls?

7. Can you speak about creating “safe spaces” for women and others?

Twitter bans nonconsensual intimate photos, a.k.a. ‘revenge porn’

A mere moments ago this appeared on my Twitter feed. Fitting for today's lively discussion. We must take seriously the space many of us spend a lot of time within — cyberspace! As important as it is that platforms help create safer spaces for everyone, we must also question the near-ubiquity of posting other people's nude photos as a shaming method. How did this start? Why is it so effective and affective? Why do we use each other's bodies and the expressions of our bodies as shaming devices? Why is the nude body such a threat that it can be used to ruin people's lives? Why do we find sexuality threatening and 'inappropriate'? Why is there so much discourse that states, "well if she didn't want her photos leaked, she shouldn't have taken them." That is the same logic that we have seen when focused on the Pamela George case that plagues so many rape/murder cases — "She shouldn't have been walking there." Or the historical discourse to teach women how not to get raped rather than teach men not to rape. How do we reproduce the rhetoric that sticks in making people's lives so unbearable that some of them commit suicide? 

What about if you had posted images online and now you want them removed? What happens then? Can people not change their mind about what content they want available? How?

These questions also point to the fallacy of the online/offline virtual/real binary.

by Kashmir Hill

It has historically been a nightmare if nude or intimate photos of you made their way out onto the Internet. Beyond the sheer embarrassment of exposure, it was very, very hard to get those photos removed. If pleas to websites to take down revealing pics posted by vengeful exes or hackers didn’t work, women — and occasionally men — resorted to creative legal threats, claiming copyright over scandalous selfies or filing lawsuits saying that the posting was an invasion of privacy. Websites, protected from liability for what their users posted, were often unsympathetic and legally in the clear. But the tide is starting to change around nonconsensual porn — also called “revenge porn” — with social media platforms making it easier for people to get pics they never wanted publicly exposed taken down. Last month, Reddit banned revenge porn. On Wednesday, Twitter followed suit.

Technically, Twitter added this clause to its rules: “You may not post intimate photos or videos that were taken or distributed without the subject’s consent.” So if your vengeful ex decides to tweet a graphic present you bestowed upon him when you were dating, you’ll now be able to report it and Twitter says it will take it down “in a timely manner.” In a recent blog post, Twitter said it’s tripled the size of its abuse response team, and responds to reports far more quickly now, though the company doesn’t give out specific numbers.

[…]

So, how exactly will this thing work? I asked the employee just how intimate a photo needs to be for a person to take it down. Does it need to be X-rated or could it be a “nonconsensual” underwear bulge or side boob shot? He said that while there’s no hard-and-fast rule on what counts as intimate, the company is trying to get at the ‘range of horrendous behaviors that people engage in’ including not just full frontals and lingerie shots, but up-skirt photos and perhaps even what Reddit likes to call “creep shots,” revealing photos taken of unsuspecting women in public.

One catch is that you have to recognize yourself in the photo and report it; Twitter doesn’t want “body police” going through tweets and reporting every pornographic image they find. If an offending tweet is removed, all native retweets will disappear too, but you’ll have to report all manual RTs and any further postings of the photo or video. Twitter does not yet have a technical way to block a given photo once it’s been flagged as banned, though the company is working on it. Franks, for one, thinks it’s problematic that bystanders can’t report the posting of explicit images of others. “Every minute private sexual material is available increases the number of people who can view it, download it, and forward it, so even ifTwitter responds quickly to complaints, it may be too late to stop the material from going viral,” she said by email.

Read more