Arvida Byström and Molly Soda first met on the world wide web. A couple of likes and a follow or two later, and the pair had struck up an online friendship. Both strident feminists who found fame online — Molly from her leaked nude selfies and late night sexts, and Arvida from her candy-colored visions of girlhood — it was a match made in digi heaven. Although they’ve previously had their work exhibited alongside one another, in group shows such as Grace Micelli’s Girls at Night on the Internet, this next project is a joint venture that aims to explore the politics of online censorship in relation to the human body and question what it is that makes an image “appropriate” in the digital age.
VOTE for us at this year’s Webby Awards for Best Fashion And Beauty Website
Conceived in the form of a book, Molly and Arvida placed an ironic open call on Insta for any images that have been taken down by the social network, for supposedly being in violation of Instagram’s strict rules. From pubic hair to nipples, hijabs to pink gloop, the results have varied from one submission to another, but you’ll have to wait till next year when the book comes out to see which ones actually made the cut. Here we catch up with the artists to talk, sex, censorship and power of nude selfies.
How did you guys meet?
Molly Soda: We met online! Probably on Tumblr. We actually met for the first time in person only after we had already decided to work on the book together.
How did the idea come about?
M: As I recall, Arvida had just had a photo taken down from Instagram and made a post expressing her frustration over it. I commented saying something like “We should make a book (of removed Instagram photos)” and she messaged me. It all happened really quickly and naturally.
Why a book?
Arvida Byström: I like the idea of putting this into a book and asking in 10, 20, or 50 years if anything changed.
M: The book acts as a sort of time capsule. When an image gets removed off of Instagram, it is lost. You’re never told which image was deleted, just that an image of yours violated the terms. The removal immediately elevates that image in terms of importance, through Instagram’s attempt at destruction. Placing these “lost” or “destroyed” photographs into a book is a nice way to commemorate them — a way to highlight the unintentional elevation that is taking place.
What kinds of submissions have you received?
M: Everything! Some are quite literal while others are more abstract.
Have there been any surprises?
A: The most surprising submission by far is a photo of a person in a hijab. The guy who took it had put a caption that people could misinterpret as having terrorist undertones, which is totally not what he intended.
M: That photo is the most surprising one because there’s no nudity, no implied nudity, nothing graphic or violent or anything of that nature; it’s a portrait.
A: Sometimes you can’t even tell if the image has anything to do with the human body, but then in the caption it refers to a nipple and then it gets taken down.
M: We also got two submissions with some sticky gloop on a hand.
A: It’s interesting, as it shows you how society works; we’re less surprised about images of female bodies being taken down, than ones of men.
How does gender play into it? Have you received more images of women than men? Are women’s images more to do with nudity and men’s ones to do with violence?
M: I’d say there are more female-identifying individuals in our submissions solely based on the way that women use the internet and are more likely to photograph themselves. I haven’t seen any submissions that would be deemed “violent.”
A: I’m sure there are things that have to do with violence that get taken down, I even read an article about the people having the job of taking down images of beheadings and other horrible photos and videos from major platforms. I don’t think those accounts that upload violent images are following us, or are interested in mine or Molly’s work, hence they won’t submit these photos. Our book will be more about bodies, femaleness, and the challenges surrounding the body when shown in a private vs. public space.
Both of you have had images of yourselves removed. How does it feel to be told that your body has somehow violated some massive, male-dominated corporation’s code?
M: At this point, I don’t get angry. I’ve been online long enough to know what will get taken down and it’s not surprising when something does. These apps/websites are not our friends, but we still use them because we feel somewhat tied to and dependent on them. It allows others access to our work and our ideas; they are valuable tools though a very flawed system.
A: I’ve been online forever too, I came from a very insecure point as a teen, having a body and hating it, etc. Now I have a way more relaxed relationship with my body but the fact that a corporation disagrees with me at times doesn’t surprise me. They are built on ideas where bodies only serve them as long as they can sell products or doesn’t harm their company. I do find it very interesting though what is demonized or decided that young people shouldn’t get exposed to, because “save our kids” is usually the argument. One could argue that just seeing one type of body (naked or not) is actually more damaging than being able to see different naked (and dressed) bodies once in awhile.
M: It’s interesting: you always find out new problems with your bodies or things that are shameful about you because photos keep getting taken down.
A: I posted a photo the other day that I thought would be taken down, of my pubic region. I was wearing pants, and you couldn’t see the hair on either side of the panty line (even though I definitely have hair there). But then there was a heart shape cut out in the middle of the pants, where you can see my pubic hair, but because the hair isn’t on the side or above, it wasn’t taken down — even though it’s pubic hair, in fact it’s probably the closest thing to having my vagina out. But then you have photographs where you’re wearing even bigger panties or a swimming costume but with public hair on the side and they get taken down.
M: Do you think those same photos would be taken down if I didn’t have any public hair?
A: No they wouldn’t, 100%. There are a lot of well-shaved girls who share so much of their bodies, who don’t have their photos taken down. Where should you draw the line?
M: You start to wonder, whose bodies are more acceptable? Some bodies are more “appropriate” than others; it would be interesting to see what happened if a group of different bodies took the same photo, I wonder whose photo would get removed.
When it comes to images of women’s bodies, to what extent do you think that sex factors into Instagram’s decision to censor them?
M: Women’s bodies are constantly in conversation with sex because they are seen as objects — in this sense it immediately becomes sexual, whether or not that is their intent and whether or not they are fully clothed.
How can women rake in “likes” for sexualized images of themselves, fully clothed, when photos of excess body hair get taken down?
A: This is such a big subject, but yeah a nude body is not socially accepted unless in the private sphere or the changing rooms at the gym. But I feel we’ve got to ask: are naked bodies actually harmful? Do naked bodies actually have anything to do with sex? And is sex at all times really that dangerous to get exposed to for young people? I think sexual education is more important than taking down photos or trying to erase young people of their bodies, which usually aren’t even talking about sex, but that are sexualized.
How have things like race and diversity played into it?
A: So, this is very real and not to be ignored, this is mainly mine and Molly’s followers that will submit photos, and not very surprisingly it will be a lot of white abled cis young women, often pretty thin and these people will be the ones submitting photos. I also believe that these kind of people tend to feel more entitled to their bodies and feel more comfortable in showing them. So if you got a chubby, fat, non-binary, brown body, we just hope you know you are equally super duper cute and are welcome to send us photos!
What overall message are you trying to convey with the book?
M: This book isn’t a fight against Instagram. I’m not trying to “free the nipple” and this isn’t the biggest struggle feminism is facing by any means. It’s just a way to encapsulate some “unseen” and “covered up” images that to me have value and should be seen.
A: It is even fair to not read it as feminist book at all, how people deal with their bodies on social media and an everyday basis might just be a way to process and make sense of their bodies in a very unaccepting society. It is a discussion about the current state of bodies getting uploaded to Instagram and taken down. When a photo gets taken down Instagram ask you to reread the guidelines in order to “keep Instagram safe.” Maybe our society has to re-evaluate what is safe and unsafe? It will be a book about what point we are at and, as we talked about before, a time capsule that might feel funny and hopefully out-dated in a few years. I mean I wouldn’t mind if nipples of all sorts were less stigmatized and not deemed unsafe, just as the rest of the body and its hairs and liquids. Maybe we could say it is a book about bodies failing to be what corporations wish them to be. Like how the fuck is a shaved leg-hip joint more safe than an unshaved? It is more safe because of the money and labor that the shaved one will have spent on getting rid of that hair.
If you’re interested in contributing to the project, email your banned images to instabannedbooked@gmail.com with your name and Insta handle.
Credits
Text Tish Weinstock