Meta’s Oversight Board calls for more inclusive rules on adult nudity

Meta’s Oversight Board has overruled the company’s takedowns of two Instagram posts showing a transgender and non-binary couple with bare chests and covered nipples. One of the images was posted in 2021 and the other last year. In the captions, the couple discussed trans healthcare. The posts noted that one of them planned to undergo gender-affirming surgery to create a flatter chest and that the duo was fundraising to pay for the procedure.

However, Meta took down the posts for violating its rules on sexual solicitation. The Oversight Board says that moderators reviewed the images multiple times after user reports and alerts from automated systems. The couple appealed Meta’s decisions to the company and the Oversight Board. Meta determined that removing the posts was the incorrect call and restored them, but the board looked into the dual cases all the same.

The Oversight Board overruled Meta’s original takedown decisions. It determined that the removal of the images was not in line with the company’s “community standards, values or human rights responsibilities” and that the cases underline core issues with Meta’s policies.

The board wrote that Meta’s directives to moderators on when to remove posts under the sexual solicitation policy is “far broader than the stated rationale for the policy or the publicly available guidance.” It claimed the discrepancy causes confusion for moderators and users. Meta itself has noted that this approach has led to content being incorrectly removed.

In addition, the board called out the inherently restrictive binary perspective of the adult nudity and sexual activity community standard. It notes that the rules, as things stand, generally don’t allow Meta’s users to post images of female nipples, though there are exceptions for things like breastfeeding and gender confirmation surgery.

“Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale,” the board wrote. It called the current rules “confusing” and noted that the extensive exceptions (which also allow for images related to protests and breast cancer awareness) “often convoluted and poorly defined.” As such, the board claimed, the policy is not workable in practice.

“The board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans and gender non-binary people on its platforms,” an Oversight Board blog post reads. “For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies. Meta should seek to develop and implement policies that address all these concerns.”

The board recommended that the company modify its rules on adult nudity and sexual activity to include “clear, objective, rights-respecting criteria” so that everyone is “treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.” It urged Meta to review the policy to determine if it protects users against the non-consensual sharing of images and whether other rules need to be tightened on that front. Moreover, it called on Meta to align its guidance to moderators with the public rules on sexual solicitation to minimize errors in enforcing the policy.

“We welcome the board’s decision in this case. We had reinstated this content prior to the decision, recognizing that it should not have been taken down,” a Meta spokesperson told Engadget. “We are constantly evaluating our policies to help make our platforms safer for everyone. We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”

In public comments on the case (PDF), several people criticized Meta for the original decisions, claiming that there was nothing sexually explicit about the images. One user called on Meta to bring in LGBTQIA+ human rights specialists and establish policies to protect trans, non-binary and other LGBTQIA people from harassment and unfair censorship. Another called out Instagram for a double standard, accusing the platform of permitting images in which nipples are covered only by body tape while removing others where they’re covered by pasties (patches that cover nipples and areolae).

One person noted that the couple “have helped me accept myself and help me understand things about myself,” noting that content shared on the account is “very educational and useful.” The comment added that “there is nothing sexual about their nudity and them sharing this type of picture is not about being nude and being provocative.”

Breast pump maker Willow now has an Apple Watch app

Willow, a company that makes smart breast pumps, is looking to make it easier for users to monitor and control their pumping sessions. The Willow 3.0, a wearable wireless pump, now has an Apple Watch companion app. As such, you won’t necessarily need your phone to manage pumping sessions (though your iPhone will still need to be within 30 feet of both the pumps and your watch).

The app enables users to start and pause sessions, switch between stimulation and expression pump modes and adjust suction levels, all from their wrist. Your watch will also let you know when your last session ended, display your per-pump milk output in real-time and show the battery level of each pump.

More than half of Willow users “regularly use an Apple Watch and many want to control their pumps with their smartwatches,” the company’s CEO Laura Chambers said in a statement. Chambers added that Willow designed the app to give users more “ease and control” over pumping sessions. You’ll need an Apple Watch Series 3 or above and at least watchOS 8 to use the Willow 3.0 app.

Meta sues surveillance company for allegedly scraping more than 600,000 accounts

Meta has filed a lawsuit against Voyager Labs, which it has accused of creating tens of thousands of fake accounts to scrape data from more than 600,000 Facebook users’ profiles. It says the surveillance company pulled information such as posts, likes, friend lists, photos, and comments, along with other details from groups and pages. Meta claims that Voyager masked its activity using its Surveillance Software, and that the company has also scraped data from Instagram, Twitter, YouTube, LinkedIn and Telegram to sell and license for profit.

In the complaint, which was obtained by Gizmodo, Meta has asked a judge to permanently ban Voyager from Facebook and Instagram. “As a direct result of Defendant’s unlawful actions, Meta has suffered and continues to suffer irreparable harm for which there is no adequate remedy at law, and which will continue unless Defendant’s actions are enjoined,” the filing reads. Meta said Voyager’s actions have caused it “to incur damages, including investigative costs, in an amount to be proven at trial.”

Meta claims that Voyager scraped data from accounts belonging to “employees of non-profit organizations, universities, news media organizations, healthcare facilities, the armed forces of the United States, and local, state, and federal government agencies, as well as full-time parents, retirees, and union members.” The company noted in a blog post it disabled accounts linked to Voyager and that filed the suit to enforce its terms and policies.

“Companies like Voyager are part of an industry that provides scraping services to anyone regardless of the users they target and for what purpose, including as a way to profile people for criminal behavior,” Jessica Romero, Meta’s director of platform enforcement and litigation, wrote. “This industry covertly collects information that people share with their community, family and friends, without oversight or accountability, and in a way that may implicate people’s civil rights.”

In 2021, The Guardian reported that the Los Angeles Police Department had tested Voyager’s social media surveillance tools in 2019. The company is said to have told the department that police could use the software to track the accounts of a suspect’s friends on social media, and that the system could predict crimes before they took place by making assumptions about a person’s activity.

According to The Guardian, Voyager has suggested factors like Instagram usernames denoting Arab pride or tweeting about Islam could indicate someone is leaning toward extremism. Other companies, such as Palantir, have worked on predictive policing tech. Critics such as the Electronic Frontier Foundation claim that tech can’t predict crime and that algorithms merely perpetuate existing biases.

Data scraping is an issue that Meta has to take seriously. In 2021, it sued an individual for allegedly scraping data on more than 178 million users. Last November, the Irish Data Protection Commission fined the company €265 million ($277 million) for failing to stop bad actors from obtaining millions of people’s phone numbers and other data, which were published elsewhere online. The regulator said Meta failed to comply with GDPR data protection rules. 

Netflix’s ‘Formula 1: Drive to Survive’ will return for its fifth season on February 24th

Formula 1: Drive to Survive, the docuseries that helped the motorsport become a bigger deal in the US, will return for its fifth season on Netflix on February 24th. That’s smart timing, since the three-day preseason test for this year’s F1 calendar wil…

Instacart will pay $5.25 million to settle a workers’ benefit case

Instacart will pay workers $5.1 million as part of a settlement after it allegedly failed to provide some benefits, as The San Francisco Chronicle reports. San Francisco accused the company of violating healthcare and paid sick leave ordinances. The company, which has not admitted to wrongdoing, will pay an additional $150,000 to cover the city’s legal costs and pay for a settlement administrator to distribute the funds.

“Instacart has always properly classified shoppers as independent contractors, giving them the ability to set their own schedule and earn on their own terms,” Instacart said in a statement. “We remain committed to continuing to serve customers across San Francisco while also protecting access to the flexible earnings opportunities Instacart shoppers consistently say they want.”

People who worked as independent contractors for Instacart in the city between February 2017 and December 2020 are eligible for payments based on how many hours they worked. San Francisco estimates that between 6,000 and 7,000 people are affected by the settlement. The city and Instacart previously reached a similar settlement that covered an earlier time period. San Francisco has settled a benefits-related case with DoorDash too.

After December 15th, 2020, Instacart workers were subject to Proposition 22, which afforded them some benefits without the company having to define them as employees. An Alameda County Superior Court judge ruled in 2021 that the measure was unconstitutional, but it remains in force while Instacart, DoorDash, Uber, Lyft and other gig companies who bankrolled Prop 22 appeal the decision. Another suit — filed by San Francisco, Los Angeles and San Diego — claims that Uber and Lyft drivers should have been classed as workers until Prop 22 passed.

SBF thought it was a good idea to start a Substack

Sam Bankman-Fried is in a world of trouble. He’s facing up to 115 years in prison if he’s convicted of federal fraud and conspiracy charges. And yet the embattled founder of collapsed crypto exchange FTX — who has pleaded not guilty and is out on a $250 million bond while awaiting trial — figured it’d be a great idea to write about his perspective on the saga in a Substack newsletter.

In his first post, which is ostensibly about the collapse of FTX International, Bankman-Fried (aka SBF) claims that “I didn’t steal funds, and I certainly didn’t stash billions away.” SBF notes that FTX US (which serves customers in America) “remains fully solvent and should be able to return all customers’ funds.” He added that FTX International still has billions of dollars in assets and that he is “dedicating nearly all of my personal assets to customers.” SBF, who once had a net worth of approximately $26.5 billion, said at the end of November that he had $100,000 in his bank account, though he pledged to give almost all of his personal shares in Robinhood to customers.

The post covers much of the same ground that SBF has gone over in the myriad interviews he gave between FTX’s collapse in November and his arrest last month. He discusses the multiple crypto market crashes in 2022 and a tweet from Binance CEO Changpeng Zhao that sparked a run on FTX’s FTT token and prompted the implosion of his exchange. SBF also writes about how he was pressured to file for Chapter 11 bankruptcy protection for FTX. Meanwhile, he notes that many of the numbers he cites in the post are approximations, since he has been locked out of FTX’s systems by those overseeing its bankruptcy proceedings.

What’s more interesting is what SBF doesn’t address. He does not mention the fact that FTX co-founder Zixiao “Gary” Wang and former Alameda Research CEO Caroline Ellison pleaded guilty to fraud charges and are cooperating with prosecutors.

SBF has continued to give interviews and tweet about the situation while he’s out on bail. That’s despite the complaint filed against him by the Securities and Exchange Commission citing his tweets and comments he made in an interview in early December. Perhaps this whole Substack thing will turn out to be a mistake too.