Opinion

Apple was a role model, until it wasn't

Lennon Torres again finds herself disappointed with the tech giant after it allegedly caves to homophobia.
 By  Lennon Torres  and  Sarah Gardner  on 
An Apple store in China.
Credit: Cheng Xin / Getty

Apple shaped how entire generations think about technology. For many of us, its products symbolized creativity and progress. The company taught us to “Think Different” and to believe technology could make life better. But today, Apple’s actions tell a different story — when ethics collide with revenue, Apple folds.

In 2014, Tim Cook came out as gay and positioned it as a moral stand. Apple wrapped itself in Pride flags, marketed inclusion, and sold us the idea that it stood for something bigger than profit. The company seemed to embrace the fact that Apple is the first Fortune 500 company with an openly gay CEO. Tim Cook even authored a piece in Bloomberg Business stating that he is, "proud to be gay."

Fast forward to now, when Apple quietly removed two of the largest gay dating apps in China, Blued and Finka, at Beijing’s request. No statement. No defense of queer communities. Just silent compliance. 


You May Also Like

This isn’t an isolated decision. It’s a pattern.

When ethics collide with revenue

Apple’s support of marginalized communities seems to collapse under pressure. Take child sexual abuse material (CSAM). In 2021, Apple admitted that verified images and videos of children being sexually abused were stored on iCloud. Because they knew it was a problem, they developed a privacy-protected, vetted-by-independent-experts detection system to stop it. They proudly announced their plan in August 2021 and then 30 days later paused the roll out.

Apple commissioned their own cryptography experts to confirm the system safeguarded privacy. Independent reviewers like David Forsyth and Benny Pinkas agreed: No innocent user data would be exposed. Yet Apple abandoned the plan after backlash over privacy concerns, retreating to arguments it had previously dismantled.

Apple's pivot to services like iCloud has made subscriptions a core revenue driver, generating nearly $100 billion annually with gross margins around 75 percent. Despite this profitability, Apple has still not implemented a meaningful solution to stop the spread of known CSAM, leaving iCloud as one of the few major cloud platforms that does not proactively detect known CSAM. This failure has sparked lawsuits from thousands of survivors who argue Apple's decision enables predators to pay for storage of abuse imagery, effectively monetizing their trauma. By contrast, companies like Google deploy industry-standard safeguards, combining hash-matching against NCMEC databases and AI to detect and report CSAM at scale. Apple's refusal to implement similar measures underscores a gap: While profiting from cloud services, it has not ensured those services are free from exploitation.

This isn’t just complacency. It’s negligence.

Ethics shouldn’t be optional

It’s easy to do the right thing when it sells. Pride campaigns drive revenue, but only when the White House is lit up rainbow or consumer trends value ethics. But standing up for queer communities in China when the government is challenging you to stand on the side of oppression? That’s harder. Tackling child abuse on your own platform? That’s riskier. Apple will remove LGBTQ+ apps to appease Beijing without putting up a fight, but won’t take decisive action against child predators. 

Apple doesn’t "Think Different" anymore. It thinks profit. And until we demand better, it will keep choosing power over people.

What needs to change

Apple has the resources and expertise to lead on both fronts — protecting vulnerable communities and safeguarding children online. It could implement proven, privacy-conscious CSAM detection tools developed by experts at Thorn, NCMEC, and Johns Hopkins’ MOORE Center. It could take a public stand against censorship that erases LGBTQ+ lives. Instead, it has chosen silence and inaction.

Regulators, investors, and consumers must hold Apple accountable. Tech companies should not be allowed to monetize harm while hiding behind branding campaigns. Ethics cannot be optional in the digital age.


This article reflects the opinions of the writers.

Lennon Torres is a Public Voices Fellow on Prevention of Child Sexual Abuse with The OpEd Project. She is an LGBTQ+ advocate who grew up in the public eye, gaining national recognition as a young dancer on television shows. With a deep passion for storytelling, advocacy, and politics, Lennon now works to center the lived experience of herself and others as she crafts her professional career in online child safety at Heat Initiative. The opinions reflected in this piece are those of Lennon Torres as an individual and not of the entities she is part of. Lennon’s substack: https://substack.com/@lennontorres

Sarah Gardner is Founder and CEO of the Heat Initiative. With more than 13 years of technical and policy expertise in online child safety, she is an internationally recognized voice in advocating for the rights of children and survivors of child sexual abuse. Heat Initiative is an organization of technology experts, parents, survivors and advocates who believe strongly that tech companies like Apple and Meta need to remove CSAM from their platforms and implement policies that will keep children safe online.

Mashable Potato

More from Mashable Voices
We didn't grow up on social media. We grew up on digital nicotine.
By Lennon Torres
A child scrolls on a smartphone.

The NAACP is fighting back against AI data centers
Construction on the xAI data center in Memphis.

Secrets from 7 tech and career experts on how to get hired in 2026
Tech jobs 2026

I broke up with my iPhone, and it felt like leaving a toxic relationship
By Lennon Torres
An iPhone heads for the recycle bin.

Instagram Teen Accounts: Is it just PR?
By Lennon Torres
A teenage girl looks at her phone.

Recommended For You
End of an era: Tesla discontinues Model S and Model X
Tesla Model X and S


Anthropic releases Claude Sonnet 4.6: Benchmark performance, how to try it
Claude logo

Mystery AI model Hunter Alpha may be DeepSeek V4 in disguise
Stylized Deepseek logo

Meet Claude Mythos: Leaked Anthropic post reveals the powerful upcoming model
Claude by Anthropic on smartphone

More in Life
How to watch Chelsea vs. Port Vale online for free
Alejandro Garnacho of Chelsea reacts

How to watch 'Wuthering Heights' at home: Margot Robbie and Jacob Elordi's controversial romance now streaming
Margot Robbie and Jacob Elordi embracing in still from "Wuthering Heights"

How to watch New York Islanders vs. Philadelphia Flyers online for free
Matthew Schaefer of the New York Islanders warms up

How to watch Mexico vs. Belgium online for free
Israel Reyes of Mexico reacts

How to watch Brazil vs. Croatia online for free
Vinicius Junior #10 of Brazil leaves

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

The Earth is glowing in new Artemis II pictures of home
One half of the Earth is seen floating in space through the open door of the Orion spacecraft.

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

Wordle today: Answer, hints for April 2, 2026
Wordle game on a smartphone
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!