Graham Mudd, vice president of product marketing at Meta, said the platform would be removing “options that relate to topics people may perceive as sensitive, such as options referencing causes, organizations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation.” Mudd provided examples that included terms like “World Diabetes Day” and “same-sex marriage” as well as topics around political beliefs, social issues, causes, organizations and figures. Facebook has long faced backlash from civil rights groups and lawmakers for the ways its advertising targeting system has been used. ProPublica reported in 2017 that Facebook let advertisers target nearly 2,300 people who expressed interest in the topics of “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’” There were job ads posted on the site that specifically targeted male Facebook users, excluding all women and non-binary users. The Department of Housing and Urban Development sued Facebook in 2019 because it allowed landlords and property owners to restrict who saw certain ads for housing based on race, religion and nationality. Facebook settled the lawsuit for $5 million and said it would change its ad-targeting system so that housing, employment, and credit ads on the social media platform no longer have the option to be selectively shown to certain ethnicities, genders or age groups. While defending the idea that advertising is best done in a “personalized” way, Mudd admitted that the company wants to “better match people’s evolving expectations of how advertisers may reach them” on the platform and “address feedback from civil rights experts, policymakers and other stakeholders on the importance of preventing advertisers from abusing the targeting options” they make available. “It is important to note that the interest targeting options we are removing are not based on people’s physical characteristics or personal attributes, but instead on things like people’s interactions with content on our platform. However, we’ve heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups,” Mudd said. “We routinely review, update and remove targeting options to simplify our ads system, provide more value for advertisers and people, and reduce the potential for abuse. The decision to remove these Detailed Targeting options was not easy, and we know this change may negatively impact some businesses and organizations. Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them. Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions.” Mudd added that while they are removing some options, there will be others available to those who need targeted ads for their organization. Ads can still be targeted at broad categories like age or gender, and you can still reach people based on whether they liked a page or watched a video on their news feed. There is also a “Lookalike Audiences” tool that helps businesses and organizations build bases of visitors that resemble their other fans. Enterprises can still use location targeting as well. “Even after we update our targeting options, people may still see ad content they aren’t interested in, which is why we are also working to expand the control that allows people to choose to see fewer ads about certain types of content. Today, people can opt to see fewer ads related to politics, parenting, alcohol, and pets,” Mudd said. “Early next year, we will be giving people control of more types of ad content, including gambling and weight loss, among others. Just as we have for the past several years, we will continue to evaluate and evolve our ad system. At the same time, we will make sure to provide our partners with the tools they need to reach their customers and increase their performance on our platform.” Facebook’s own staff members have long pushed for the site to limit the microtargeting features available to certain organizations, particularly those involved in politics. Facebook researchers repeatedly warned the company’s leaders that the ad targeting tools were being used to specifically target certain ethnicities to dissuade them from voting or target certain audiences with misinformation. Researchers have noted that the tool was used by gun manufacturers to target ads toward far-right militias ahead of the terrorist attack on Congress on January 6. Leaks from the company show CEO Mark Zuckerberg personally overruled employees and continued to allow political campaigns to target people. Most social media sites like Twitter and even Google banned targeted political advertising.