Safety & Privacy Center

Our Approach to Violent Extremism

Spotify seeks to give artists an opportunity to live off their art and billions of fans the opportunity to enjoy and be inspired by it. In support of that endeavor, our global teams work around the clock to ensure that the experience along the way is safe and enjoyable for creators, listeners, and advertisers.

On Spotify, the vast majority of listening time is spent on licensed content. Regardless of who created the content, our top priority is to allow our community to connect directly with the music, podcasts, and audiobooks they love. However, this does not mean that anything goes.

Spotify strictly prohibits content that promotes terrorism or violent extremism and takes action on content that violates our Platform Rules or the law.

When it comes to violent extremism, we carefully review entities' on-platform and offline behavior, including (but not limited to) violent conduct and incitement of violence. We work closely with third-parties with extremism expertise to ensure we are making the most informed decisions throughout these processes and taking local, regional, and cultural context into account.

We address potential violent extremist content through multiple policies, which include, but are not limited to:

  • Our hate policies prohibit content that explicitly incites violence or hatred toward people based on protected characteristics, including race, sex, ethnicity, or sexual orientation.
  • Our dangerous content policies clearly outline that material which promotes or supports terrorism or violent extremism is strictly not allowed on the Spotify platform.

We identify potentially violative content for review using proactive monitoring methods, leveraging human expertise and user reports. We also use insights from global third-party experts to monitor emerging abuse trends and ensure we're constantly improving our approach.

When it comes to enforcement, we may take various actions, including removing content or the creator, reducing distribution, and/or demonetization. When determining what action to take, we consider the potential risk of the content to lead to offline harm. Additional factors may also include:

  • Is there region-specific context or nuance?
  • Could this content increase the risk of offline harm?
  • What is the nature of the content (for example, is it news or a documentary? A comedy or satire?)
  • Is the speaker discussing their personally lived experience?

Additionally, when users search for violent extremist content, they may be pointed to resource hubs that offer support for those who have been exposed to radicalizing content. This material was created in partnership with third-party experts, including the Spotify Safety Advisory Council, and encourages users to critically evaluate the content they consume.

This space is nuanced, complex, and always evolving. We are committed to iterating and improving upon our approach to keep violent extremist content off our platform. You can read more about our safety work here.