Thesis statement:
While TikTok’s algorithm can amplify feminist issues by “calculating feminism,” giving women’s voices some space on the platform, it doesn’t allow those voices to express themselves freely and can control the conversation by hiding content and limiting traffic.
Introduction:
In China, feminist topics have long been suppressed in mainstream media, and the space for public expression is limited. But with the rise of social media, and TikTok in particular, women are looking to digital platforms for voice and connection. These platforms not only broaden the communication channels of feminism, but also promote the formation of online social awareness to a certain extent. For example, the topic #girlshelpgirls on TikTok has gone viral due to the discussion of female mutual aid and sexual harassment and has received tens of millions of views. However, at the same time, many videos have been restricted or even deleted due to “causing opposition”. So this kind of expression is not completely free. TikTok’s recommendation algorithm amplifies certain content while exercising “algorithmic control” over other voices through traffic throttling and censorship mechanisms, creating a phenomenon known as “algorithmic feminism.” This essay will explore how TikTok’s recommendation algorithm affects the dissemination and discussion of feminist topics in China and argue that the algorithm can either promote social change or become an implicit technological means to restrict women’s voices. First, I’ll analyze how TikTok shapes and helps the spread of feminist topics; Second, I will explore how platforms limit the visibility of relevant content; Finally, I will analyse how algorithmic preferences reshape feminist discourse forms. Through these analyses, this essay will reflect on the complex impact of TikTok’s platform algorithms on digital feminist activism in China, and it also aims to reveal the dual roles that platform algorithms play in promoting and inhibiting social change.
How does TikTok amplify the topic of feminism
Among social media platforms, TikTok has become an important channel for the spread of feminist topics due to its powerful content recommendation system. TikTok’s “Recommend for you” function constantly optimizes content push according to users’ interests, preferences and interactive behaviors, which provides a unique path for the rapid popularity of feminist topics (Zhang & Liu, 2021). Unlike traditional media, TikTok’s content dissemination does not rely on institutional channels, but is built on the basis of spontaneous creation and sharing by users. This decentralized mode of communication allows gender issues that were originally marginalized to enter the mainstream (Hampton & Wellman, 2018). First, TikTok’s recommendation algorithm is based on a user’s interactions, including likes, comments, retweets and how long a video is watched. This mechanism makes it easier for feminist content that resonates widely to be recommended again and again. Examples include telling personal experiences of workplace discrimination, sharing stories about menstrual shame, or criticizing traditional family division of labor. This kind of content often generates a lot of discussion and retweets due to genuine, strong emotional expression. One example is #girlshelpgirls, a hashtag that has gone viral and garnered tens of millions of views around the topic of women helping each other and how to warn and protect each other from sexual harassment on the street. The popularity of such topics not only reflects the resonance among female users but also reflects the TikTok algorithm’s priority push mechanism for emotional content with broad social sympathy (Milan, 2015). In addition, some “lighthearted” or “humorous” feminist content is also popular on TikTok. For example, some users use funny ways to imitate the difference between men and women in daily life, this form of content is easy, funny, can cause laughter and can convey gender awareness, and it is easier to spread on the platform. Other users are good at explaining complex feminist concepts in simple terms, making serious issues such as “patriarchy” and “gender stereotypes” more acceptable. This kind of “soft communication” not only lowers the threshold of communication, but also reduces the audience’s resistance, so that more people can understand these theories, and also helps the feminist concept to further popularize (Drueke & Zobl, 2018). Furthermore,TikTok’s “challenge” campaigns and hashtags, such as # feminism #girlpower, have also provided tools for feminist topics to spread. As long as a hashtag is popular, many creators will do content around the hashtag, further expanding the influence of the topic. The TikTok platform itself will also incorporate these popular tags into the recommendation logic, so that they continue to appear in more users’ pages and thus form a wave of spread boom. To sum up, TikTok’s algorithm and platform functions provide a new space for feminist topics to be expressed and disseminated. Through personalized recommendation mechanisms, label aggregation, and emotional dissemination, platforms contribute to the wide diffusion and diverse expression of feminist discourse, and promote a digital gender politics based on everyday experience (Hampton, 2015).
How does TikTok hide or restrict feminist content
However, although TikTok provides a platform for the spread of feminist topics, the hiding and restriction of specific content is also very obvious, showing the mechanism of censorship and demotion of “sensitive content”. Platforms have decisive control over video visibility, and many female users have found that their expressions about gender-based violence, body autonomy, and even moderate women’s rights activities have seen their views drop off or even their content removed, raising widespread questions about censorship standards (Zeng & Kaye, 2022). TikTok’s community guidelines prohibit “violence, nudity, hate speech, and sensitive social issues,” but those terms are vaguely defined, leaving a lot of room for interpretation by algorithms and human moderators. For example, a user sharing their own experience of harassment in a public place, although the content is mild, it is still marked as “inappropriate to recommend” and restricted. In contrast, some content that parody women from a male perspective is not restricted, exposing the platform’s “double standard” on gender issues (Zeng & Kaye, 2022). The censorship of the platform is not only reflected in the removal of content, but also in the “shadow banning” mechanism. It does not block content or accounts directly, but significantly reduces their exposure on featured pages. Many feminist creators have reported initial traffic to videos that suddenly hit a “viewing plateau” hours later. This is often associated with “sensitive keywords”, such as “gender equality”, “women’s rights” and “body autonomy” (Milan, 2015). To get around algorithmic censorship, some users have turned to typologies, emoticons or English substitutions, such as “#girlpower” instead of “feminist.” This strategy reflects users’ cognition and adaptation to the platform rules, and also shows that feminist expression on TikTok needs to survive by “playing the edge”, and it is difficult to openly discuss. More worryingly, such restrictions are prompting some users to self-censor. To avoid streaming restrictions and bans, some creators have moved away from issues related to gender inequality and toward lighthearted entertainment. This change has compressed the space for public discussion, weakened the critical and challenging feminism should be, and made it gradually become entertaining and depoliticized on the platform, losing its social influence. Moreover, such restrictions are widespread across the globe. In more culturally and politically conservative countries, TikTok is more likely to remove content that addresses gender or sexual issues (Zeng & Kaye, 2022). The “localization censorship policy” implemented by the platform means that users in different regions can enjoy significant differences in the freedom of expression and visibility when facing the same issues, which further exposes the inconsistency of platform governance and the limitation of cross-cultural feminist expression. To sum up, TikTok, on the one hand, amplifies the dissemination space of feminist topics; on the other hand, it represses critical content by means of vague censorship standards, shadow bans and keyword restrictions, creating an expression environment of “seeming freedom but actually tightening”, which potentially weakens the diffusion power and social effectiveness of feminist voices.
The bias of TikTok’s algorithm and its impact on discourse power
The spread of feminism on TikTok is not only influenced by content restrictions, but also deeply shaped by the platform’s algorithmic bias. How algorithms distribute content determines which voices get greater exposure and which are marginalized. This technology-driven screening process actually builds a mechanism of “selective amplification”, thus affecting the expression form and dominant direction of feminist discourse on the platform (Milan, 2015). TikTok prefers videos that are light in style, positive in mood and with likable characters. For example, a well-dressed, soft-spoken female blogger who shares a “How to love yourself” video is often more likely to get likes and recommendations than a critically strong accusatory video. This algorithmic preference not only shaped the “appearance” of feminism, but also shaped its “sound” – mild, sensual and entertaining became the mainstream style recommended by the platform (Zhang & Liu, 2021). It can be seen that content that attempts to challenge traditional gender norms and criticize patriarchal structures is often ignored or even demoted by algorithms because of its “negative emotions” or “controversial” nature. For example, feminist content that emphasizes structural injustice, class differences, and cross-sectional racial and gender oppression is often deemed “not engaging enough” to gain platform support. This algorithmic preference marginalizes “radical feminism” and shapes a platform environment that favors “moderate feminism” (Drueke & Zobl, 2018). The algorithm’s bias is also reflected in the selection of content creators. Creators who are good-looking, fluent in expression, and have a sense of mirror are more likely to be recommended; However, it is difficult for ordinary users or bloggers with atypical images to gain visibility, even if their views are profound (Hampton, 2015). This suggests that the platform is not a neutral technological space, but an invisible part of ideological screening. In order to “cater to the algorithm,” some bloggers have gradually adjusted their content style to reduce the critical component and instead emphasize personal growth, emotional healing and lifestyle. Although such content can also be considered part of feminism, its dominance weakens the political edge of feminism, leading to its gradual “depoliticization” and “commodification” in the flow (Zeng & Kaye, 2022). In addition, algorithmic tendencies reinforce the echo chamber effect. Users are more easily exposed to content that is consistent with their own views on the platform, resulting in a lack of diversity of views. Social media platforms are also essentially social constructs, and their technical design determines how users build connections and communities. On TikTok, the algorithm gives priority to recommending certain types of content, which contributes to the popularity of some issues, but also limits the in-depth interaction and understanding between different groups, forming a “selective community” (Hampton & Wellman, 2018). This phenomenon makes the discussion within feminism easy to fall into a single position, and it is difficult to cover intersectionality, diversity and inclusiveness, which affects the in-depth development of the issue. In short, TikTok’s algorithms not only determine the path of content distribution, but also, more deeply, who is speaking, what and how. The platform is not simply a “technology intermediary”, but actively participates in the selection and construction of feminist discourse, which has a profound impact on the distribution of discourse rights (Drueke & Zobl, 2018).
Conclusion:
In conclusion, TikTok, as a deeply algorithm-driven social platform, plays a complex and multifaceted role in the spread of feminist topics. On the one hand, the platform breaks the centralized structure of traditional media through mechanisms such as “recommending for you”, enabling ordinary users to participate in public issues, and promoting the daily and visible expression of feminism. On the other hand, its algorithmic logic and censorship mechanism continue to compress the expression boundaries of feminism, making gender issues on the platform tend to be entertaining, moderate, and even depoliticized. In this essay, we demonstrate how TikTok intervenes in the circulation mode of feminist discourse at the three levels of “amplification”, “restriction” and “shaping”. This process is not simply a technology-neutral process but deeply embedded in specific ideology and power logic. From the popularity of #girlpower to the universalization of shadow bans to the algorithmic preference for “moderate feminism,” the spread of feminism on TikTok presents a paradoxical state: both elevated and domesticated. This not only shows that technology platforms have the ability to selectively shape public opinion but also reminds us that we should be wary of the active participation of digital platforms in discourse construction. As Milan (2015) puts it, algorithms are not just distribution tools but part of collective action and discourse politics. While TikTok offers an unprecedented platform for public expression, its algorithmic logic and content management mechanisms are quietly reshaping the structure of public discourse. Feminism on TikTok not only gained new life but also faced the challenge of compromise and reconstruction. Future research could further explore the differences in the dissemination of feminist content in different national, cultural, and policy contexts, and how users reframe their expression strategies under technical disciplines.
Reference List:
Drüeke, R., & Zobl, E. (2018). Forming publics. The Routledge Companion to Media and Activism, 134–141. https://doi.org/10.4324/9781315475059-14
Hampton, K. N. (2015). Persistent and Pervasive Community. American Behavioral Scientist, 60(1), 101–124. https://doi.org/10.1177/0002764215601714
Hampton, K. N., & Wellman, B. (2018). Lost and Saved . . . Again: The Moral Panic about the Loss of Community Takes Hold of Social Media. Contemporary Sociology, 47(6), 643–651. https://www.jstor.org/stable/26585966
Milan, S. (2015). When Algorithms Shape Collective Action: Social Media and the Dynamics of Cloud Protesting. Social Media + Society, 1(2), 205630511562248. https://doi.org/10.1177/2056305115622481
Zeng, J., & Kaye, D. Bondy V. (2022). From content moderation to visibility moderation: A case study of platform governance on TikTok. Policy & Internet, 14(1), 79–95. https://doi.org/10.1002/poi3.287
Zhang, M., & Liu, Y. (2021). A commentary of TikTok recommendation algorithms in MIT Technology Review 2021. Fundamental Research, 1(6), 846–847. https://doi.org/10.1016/j.fmre.2021.11.015
Hi Shannon Kate, You’re right to ask; it is incredibly difficult to police these issues today. Predatory behaviour isn’t exclusive…