In early March, Point de Contact published its first transparency report as a trusted flagger, providing a clear overview of how platforms and hosting services handle moderation.
In 2025, the association received 122,317 reports, with 45,086 pieces of content classified as illegal. The overall removal rate reached 97%, but only 64% for non-consensual intimate images (NCII). These figures highlight both the scale of harmful content online and the crucial role trusted flaggers play in enabling faster and more effective moderation. The report also highlights differences in how digital actors process and acknowledge notifications, revealing significant disparities across services.
Based on these findings, Point de Contact makes several recommendations. Under the Digital Services Act, the trusted flagger status only applies across all online platforms, which are required to treat notices submitted by trusted flaggers with priority under Article 22. Point de Contact therefore calls for extending this priority treatment to other intermediary services that play a key role in the online ecosystem but are not currently covered by these obligations—particularly hosting service providers and internet access providers. The association also recommends creating a dedicated fund to support trusted flaggers and encouraging intermediary service providers to actively detect NCII. These measures aim to strengthen prevention and accelerate content removal, thereby reducing harm for victims.