Our commitment to a safer digital space

This is the third edition of our biannual Transparency Report, which provides insight into our content moderation efforts over a six-month period (January - June 2025) reaffirming our commitment to safety, transparency and compliance with the Digital Services Act. The data from this report was collected between September and October 2025.
Linktree’s mission is to empower anyone to curate, grow, and monetize their digital universe.

Our highest priority is ensuring a safe and reliable experience for the visitors, creators and businesses worldwide who trust our platform. To this end, we have invested significantly in both people and technology to swiftly identify and remove harmful content.

Our Trust & Safety team accomplishes this through proactive and reactive reporting workflows.

  • Our internal tooling and workflows identify and flag high-risk content, which is then automatically removed or sent to our moderation team for manual review.

  • Our website forms (Violation Report Form and IP Infringement Report Form) and mobile app enable anyone to easily report any content they come across that they believe is in breach of our Terms & Conditions or Community Standards.

  • Any requests we receive from law enforcement or government bodies are reviewed and promptly actioned when required.

Our approach to content moderation

Creative expression is central to our community, and while members have the freedom to curate their profiles, we have safeguards and limits in place so they can trust that they’ll have a safe experience on our platform.

Legal requirements

The simplest but most important reason for these safeguards is to ensure legal compliance. A large cohort of our user base is creatives and content creators; for this reason, we take claims of intellectual property infringement very seriously. We take action on the misuse or the unauthorized use of any copyrighted or trademarked material.

A safe and trusted experience

At Linktree, we prioritize individual privacy and safety. As part of these efforts, we don’t allow Linktrees that expose others’ personal information (e.g., addresses, IDs). In line with our intellectual property policies, we also prohibit Linktrees that impersonate individuals or organizations.Linkers cannot use Linktree to share content with the intention to intimidate, harass, threaten, or bully another person. Exposing private media and image-based sexual abuse (including the unauthorized sharing of personal media from subscription-based online platforms, DMCA-protected adult content, or non-consensual pornography) is also prohibited.Furthermore, any content featuring hate speech (discrimination against an individual or a group based on their religion, ethnicity, nationality, race, colour, descent, gender, or other identity factors) will be removed from Linktree.

Protection against physical harm

We prohibit the promotion of behaviours or actions that could cause physical harm, illness, or even death. This category of content includes medical misinformation, e.g. dangerous “alternative” COVID-19 remedies.

Our policies

When creating a Linktree account, Linkers must confirm their adherence to our policies, including our Terms & Conditions and Community Standards. Any violations of these policies will result in warnings, content removal, and/or account termination.

Our Community Standards provide guidelines for what content can and can’t be shared on Linktree. In most cases, they cover content that is considered illegal in the EU, however we may also remove content in some categories that is legal but deemed violative on our platform.

Our Community Standards may limit or prohibit the following content if found on a Linktree:

  • Adult content

  • Child harm

  • Copyright or trademark infringement

  • Electoral fraud

  • Extremism or terrorism

  • Harassment

  • Hate speech

  • Illegal goods and services

  • Misinformation

  • Invasion of privacy or impersonation

  • Self-harm

  • Shocking or violent content

  • Spam and fraud

  • URL abuse

Team structure

Linktree's Trust & Safety team operates around the clock, with members strategically located across Australia, the Philippines, India and the United States.

With a global team reviewing content and responding to community violation reports 24/7, we can address issues efficiently and remove harmful content as quickly as possible – with multiple escalation pathways within the moderation team, and between full-time Trust & Safety managers at Linktree.

All team members have access to regular and comprehensive training, monthly quality assurance checks, and regular meetings with their team leaders. Additionally, due to the nature of the content reviewed, we provide access to mental health support to protect our team’s wellbeing while they protect the Linktree community.

How we remove content

Linktree employs a combination of automated, manual and hybrid moderation methods to flag, review, and manage content on the platform.

We have two primary approaches to address content that violates our Community Standards: content removal and Linktree suspension.

Content removal

Linktree prioritizes removing violative content from a Linktree before considering suspension whenever possible. This process may involve removing links that direct to harmful sites, deleting images containing inappropriate content, or applying a sensitive content warning to specific links to alert visitors before they proceed to third-party websites.

The Linker is notified whenever content is removed from their Linktree. They’re also shown how to appeal the moderation action if they believe it was made in error.

Linktree suspensions

In cases where a violation is particularly severe or multiple infractions associated with a Linktree, the entire Linktree may be suspended. Like with content removals, Linkers are notified when their Linktree is suspended and are given instructions to appeal the decision, unless there is a business or legal reason that prevents us from doing so.

Notifications

Community violation reports

If Linkers or visitors believe they have found content that violates our policies, they can access our community violation report form on any Linktree. Once completed, the report is manually reviewed by our moderation team and the reporting party is notified once it has been processed.

Intellectual property reports

For cases of intellectual property infringement on a Linktree, visitors or rights holders can submit an intellectual property report that will be reviewed manually, usually within 1-2 business days.

Anyone who submits an intellectual property report will be notified of the outcome, including whether their report was accepted or rejected.

Linkers will also be notified if any of their content is moderated due to an intellectual property report and will receive instructions on how to submit a counter-notice if they believe their content was removed in error.

Appeals

As mentioned previously, if a person believes that their content was wrongly removed or their profile unjustly suspended, they can submit an appeal for a manual review.

Our team carefully reviews each appeal, and in some cases appeals may be escalated for further evaluation.

There are three possible outcomes:

  1. The decision is overturned, and the Linktree or content is restored.

  2. The decision is modified, resulting in a warning being applied (or removed) to a specific link.

  3. The decision is upheld, and the Linktree or content is not restored.

Once the final decision is made, the person who submitted the appeal is promptly notified of the outcome.

Methodology

Automation

Linktree uses various automated techniques to detect and remove violative content. Our automated tools use a combination of text, URL and image models to detect potentially violative content and remove it – or prevent it from being added at all. All automated models include a degree of human oversight to address biases and improve accuracy.

Hybrid

Much of our content moderation includes an element of manual review, which classifies these processes as hybrid actions. These processes can include items that are flagged by our automated content detection systems and require manual moderation for retraining or reviewing. Linktree also uses third-party tools and services to detect violative content, with the results being reviewed by our content moderation team.

Manual

Linktree favors manual moderation techniques for t violative categories that require additional context and judgement. Examples include community violation reports, policy/ban appeals and intellectual property reports. Handling these personally is how we balance speed and accuracy: making sure we are handling these cases in an appropriate timeframe and that the correct moderation action is applied.

Content moderation actions
(Global & EU)

Automated link removals

The table below displays the percentage of links removed and banned from Linktrees, and the percentage of decisions that were reversed, providing insight into the accuracy of the automated processes.

Volume of Moderation Actions

Manual vs hybrid content moderation

The table below shows the breakdown of manual link removals submitted by the content moderation team during the reporting period versus how many were eventually reversed.

These removals are associated with flagged items that were reviewed by the team and determined to be unsafe, illegal or in violation of our Community Standards.

Volume of Reversals

Linktree Suspensions

The table below shows the breakdown of manual link removals submitted by the content moderation team during the reporting period versus how many were eventually reversed.

These removals are associated with flagged items that were reviewed by the team and determined to be unsafe, illegal or in violation of our Community Standards.

Profile Suspensions

Banned accounts: Top violative categories

This chart shows a breakdown of the most common reasons for an account suspension on a global basis. ‘Other Ban’ refers to bans not covered by any existing categories.

Number of Bans

Community
violation reports

Number of violation reports received

In addition to proactively detecting violative content on our platform, we welcome Linkers and visitors to report any content or accounts they believe to be in breach of our Community Standards.

During the Jan-June period represented in this report, we received 9,668 violation reports, averaging 1,611 per month.

Community Violation Reports

Below is a breakdown of the Community Violation reports that Linktree has received from EU member states.

Please note that the category indicates what the reporter selected when submitting the report, which does not correlate with or guarantee moderation action.

EU Community Violation Reports

Violation reports rejected vs accepted

This graph shows the ratio of reports where there was no confirmed violation, compared to reports that did result in a moderation action.

Overall, this shows that on average only 18.26% of submissions reported during this time were in breach of our Community Standards.

Violation Reports: Rejected vs Accepted

Median handling time

This data reflects the difference in time (seconds) between when the violation report was submitted and when it was closed out by the moderator, including review time and moderation action.

Median Time in Queue (Seconds)

Volume of appeals received

Appeals may be submitted for almost any moderation action taken on an account, such as an account ban or link removal. We received a total of 4,947 during this period.

Volume of Appeals

The chart below provides insight into the appeals submitted by individuals within the EU.

Appeals within the EU:

Appeals rejected vs. accepted

This is the ratio of appeals that were rejected (i.e. where the ban stayed in place) vs. appeals that were accepted (i.e. decision overturned and account/content restored).

On average, 20.48% of appeals during this time were accepted.

Appeals: Rejected vs Accepted

Median handling time

This calculates the difference in time (seconds) between the appeal being submitted and the decision being made by the moderator, including review time and any reversal of moderation action.

Median Time in Queue (Seconds)

Intellectual property reports

Linktree respects intellectual property rights and we expect Linkers to do the same.

When we receive an intellectual property report, it is manually reviewed to determine whether the report is valid and if the reported Linktree contains infringing content. If it does, then the content is removed and, if appropriate, the Linktree is suspended.

More on our intellectual property policy

Here’s a breakdown of the global volume of IP reports we received during the data period.

Trademark and Copyright

The two pie charts below highlight the ratio of copyright and trademark reports that were accepted, denied, or resulted in no action globally.

Trademark Reports

Copyright Reports

The chart below shows the median time in hours to respond to an intellectual property report. During the data period the average median first response time was 9.92 hours.

Median Response Time (Hours)

The following chart shows the volume of intellectual property reports that we received from within the European Union.
Below is a table that shows the complete breakdown for the intellectual property reports we’ve received from each EU member state as well as the total actions taken with each report.

Legal requests

At Linktree, we take all legal information requests seriously. Each request is carefully reviewed by our legal team and actioned promptly in accordance with applicable laws and our law enforcement access request policy. Legal requests can be submitted to [email protected].

The table below provides a breakdown of all legal information requests received during the reporting period. Please note that subpoenas are only actioned when accompanied by the appropriate and official legal documentation.

Trusted Flaggers

Linktree has not received any reports from Trusted Flaggers, as defined by the Digital Services Act, during the data period. We remain committed to closely monitoring activity and maintaining an open channel of communication with our Trusted Flaggers to ensure swift action when necessary.

Monthly
active users

Article 24(2) of the DSA requires online platforms like Linktree to publish information on average monthly active recipients of the service every six months. These recipients should be geographically located in the EU. The primary purpose of publishing this number is to identify if an online platform is a ‘very large online platform’ = at least 45 million users per month in the EU.

Between 1 January 2025 and 30 June 2025, the average number of monthly active recipients was below the 45 million user threshold for being designated as a VLOP.

We define a monthly active recipient as Linkers and visitors who visit our platform and interact with a profile at least once, for example clicking a link, during the calculation period. We have also attempted to limit this number to “unique” visits only e.g. counting multiple visits by the same user as only once in each month.

We will continue to monitor the number of average monthly active recipients and publish updated information in accordance with Article 24(2) of the DSA.