Internal Compliance Document

Content Moderation & Complaints Procedure

Service name: Discreet Liaisons
URL: discreetliaisons.co.uk
Document version: 1.0
Date: 14 February 2026
Review date: 14 February 2027
Prepared by: Platform Operator

1. Purpose and Scope

This Content Moderation and Complaints Procedure (“Procedure”) documents how Discreet Liaisons identifies, reviews, and acts upon content and behaviour that breaches our Terms & Conditions or is otherwise illegal or harmful. It is prepared in accordance with the Online Safety Act 2023 (“OSA”) and Ofcom's guidance for user-to-user services.

This Procedure is available to Ofcom on request as evidence of the platform's compliance with its content moderation and complaints obligations under Part 3 of the OSA.

2. Moderation Architecture

Discreet Liaisons operates a hybrid moderation model:

2.1 Pre-Publication Photo Moderation

All photographs submitted to the platform are reviewed by a human moderator via the platform's administration interface before they are made visible to any other member. A photograph will only be approved and displayed if it meets all of the following criteria:

Photographs that do not meet these criteria are rejected and the account holder is notified. Repeated non-compliant submissions result in account suspension.

Approved photographs are stored in encrypted cloud storage (Amazon S3). Images are served via temporary signed URLs with short expiry periods, ensuring photographs are not permanently accessible via direct links.

2.2 Reactive Moderation of Private Messages

Private messages between members are not pre-moderated. Messages are reviewed reactively when a report is received. The platform's design ensures users can report any message or any member profile at any time. Users can also block any other member, which immediately prevents all further contact from that member.

Where reports indicate illegal content within messages (including suspected CSAM, fraud, or harassment), the platform operator will take immediate action including preservation of relevant records and notification to appropriate authorities.

2.3 Profile Content Moderation

Profile text fields (name, bio, preferences) are reviewed during the photograph moderation process and on receipt of a report. Profile content that breaches the Terms & Conditions (including contact information, commercial solicitation, or abusive content) will be removed and the account holder notified.

3. User Reporting Mechanism

The platform provides a built-in reporting mechanism accessible from every member's profile page. Reports can also be submitted by email to safety@discreetliaisons.co.uk.

When a user submits a report, they are asked to select a reason:

All reports are logged in the platform's admin dashboard with a timestamp, the reporting user's ID, the reported user's ID, and the reason selected. Reports are visible to the platform administrator via the moderation interface.

4. Report Review Process

4.1 Initial Triage

On receipt of a report, the platform operator performs an initial triage within the following timeframes:

Report Category Initial Triage Resolution Target
CSAM / child safetyImmediate (within 1 hour)Immediate suspension + law enforcement referral within 24 hours
Trafficking / exploitationWithin 4 hoursWithin 24 hours; NCA referral if credible
Threats / violent contentWithin 8 hoursWithin 24 hours
Fraud / financial solicitationWithin 24 hoursWithin 48 hours
HarassmentWithin 24 hoursWithin 72 hours
Fake profile / fraudWithin 48 hoursWithin 72 hours
Other / generalWithin 72 hoursWithin 5 working days

4.2 Review and Investigation

The reviewing moderator will:

  1. Review the reported profile, including all submitted photographs, profile text, and any available message thread;
  2. Review any prior reports involving either the reported user or the reporting user;
  3. Assess the report against the platform's Terms & Conditions and applicable law;
  4. Determine the appropriate action (see section 5 below);
  5. Record the action taken and the rationale in the moderation log.

5. Actions Available

The following actions may be taken following a content review:

Action Applicable Circumstances
No action / dismissReport is unsubstantiated or amounts to a false or malicious report
Content removalSpecific content (photograph, profile text) breaches T&Cs but the account otherwise appears legitimate
Formal warningMinor breach, first offence; account holder is notified of the breach
Temporary suspensionRepeated minor breaches or a moderate single breach
Permanent banSerious breach of T&Cs, including illegal content, repeated harassment, fraud, or impersonation
Law enforcement referralAny suspected criminal conduct including CSAM, trafficking, fraud, or serious harassment
IWF referralAny identified or suspected CSAM
Reporting user notifiedWhere the report resulted in action, the reporting user is informed (without disclosing specific action taken)

6. Escalation Path

The following escalation path applies:

1
Platform moderator — handles all routine reports (fake profiles, minor harassment, T&Cs breaches)
2
Platform operator (senior review) — handles complex cases, permanent ban decisions, and any report involving potential illegality
3
External referral — NCA, IWF, Action Fraud, or Police (as appropriate) for confirmed or strongly suspected criminal conduct
4
Ofcom notification — where required under s.66 OSA (terrorism content, CSAM) or where there is a material risk of significant harm

7. User Right of Appeal

Users whose accounts are suspended or whose content is removed may appeal the decision by contacting safety@discreetliaisons.co.uk within 14 days of the action. The appeal will be reviewed by the platform operator (a person not involved in the original decision, where possible). A decision on appeal will be communicated within 10 working days.

Permanent bans issued following identification of CSAM or other serious illegal content are not subject to appeal.

8. Record Keeping

The following records are maintained:

Record Type Retention Period
All user reports (log of report, reporter, reported user, outcome)3 years
Moderation decisions and rationale3 years
Suspended and banned account records5 years (to prevent re-registration)
Law enforcement / IWF referrals6 years or as directed by the relevant authority
This Procedure document and all compliance assessments3 years from supersession
User appeals and outcomes3 years

Records are stored securely on the platform's encrypted infrastructure. Where records relate to criminal investigations, they will be preserved in accordance with instructions from the relevant law enforcement body.

9. Transparency and Review

This Procedure will be reviewed annually (next review: 14 February 2027) or whenever there is a material change to the platform, user base, or regulatory framework. A summary of moderation activity (total reports received, actions taken, referrals made) will be recorded annually as part of the platform's OSA compliance record.

This document is available to Ofcom on request. A summary of the platform's safety approach is available publicly at our Online Safety page.

Document version: 1.0 • Date: 14 February 2026 • Review due: 14 February 2027

Prepared for compliance with the Online Safety Act 2023 and Ofcom's guidance for user-to-user services.