Internal Complaint Handling Regulations and Reporting Policy - SheetDB - Google Sheets REST API

Internal Complaint Handling Regulations and Reporting Policy

Effective Date: 12 March 2024

§ 1. General Provisions

  1. Krzysztof Świtalski, conducting business activity under the name "Pixel Perfection Krzysztof Świtalski", with the permanent place of business at ul. Zielna, no. 38, Wrocław 51-313, Tax Identification Number (NIP): 8951838474, National Business Registry Number (REGON): 367017298 (hereinafter referred to as "Provider") in relation to the provision of intermediary services in the form of hosting services within the meaning of the Regulation of the European Parliament and of the Council (EU) 2022/2065 of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act) (hereinafter referred to as "DSA") and the obligations arising from this legal act, including, among others, the implementation of an internal system for handling complaints (reports of illegal content), adopts these Regulations (hereinafter referred to as "Regulations").
  2. Whenever these Regulations refer to:
    • Report – it refers to the Provider's annual report made public concerning content moderation,
    • Social Media – it refers to all profiles (fan pages), closed and open groups operated by the Provider within X, YouTube, Medium, Facebook, Instagram,
    • User – it refers to any person visiting the Provider's Social Media,
    • Complaint – a report of content that is illegal content, in accordance with the Understanding of the Content Moderation Regulation in Social Media (Report).

§ 2. Internal Complaint Handling System

  1. The Provider, fulfilling its obligations under the DSA, handles complaints that have been submitted to it in accordance with the procedures adopted through Social Media.
  2. User complaints are submitted via email to [email protected].
  3. The Provider introduces a mechanism for handling complaints, ensuring Users can easily submit complaints and implementing a standardized format for Users to report violations. The template constitutes Annex No. 1 to these Regulations.
  4. The Provider designates a person responsible for handling the Complaint, who is an employee or collaborator of the Provider. The Complaint is handled in a timely, non-discriminatory, objective, and non-arbitrary manner, in accordance with the provisions of these Regulations and the Shop/Service Regulations.
  5. The Provider, or a person appointed by them (in accordance with section 4), upon receiving a complaint from a User, immediately, but no later than within 10 working days of the complaint's receipt, informs the User in a return message about the acceptance of the complaint, according to the following content: "Thank you for your report. The information you have provided will be verified by us. Within 10 working days, we will return to you with a response regarding the actions we have taken and the decision issued on the reported content."
  6. The complaint is processed within a maximum of 10 working days.
  7. In the course of handling the complaint, depending on the decision, the Provider takes the following steps in particular:
    • Informs the User about the non-recognition of the complaint,
    • Informs the User about the partial recognition of the complaint and the further steps that will be taken,
    • Informs the User about the positive resolution of the complaint and the further steps that will be taken.
  8. The Provider notifies without undue delay the User to whom the notification pertains, conveying information about the complaint being considered and the possibility of subsequently appealing the Provider's decision, as well as any significant information related to the appeal (appeal deadlines, appropriate contact points). The User's appeal will be considered within a maximum of 14 working days by the Provider. The template for notifying the User concerned constitutes Annex No. 2 to the Regulations. The template for the appeal decision constitutes Annex No. 4.
  9. Depending on the circumstances, in the course of partially or fully recognizing the User's complaint, the Provider may take the following actions:
    • Block access to the content in Social Media;
    • Permanently remove the content from Social Media;
    • Report the content to the Social Media infrastructure provider.
  10. The Provider also notifies without undue delay the User who made the notification about the decision concerning the reported content, the facts, and circumstances on which the appropriate decision was made. The template notification for the User who made the report constitutes Annex No. 3 to these Regulations. In the event of an appeal by the User, the Provider addresses the appeal by accepting it in full or by refusing acceptance and informs the User of the decision within 14 days from the date of the appeal's receipt. The template for the appeal decision constitutes Annex No. 4.
  11. If during the process of handling a complaint, the Provider receives any information that gives reason to suspect that a crime endangering the life or safety of a person or persons may have been committed, it immediately informs the relevant public authorities.
  12. The Provider maintains a Complaints Register, which constitutes Annex No. 5 to these Regulations.

§ 3. Annual Report on Content Moderation

  1. In accordance with Article 15 of the DSA, the Provider prepares a Report concerning content moderation.
  2. The Provider prepares the Report once a year between December and January.
  3. The Report is published for a period of 3 days on the Provider's website at: https://sheetdb.io in a machine-readable format directly via the website without the need to download the text.
  4. The Report includes:
    • The number of orders received from the member state authorities, including orders grouped according to the type of relevant illegal content,
    • The number of notifications made in accordance with Article 16 of the DSA, grouped according to the type of relevant potential illegal content, the number of notifications made by trusted flaggers, any actions taken in accordance with the notifications divided according to whether the action was taken on the basis of legal provisions or the terms of use of the Provider's services, the number of notifications handled using automated means, and the median time needed to take action,
    • Significant and comprehensible information about content moderation carried out on the Provider's own initiative, including the use of automated tools, measures taken to provide training and assistance to those responsible for content moderation, the number and type of measures adopted that affect the availability, visibility, and accessibility of information conveyed by Users and the ability of Users to convey information via the service, as well as other related service restrictions, the information contained in the Report is grouped according to the type of illegal content,
    • The number of complaints received through internal complaint handling systems in accordance with the terms of use of the Provider's services,
    • Any cases of using automated means for content moderation purposes, including a qualitative description, specification of specific goals, accuracy indicators, and the possible level of error of the automated means used to achieve those goals, and the applied safeguards.

§ 4. Final Provisions

  1. These Regulations come into effect on 12 March 2024.
  2. In matters not regulated by these Regulations, the appropriate provisions of generally applicable law will apply, in particular the provisions of the DSA, the Consumer Rights Act, the Copyright and Related Rights Act.