On March 6, 2022, the Committee for Adapting the Law to the Challenges of Innovation and Technology Acceleration in the Ministry of Justice and the Advisory Committee to Examine Regulation on Digital Content Platforms in the Ministry of Communications, issued an open call for the public. The committees requested the public’s comments on alternative regulatory arrangements designed to block offensive content posted on digital platforms. The committees also seek opinions on the transparency guidelines applicable to the platforms, their responsibility to handle users’ submissions, and how to classify the platforms to which future legislation will apply.
The Israel Internet Association (ISOC-IL) and the Israel Democracy Institute submitted a comprehensive policy document in response to the open call, mapping the potential policies and tools to curb harmful content and the proposed regulation that will increase transparency and accountability of social networks operating in Israel while referencing alternatives considered in other countries.
Read the full Response as submitted (Hebrew)
Summary of our position on the regulation of content platforms in Israel
Regulatory objectives to curb offensive or harmful content:
The committees of the Ministry of Communications and the Ministry of Justice are facing a policy challenge: Adopting regulatory measures that allow the Israeli government to reduce the societal harm caused by the ever-increasing scope of legal yet inappropriate online content without causing greater damage by (deliberate or unintentional) removal of legal and desirable content.
To deal with this challenge, we mapped the three key policy alternatives currently considered around the world:
- Increasing the platforms’ legal liability for the illegal content distributed through them: This alternative will revoke the platforms’ current immunity to criminal liability for illicit content uploaded to them, advocating a procedure to criminalize inappropriate content that is not yet prohibited by law.
- Intervention in the platforms’ content guidelines: This alternative will enable government intervention in the community guidelines of digital platforms by outlining a threshold of policies that each platform is obligated to implement.
- Increasing platforms’ accountability and responsiveness toward the public and the State: This alternative is conceptually compatible with how the European Union and the UK decided to deal with the problem of offensive online content. It will increase the platforms’ responsiveness to the public when it comes to the actual enforcement of the guidelines regarding legal yet inappropriate content via three channels:
- Require platforms to display detailed and clear community guidelines and enforce them effectively, fairly, and equally.
- Define basic transparency guidelines designed to give the users and the State ways to examine whether the terms of fair enforcement were violated.
- Force platforms to grant users the means to participate in the filtering process, whether by reporting inappropriate content or appealing a decision against the user.
We presented previous efforts to implement each of these alternatives worldwide, their respective advantages, and the difficulties they raise when balanced against the constitutional right to freedom of expression and the risk of harming users’ ability to upload legal content.
Based on the various considerations, we recommended adopting the third alternative, which balances the desire for greater supervision over digital platforms with the risk of government censorship of legal content.
Proposal to increase transparency and accountability of social networks
Considering the clear advantages of adopting the third policy mentioned above, we proposed to implement it through a comprehensive regulatory arrangement. This proposal is based on a bill that the ISOC-IL currently promotes with other civil society organizations. As part of the proposal, the regulatory arrangement will be integrated into the Consumer Protection Law, and will focus on regulating digital platform users’ consumer rights. Accordingly, the Consumer Protection and Fair Trade Authority will be responsible for enforcing it.
The bill will apply to online social network operators (distinguishing them from other online services), used by more than 3% of Israelis over the age of 13.
Unlike similar proposals worldwide, this definition also means that the bill will apply to instant messaging platforms, including WhatsApp and Telegram, as they play a significant role in the distribution of harmful content in Israel. Like European legislation (Digital Services Act), the bill includes “basic duties” that will apply to all social network operators over the minimum threshold and “augmented duties” that will apply to large platforms used by more than 25% of Israelis over the age of 13 for more than three months.
As a prerequisite for promoting the social values addressed within the regulatory framework, the bill ensures that Israeli law applies to network operators everywhere, regardless of their geographical location. Israeli users will have a local point of contact. To this end, the bill will determine that:
- Operators will comply with instructions issued by the relevant authorities according to Israeli law.
- Any procedure involving an online social network operator and a user in Israel will be under the jurisdiction of Israeli authorities. The provisions of the law will apply despite any prior concession or contradictory agreement.
- The operator will be responsible for appointing a local representative for user applications and customer service. Large operators will be required to engage legal representation in Israel to serve as the party responsible for legal submissions.
- Operators will examine and process reports of alleged violations of the community guidelines in appropriate time , depending on the type of report. Operators will provide users with clear and accessible ways to get support in cases of content-related actions and their accounts.
As for obligations that will apply to platforms, we offer the following:
- Operators will be required to present clear community guidelines. The relevant Minister will determine the sufficient scope of such guidelines.
- Operators will enforce their community guidelines effectively, fairly, and equally. This entails that the operators notify the user when deleting content uploaded by them, suspending or blocking their account, or any other sanction, specifying the reason and methods of appeal. This information will be available in the language of the content or of the user interface.
- Operators will allow users to appeal any decision, whether it is a decision to allow content related to them or remove content created by them.
- Operators will be required to comply with transparency guidelines, including the publication of trusted f, annual enforcement reports, and providing extensive information to the user who violated the guidelines. Large operators will be required to provide additional details and reasoning.
- Operators will notify whoever reports content that violate the community guidelines on the result of their report. If necessary, the operators will inform the reporting party or community managers of any decision regarding content deletion, account suspension or blocking, or any other sanction related to them.
To ensure compliance with the new duties and allow public and regulatory supervision over the activity of the platforms, any online social network operator will be required to comply with transparency guidelines regarding content and its moderation , including a threshold of the annual transparency reports:
- large operators will be required to issue an annual public report, in Hebrew and Arabic, listing information about actions related to content moderation of users in Israel. The Minister will determine the required details.
- Operators will issue an annual public report, in Hebrew and Arabic, listing content moderation concerning users in Israel, which had been done following a request from a government authority. The Minister will determine the required details.
- Operators that offer a personalized advertisements platform will allow any user to know why they were targeted and will enable them to block specific content types. The Minister will determine the required details.
- Operators will ensure that users can clearly recognize that the content presented to them on the platform is an advertisement and the publisher’s identity. The Minister will determine the details prohibiting personalized advertising to vulnerable populations, such as minors.