Major Chinese internet firms, including Tencent Holdings Ltd. and ByteDance Ltd., have disclosed some information about their key algorithms to Beijing for the first time to tackle issues regarding data abuse that may risk the confidentiality of critical corporate details.
The Cyberspace Administration of China (CAC) revealed last week a list of 30 algorithms involving apps operated by Alibaba Group Holding Ltd. and Meituan, among other companies, that collect users’ data to personalize recommendations and content they receive.
The list published did not specify actual codes, nor was it clear how much information about the internet giants’ main software was privately provided to the regulators. It only covered basic details on how the algorithms work and the product and use cases where they are employed.
The CAC currently only needs basic details from companies, but it may require more to look into complaints of data abuse.
China’s New Algorithm Regulation
The algorithms that determine which posts, videos, or images on social media platforms users see play a massive role in catching interest and increasing growth.
China in March implemented policies that require internet firms to release such algorithms to solve concerns over potential data abuse and practice stricter supervision of internet companies.
The country’s adoption of the disclosure requirement separates it from countries such as the US, where Meta Platforms Inc. and Alphabet Inc. were able to keep their algorithms private, despite policymakers and activists aimed to grasp more of the way they choose and tailor content and handle data.
China has been practicing stricter regulations to control the once unsupervised growth of its major tech companies. The country implemented in November 2021 the Personal Information Protection Law and the Data Security Law to introduce tighter policies for how firms manage user data.
In addition to the publicly shared information about their algorithms, the companies are required to provide the CAC non-public details regarding self-assessment on private biometric or identity information and the data sources used to teach the algorithms.
The information the companies submitted to the CAC likely has further details that were not released to the public and may have included some corporate secrets that are not meant for the masses.