Anthropic to limit use of services by Chinese-owned entities
Anthropic, a leading AI technology company, has announced a significant policy change aimed at restricting the sale of its AI services to entities with majority ownership based in China. This move is part of a broader strategy to limit the potential misuse of AI technology for military and intelligence purposes, particularly by entities aligned with the Chinese Communist Party (CCP).
The decision comes on the heels of growing concerns about the misuse of AI, as highlighted in Anthropic's recent threat intelligence report. The report detailed instances of threat actors exploiting AI models for malicious activities such as large-scale data theft, extortion, and generating AI ransomware. North Korean operatives have also been found using AI to secure remote employment at US companies, circumventing international sanctions.
In response to these threats, Anthropic has implemented several measures to enhance security and share findings with relevant authorities. The company has banned accounts involved in malicious activities and is continuing to develop new safety measures to mitigate the risks associated with AI-powered cybercrime.
This policy extension also applies to countries considered US adversaries, including Russia, Iran, and North Korea. The move underscores the increasing need for vigilance and proactive measures to protect against the misuse of AI technology.
In a separate development, the United States has imposed visa restrictions on Central Americans linked to CCP influence, further highlighting the global effort to counter China's expanding influence in the region [2].
These actions by Anthropic and the US government reflect a broader trend of tightening controls on the use of advanced technologies to prevent misuse and ensure national security.
References:
[1] https://pulse24.ai/news/2025/9/4/22/anthropic-restricts-ai-access
[2] https://www.greaterbelize.com/u-s-imposes-visa-restrictions-on-central-americans-linked-to-ccp-influence/
Comments

No comments yet