Nvidia warns that any GPU 'kill switch' or 'backdoor' into its AI chips would 'fracture trust in US technology'
Briefly

Nvidia firmly states that its AI chips should not have backdoors or kill switches, arguing that these would lead to vulnerabilities and compromise user trust. The company's chief security officer emphasized in a blog post the significant risks associated with government surveillance and control mechanisms. China expressed concerns regarding potential backdoor risks in Nvidia's H20 chips, highlighting the need for secure technologies. Nvidia supports maintaining integrity and security in the tech ecosystem without compromising to external pressures.
NVIDIA GPUs do not and should not have kill switches and backdoors. Allowing such technologies is an open invitation to disaster and fractures trust in US technology.
Allowing backdoors would make the overall technology more vulnerable, leading to potential security risks and concerns among users, including those in the Chinese market.
Read at Business Insider
[
|
]