The growing influence of artificial intelligence (AI) has many organizations that rush to respond to new cybersecurity and data confidentiality problems created by technology, especially because AI is used in cloud systems. Apple tackles AI’s security and confidentiality with its private cloud calculation system (CCC).
Apple seems to have solved the problem of the offer of cloud services without undermining the confidentiality of users or adding additional insecurity layers. He had to do so, because Apple had to create a cloud infrastructure on which to execute generative models of AI (Genai) which need more power of treatment than its devices could not provide while protecting the confidentiality of users, said that a Computerworld article.
Apple opens the PCC system to security researchers to “find out more about the CCP and carry out their own independent verification of our claims”, the Announced company. In addition, Apple also extends its Apple security bonus.
What does that mean for AI security in the future? Security Intelligence was maintained with Ruben Boonen, the development of CNE capacities in IBM, to find out what researchers think of the PCC and Apple approach.