
Most AI providers Try to improve their products by forming them both with public information and user data. However, this last method puts a company concerned with confidentiality as Apple in a difficult position. How can he improve his Apple Intelligence Technology without compromising the confidentiality of its users? This is a difficult challenge, but the company considers that it has found a solution.
Synthetic data vs actual data
OPENAI,, Google,, MicrosoftAnd Meta Train their products in part by analyzing your cats. The objective is to improve the reliability and precision of their AI by scraping real conversation data. Although you can generally withdraw from this type of data sharing, the process to do it varies for each product. This means that the responsibility is responsible for how to break the connection.
Apple has always been proud to be more confidentially focused than its technological rivals. To this end, the company relied on something called synthetic data to train and improve its AI products. Created using the Grand Language (LLM) model of Apple, synthetic data try to imitate the essence of real data.
Also: Do you want AI to work for your business? Then confidentiality must come first
For example, AI can create a synthetic email that is similar in the subject and style to a real message. The objective is to teach AI how to summarize this email, a functionality already integrated with Apple Mail.
Apple solution: “differential confidentiality”
The problem with synthetic data is that it cannot reproduce special human touch found in the content of the real world. This limitation led Apple to adopt a different approach, known as differential confidentiality. As described by Apple in a Blog article published MondayDifferential confidentiality combines synthetic data with real data. Here’s how it works.
Also: Apple’s doctor AI will be ready to see you next spring
Let’s say Apple wants to teach its AI how to summarize an email. The company begins by creating a large number of synthetic emails on various subjects. Apple then generates integration for each synthetic message to capture key elements such as language, subject and length. These interests are sent to Apple users who have opted for the sharing of analysis on their devices.
Each device selects a small sample of real user emails and generates its own interests. The device then determines the synthetic interests which most closely correspond to the language, the subject and the other characteristics of the user emails. Thanks to differential intimacy, Apple identifies that synthetic incorporations were the most similar. In the next step, the company can organize these samples to further refine the data or start using it to form its AI.
Also: Forget the new Siri: here is the advanced that I use on my iPhone instead
As an example provided by Apple, imagine that an email on reading tennis is one of the best interests. A similar message is generated by replacing the “tennis” with “football” or another sport and added to the conservation or training list. Modifying the subject and other elements of each email helps AI learn to create better summaries for a greater variety of messages.
How does Apple protect confidentiality?
If you are wondering how this process really protects your confidentiality, peripheral analysis sharing is deactivated by default – so only those who oppose are involved in data training.
You can easily view, opt or opt for any Apple device. Access the settings (system settings on a Mac) and select confidentiality and security. Scroll down the screen and press the parameter for analysis and improvements. You will now see the name and description of each option so that you can decide which data, if necessary, you want to share. The full range of options will be available with iOS / iPados 18.5 and MacOS 15.5.
Also: How to activate the new Siri glow effect on iOS 18 – and other parameters that you must modify
In addition, sampled user email data never leave the device and are never shared with Apple. Someone’s device who opted for analysis sharing sends a signal to the company indicating which synthetic emails are closest to real user emails. However, this signal does not refer to an IP address, an Apple account or any other data associated with the user.
Apple has already used differential confidentiality for its Genmoji functionalityWho uses AI to create personalized emoji according to your descriptions. In this case, the company has been able to identify prompts and popular models without binding them to specific users. For the future, Apple declared that it was planning to extend its use of differential confidentiality to other AI features, in particular Image Games AreaWand imageCreation of memories, WritingAnd Visual intelligence.
Also: How I define Chatgpt as Siri’s backup – and what else can do on my iPhone
“By relying on our many years of experience using techniques such as differential confidentiality, as well as new techniques such as the generation of synthetic data, we are able to improve Apple’s intelligence features while protecting the confidentiality of users for users who oppose the device’s analysis program,” said Apple in its blog article. “These techniques allow Apple to understand global trends without learning information about a person, such as the guests they use or the content of their emails.”
Get the best morning stories in your reception box every day with our Newsletter Tech Today.