fromInfoQ
1 week agoGoogle Introduces VaultGemma: An Experimental Differentially Private LLM
Differential privacy is a mathematical technique designed to publish statistical information derived from a dataset without leaking information about individual samples contained in it. This is typically achieved by injecting calibrated noise into the training data in such a way that its overall statistical properties are preserved while making it more difficult to infer details about specific samples.
Miscellaneous