'How to add noise (differential privacy) to clients weights in federal learning?

I want to add noise to the gradient on the client side. I modified tf.keras.optimizers.Adam() to DPKerasAdamOptimizer(), but it doesn't work.

    iterative_process = tff.learning.build_federated_averaging_process(
        model_fn=Create_tff_model,
        client_optimizer_fn=lambda: DPKerasAdamOptimizer(1,1.85))

The error is

AssertionError: Neither _compute_gradients() or get_gradients() on the differentially private optimizer was called. This means the training is not differentially private. It may be the case that you need to upgrade to TF 2.4 or higher to use this particular optimizer.

I can add noise on the server side using the tff.learning.model_update_aggregator.dp_aggregator(noise_multiplier, client_per_round), but how to add noise on the client side?



Solution 1:[1]

First, have a look at tutorial Differential Privacy in TFF which shows the simple gaussian mechanism using tff.learning.dp_aggregator.

If you would like to customize the details of the mechanism, you can either look at how the dp_aggregator is implemented, in particular tff.aggregators.DifferentiallyPrivateFactory being parameterized by a TensorFlow Privacy object, or write a custom aggregator from scratch.

Note that using DPKerasAdamOptimizer as the client optimizer might not be the right path, as usually the interesting part is to privatize whatever data leaves the client, but the intermediate steps at a client are not important.

Solution 2:[2]

Try using the CurrentThread property in order to get the data you are looking for.

Task<Double> t = Task.Run( () => { ShowThreadInformation("Main Task(Task #" + Task.CurrentId.ToString() + ")");
                                     for (int ctr = 1; ctr <= 20; ctr++)
                                       tasks.Add(Task.Factory.StartNew(
                                          () => { ShowThreadInformation("Task #" + Task.CurrentId.ToString());
                                                  long s = 0;
                                                  for (int n = 0; n <= 999999; n++) {
                                                     lock (rndLock) {
                                                        s += rnd.Next(1, 1000001);
                                                     }
                                                  }
                                                  return s/1000000.0;
                                                } ));

                                    Task.WaitAll(tasks.ToArray());
                                    Double grandTotal = 0;
                                    Console.WriteLine("Means of each task: ");
                                    foreach (var child in tasks) {
                                       Console.WriteLine("   {0}", child.Result);
                                       grandTotal += child.Result;
                                    }
                                    Console.WriteLine();
                                    return grandTotal / 20;
                               } );
  Console.WriteLine("Mean of Means: {0}", t.Result);

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Jakub Konecny
Solution 2 Ricardo Ortega MagaƱa