Question 1. Adversarial training of linear classifier .Consider adversarial training a linear model
on the soft-SVM loss. The minimax objective is given by
min
w
Ex,y max
||δ|| p ≤ε
[max(0, 1 − yw
>
(x + δ))], (1)
where x is the instance and y {− ∈ 1, +1} is the label. In this question, we will simplify Equation (1).
Question 1.1. Let p = ∞. For a fixed weight vector w and data point x, show that the optimal
perturbation δ ∗
(x) that maximizes the soft-SVM loss has a closed-form solution:
δ
∗
= −yε sign(w),
where
sign(a) =
1, if a > 0;
0, if a = 0;
−1, if a < 0,
and sign(w) means running the sign operation element-wise on the vector w.
Question 1.2.Let p = ∞. Simplify adversarial training Equation (1).
Question 1.3.Simplify adversarial training Equation (1) for general p ≥ 1.
1