$35
Engineering Applications of Machine Learning and Data Analytics Homework #3
1 The `2 Support Vector Machine [20pts] In class, we discussed that if our data is not linearly separable, then we need to modify our optimization problem to include slack variables. The formulation that was used is known as the `1-norm soft margin SVM. Now consider the formulation of the `2-norm soft margin SVM, which squares the slack variables within the sum. Notice that non-negativity of the slack variables has been removed. arg min w,b,ξ 1 2 kwk 2 + C 2 Xn i=1 ξ 2 i s.t. yi(wT xi + b) ≥ 1 − ξi ∀i ∈ [n] Derive the dual form expression along with any constraints. Work must be shown. Hints: Refer to the methodology that was used in class to derive the dual form. The solution is given by: arg max α Xn i=1 αi − 1 2 Xn i=1 Xn j=1 αiαjyiyjx T i xj − 1 2C Xn i=1 α 2 i s.t. αi ≥ 0 ∀i ∈ [n] and Xn i=1 αiyi = 0 arizona.edu 2 March 04, 2022 Spring 2021 University of Arizona 2 Domain Adaptation Support Vector Machines [20pts] We now look at a different type of SVM that is designed for domain adaptation and optimizes the hyperplanes given by wS (source hyperplane) before optimizing wT (target hyperplane). The process begins by training a support vector machine on source data then once data from the target are available, train a new SVM using the hyperplane from the first SVM and the data from the target to solve for a new “domain adaptation” SVM. The primal optimization problem is given by arg min wT ,ξ 1 2 kwT k 2 + C Xn i=1 ξi − BwT T wS s.t. yi(wT T xi + b) ≥ 1 − ξi ∀i ∈ {1, . . . , n} ξi ≥ 0 ∀i ∈ {1, . . . , n} where wS is hyperplane trained on the source data (assumed to be known), wT is hyperplane for the target, yi ∈ {±1} is the label for instance xi , C & B are regularization parameters defined by the user and ξi is a slack variable for instance xi . The problem becomes finding a hyperplane, wT , that minimizes the above objective function subject to the constraints. Solve/derive the dual optimization problem. Note: I will give the class the solution to this problem prior to the due date because Problem #3 requires that you implement this algorithm in code. arizona.edu 3 March 04, 2022 Spring 2021 University of Arizona 3 Domain Adaptation SVM (Code) [20pts] Implement the domain adaptation SVM from Problem #2. A data setfor the source and target domains (both training and testing) have been uploaded to D2L. There are several ways to implement this algorithm. If I were doing this for an assignment, I would implement the SVM (both the domain adaptation SVM and normal SVM) directly using quadratic programming. You do not need to build the classifier (i.e., solve for the bias term); however, you will need to find wT and wS. To find the weight vectors, you will need to solve a quadratic programming problem and look through the documentation to learn how to solve this optiization task. The following Python packages are recommended: • CVXOPT(https://cvxopt.org/) • PyCVX (https://www.cvxpy.org/install/) Note: Your solution can (and should) use any of the packages above. arizona.edu 4 March 04, 2022