tip:老师语速超快...痛苦= = 线性分类器损失函数与最优化 \(Multiclass SVM loss: L_{i} = \sum_{j \neq y_{i}} max(0,s_{i}-s_{y_{i}}+1)\) \(Loss = \frac{1}{N} \sum_{i=1}^{N} L_{i}\) Q1: what if the sum was instead over all classes(j = yi)? A1:在计算中,我们可以知道这个没有意义,在公式中相当于加上了1,因为yi
1 #CS231n中线性.非线性分类器举例(Softmax) #注意其中反向传播的计算 # -*- coding: utf-8 -*- import numpy as np import matplotlib.pyplot as plt N = 100 # number of points per class D = 2 # dimensionality K = 3 # number of classes X = np.zeros((N*K,D)) # data matrix (each row