【StatLearn】统计学习中knn算法实验(2)
接着统计学习中knn算法实验(1)的内容
Problem:
- Explore the data before classification using summary statistics or visualization
- Pre-process the data (such as denoising, normalization, feature selection, …)
- Try other distance metrics or distance-based voting
- Try other dimensionality reduction methods
- How to set the k value, if not using cross validation? Verify your idea
- 在对数据分类之前使用对数据进行可视化处理
- 预处理数据(去噪,归一化,数据选择)
- 在knn算法中使用不同的距离计算方法
- 使用其他的降维算法
- 如何在不使用交叉验证的情况下设置k值
使用Parallel coordinates plot做数据可视化,首先对数据进行归一化处理,数据的动态范围控制在[0,1]。注意归一化的处理针对的是每一个fearture。
通过对图的仔细观察,我们挑选出重叠度比较低的feature来进行fearture selection,feature selection实际上是对数据挑选出更易区分的类型作为下一步分类算法的数据。我们挑选出feature序号为(1)、(2)、(5)、(6)、(7)、(10)的feature。个人认为,feature selection是一种简单而粗暴的降维和去噪的操作,但是可能效果会很好。
根据上一步的操作,从Parallel coordinates上可以看出,序号为(1)、(2)、(5)、(6)、(7)、(10)这几个feature比较适合作为classify的feature。我们选取以上几个feature作knn,得到的结果如下:
当K=1 的时候,Accuracy达到了85.38%,并且相比于简单的使用knn或者PCA+knn的方式,Normalization、Featrure Selection的方法使得准确率大大提升。我们也可以使用不同的feature搭配,通过实验得到更好的结果。
MaxAccuracy= 0.8834 when k=17 (Normalization+FeartureSelection+KNN)
Denoising的代码如下:
function[DNData]=DataDenoising(InputData,KillRange)
DNData=InputData;
%MedianData=median(DNData);
for i=2:size(InputData,2)
[temp,DNIndex]=sort(DNData(:,i));
DNData=DNData(DNIndex(1+KillRange:end-KillRange),:);
end
采用LLE作为降维的手段,通过和以上的几种方案作对比,如下:
MaxAccuracy= 0.9376 when K=23 (LLE dimensionality reduction to 2)
关于LLE算法,参见这篇论文
- Nonlinear dimensionality reduction by locally linear embedding.Sam Roweis & Lawrence Saul.Science, v.290 no.5500 , Dec.22, 2000. pp.2323--2326.
源代码:
StatLearnProj.m
clear;
data=load('wine.data.txt');
%calc 5-folder knn
Accuracy=[];
for i=1:5
Test=data(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(data,Test,'rows');
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyKNN=mean(Accuracy,1); %calc PCA
Accuracy=[];
%PCA
[Coeff,Score,Latent]=princomp(data(:,2:end));
dataPCA=[data(:,1),Score(:,1:6)];
Latent
for i=1:5
Test=dataPCA(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(dataPCA,Test,'rows');
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyPCA=mean(Accuracy,1);
BarData=[AccuracyKNN;AccuracyPCA];
bar(1:2:51,BarData'); [D,I]=sort(AccuracyKNN,'descend');
D(1)
I(1)
[D,I]=sort(AccuracyPCA,'descend');
D(1)
I(1) %pre-processing data
%Normalization
labs1={'1)Alcohol','(2)Malic acid','3)Ash','4)Alcalinity of ash'};
labs2={'5)Magnesium','6)Total phenols','7)Flavanoids','8)Nonflavanoid phenols'};
labs3={'9)Proanthocyanins','10)Color intensity','11)Hue','12)OD280/OD315','13)Proline'};
uniData=[];
for i=2:size(data,2)
uniData=cat(2,uniData,(data(:,i)-min(data(:,i)))/(max(data(:,i))-min(data(:,i))));
end
figure();
parallelcoords(uniData(:,1:4),'group',data(:,1),'labels',labs1);
figure();
parallelcoords(uniData(:,5:8),'group',data(:,1),'labels',labs2);
figure();
parallelcoords(uniData(:,9:13),'group',data(:,1),'labels',labs3); %denoising %Normalization && Feature Selection
uniData=[data(:,1),uniData];
%Normalization all feature for i=1:5
Test=uniData(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(uniData,Test,'rows');
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyNorm=mean(Accuracy,1); %KNN PCA Normalization
BarData=[AccuracyKNN;AccuracyPCA;AccuracyNorm];
bar(1:2:51,BarData'); %Normalization& FS 1 2 5 6 7 10 we select 1 2 5 6 7 10 feature
FSData=uniData(:,[1 2 3 6 7 8 11]);
size(FSData)
for i=1:5
Test=FSData(i:5:end,:);
Trainning=setdiff(FSData,Test,'rows');
TestData=Test(:,2:end);
TestLabel=Test(:,1);
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyNormFS1=mean(Accuracy,1); %Normalization& FS 1 6 7
FSData=uniData(:,[1 2 7 8]);
for i=1:5
Test=FSData(i:5:end,:);
Trainning=setdiff(FSData,Test,'rows');
TestData=Test(:,2:end);
TestLabel=Test(:,1);
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyNormFS2=mean(Accuracy,1);
figure();
BarData=[AccuracyNorm;AccuracyNormFS1;AccuracyNormFS2];
bar(1:2:51,BarData'); [D,I]=sort(AccuracyNorm,'descend');
D(1)
I(1)
[D,I]=sort(AccuracyNormFS1,'descend');
D(1)
I(1)
[D,I]=sort(AccuracyNormFS2,'descend');
D(1)
I(1)
%denoiding
%Normalization& FS 1 6 7
FSData=uniData(:,[1 2 7 8]);
for i=1:5
Test=FSData(i:5:end,:);
Trainning=setdiff(FSData,Test,'rows');
Trainning=DataDenoising(Trainning,2);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel));
end
AccuracyNormFSDN=mean(Accuracy,1);
figure();
hold on
plot(1:2:51,AccuracyNormFSDN);
plot(1:2:51,AccuracyNormFS2,'r'); %other distance metrics Dist='cityblock';
for i=1:5
Test=uniData(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(uniData,Test,'rows');
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracyPlus(TestData,TestLabel,TrainningData,TrainningLabel,Dist));
end
AccuracyNormCity=mean(Accuracy,1); BarData=[AccuracyNorm;AccuracyNormCity];
figure();
bar(1:2:51,BarData'); [D,I]=sort(AccuracyNormCity,'descend');
D(1)
I(1) %denoising
FSData=uniData(:,[1 2 7 8]);
Dist='cityblock';
for i=1:5
Test=FSData(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(FSData,Test,'rows');
Trainning=DataDenoising(Trainning,3);
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracyPlus(TestData,TestLabel,TrainningData,TrainningLabel,Dist));
end
AccuracyNormCityDN=mean(Accuracy,1);
figure();
hold on
plot(1:2:51,AccuracyNormCityDN);
plot(1:2:51,AccuracyNormCity,'r'); %call lle data=load('wine.data.txt');
uniData=[];
for i=2:size(data,2)
uniData=cat(2,uniData,(data(:,i)-min(data(:,i)))/(max(data(:,i))-min(data(:,i))));
end
uniData=[data(:,1),uniData];
LLEData=lle(uniData(:,2:end)',5,2);
%size(LLEData)
LLEData=LLEData';
LLEData=[data(:,1),LLEData]; Accuracy=[];
for i=1:5
Test=LLEData(i:5:end,:);
TestData=Test(:,2:end);
TestLabel=Test(:,1);
Trainning=setdiff(LLEData,Test,'rows');
Trainning=DataDenoising(Trainning,2);
TrainningData=Trainning(:,2:end);
TrainningLabel=Trainning(:,1);
Accuracy=cat(1,Accuracy,CalcAccuracyPlus(TestData,TestLabel,TrainningData,TrainningLabel,'cityblock'));
end
AccuracyLLE=mean(Accuracy,1);
[D,I]=sort(AccuracyLLE,'descend');
D(1)
I(1) BarData=[AccuracyNorm;AccuracyNormFS2;AccuracyNormFSDN;AccuracyLLE];
figure();
bar(1:2:51,BarData'); save('ProcessingData.mat');
CalcAccuracy.m
function Accuracy=CalcAccuracy(TestData,TestLabel,TrainningData,TrainningLabel)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%calculate the accuracy of classify
%TestData:M*D matrix D stand for dimension,M is sample
%TrainningData:T*D matrix
%TestLabel:Label of TestData
%TrainningLabel:Label of Trainning Data
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
CompareResult=[];
for k=1:2:51
ClassResult=knnclassify(TestData,TrainningData,TrainningLabel,k);
CompareResult=cat(2,CompareResult,(ClassResult==TestLabel));
end
SumCompareResult=sum(CompareResult,1);
Accuracy=SumCompareResult/length(CompareResult(:,1));
CalcAccuracyPlus.m
function Accuracy=CalcAccuracyPlus(TestData,TestLabel,TrainningData,TrainningLabel,Dist)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%just as CalcAccuracy,but add distance metrics
%calculate the accuracy of classify
%TestData:M*D matrix D stand for dimension,M is sample
%TrainningData:T*D matrix
%TestLabel:Label of TestData
%TrainningLabel:Label of Trainning Data
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
CompareResult=[];
for k=1:2:51
ClassResult=knnclassify(TestData,TrainningData,TrainningLabel,k,Dist);
CompareResult=cat(2,CompareResult,(ClassResult==TestLabel));
end
SumCompareResult=sum(CompareResult,1);
Accuracy=SumCompareResult/length(CompareResult(:,1));
【StatLearn】统计学习中knn算法实验(2)的更多相关文章
- 【StatLearn】统计学习中knn算法的实验(1)
Problem: Develop a k-NN classifier with Euclidean distance and simple voting Perform 5-fold cross va ...
- 学习OpenCV——KNN算法
转自:http://blog.csdn.net/lyflower/article/details/1728642 文本分类中KNN算法,该方法的思路非常简单直观:如果一个样本在特征空间中的k个最相似( ...
- Machine Learning In Action 第二章学习笔记: kNN算法
本文主要记录<Machine Learning In Action>中第二章的内容.书中以两个具体实例来介绍kNN(k nearest neighbors),分别是: 约会对象预测 手写数 ...
- 统计学习中感知机的C++代码
感知机是古老的统计学习方法,主要应用于二类线性可分数据,策略是在给定的超平面上对误差点进行纠正,从而保证所有的点都是正确可分的. 用到的方法是随机梯度下降法,由于是线性可分的,可保证最终在有限步内收敛 ...
- [译]针对科学数据处理的统计学习教程(scikit-learn教程2)
翻译:Tacey Wong 统计学习: 随着科学实验数据的迅速增长,机器学习成了一种越来越重要的技术.问题从构建一个预测函数将不同的观察数据联系起来,到将观测数据分类,或者从未标记数据中学习到一些结构 ...
- KNN算法的R语言实现
近邻分类 简言之,就是将未标记的案例归类为与它们最近相似的.带有标记的案例所在的类. 应用领域: 1.计算机视觉:包含字符和面部识别等 2.推荐系统:推荐受众喜欢电影.美食和娱乐等 3.基因工程:识别 ...
- 机器学习(一)之KNN算法
knn算法原理 ①.计算机将计算所有的点和该点的距离 ②.选出最近的k个点 ③.比较在选择的几个点中那个类的个数多就将该点分到那个类中 KNN算法的特点: knn算法的优点:精度高,对异常值不敏感,无 ...
- 吴裕雄--天生自然python机器学习实战:K-NN算法约会网站好友喜好预测以及手写数字预测分类实验
实验设备与软件环境 硬件环境:内存ddr3 4G及以上的x86架构主机一部 系统环境:windows 软件环境:Anaconda2(64位),python3.5,jupyter 内核版本:window ...
- SVM(支持向量机)与统计机器学习 & 也说一下KNN算法
因为SVM和统计机器学习内容很多,所以从 http://www.cnblogs.com/charlesblc/p/6188562.html 这篇文章里面分出来,单独写. 为什么说SVM和统计学关系很大 ...
随机推荐
- 多线程学习笔记三之ReentrantLock与AQS实现分析
目录 简介 AQS同步状态 AQS同步队列 ReentrantLock数据结构 公平锁的获取 tryAcquire(arg) addWaiter(Node mode) acquireQueued(fi ...
- Linux驱动之混杂设备(misc)
字符设备之混杂设备: 定义混杂设备: struct misdevice{ int minor; //为什么这里只有次设备号,因为混杂设备是一种在 /////////////////////////Li ...
- BZOJ.2142.礼物(扩展Lucas)
题目链接 答案就是C(n,m1) * C(n-m1,m2) * C(n-m1-m2,m3)...(mod p) 使用扩展Lucas求解. 一个很简单的优化就是把pi,pi^ki次方存下来,因为每次分解 ...
- C++中如何访问全局变量和全局函数
全局变量和全局函数是相对局部变量和局部函数而言的,不在{}或者for, if 等范围内的都是全局变量或者全局函数,最简单的是在同一个文件中去声明. 例如在mian.cpp中 #include < ...
- hdu 5826 physics 物理题
physics 题目连接: http://acm.hdu.edu.cn/showproblem.php?pid=5826 Description There are n balls on a smoo ...
- LPCScrypt, DFUSec : USB FLASH download, programming, and security tool, LPC-Link 2 Configuration tool, Firmware Programming
What does this tool do? The LPC18xx/43xx DFUSec utility is a Windows PC tool that provides support f ...
- 【Go命令教程】10. go fix 与 go tool fix
命令 go fix 会把指定 代码包 的所有 Go 语言源码文件中的旧版本代码修正为新版本的代码.这里所说的版本即 Go 语言的版本.代码包的所有 Go 语言源码文件不包括其子代码包(如果有的话)中的 ...
- cocos 主循环
CCApplication的run为主循环,负责在空闲的时候,调用CCDirector的mainloop,setAnimationInterval设置多少秒一帧.m_nAnimationInterva ...
- This function or variable may be unsafe Consider using xxx instead
问题: 在Visual C++ 6.0 以下执行正常的代码放到Visual Studio 20xx系列里就跑不动了,有时候会提演示样例如以下错误: error C4996: 'fopen': This ...
- Delphi来实现一个IP地址输入控件
unit Unit1; interface uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms ...