R︱mlr包帮你挑选最适合数据的机器学习模型(分类、回归)+机器学习python和R互查手册
一、R语言的mlr packages
install.packages("mlr")之后就可以看到R里面有哪些机器学习算法、在哪个包里面。
a<-listLearners()
这个包是听CDA网络课程《R语言与机器学习实战》余文华老师所述,感觉很棒,有待以后深入探讨。以下表格是R语言里面,52个机器学习算法的来源以及一些数据要求。
class | name | short.name | package | note | type | installed | numerics | factors | ordered | missings | weights | prob | oneclass | twoclass | multiclass | class.weights | se | lcens | rcens | icens | |
1 | classif.avNNet | Neural Network | avNNet | nnet | `size` has been set to `3` by default. Doing bagging training of `nnet` if set `bag = TRUE`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
2 | classif.binomial | Binomial Regression | binomial | stats | Delegates to `glm` with freely choosable binomial link function via learner parameter `link`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
3 | classif.C50 | C50 | C50 | C50 | classif | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
4 | classif.cforest | Random forest based on conditional inference trees | cforest | party | See `?ctree_control` for possible breakage for nominal features with missingness. | classif | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
5 | classif.ctree | Conditional Inference Trees | ctree | party | See `?ctree_control` for possible breakage for nominal features with missingness. | classif | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
6 | classif.cvglmnet | GLM with Lasso or Elasticnet Regularization (Cross Validated Lambda) | cvglmnet | glmnet | The family parameter is set to `binomial` for two-class problems and to `multinomial` otherwise. Factors automatically get converted to dummy columns, ordered factors to integer. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
7 | classif.gausspr | Gaussian Processes | gausspr | kernlab | Kernel parameters have to be passed directly and not by using the `kpar` list in `gausspr`. Note that `fit` has been set to `FALSE` by default for speed. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
8 | classif.gbm | Gradient Boosting Machine | gbm | gbm | `keep.data` is set to FALSE to reduce memory requirements. Note on param 'distribution': gbm will select 'bernoulli' by default for 2 classes, and 'multinomial' for multiclass problems. The latter is the only setting that works for > 2 classes. | classif | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
9 | classif.glmnet | GLM with Lasso or Elasticnet Regularization | glmnet | glmnet | The family parameter is set to `binomial` for two-class problems and to `multinomial` otherwise. Factors automatically get converted to dummy columns, ordered factors to integer. Parameter `s` (value of the regularization parameter used for predictions) is set to `0.1` by default, but needs to be tuned by the user. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
10 | classif.h2o.deeplearning | h2o.deeplearning | h2o.dl | h2o | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
11 | classif.h2o.gbm | h2o.gbm | h2o.gbm | h2o | 'distribution' is set automatically to 'gaussian'. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
12 | classif.h2o.glm | h2o.glm | h2o.glm | h2o | 'family' is always set to 'binomial' to get a binary classifier. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
13 | classif.h2o.randomForest | h2o.randomForest | h2o.rf | h2o | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
14 | classif.knn | k-Nearest Neighbor | knn | class | classif | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
15 | classif.ksvm | Support Vector Machines | ksvm | kernlab | Kernel parameters have to be passed directly and not by using the `kpar` list in `ksvm`. Note that `fit` has been set to `FALSE` by default for speed. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE |
16 | classif.lda | Linear Discriminant Analysis | lda | MASS | Learner parameter `predict.method` maps to `method` in `predict.lda`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
17 | classif.logreg | Logistic Regression | logreg | stats | Delegates to `glm` with `family = binomial(link = "logit")`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
18 | classif.lssvm | Least Squares Support Vector Machine | lssvm | kernlab | `fitted` has been set to `FALSE` by default for speed. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
19 | classif.lvq1 | Learning Vector Quantization | lvq1 | class | classif | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
20 | classif.mlp | Multi-Layer Perceptron | mlp | RSNNS | classif | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
21 | classif.multinom | Multinomial Regression | multinom | nnet | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
22 | classif.naiveBayes | Naive Bayes | nbayes | e1071 | classif | TRUE | TRUE | TRUE | FALSE | TRUE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | |
23 | classif.nnet | Neural Network | nnet | nnet | `size` has been set to `3` by default. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
24 | classif.plsdaCaret | Partial Least Squares (PLS) Discriminant Analysis | plsdacaret | caret | classif | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | |
25 | classif.probit | Probit Regression | probit | stats | Delegates to `glm` with `family = binomial(link = "probit")`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
26 | classif.qda | Quadratic Discriminant Analysis | qda | MASS | Learner parameter `predict.method` maps to `method` in `predict.qda`. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
27 | classif.randomForest | Random Forest | rf | randomForest | Note that the rf can freeze the R process if trained on a task with 1 feature which is constant. This can happen in feature forward selection, also due to resampling, and you need to remove such features with removeConstantFeatures. | classif | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE |
28 | classif.rpart | Decision Tree | rpart | rpart | `xval` has been set to `0` by default for speed. | classif | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
29 | classif.svm | Support Vector Machines (libsvm) | svm | e1071 | classif | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | TRUE | FALSE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | |
30 | classif.xgboost | eXtreme Gradient Boosting | xgboost | xgboost | All settings are passed directly, rather than through `xgboost`'s `params` argument. `nrounds` has been set to `1` by default. `num_class` is set internally, so do not set this manually. | classif | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE |
31 | cluster.dbscan | DBScan Clustering | dbscan | fpc | A cluster index of NA indicates noise points. Specify `method = "dist"` if the data should be interpreted as dissimilarity matrix or object. Otherwise Euclidean distances will be used. | cluster | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
32 | cluster.kkmeans | Kernel K-Means | kkmeans | kernlab | `centers` has been set to `2L` by default. The nearest center in kernel distance determines cluster assignment of new data points. Kernel parameters have to be passed directly and not by using the `kpar` list in `kkmeans` | cluster | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
33 | regr.avNNet | Neural Network | avNNet | nnet | `size` has been set to `3` by default. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
34 | regr.cforest | Random Forest Based on Conditional Inference Trees | cforest | party | See `?ctree_control` for possible breakage for nominal features with missingness. | regr | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
35 | regr.ctree | Conditional Inference Trees | ctree | party | See `?ctree_control` for possible breakage for nominal features with missingness. | regr | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
36 | regr.gausspr | Gaussian Processes | gausspr | kernlab | Kernel parameters have to be passed directly and not by using the `kpar` list in `gausspr`. Note that `fit` has been set to `FALSE` by default for speed. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE |
37 | regr.gbm | Gradient Boosting Machine | gbm | gbm | `keep.data` is set to FALSE to reduce memory requirements, `distribution` has been set to `"gaussian"` by default. | regr | TRUE | TRUE | TRUE | FALSE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
38 | regr.glm | Generalized Linear Regression | glm | stats | 'family' must be a character and every family has its own link, i.e. family = 'gaussian', link.gaussian = 'identity', which is also the default. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE |
39 | regr.glmnet | GLM with Lasso or Elasticnet Regularization | glmnet | glmnet | Factors automatically get converted to dummy columns, ordered factors to integer. Parameter `s` (value of the regularization parameter used for predictions) is set to `0.1` by default, but needs to be tuned by the user. | regr | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
40 | regr.h2o.deeplearning | h2o.deeplearning | h2o.dl | h2o | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | |
41 | regr.h2o.gbm | h2o.gbm | h2o.gbm | h2o | 'distribution' is set automatically to 'gaussian'. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
42 | regr.h2o.glm | h2o.glm | h2o.glm | h2o | 'family' is always set to 'gaussian'. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
43 | regr.h2o.randomForest | h2o.randomForest | h2o.rf | h2o | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | |
44 | regr.ksvm | Support Vector Machines | ksvm | kernlab | Kernel parameters have to be passed directly and not by using the `kpar` list in `ksvm`. Note that `fit` has been set to `FALSE` by default for speed. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
45 | regr.lm | Simple Linear Regression | lm | stats | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | |
46 | regr.mob | Model-based Recursive Partitioning Yielding a Tree with Fitted Models Associated with each Terminal Node | mob | party | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | |
47 | regr.nnet | Neural Network | nnet | nnet | `size` has been set to `3` by default. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
48 | regr.randomForest | Random Forest | rf | randomForest | See `?regr.randomForest` for information about se estimation. Note that the rf can freeze the R process if trained on a task with 1 feature which is constant. This can happen in feature forward selection, also due to resampling, and you need to remove such features with removeConstantFeatures. | regr | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE |
49 | regr.rpart | Decision Tree | rpart | rpart | `xval` has been set to `0` by default for speed. | regr | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
50 | regr.rvm | Relevance Vector Machine | rvm | kernlab | Kernel parameters have to be passed directly and not by using the `kpar` list in `rvm`. Note that `fit` has been set to `FALSE` by default for speed. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
51 | regr.svm | Support Vector Machines (libsvm) | svm | e1071 | regr | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | |
52 | regr.xgboost | eXtreme Gradient Boosting | xgboost | xgboost | All settings are passed directly, rather than through `xgboost`'s `params` argument. `nrounds` has been set to `1` by default. | regr | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE |
53 | surv.cforest | Random Forest based on Conditional Inference Trees | crf | party,survival | See `?ctree_control` for possible breakage for nominal features with missingness. | surv | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE |
54 | surv.coxph | Cox Proportional Hazard Model | coxph | survival | surv | TRUE | TRUE | TRUE | FALSE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE | |
55 | surv.cvglmnet | GLM with Regularization (Cross Validated Lambda) | cvglmnet | glmnet | Factors automatically get converted to dummy columns, ordered factors to integer. | surv | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE |
56 | surv.glmnet | GLM with Regularization | glmnet | glmnet | Factors automatically get converted to dummy columns, ordered factors to integer. Parameter `s` (value of the regularization parameter used for predictions) is set to `0.1` by default, but needs to be tuned by the user. | surv | TRUE | TRUE | TRUE | TRUE | FALSE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE |
57 | surv.rpart | Survival Tree | rpart | rpart | `xval` has been set to `0` by default for speed. | surv | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | FALSE | TRUE | FALSE |
二、ML在python+R的互查
R︱mlr包帮你挑选最适合数据的机器学习模型(分类、回归)+机器学习python和R互查手册的更多相关文章
- <转>机器学习系列(9)_机器学习算法一览(附Python和R代码)
转自http://blog.csdn.net/han_xiaoyang/article/details/51191386 – 谷歌的无人车和机器人得到了很多关注,但我们真正的未来却在于能够使电脑变得更 ...
- 深入对比数据科学工具箱:Python和R之争
建议:如果只是处理(小)数据的,用R.结果更可靠,速度可以接受,上手方便,多有现成的命令.程序可以用.要自己搞个算法.处理大数据.计算量大的,用python.开发效率高,一切尽在掌握. 概述 在真实的 ...
- 【技术翻译】支持向量机简明教程及其在python和R下的调参
原文:Simple Tutorial on SVM and Parameter Tuning in Python and R 介绍 数据在机器学习中是重要的一种任务,支持向量机(SVM)在模式分类和非 ...
- Python与R的争锋:大数据初学者该怎样选?
在当下,人工智能的浪潮席卷而来.从AlphaGo.无人驾驶技术.人脸识别.语音对话,到商城推荐系统,金融业的风控,量化运营.用户洞察.企业征信.智能投顾等,人工智能的应用广泛渗透到各行各业,也让数据科 ...
- (数据科学学习手札29)KNN分类的原理详解&Python与R实现
一.简介 KNN(k-nearst neighbors,KNN)作为机器学习算法中的一种非常基本的算法,也正是因为其原理简单,被广泛应用于电影/音乐推荐等方面,即有些时候我们很难去建立确切的模型来描述 ...
- (数据科学学习手札22)主成分分析法在Python与R中的基本功能实现
上一篇中我们详细介绍推导了主成分分析法的原理,并基于Python通过自编函数实现了挑选主成分的过程,而在Python与R中都有比较成熟的主成分分析函数,本篇我们就对这些方法进行介绍: R 在R的基础函 ...
- (数据科学学习手札23)决策树分类原理详解&Python与R实现
作为机器学习中可解释性非常好的一种算法,决策树(Decision Tree)是在已知各种情况发生概率的基础上,通过构成决策树来求取净现值的期望值大于等于零的概率,评价项目风险,判断其可行性的决策分析方 ...
- 使用R语言的RTCGA包获取TCGA数据--转载
转载生信技能树 https://mp.weixin.qq.com/s/JB_329LCWqo5dY6MLawfEA TCGA数据源 - R包RTCGA的简单介绍 - 首先安装及加载包 - 指定任意基因 ...
- R实战 第八篇:重塑数据(reshape2)
数据重塑通常使用reshape2包,reshape2包用于实现对宽数据及长数据之间的相互转换,由于reshape2包不在R的默认安装包列表中,在第一次使用之前,需要安装和引用: install.pac ...
随机推荐
- Ubantu搭建FTP
1.安装并启动 FTP 服务 安装 VSFTPD 使用 apt-get 安装 vsftpd kylin@kylin:~$ sudo apt-get install vsftpd -y [sudo] p ...
- java 如何将 word,excel,ppt如何转pdf--jacob
问题:java 如果将 word,excel,ppt如何转pdf 我个人的观点:windows server下用 jacob; linux server下 用openoffice. PS:1.本文 ...
- spring boot 中实现兼容不同的请求类型的方法。
比如一个接口,既想实现请求参数是application/json,又想实现form提交,改怎么做呢?用postman去测试,发现不可能做到两全其美. 我有一个方法,就是不用requestbody,也可 ...
- ------- 软件调试——还原 QQ 过滤驱动对关键内核设施所做的修改 -------
-------------------------------------------------------------------------------- 在前一篇博文中,我们已经处理完最棘手的 ...
- Sonar 常用代码规则整理(一)
更多原创测试技术文章同步更新到微信公众号 :三国测,敬请扫码关注个人的微信号,感谢! 摘要:公司部署了一套sonar,经过一段时间运行,发现有一些问题出现频率很高,因此有必要将这些问题进行整理总结和分 ...
- BZOJ 4129: Haruna’s Breakfast [树上莫队 分块]
传送门 题意: 单点修改,求一条链的mex 分块维护权值,$O(1)$修改$O(S)$求mex...... 带修改树上莫队 #include <iostream> #include < ...
- Codevs 3990 [中国剩余定理]
模板题 注意如何得到[a,b]区间范围内的解 #include <iostream> #include <cstdio> #include <cstring> #i ...
- Xcode的SVN提示"The request timed out."的解决方案
问题描述 在利用Xcode的SourceControl进行SVN代码检出时,确认输入地址.帐号密码都正确的情况下,总是提示"The request timed out.".该问题的 ...
- win8 -telnet安装
控制面板->程序-> 启动或关闭windows功能->选择telnet服务器和telnet客户端->确定 为了安全起见,我们可以设置为手动器用telnet,右键计算机-> ...
- 配置 github 上的程序
最近学习的node.vue的单页模式,看到github (地址:https://github.com/bailicangdu/node-elm)上面有大神做了一个几十页的系统,心想怎么弄到本地研究下 ...