Hadoop 3.2.1 win10 64位系统 vs2015 编译

1        环境配置

1.1   JDK下载安装

1.1.1         下载

JDK 1.8    (jdk1.8.0_102)
   下载地址    : http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html(按照电脑版本下载安装)

1.1.2        
安装

解压到指定文件夹

(1) 安装JDK 
(2)新建系统变量JAVA_HOME=D:\Program
Files\Java\jdk1.8.0_102
 
(3)编辑系统变量Path,新增%JAVA_HOME%\bin%JAVA_HOME%\jre\bin

1.2  
Marven下载和安装

1.2.1        
下载

https://blog.csdn.net/changge458/article/details/53576178

从该网站 http://maven.apache.org/download.cgi 下载

1.2.2        
安装

解压到文件夹D:\marven\apache-maven-3.6.3

添加系统环境变量:MARVEN_HOME

D:\marven\apache-maven-3.6.3\

在系统环境变量path中加入

测试mvn是否安装成功,打开cmd,以管理员身份运行,否则容易报错:'mvn' 不是内部或外部命令,也不是可运行的程序

查看是否配置成功可在黑窗口中输入 mvn –v 出现如下图所示说明配置成功

打开D:\marven\apache-maven-3.6.3\conf\settings.xml
加入阿里云镜像,找到mirrors标签,添加如下阿里云镜像地址,marven下载库从这里下载更快速。

<mirror>

<id>nexus-aliyun</id>

<mirrorOf>central</mirrorOf>

<name>Nexus aliyun</name>

<url>http://maven.aliyun.com/nexus/content/groups/public</url>

</mirror>

1.3  
编译安装protobuff

Protobuff是用于序列化数据的,hadoop编译需要依赖这个库。

1.3.1        
下载

ProtocolBuffer
2.5.0    (两个文件protobuf-2.5.0.zip  protoc-2.5.0-win32.zip

下载地址    :
https://github.com/google/protobuf/releases/tag/v2.5.0

注:除了下载protobuf源码外,还需要下载相应版本的编译过的用于Windows平台的protoc命令(protoc-2.5.0-win32.zip),该命令用于将.proto文件转化为Java或C++源文件。

1.3.2        
安装


解压ProtocolBuffer到指定目录


解压protoc-2.5.0-win32.zip,将protoc.exe复制到C:\WorkSpace\protobuf-2.5.0\src目录下


安装ProtocolBuffer,打开CMD命令行

cd
C:\WorkSpace\protobuf-2.5.0\java

mvn
test

mvn
install

protoc
–version    这个命令失败

protoc.exe所在路径C:\WorkSpace\protobuf-2.5.0\src,添加到系统变量Path

1.4  
Git下载和安装

Git是在window的cmd黑窗口中,可以使用linux的命令。不安装会出现错误

Failed
to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on
project hadoop-project-dist: Command execution failed.: Cannot run program
"bash" (in directory
"D:\hadoop\hadoop-3.2.1-src\hadoop-project-dist\target"):
CreateProcess error=2, 系统找不到指定的文件。 -> [Help 1]

1.4.1        
下载

https://git-for-windows.github.io/

1.4.2        
安装

(1)

(2)、
      

(3)、
      

(4)、
      

(5)、

1.5  
CMake 安装

1.5.1        
下载

https://cmake.org/download/

Windows
win64-x64 ZIP  cmake-3.16.0-win64-x64.zip

1.5.2        
安装

解压到指定文件夹即可,path环境变量添加bin路径。

D:\hadoop\cmake-3.16.0-win64-x64\bin

1.6  
Zlib下载安装

1.6.1        
下载

http://jaist.dl.sourceforge.net/project/libpng/zlib/1.2.8/

1.6.2        
安装

解压到指定文件夹,添加zlib系统变量

2       
编译hadoop

2.1  
升级项目Visual Studio的版本

编译hadoop时,自动编译会去根据项目文件找到编译器visual studio去编译,hadoop3.2.1默认是VS2010,现在要用VS2015去编译,需要将项目文件winutils.sln和native.sln升级为vs2015。由于Visual
Studio版本问题,官方默认使用Visual Studio 2010 Professional,但本文采用Visual
Studio 2015,因此对于生成失败的项目,需要用Visual Studio 2015重新打开,会弹出升级的对话框,升级项目至Visual Studio 2015版本即可:

(1)      
Window工具编译

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\winutils

(2)hadoop.dll编译

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\native

这个项目编译输出的就是hadoop.dll文件,编辑编译生成按钮,会报很多头文件错误,是因为投啊文件件包含了..\..\..\target\native\javah,这个文件夹内的头文件,编译hadoop时,编译命令自动先复制头文件到此文件夹,然后才能编译。

打开cmd窗口,进入路径D:\hadoop\hadoop-3.2.1-src\,执行下面的命令:

mvn package -Pdist,native-win-DskipTests -Dtar

2.2  
Apache
Hadoop Common编译错误

2.2.1        
convert-ms-winutils错误描述

错误描述

[INFO]
Apache Hadoop Auth Examples ........................ SUCCESS [ 12.916 s]

[INFO]
Apache Hadoop Common ............................... FAILURE [02:06 min]

[INFO]
Apache Hadoop NFS .................................. SKIPPED

[INFO]
Apache Hadoop KMS .................................. SKIPPED

[INFO]
------------------------------------------------------------------------

[ERROR]
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec
(convert-ms-winutils) on project hadoop-common: Command execution failed.:
Process exited with an error: 1 (Exit value: 1) -> [Help 1]

[ERROR]

[ERROR]
To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR]
Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR]
For more information about the errors and possible solutions, please read the
following articles:

[ERROR]
[Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR]

[ERROR]
After correcting the problems, you can resume the build with the command

[ERROR]   mvn <args> -rf :hadoop-common

解决办法

(1)   确认2.1中两个项目都升级成功,sln文件都是显示Visual
studio 14

(2)   VS 2015 x64 native Tools 工具执行命令mvn
package -Pdist,native-win -DskipTests -Dtar -e -X

2.2.2        
Javah错误

错误描述

[ERROR]
Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah
(default) on project hadoop-common: Error running javah command: Error
executing command line. Exit code:1 -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException:
Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah
(default) on project hadoop-common: Error running javah command

解决办法

因为D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common路径下的pom.xml文件中,<javahPath>${env.JAVA_HOME}/bin/javah</javahPath>采用env.JAVA_HOME
无法识别,所以讲pom.xml文件中换成javah的绝对路径:D:\Java\jdk1.8.0_181\bin\,一共有两个地方。

然后命令加入参数执行–rf :hadoop-common 表示从hadoop-common开始编译:

mvn
package -Pdist,native-win-DskipTests –Dtar –rf :hadoop-common

2.3 
HDFS Native Client编译错误

2.3.1        
错误描述

[INFO]
Apache Hadoop HDFS Native Client ................... FAILURE [02:26 min]

[ERROR]
Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run
(make) on project hadoop-hdfs-native-client: An Ant BuildException has occured:
exec returned: 1

[ERROR]
around Ant part ...<exec failonerror="true"
dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native"
executable="cmake">... @ 5:140 in
D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR]
-> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException:
Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run
(make) on project hadoop-hdfs-native-client: An Ant BuildException has occured:
exec returned: 1

around
Ant part ...<exec failonerror="true"
dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native"
executable="cmake">... @ 5:140 in
D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

2.3.2        
解决方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\pom.xml文件打开,修改如下部分的true为false;

2.4 
hadoop-hdfs-native-client :RelWithDebInfo编译错误

2.4.1        
错误描述

Finished
at: 2019-12-01T18:20:30+08:00

[INFO]
------------------------------------------------------------------------

[ERROR]
Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run
(make) on project hadoop-hdfs-native-client: An Ant BuildException has occured:
D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo
does not exist.

[ERROR]
around Ant part ...<copy
todir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/bin">...
@ 13:101 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR]
-> [Help 1]

2.4.2        
解决方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo
does not exist.错误是这个目录不存在,则创建这个目录

2.5 
执行maven-plugin:1.3.1:exec
(pre-dist)失败

2.5.1        
错误描述

Failed
to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on
project hadoop-hdfs-native-client: Command execution failed.: Process exited
with an error: 1 (Exit value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException:
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec
(pre-dist) on project hadoop-hdfs-native-client: Command execution failed.

2.5.2        
解决办法

下载安装Cygwin
下载:http://www.cygwin.com/setup-x86_64.exe ,安装
配置D:\cygwin64\bin到PATH

然后打开cygwin64的terminal窗口,执行命令mvn package -Pdist,native-win
-DskipTests -Dtar -e -X -rf :hadoop-hdfs-native-client

2.6 
Msbuild编译错误

2.6.1        
错误描述

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run
(make) on project hadoop-hdfs-native-client:
An Ant BuildException has occured: Execute failed:
java.io.IOException: Cannot run program "msbuild" (in directory
"D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native"):
CreateProcess error=2, ϵͳ▒Ҳ▒▒▒ָ▒▒▒▒▒ļ▒▒▒

[ERROR] around Ant part ...<exec failonerror="false"
dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native"
executable="msbuild">... @ 9:143 in
D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

2.6.2        
解决办法

下载安装vscode  https://code.visualstudio.com/docs/?dv=win

将路径C:\Program Files
(x86)\MSBuild\14.0\Bin添加到path环境变量。

2.7 
maven-surefire-plugin编译错误

2.7.1        
错误描述

Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M1:test
(default-test)

2.7.2        
解决方法

在D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\pom.xml找到org.apache.maven.plugins,在configuration中添加<testFailureIgnore>true</testFailureIgnore>屏蔽测试错误。

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-surefire-plugin</artifactId>

<configuration>

<testFailureIgnore>true</testFailureIgnore>

<forkCount>${testsThreadCount}</forkCount>

<reuseForks>false</reuseForks>

<argLine>${maven-surefire-plugin.argLine}
-DminiClusterDedicatedDirs=true</argLine>

<systemPropertyVariables>

<testsThreadCount>${testsThreadCount}</testsThreadCount>

<test.build.data>${test.build.data}/${surefire.forkNumber}</test.build.data>

<test.build.dir>${test.build.dir}/${surefire.forkNumber}</test.build.dir>

<hadoop.tmp.dir>${hadoop.tmp.dir}/${surefire.forkNumber}</hadoop.tmp.dir>

<!-- Due to a Maven quirk,
setting this to just -->

<!-- surefire.forkNumber
won't do the parameter substitution. -->

<!-- Putting a prefix in
front of it like "fork-" makes it -->

<!-- work. -->

<test.unique.fork.id>fork-${surefire.forkNumber}</test.unique.fork.id>

</systemPropertyVariables>

</configuration>

</plugin>

2.8 
Apache Hadoop Distribution编译错误

2.8.1        
错误描述

[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (dist) on project hadoop-dist:
Command execution failed.: Process exited with an error: 1 (Exit value: 1)
-> [Help 1]

2.8.2        
解决办法

执行命令mvn package -Pdist,native-win
-DskipTests -Dtar -e -X -rf :hadoop-dist,加上-e和-X参数显示详细的错误。会发现是目录结构问题,找不到如下路径。解决方法就是创建缺少的路径,并将生成的jar包成果物复制到路径下

[DEBUG] Executing command line: [bash,
D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching,
3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

cp: cannot stat
'/d/hadoop/hadoop-3.2.1-src/hadoop-common-project/hadoop-kms/target/hadoop-kms-3.2.1/*': No such file or
directory

2.9 
hadoop-dist: exec (toolshooks)编译错误

2.9.1        
错误描述

Executing command line: [bash,
D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker,
3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target,
D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker:
line 137:
D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new:
No such file or directory

mv: cannot stat
'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new':
No such file or directory

Rewriting
D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

[INFO]
------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop
Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution
......................... FAILURE [ 25.920 s]

[INFO] Apache Hadoop Client Modules
....................... SKIPPED

[INFO] Apache Hadoop Cloud Storage
........................ SKIPPED

[INFO] Apache Hadoop Cloud Storage Project
................ SKIPPED

[INFO]
------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO]
------------------------------------------------------------------------

[INFO] Total time:  33.530 s

[INFO] Finished at:
2019-12-08T11:56:12+08:00

[INFO]
------------------------------------------------------------------------

[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (toolshooks) on project
hadoop-dist: Command execution failed.: Process exited with an error: 1 (Exit
value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException:
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec
(toolshooks) on project hadoop-dist: Command execution failed.

2.9.2        
解决方法

(1)错误原因和2.8相似,缺少了文件夹和文件,按照错误提示创建文件夹,找到缺少的文件放入指定文件夹。

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker,
3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target,
D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker:
line 137:
D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new:
No such file or directory

mv: cannot stat
'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new':
No such file or directory

Rewriting
D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

(2)根据描述是添加了/etc/hadoop文件夹,并且从D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\conf
\ 中追到了hadoop-env.sh复制过去,但是发现在执行编译时,总是会自动删除这个文件夹和文件。在执行下面的命令时,删除的:

[DEBUG] Executing command line: [bash,
D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching,
3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

Current directory /d/hadoop/hadoop-3.2.1-src/hadoop-dist/target

$ rm -rf hadoop-3.2.1

$ mkdir hadoop-3.2.1

$ cd hadoop-3.2.1

(3)打开D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching文件,发现dist-layout-stitching中有一个语句run rm
-rf "hadoop-${VERSION}"会删除hadoop-3.2.1文件夹,然后再新建,新建之后再复制,所以怎么加etc/hadoop文件夹都会被删除

(4)在文件末尾处添加如下代码,用代码去创建文件夹,并复制hadoop-env.sh文件。

run mkdir "etc"

run cd "etc"

run mkdir "hadoop"

run cd "hadoop"

run copy 
"${ROOT}\hadoop-common-project\hadoop-common\src\main\conf\hadoop-env.sh"

(5)执行命令>mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf
:hadoop-dist

3      
大工告成

[DEBUG]  
(f) siteDirectory =
D:\hadoop\hadoop-3.2.1-src\hadoop-cloud-storage-project\src\site

[DEBUG]  
(f) skip = false

[DEBUG] -- end configuration --

[INFO] No site descriptor found: nothing to
attach.

[INFO]
------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop
Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution
......................... SUCCESS [ 39.961 s]

[INFO] Apache Hadoop Client Modules
....................... SUCCESS [  5.721
s]

[INFO] Apache Hadoop Cloud Storage
........................ SUCCESS [  10:36
h]

[INFO] Apache Hadoop Cloud Storage Project
................ SUCCESS [  1.471 s]

[INFO]
------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO]
------------------------------------------------------------------------

[INFO] Total time:  10:37 h

[INFO] Finished at:
2019-12-09T10:05:07+08:00

[INFO]
------------------------------------------------------------------------

自己开发了一个股票智能分析软件,功能很强大,需要的点击下面的链接获取:

https://www.cnblogs.com/bclshuai/p/11380657.html

参考文献

https://blog.csdn.net/qq_37475168/article/details/90746823

Hadoop 3.2.1 win10 64位系统 vs2015 编译的更多相关文章

  1. (win10 64位系统中)Visual Studio 2015+OpenCV 3.3.0环境搭建,100%成功

    (win10 64位系统中)Visual Studio 2015+OpenCV 3.3.0环境搭建,100%成功 1.下载opencv 官网http://opencv.org/下载windows版Op ...

  2. Ubuntu12.04安装64位系统出现编译错误error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or dir

    问题: Ubuntu12.04安装64位系统出现编译错误error while loading shared libraries: libz.so.1: cannot open shared obje ...

  3. Win10 64位系统ADO方式操作数据库失败解决方法

    VC操作Access数据库一般通过ODBC.ADO.DAO等方式,但在我的Win10 64位操作系统中,通过ADO方式操作数据库会失败,无法读取数据.解决方法:1.首先确保Win10操作系统ado目录 ...

  4. WIN10 64位下VS2015 MFC直接添加 halcon 12的CPP文件实现视觉检测

    近段时间开始接触halcon,但是在VS2015里面使用,无论是配置还是生产EXE文件,都不如意. 加上网上的教程很多,经过多次测试,其实有很多地方无需修改,如果修改的太多也失去了直接添加封装的意义. ...

  5. win7在64位系统下编译openssl

    曾经在笔记本上配置过openssl-0.9.8g版本号,今天在公司的台式机上配置死活没配置成功.机器的系统是win7,64位.编译openssl-1.0.1e出现各种莫名的错误,最后无意中编译了1.0 ...

  6. WIN10 64位系统 如何安装.NET Framwork3.5

    把SXS文件夹复制到C盘根目录,然后以管理员身份运行CMD,大概2分钟能完成,然后这个SXS文件夹就可以删了        

  7. 《转载》WIN10 64位系统 32位Python2.7 PIL安装

    http://blog.csdn.net/kanamisama0/article/details/53960281 首先安装这个真的出了好多问题,之前装过一次PIL也失败了,就一直没管,今天刚好找了机 ...

  8. 解决win10 64位系统可用2.99g

    msconfig-->引导-->高级选项-->最大内存勾去掉

  9. win10 64位系统中安装多个jdk版本的切换问题

    前言: 近期要更换oracle jdk到zulu jdk,因此在本地先安装一版zulu的来进行代码的编译和比较. 注释: 本地电脑之前是oracle jdk 1.8,要更换为zulu jdk 1.8. ...

随机推荐

  1. 适配方案(七)iPhone各种系统分辨率、屏幕分辨率

  2. 清北学堂-DAY2-数论专题-中国剩余定理(CRT)

    首先请看定义:(百科上抄下来的)孙子定理是中国古代求解一次同余式组(见同余)的方法.是数论中一个重要定理.又称中国余数定理. 一元线性同余方程组问题最早可见于中国南北朝时期(公元5世纪)的数学著作&l ...

  3. iOS分类(category),类扩展(extension)—史上最全攻略

    背景: 在大型项目,企业级开发中多人同时维护同一个类,此时程序员A因为某项需求只想给当前类currentClass添加一个方法newMethod,那该怎么办呢? 最简单粗暴的方式是把newMethod ...

  4. 配置Hadoop,hive,spark,hbase ————待整理

    五一一天在家搭建好了集群,要上班了来不及整理,待下周周末有时间好好整理整理一个完整的搭建hadoop生态圈的集群的系列 若出现license information(license not accep ...

  5. Microsoft 中间语言

  6. MySQL数据库的二进制安装、源码编译和基础入门操作

    一.MySQL安装 (1)安装方式: 1 .程序包yum安装 优点:安装快,简单 缺点:定死了各个文件的地方,需要修改里边的相关配置文件,很麻烦 2 .二进制格式的程序包:展开至特定路径,并经过简单配 ...

  7. python链接sql server 乱码问题

    import pymssql import sys import os reload(sys) sys.setdefaultencoding('utf-8') os.environ['NLS_LANG ...

  8. 网络设备驱动程序-netdevice结构体关键部分注释

    仅仅做个记录,内核4.19 struct net_device { char name[IFNAMSIZ]; //网络设备的名称 struct hlist_node name_hlist; char ...

  9. python 学习笔记_2 模拟socket编程 服务端、客户端通信(参考核心编程2代码实现)

    服务器端代码实现: #!/usr/bin/env python#coding=gbk'''接收客户端字符串,在字段串前面打上当前时间,然后返回server端采用 python2 linux下调试运行客 ...

  10. Vim键盘图-红色圈标记为重点