apache-storm-0.9.6.tar.gz的集群搭建(3节点)(图文详解)
不多说,直接上干货!
Storm的版本选取
我这里,是选用apache-storm-0.9.6.tar.gz
Storm的本地模式安装
本地模式在一个进程里面模拟一个storm集群的所有功能, 这对开发和测试来说非常方便。以本地模式运行topology跟在集群上运行topology类似。
要创建一个进程内“集群”,使用LocalCluster对象就可以了:
import backtype.storm.LocalCluster;
LocalCluster cluster = new LocalCluster();
然后可以通过LocalCluster对象的submitTopology方法来提交topology, 效果和StormSubmitter对应的方法是一样的。submitTopology方法需要三个参数: topology的名字, topology的配置以及topology对象本身。你可以通过killTopology方法来终止一个topology, 它需要一个topology名字作为参数。
要关闭一个本地集群,简单调用:
cluster.shutdown();
就可以了。
Storm的分布式模式安装(本博文)
官方安装文档
http://storm.apache.org/releases/current/Setting-up-a-Storm-cluster.html
机器情况:在master、slave1、slave2机器的/home/hadoop/app目录下分别下载storm安装包
1、apache-storm-0.9.6.tar.gz的下载
http://archive.apache.org/dist/storm/apache-storm-0.9.6/
或者,直接在安装目录下,在线下载
wget http://apache.fayea.com/storm/apache-storm-0.9.6/apache-storm-0.9.6.tar.gz
我这里,选择先下载好,再上传安装的方式。
2、上传压缩包
[hadoop@master ~]$ cd app/
[hadoop@master app]$ ll
total
drwxrwxr-x hadoop hadoop May : azkaban
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
lrwxrwxrwx hadoop hadoop Apr : es -> elasticsearch-2.4./
lrwxrwxrwx hadoop hadoop Apr : flume -> flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.7.
lrwxrwxrwx. hadoop hadoop Apr : hadoop -> hadoop-2.6.
drwxr-xr-x. hadoop hadoop Apr : hadoop-2.6.
lrwxrwxrwx. hadoop hadoop Apr : hbase -> hbase-0.98.
drwxrwxr-x. hadoop hadoop Apr : hbase-0.98.
lrwxrwxrwx. hadoop hadoop Apr : hive -> hive-1.0.
drwxrwxr-x. hadoop hadoop May : hive-1.0.
lrwxrwxrwx. hadoop hadoop Apr : jdk -> jdk1..0_79
drwxr-xr-x. hadoop hadoop Apr jdk1..0_79
drwxr-xr-x. hadoop hadoop Aug jdk1..0_60
lrwxrwxrwx hadoop hadoop May : kafka -> kafka_2.-0.8.2.2
drwxr-xr-x hadoop hadoop May : kafka_2.-0.8.2.2
lrwxrwxrwx hadoop hadoop Apr : kibana -> kibana-4.6.-linux-x86_64/
drwxrwxr-x hadoop hadoop Nov kibana-4.6.-linux-x86_64
lrwxrwxrwx hadoop hadoop May : snappy -> snappy-1.1.
drwxr-xr-x hadoop hadoop May : snappy-1.1.
lrwxrwxrwx. hadoop hadoop Apr : sqoop -> sqoop-1.4.
drwxr-xr-x. hadoop hadoop May : sqoop-1.4.
lrwxrwxrwx. hadoop hadoop Apr : zookeeper -> zookeeper-3.4.
drwxr-xr-x. hadoop hadoop Apr : zookeeper-3.4.
[hadoop@master app]$ rz [hadoop@master app]$ ll
total
-rw-r--r-- hadoop hadoop May : apache-storm-0.9..tar.gz
drwxrwxr-x hadoop hadoop May : azkaban
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
lrwxrwxrwx hadoop hadoop Apr : es -> elasticsearch-2.4./
lrwxrwxrwx hadoop hadoop Apr : flume -> flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.7.
lrwxrwxrwx. hadoop hadoop Apr : hadoop -> hadoop-2.6.
drwxr-xr-x. hadoop hadoop Apr : hadoop-2.6.
lrwxrwxrwx. hadoop hadoop Apr : hbase -> hbase-0.98.
drwxrwxr-x. hadoop hadoop Apr : hbase-0.98.
lrwxrwxrwx. hadoop hadoop Apr : hive -> hive-1.0.
drwxrwxr-x. hadoop hadoop May : hive-1.0.
lrwxrwxrwx. hadoop hadoop Apr : jdk -> jdk1..0_79
drwxr-xr-x. hadoop hadoop Apr jdk1..0_79
drwxr-xr-x. hadoop hadoop Aug jdk1..0_60
lrwxrwxrwx hadoop hadoop May : kafka -> kafka_2.-0.8.2.2
drwxr-xr-x hadoop hadoop May : kafka_2.-0.8.2.2
lrwxrwxrwx hadoop hadoop Apr : kibana -> kibana-4.6.-linux-x86_64/
drwxrwxr-x hadoop hadoop Nov kibana-4.6.-linux-x86_64
lrwxrwxrwx hadoop hadoop May : snappy -> snappy-1.1.
drwxr-xr-x hadoop hadoop May : snappy-1.1.
lrwxrwxrwx. hadoop hadoop Apr : sqoop -> sqoop-1.4.
drwxr-xr-x. hadoop hadoop May : sqoop-1.4.
lrwxrwxrwx. hadoop hadoop Apr : zookeeper -> zookeeper-3.4.
drwxr-xr-x. hadoop hadoop Apr : zookeeper-3.4.
[hadoop@master app]$
slave1和slave2机器同样。不多赘述。
3、解压压缩包,并赋予用户组和用户权限
[hadoop@master app]$ ll
total
-rw-r--r-- hadoop hadoop May : apache-storm-0.9..tar.gz
drwxrwxr-x hadoop hadoop May : azkaban
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
lrwxrwxrwx hadoop hadoop Apr : es -> elasticsearch-2.4./
lrwxrwxrwx hadoop hadoop Apr : flume -> flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.7.
lrwxrwxrwx. hadoop hadoop Apr : hadoop -> hadoop-2.6.
drwxr-xr-x. hadoop hadoop Apr : hadoop-2.6.
lrwxrwxrwx. hadoop hadoop Apr : hbase -> hbase-0.98.
drwxrwxr-x. hadoop hadoop Apr : hbase-0.98.
lrwxrwxrwx. hadoop hadoop Apr : hive -> hive-1.0.
drwxrwxr-x. hadoop hadoop May : hive-1.0.
lrwxrwxrwx. hadoop hadoop Apr : jdk -> jdk1..0_79
drwxr-xr-x. hadoop hadoop Apr jdk1..0_79
drwxr-xr-x. hadoop hadoop Aug jdk1..0_60
lrwxrwxrwx hadoop hadoop May : kafka -> kafka_2.-0.8.2.2
drwxr-xr-x hadoop hadoop May : kafka_2.-0.8.2.2
lrwxrwxrwx hadoop hadoop Apr : kibana -> kibana-4.6.-linux-x86_64/
drwxrwxr-x hadoop hadoop Nov kibana-4.6.-linux-x86_64
lrwxrwxrwx hadoop hadoop May : snappy -> snappy-1.1.
drwxr-xr-x hadoop hadoop May : snappy-1.1.
lrwxrwxrwx. hadoop hadoop Apr : sqoop -> sqoop-1.4.
drwxr-xr-x. hadoop hadoop May : sqoop-1.4.
lrwxrwxrwx. hadoop hadoop Apr : zookeeper -> zookeeper-3.4.
drwxr-xr-x. hadoop hadoop Apr : zookeeper-3.4.
[hadoop@master app]$ tar -zxvf apache-storm-0.9..tar.gz
slave1和slave2机器同样。不多赘述。
4、删除压缩包,为了更好容下多版本,创建软链接
大数据各子项目的环境搭建之建立与删除软连接(博主推荐)
[hadoop@master app]$ ll
total
drwxrwxr-x hadoop hadoop May : apache-storm-0.9.
-rw-r--r-- hadoop hadoop May : apache-storm-0.9..tar.gz
drwxrwxr-x hadoop hadoop May : azkaban
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
drwxrwxr-x hadoop hadoop Apr : elasticsearch-2.4.
lrwxrwxrwx hadoop hadoop Apr : es -> elasticsearch-2.4./
lrwxrwxrwx hadoop hadoop Apr : flume -> flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.6.
drwxrwxr-x hadoop hadoop Apr : flume-1.7.
lrwxrwxrwx. hadoop hadoop Apr : hadoop -> hadoop-2.6.
drwxr-xr-x. hadoop hadoop Apr : hadoop-2.6.
lrwxrwxrwx. hadoop hadoop Apr : hbase -> hbase-0.98.
drwxrwxr-x. hadoop hadoop Apr : hbase-0.98.
lrwxrwxrwx. hadoop hadoop Apr : hive -> hive-1.0.
drwxrwxr-x. hadoop hadoop May : hive-1.0.
lrwxrwxrwx. hadoop hadoop Apr : jdk -> jdk1..0_79
drwxr-xr-x. hadoop hadoop Apr jdk1..0_79
drwxr-xr-x. hadoop hadoop Aug jdk1..0_60
lrwxrwxrwx hadoop hadoop May : kafka -> kafka_2.-0.8.2.2
drwxr-xr-x hadoop hadoop May : kafka_2.-0.8.2.2
lrwxrwxrwx hadoop hadoop Apr : kibana -> kibana-4.6.-linux-x86_64/
drwxrwxr-x hadoop hadoop Nov kibana-4.6.-linux-x86_64
lrwxrwxrwx hadoop hadoop May : snappy -> snappy-1.1.
drwxr-xr-x hadoop hadoop May : snappy-1.1.
lrwxrwxrwx. hadoop hadoop Apr : sqoop -> sqoop-1.4.
drwxr-xr-x. hadoop hadoop May : sqoop-1.4.6
lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6
[hadoop@master app]$ rm apache-storm-0.9.6.tar.gz
[hadoop@master app]$ ln -s apache-storm-0.9.6/ storm
[hadoop@master app]$ ll
total 64
drwxrwxr-x 9 hadoop hadoop 4096 May 21 13:15 apache-storm-0.9.6
drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban
drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0
drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3
lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/
lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0
drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0
drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0
lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0
lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19
drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19
lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0
drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0
lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79
drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79
drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60
lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2
drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2
lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/
drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64
lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3
drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3
lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6
drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6
lrwxrwxrwx 1 hadoop hadoop 19 May 21 13:17 storm -> apache-storm-0.9.6/
lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6
[hadoop@master app]$
slave1和slave2机器同样。不多赘述。
5、修改配置环境
[hadoop@master app]$ su root
Password:
[root@master app]# vim /etc/profile
slave1和slave2机器同样。不多赘述
#storm
export STORM_HOME=/home/hadoop/app/storm
export PATH=$PATH:$STORM_HOME/bin
slave1和slave2机器同样。不多赘述
[hadoop@master app]$ su root
Password:
[root@master app]# vim /etc/profile
[root@master app]# source /etc/profile
[root@master app]#
slave1和slave2机器同样。不多赘述
6、下载好Storm集群所需的其他
因为博主我的机器是CentOS6.5,已经自带了
[hadoop@master ~]$ python
Python 2.6. (r266:, Nov , ::)
[GCC 4.4. (Red Hat 4.4.-)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
7、配置storm的配置文件
[hadoop@master storm]$ pwd
/home/hadoop/app/storm
[hadoop@master storm]$ ll
total
drwxrwxr-x hadoop hadoop May : bin
-rw-r--r-- hadoop hadoop Oct CHANGELOG.md
drwxrwxr-x hadoop hadoop May : conf
-rw-r--r-- hadoop hadoop Oct DISCLAIMER
drwxr-xr-x hadoop hadoop Oct examples
drwxrwxr-x hadoop hadoop May : external
drwxrwxr-x hadoop hadoop May : lib
-rw-r--r-- hadoop hadoop Oct LICENSE
drwxrwxr-x hadoop hadoop May : logback
-rw-r--r-- hadoop hadoop Oct NOTICE
drwxrwxr-x hadoop hadoop May : public
-rw-r--r-- hadoop hadoop Oct README.markdown
-rw-r--r-- hadoop hadoop Oct RELEASE
-rw-r--r-- hadoop hadoop Oct SECURITY.md
[hadoop@master storm]$
slave1和slave2机器同样。不多赘述
进入storm配置目录下,修改配置文件storm.yaml
[hadoop@master conf]$ pwd
/home/hadoop/app/storm/conf
[hadoop@master conf]$ ll
total
-rw-r--r-- hadoop hadoop Oct storm_env.ini
-rw-r--r-- hadoop hadoop Oct storm.yaml
[hadoop@master conf]$ vim storm.yaml
slave1和slave2机器同样。不多赘述
这里,教给大家一个非常好的技巧。
大数据搭建各个子项目时配置文件技巧(适合CentOS和Ubuntu系统)(博主推荐)
注意第一列需要一个空格
storm.zookeeper.servers:
- "master"
- "slave1"
- "slave2" nimbus.host: "master"
ui.port: storm.local.dir: "/home/hadoop/data/storm" supervisor.slots.ports:
-
-
注意:我的这里ui.port选定为9999,是自定义,为了解决Storm 和spark默认的 8080 端口冲突!
supervisor.slots.ports,我这里是两个,因为我只有slave和slave2.
slave1和slave2机器同样。不多赘述。
8、新建storm数据存储的路径目录
[hadoop@master conf]$ mkdir -p /home/hadoop/data/storm
slave1和slave2机器同样。不多赘述
9、启动storm集群
1、先在master上启动
storm nimbus &
jps出现nimbus
2、再在master上启动
storm ui &
jps出现core
3、最后在slave1和slave2上启动 supervisor
storm supervisor &
jps出现supervisor
或者直接用后台方式来运行(推荐)
- 启动nimbus后台运行:bin/storm nimbus < /dev/null 2<&1 &
- 启动supervisor后台运行:bin/storm supervisor < /dev/null 2<&1 &
- 启动ui后台运行:bin/storm ui < /dev/null 2<&1 &
a) 在nimbus设备(我这里是master)上启动storm nimbus进程
[hadoop@master storm]$ jps
QuorumPeerMain
Jps
AzkabanWebServer
ResourceManager
AzkabanExecutorServer
NameNode
SecondaryNameNode
[hadoop@master storm]$ storm nimbus &
[]
[hadoop@master storm]$ jps
QuorumPeerMain
AzkabanWebServer
ResourceManager
AzkabanExecutorServer
config_value
NameNode
SecondaryNameNode
Jps
[hadoop@master storm]$ jps
QuorumPeerMain
AzkabanWebServer
Jps
ResourceManager
AzkabanExecutorServer
config_value
NameNode
SecondaryNameNode
[hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9. -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9./logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9./lib/tools.macro-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/objen
esis-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-core-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-fileupload-1.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/clojure-1.5..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9./lib/hiccup-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-core-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.logging-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-stacktrace-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9./lib/logback-classic-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.cli-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-logging-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/disruptor-2.10..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-util-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/compojure-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-devel-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9./lib/carbonite-1.4..jar:/home/hadoop/app/apache-storm-0.9./lib/chill-java-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-jetty-adapter-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-servlet-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/storm-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-time-0.4..jar:/home/hadoop/app/apache-storm-0.9./lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9./lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/core.incubator-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/log4j-over-slf4j-1.6..jar:/home/hadoop/app/apache-storm-0.9./lib/clout-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9./lib/jgrapht-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/math.numeric-tower-0.0..jar:/home/hadoop/app/apache-storm-0.9./lib/slf4j-api-1.7..jar:/home/hadoop/app/apache-storm-0.9./lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9./conf -Xmx1024m -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9./logback/cluster.xml backtype.storm.daemon.nimbus
[hadoop@master storm]$ jps
QuorumPeerMain
Jps
AzkabanWebServer
ResourceManager
AzkabanExecutorServer
NameNode
SecondaryNameNode
[hadoop@master storm]$ storm nimbus &
[]
[hadoop@master storm]$ jps
QuorumPeerMain
AzkabanWebServer
ResourceManager
AzkabanExecutorServer
config_value
NameNode
SecondaryNameNode
Jps
[hadoop@master storm]$ jps
QuorumPeerMain
AzkabanWebServer
Jps
ResourceManager
AzkabanExecutorServer
config_value
NameNode
SecondaryNameNode
[hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9. -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9./logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9./lib/tools.macro-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/objen
esis-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-core-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-fileupload-1.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/clojure-1.5..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9./lib/hiccup-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-core-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.logging-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-stacktrace-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9./lib/logback-classic-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.cli-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-logging-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/disruptor-2.10..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-util-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/compojure-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-devel-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9./lib/carbonite-1.4..jar:/home/hadoop/app/apache-storm-0.9./lib/chill-java-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-jetty-adapter-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-servlet-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/storm-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-time-0.4..jar:/home/hadoop/app/apache-storm-0.9./lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9./lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/core.incubator-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/log4j-over-slf4j-1.6..jar:/home/hadoop/app/apache-storm-0.9./lib/clout-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9./lib/jgrapht-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/math.numeric-tower-0.0..jar:/home/hadoop/app/apache-storm-0.9./lib/slf4j-api-1.7..jar:/home/hadoop/app/apache-storm-0.9./lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9./conf -Xmx1024m -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9./logback/cluster.xml backtype.storm.daemon.nimbus
b) 在nimbus设备(我这里是master)上启动storm ui进程
[hadoop@master storm]$ storm ui &
[]
[hadoop@master storm]$ jps
QuorumPeerMain
Jps
AzkabanWebServer
ResourceManager
AzkabanExecutorServer
config_value
NameNode
SecondaryNameNode
[hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9. -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9./logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9./lib/tools.macro-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-core-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-fileupload-1.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/clojure-1.5..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9./lib/hiccup-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-core-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.logging-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-stacktrace-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9./lib/logback-classic-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.cli-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-logging-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/disruptor-2.10..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-util-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/compojure-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-devel-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9./lib/carbonite-1.4..jar:/home/hadoop/app/apache-storm-0.9./lib/chill-java-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-jetty-adapter-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-servlet-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/storm-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-time-0.4..jar:/home/hadoop/app/apache-storm-0.9./lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9./lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/core.incubator-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/log4j-over-slf4j-1.6..jar:/home/hadoop/app/apache-storm-0.9./lib/clout-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9./lib/jgrapht-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/math.numeric-tower-0.0..jar:/home/hadoop/app/apache-storm-0.9./lib/slf4j-api-1.7..jar:/home/hadoop/app/apache-storm-0.9./lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.:/home/hadoop/app/apache-storm-0.9./conf -Xmx768m -Dlogfile.name=ui.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9./logback/cluster.xml backtype.storm.ui.core
c) 在slave1和slave2设备上分别启动storm supervisor进程
storm supervisor &
[hadoop@slave1 storm]$ jps
NodeManager
DataNode
QuorumPeerMain
Jps
[hadoop@slave1 storm]$ storm supervisor &
[]
[hadoop@slave1 storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9. -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9./logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9./lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-core-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-devel-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/carbonite-1.4..jar:/home/hadoop/app/apache-storm-0.9./lib/slf4j-api-1.7..jar:/home/hadoop/app/apache-storm-0.9./lib/math.numeric-tower-0.0..jar:/home/hadoop/app/apache-storm-0.9./lib/log4j-over-slf4j-1.6..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9./lib/commons-logging-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.logging-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9./lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/disruptor-2.10..jar:/home/hadoop/app/apache-storm-0.9./lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/clout-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/chill-java-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/clojure-1.5..jar:/home/hadoop/app/apache-storm-0.9./lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9./lib/clj-stacktrace-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9./lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/jgrapht-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-jetty-adapter-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-classic-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-util-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/logback-core-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-time-0.4..jar:/home/hadoop/app/apache-storm-0.9./lib/hiccup-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/core.incubator-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-servlet-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9./lib/tools.macro-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-fileupload-1.2..jar:/home/hadoop/app/apache-storm-0.9./lib/storm-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/compojure-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.cli-0.2..jar:/home/hadoop/app/apache-storm-0.9./conf -Xmx256m -Dlogfile.name=supervisor.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9./logback/cluster.xml backtype.storm.daemon.supervisor
[hadoop@slave2 storm]$ jps
NodeManager
DataNode
Jps
QuorumPeerMain
[hadoop@slave2 storm]$ storm supervisor &
[]
[hadoop@slave2 storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9. -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9./logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9./lib/ring-core-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9./lib/math.numeric-tower-0.0..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-jetty-adapter-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9./lib/ring-devel-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9./lib/core.incubator-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9./lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9./lib/clout-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9./lib/disruptor-2.10..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-util-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/hiccup-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.macro-0.1..jar:/home/hadoop/app/apache-storm-0.9./lib/jgrapht-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/storm-core-0.9..jar:/home/hadoop/app/apache-storm-0.9./lib/tools.cli-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/clj-stacktrace-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/jetty-6.1..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9./lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/log4j-over-slf4j-1.6..jar:/home/hadoop/app/apache-storm-0.9./lib/clojure-1.5..jar:/home/hadoop/app/apache-storm-0.9./lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9./lib/tools.logging-0.2..jar:/home/hadoop/app/apache-storm-0.9./lib/chill-java-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/slf4j-api-1.7..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-logging-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-core-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9./lib/clj-time-0.4..jar:/home/hadoop/app/apache-storm-0.9./lib/ring-servlet-0.3..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-fileupload-1.2..jar:/home/hadoop/app/apache-storm-0.9./lib/logback-classic-1.0..jar:/home/hadoop/app/apache-storm-0.9./lib/carbonite-1.4..jar:/home/hadoop/app/apache-storm-0.9./lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9./lib/compojure-1.1..jar:/home/hadoop/app/apache-storm-0.9./lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9./conf -Xmx256m -Dlogfile.name=supervisor.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9./logback/cluster.xml backtype.storm.daemon.supervisor
10、查看storm u集群
http://192.168.80.145:9999/index.html
成功!,其他的大家,自行去看吧,这里不多赘述了。
apache-storm-0.9.6.tar.gz的集群搭建(3节点)(图文详解)的更多相关文章
- apache-storm-1.0.2.tar.gz的集群搭建(3节点)(图文详解)(非HA和HA)
不多说,直接上干货! Storm的版本选取 我这里,是选用apache-storm-1.0.2.tar.gz apache-storm-0.9.6.tar.gz的集群搭建(3节点)(图文详解) 为什么 ...
- hadoop-2.6.0.tar.gz的集群搭建(3节点)(不含zookeeper集群安装)
前言 本人呕心沥血所写,经过好一段时间反复锤炼和整理修改.感谢所参考的博友们!同时,欢迎前来查阅赏脸的博友们收藏和转载,附上本人的链接http://www.cnblogs.com/zlslch/p/5 ...
- Centos7.4 Storm2.0.0 + Zookeeper3.5.5 高可用集群搭建
想了下还是把kafka集群和storm集群分开比较好 集群规划: Nimbus Supervisor storm01 √ √ storm02 √(备份) √ storm03 √ 准备工作 老样子复制三 ...
- spark2.4.0+hadoop2.8.3全分布式集群搭建
集群环境 hadoop-2.8.3搭建详细请查看hadoop系列文章 scala-2.11.12环境请查看scala系列文章 jdk1.8.0_161 spark-2.4.0-bin-hadoop2. ...
- Hadoop2.6.0在CentOS 7中的集群搭建
我这边给出我的集群环境是由一台主节点master和三台从节点slave组成: master 192.168.1.2 slave1 192.168.1.3 slave2 ...
- storm集群部署和配置过程详解
先整体介绍一下搭建storm集群的步骤: 设置zookeeper集群 安装依赖到所有nimbus和worker节点 下载并解压storm发布版本到所有nimbus和worker节点 配置storm ...
- 访问Storm ui界面,出现org.apache.storm.utils.NimbusLeaderNotFoundException: Could not find leader nimbus from seed hosts ["master"]. Did you specify a valid list of nimbus hosts for confi的问题解决(图文详解)
不多说,直接上干货! 前期博客 apache-storm-0.9.6.tar.gz的集群搭建(3节点)(图文详解) apache-storm-1.0.2.tar.gz的集群搭建(3节点)(图文详解)( ...
- 访问Storm ui界面,出现org.apache.storm.utils.NimbusLeaderNotFoundException: Could not find leader nimbus from seed hosts ["master" "slave1"]. Did you specify a valid list of nimbus hosts for confi的问题解决(图文详解)
不多说,直接上干货! 前期博客 apache-storm-1.0.2.tar.gz的集群搭建(3节点)(图文详解)(非HA和HA) 问题详情 org.apache.storm.utils.Nimbu ...
- 访问Storm ui界面,出现org.apache.thrift7.transport.TTransportException: java.net.ConnectException: Connection refused的问题解决(图文详解)
不多说,直接上干货! 前期博客 apache-storm-0.9.6.tar.gz的集群搭建(3节点)(图文详解) 问题详情 org.apache.thrift7.transport.TTranspo ...
随机推荐
- PKCS填充方式
1)RSA_PKCS1_PADDING 填充模式,最常用的模式要求: 输入 必须 比 RSA 钥模长(modulus) 短至少11个字节, 也就是 RSA_size(rsa) – 11.如果输入的明文 ...
- hdu - 1565 方格取数(1) && 1569 方格取数(2) (最大点权独立集)
http://acm.hdu.edu.cn/showproblem.php?pid=1565 两道题只是数据范围不同,都是求的最大点权独立集. 我们可以把下标之和为奇数的分成一个集合,把下标之和为偶数 ...
- java反射与注解结合使用(根据传入对象输出查询sql)
我们在项目开发中有很多地方使用到了注解,关于注解的定义与创建小伙伴可以参考我的文章<java注解>.有任何问题的小伙伴们可以在评论区指出哦,欢迎各位大佬指出问题. 今天我要说的是使用注解与 ...
- Hive之Order,Sort,Cluster and Distribute By
测试数据 create table sort_test( id int, name string ) row format delimited fields terminated by '\t' li ...
- Core Data 的简单使用
认识cocoa Data在ios开发中的环境情况. watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQv/font/5a6L5L2T/fontsize/400/f ...
- Websphere优化 (四个方面)举例
Websphere优化 一.简单介绍 环境 名称 版本号 server操作系统 Centos 5.6 应用server操作系统 Windows 7 Websphere版本号 WAS 7.0 数据库 O ...
- linux 下使用genymotion
在官网下载genymotion http://www.genymotion.cn/ 然后进行下面操作 1.假设本机没有virtualbox 下载一个 能够通过指令 sudo apt-get inst ...
- vuex资料
vuex最简单.最详细的入门文档 链接:https://segmentfault.com/a/1190000009404727 https://www.jb51.net/article/138239. ...
- 中文分词实践(基于R语言)
背景:分析用户在世界杯期间讨论最多的话题. 思路:把用户关于世界杯的帖子拉下来.然后做中文分词+词频统计,最后将统计结果简单做个标签云.效果例如以下: 兴许:中文分词是中文信息处理的基础.分词之后.事 ...
- npm won't install packages “npm ERR! network tunneling socket could not be established, cause=Parse Error”
昨天在使用npm安装react-native的时候一直报网络不能connection,可是在浏览器中直接访问时是成功,搜索百度无果,最后在google中找到了这个解决方案:http://stackov ...