1、使用sqoop技术将mysql的数据导入到Hive出现的错误如下所示:

第一次使用命令如下所示:

  1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table tb_user --hive-import --m 1
2 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
3 Please set $HCAT_HOME to the root of your HCatalog installation.
4 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
5 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
6 18/05/18 19:57:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
7 18/05/18 19:57:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
8 18/05/18 19:57:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
9 18/05/18 19:57:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
10 18/05/18 19:57:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
11 18/05/18 19:57:51 INFO tool.CodeGenTool: Beginning code generation
12 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
13 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
14 18/05/18 19:57:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
15 Note: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.java uses or overrides a deprecated API.
16 Note: Recompile with -Xlint:deprecation for details.
17 18/05/18 19:57:54 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.jar
18 18/05/18 19:57:54 WARN manager.MySQLManager: It looks like you are importing from mysql.
19 18/05/18 19:57:54 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
20 18/05/18 19:57:54 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
21 18/05/18 19:57:54 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
22 18/05/18 19:57:54 INFO mapreduce.ImportJobBase: Beginning import of tb_user
23 18/05/18 19:57:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24 18/05/18 19:57:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
25 18/05/18 19:57:56 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
26 18/05/18 19:57:56 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
27 18/05/18 19:57:59 INFO db.DBInputFormat: Using read commited transaction isolation
28 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: number of splits:1
29 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0002
30 18/05/18 19:58:01 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0002
31 18/05/18 19:58:01 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0002/
32 18/05/18 19:58:01 INFO mapreduce.Job: Running job: job_1526642793183_0002
33 18/05/18 19:58:14 INFO mapreduce.Job: Job job_1526642793183_0002 running in uber mode : false
34 18/05/18 19:58:14 INFO mapreduce.Job: map 0% reduce 0%
35 18/05/18 19:58:30 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_0, Status : FAILED
36 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
37
38 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
39 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
40 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
41 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
42 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
43 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
44 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
45 at java.security.AccessController.doPrivileged(Native Method)
46 at javax.security.auth.Subject.doAs(Subject.java:415)
47 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
48 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
49 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
50
51 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
52 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
53 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
54 ... 9 more
55 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
56
57 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
58 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
59 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
60 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
61 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
62 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
63 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
64 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
65 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
66 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
67 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
68 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
69 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
70 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
71 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
72 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
73 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
74 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
75 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
76 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
77 at java.sql.DriverManager.getConnection(DriverManager.java:571)
78 at java.sql.DriverManager.getConnection(DriverManager.java:215)
79 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
80 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
81 ... 10 more
82 Caused by: java.net.ConnectException: Connection refused
83 at java.net.PlainSocketImpl.socketConnect(Native Method)
84 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
85 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
86 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
87 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
88 at java.net.Socket.connect(Socket.java:579)
89 at java.net.Socket.connect(Socket.java:528)
90 at java.net.Socket.<init>(Socket.java:425)
91 at java.net.Socket.<init>(Socket.java:241)
92 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
93 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
94 ... 26 more
95
96 18/05/18 19:58:37 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_1, Status : FAILED
97 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
98
99 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
100 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
101 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
102 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
103 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
104 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
105 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
106 at java.security.AccessController.doPrivileged(Native Method)
107 at javax.security.auth.Subject.doAs(Subject.java:415)
108 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
109 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
110 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
111
112 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
113 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
114 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
115 ... 9 more
116 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
117
118 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
119 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
120 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
121 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
122 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
123 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
124 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
125 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
126 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
127 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
128 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
129 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
130 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
131 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
132 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
133 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
134 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
135 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
136 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
137 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
138 at java.sql.DriverManager.getConnection(DriverManager.java:571)
139 at java.sql.DriverManager.getConnection(DriverManager.java:215)
140 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
141 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
142 ... 10 more
143 Caused by: java.net.ConnectException: Connection refused
144 at java.net.PlainSocketImpl.socketConnect(Native Method)
145 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
146 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
147 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
148 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
149 at java.net.Socket.connect(Socket.java:579)
150 at java.net.Socket.connect(Socket.java:528)
151 at java.net.Socket.<init>(Socket.java:425)
152 at java.net.Socket.<init>(Socket.java:241)
153 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
154 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
155 ... 26 more
156
157 18/05/18 19:58:44 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_2, Status : FAILED
158 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
159
160 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
161 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
162 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
163 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
164 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
165 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
166 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
167 at java.security.AccessController.doPrivileged(Native Method)
168 at javax.security.auth.Subject.doAs(Subject.java:415)
169 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
170 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
171 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
172
173 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
174 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
175 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
176 ... 9 more
177 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
178
179 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
180 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
181 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
182 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
183 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
184 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
185 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
186 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
187 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
188 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
189 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
190 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
191 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
192 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
193 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
194 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
195 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
196 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
197 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
198 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
199 at java.sql.DriverManager.getConnection(DriverManager.java:571)
200 at java.sql.DriverManager.getConnection(DriverManager.java:215)
201 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
202 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
203 ... 10 more
204 Caused by: java.net.ConnectException: Connection refused
205 at java.net.PlainSocketImpl.socketConnect(Native Method)
206 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
207 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
208 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
209 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
210 at java.net.Socket.connect(Socket.java:579)
211 at java.net.Socket.connect(Socket.java:528)
212 at java.net.Socket.<init>(Socket.java:425)
213 at java.net.Socket.<init>(Socket.java:241)
214 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
215 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
216 ... 26 more
217
218 ^C[hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$

2、第二次导入使用命令如:

主要是注意连接的主机名称别写错了,账号和密码信息。

 1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import \
2 > --connect jdbc:mysql://slaver1:3306/test \
3 > --username root \
4 > --password 123456 \
5 > --table tb_user \
6 > --hive-import \
7 > --m 1
8 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
9 Please set $HCAT_HOME to the root of your HCatalog installation.
10 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
11 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
12 18/05/18 20:01:43 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
13 18/05/18 20:01:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14 18/05/18 20:01:43 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15 18/05/18 20:01:43 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16 18/05/18 20:01:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17 18/05/18 20:01:43 INFO tool.CodeGenTool: Beginning code generation
18 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
19 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
20 18/05/18 20:01:44 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
21 Note: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.java uses or overrides a deprecated API.
22 Note: Recompile with -Xlint:deprecation for details.
23 18/05/18 20:01:47 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.jar
24 18/05/18 20:01:47 WARN manager.MySQLManager: It looks like you are importing from mysql.
25 18/05/18 20:01:47 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
26 18/05/18 20:01:47 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
27 18/05/18 20:01:47 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
28 18/05/18 20:01:47 INFO mapreduce.ImportJobBase: Beginning import of tb_user
29 18/05/18 20:01:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
30 18/05/18 20:01:48 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
31 18/05/18 20:01:49 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
32 18/05/18 20:01:49 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
33 18/05/18 20:01:52 INFO db.DBInputFormat: Using read commited transaction isolation
34 18/05/18 20:01:52 INFO mapreduce.JobSubmitter: number of splits:1
35 18/05/18 20:01:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0003
36 18/05/18 20:01:53 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0003
37 18/05/18 20:01:54 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0003/
38 18/05/18 20:01:54 INFO mapreduce.Job: Running job: job_1526642793183_0003
39 18/05/18 20:02:05 INFO mapreduce.Job: Job job_1526642793183_0003 running in uber mode : false
40 18/05/18 20:02:05 INFO mapreduce.Job: map 0% reduce 0%
41 18/05/18 20:02:16 INFO mapreduce.Job: map 100% reduce 0%
42 18/05/18 20:02:16 INFO mapreduce.Job: Job job_1526642793183_0003 completed successfully
43 18/05/18 20:02:16 INFO mapreduce.Job: Counters: 30
44 File System Counters
45 FILE: Number of bytes read=0
46 FILE: Number of bytes written=132972
47 FILE: Number of read operations=0
48 FILE: Number of large read operations=0
49 FILE: Number of write operations=0
50 HDFS: Number of bytes read=87
51 HDFS: Number of bytes written=153
52 HDFS: Number of read operations=4
53 HDFS: Number of large read operations=0
54 HDFS: Number of write operations=2
55 Job Counters
56 Launched map tasks=1
57 Other local map tasks=1
58 Total time spent by all maps in occupied slots (ms)=7821
59 Total time spent by all reduces in occupied slots (ms)=0
60 Total time spent by all map tasks (ms)=7821
61 Total vcore-seconds taken by all map tasks=7821
62 Total megabyte-seconds taken by all map tasks=8008704
63 Map-Reduce Framework
64 Map input records=10
65 Map output records=10
66 Input split bytes=87
67 Spilled Records=0
68 Failed Shuffles=0
69 Merged Map outputs=0
70 GC time elapsed (ms)=88
71 CPU time spent (ms)=1230
72 Physical memory (bytes) snapshot=100917248
73 Virtual memory (bytes) snapshot=841768960
74 Total committed heap usage (bytes)=15794176
75 File Input Format Counters
76 Bytes Read=0
77 File Output Format Counters
78 Bytes Written=153
79 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Transferred 153 bytes in 27.1712 seconds (5.631 bytes/sec)
80 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Retrieved 10 records.
81 18/05/18 20:02:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
82 18/05/18 20:02:16 INFO hive.HiveImport: Loading uploaded data into Hive
83 18/05/18 20:02:23 INFO hive.HiveImport:
84 18/05/18 20:02:23 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
85 18/05/18 20:02:36 INFO hive.HiveImport: OK
86 18/05/18 20:02:36 INFO hive.HiveImport: Time taken: 4.423 seconds
87 18/05/18 20:02:36 INFO hive.HiveImport: Loading data to table default.tb_user
88 18/05/18 20:02:38 INFO hive.HiveImport: Table default.tb_user stats: [numFiles=1, numRows=0, totalSize=153, rawDataSize=0]
89 18/05/18 20:02:38 INFO hive.HiveImport: OK
90 18/05/18 20:02:38 INFO hive.HiveImport: Time taken: 1.68 seconds
91 18/05/18 20:02:38 INFO hive.HiveImport: Hive import complete.
92 18/05/18 20:02:38 INFO hive.HiveImport: Export directory is not empty, keeping it.
93 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$

3、由于mysql导入到Hive中模式是在default中,所以查看的时候可以去default数据库中查看。

 1 [hadoop@slaver1 ~]$ hive
2
3 Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
4 hive> show databases;
5 OK
6 course
7 default
8 test0516
9 test20180509
10 Time taken: 0.759 seconds, Fetched: 4 row(s)
11 hive> use default;
12 OK
13 Time taken: 0.025 seconds
14 hive> show tables;
15 OK
16 tb_user
17 user
18 Time taken: 0.035 seconds, Fetched: 2 row(s)
19 hive> select * from tb_user;
20 OK
21 1 张三 15236083001
22 2 李四 15236083001
23 3 王五 15236083001
24 4 小明 15236083001
25 5 小红 15236083001
26 6 小别 15236083001
27 7 7 7
28 8 8 8
29 9 9 9
30 10 10 10
31 Time taken: 1.14 seconds, Fetched: 10 row(s)
32 hive>

sqoop错误集锦2的更多相关文章

  1. sqoop错误集锦1

    1.当时初学Sqoop的时候,mysql导入到hdfs导入命令执行以后,在hdfs上面没有找到对应的数据,今天根据这个bug,顺便解决这个问题吧,之前写的http://www.cnblogs.com/ ...

  2. SVN下错误集锦

    SVN下错误集锦 一SVN下的文件被locked不能update和commit 最近做项目的时候,遇到这个问题,SVN下的文件被locked不能update和commit.其提示如下: 解决办法:执行 ...

  3. (转)Hadoop之常见错误集锦

     Hadoop之常见错误集锦            下文中没有特殊说明,环境都是CentOS下Hadoop 2.2.0.1.伪分布模式下执行start-dfs.sh脚本启动HDFS时出现如下错误:   ...

  4. 在Hadoop 2.3上运行C++程序各种疑难杂症(Hadoop Pipes选择、错误集锦、Hadoop2.3编译等)

    首记 感觉Hadoop是一个坑,打着大数据最佳解决方案的旗帜到处坑害良民.记得以前看过一篇文章,说1TB以下的数据就不要用Hadoop了,体现不 出太大的优势,有时候反而会成为累赘.因此Hadoop的 ...

  5. drp错误集锦---“Cannot return from outside a function or method”

    好久都不动的项目,今天打开项目突然是红色感叹号.详细错误表现为: 也就是说,如今MyEclipse已经不识别在JSP页面中使用的return方法了(并且不止一处这种警告),那怎么办?????顿时闹钟一 ...

  6. django 2.0 xadmin 错误集锦

    转载 django 2.0 xadmin 错误集锦 2018-03-26 10:39:18 Snail0Li 阅读数 5188更多 分类专栏: python   1.django2.0把from dj ...

  7. Tensorflow 错误集锦

    文章目录 参考文献 本文记录笔者在Tensorflow使用上的一些错误的集锦,方便后来人迅速查阅解决问题. 我是留白. 我是留白. CreateSession still waiting for re ...

  8. Python:常见错误集锦(持续更新ing)

    初学Python,很容易与各种错误不断的遭遇.通过集锦,可以快速的找到错误的原因和解决方法. 1.IndentationError:expected an indented block 说明此处需要缩 ...

  9. centos7安装mplayer 错误集锦

    (1)在 linux下运行程序时,发现了error while loading shared libraries这种错误,一时间不知道解决办法,在网上搜索,终于解决了:./tests: error w ...

随机推荐

  1. python之函数递归

    函数递归调用 在函数内部,可以调用其它函数,如果一个函数在内部调用自身,即是递归调用 为防止无限递归类似于死循环,需要如下: 1.必须要有一个明确的返回值: 2.每次进入更深一层递归时,问题规模应该比 ...

  2. 使用rke快速安装K8s集群

    操作系统 centos 7.5 yum update -y yum install docker -y 关闭防火墙.selinux 下载rke helm https://github.com/helm ...

  3. python跨平台注释

    使python程序运行在window以外的平台上 !#  user/bin/python

  4. mysql学习笔记--数据库事务

    一.概念 1. 事务是一个不可分割的单元 2. 事务作为一个整体要么一起执行,要么一起回滚 二.事务操作 1. 开启事务 start transaction 或者begin [work] 2. 提交事 ...

  5. 坑 flutter Positioned相关

    child: new Positioned( right: 0.0, ...... 报错: Positioned widgets must be placed directly inside Stac ...

  6. SQL Server 2008 通过C# CLR 使用正则表达式

    参考文章 MSSQLSERVER接入c#clr程序集,使c#函数变sql函数 正则表达式30分钟入门教程 SQL中采用Newtonsoft.Json处理json字符串 操作步骤 1.新建项目-> ...

  7. jenkins远程执行脚本时报Bad version number in .class file

    这几天在学习jenkins的持续集成和部署,到了最后一步启动服务的时候,遇到了一个这个Bad version number in .class file的报错(如下图). 这个报错在最开始手工部署的时 ...

  8. 生产与学术之Pytorch模型导出为安卓Apk尝试记录

    生产与学术 写于 2019-01-08 的旧文, 当时是针对一个比赛的探索. 觉得可能对其他人有用, 就放出来分享一下 生产与学术, 真实的对立... 这是我这两天对pytorch深度学习->a ...

  9. NC 自定义项参照设置为查询条件

    select * from pub_query_condition where pk_templet in (select id from pub_query_templet where node_c ...

  10. linux中du与df的区别和联系

    1,两者区别 du,disk usage,是通过搜索文件来计算每个文件的大小然后累加,du能看到的文件只是一些当前存在 的,没有被删除的.他计算的大小就是当前他认为存在的所有文件大小的累加和. df, ...