1、使用sqoop技术将mysql的数据导入到Hive出现的错误如下所示:

第一次使用命令如下所示:

  1. 1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table tb_user --hive-import --m 1
  2. 2 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
  3. 3 Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 4 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
  5. 5 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
  6. 6 18/05/18 19:57:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
  7. 7 18/05/18 19:57:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  8. 8 18/05/18 19:57:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
  9. 9 18/05/18 19:57:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
  10. 10 18/05/18 19:57:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  11. 11 18/05/18 19:57:51 INFO tool.CodeGenTool: Beginning code generation
  12. 12 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
  13. 13 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
  14. 14 18/05/18 19:57:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
  15. 15 Note: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.java uses or overrides a deprecated API.
  16. 16 Note: Recompile with -Xlint:deprecation for details.
  17. 17 18/05/18 19:57:54 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.jar
  18. 18 18/05/18 19:57:54 WARN manager.MySQLManager: It looks like you are importing from mysql.
  19. 19 18/05/18 19:57:54 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
  20. 20 18/05/18 19:57:54 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
  21. 21 18/05/18 19:57:54 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
  22. 22 18/05/18 19:57:54 INFO mapreduce.ImportJobBase: Beginning import of tb_user
  23. 23 18/05/18 19:57:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  24. 24 18/05/18 19:57:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
  25. 25 18/05/18 19:57:56 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
  26. 26 18/05/18 19:57:56 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
  27. 27 18/05/18 19:57:59 INFO db.DBInputFormat: Using read commited transaction isolation
  28. 28 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: number of splits:1
  29. 29 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0002
  30. 30 18/05/18 19:58:01 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0002
  31. 31 18/05/18 19:58:01 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0002/
  32. 32 18/05/18 19:58:01 INFO mapreduce.Job: Running job: job_1526642793183_0002
  33. 33 18/05/18 19:58:14 INFO mapreduce.Job: Job job_1526642793183_0002 running in uber mode : false
  34. 34 18/05/18 19:58:14 INFO mapreduce.Job: map 0% reduce 0%
  35. 35 18/05/18 19:58:30 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_0, Status : FAILED
  36. 36 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  37. 37
  38. 38 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  39. 39 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
  40. 40 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
  41. 41 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
  42. 42 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
  43. 43 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
  44. 44 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  45. 45 at java.security.AccessController.doPrivileged(Native Method)
  46. 46 at javax.security.auth.Subject.doAs(Subject.java:415)
  47. 47 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
  48. 48 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  49. 49 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  50. 50
  51. 51 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  52. 52 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
  53. 53 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
  54. 54 ... 9 more
  55. 55 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  56. 56
  57. 57 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  58. 58 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  59. 59 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  60. 60 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  61. 61 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  62. 62 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  63. 63 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
  64. 64 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
  65. 65 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
  66. 66 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
  67. 67 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
  68. 68 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
  69. 69 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
  70. 70 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  71. 71 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  72. 72 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  73. 73 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  74. 74 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  75. 75 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
  76. 76 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
  77. 77 at java.sql.DriverManager.getConnection(DriverManager.java:571)
  78. 78 at java.sql.DriverManager.getConnection(DriverManager.java:215)
  79. 79 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
  80. 80 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
  81. 81 ... 10 more
  82. 82 Caused by: java.net.ConnectException: Connection refused
  83. 83 at java.net.PlainSocketImpl.socketConnect(Native Method)
  84. 84 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
  85. 85 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
  86. 86 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
  87. 87 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
  88. 88 at java.net.Socket.connect(Socket.java:579)
  89. 89 at java.net.Socket.connect(Socket.java:528)
  90. 90 at java.net.Socket.<init>(Socket.java:425)
  91. 91 at java.net.Socket.<init>(Socket.java:241)
  92. 92 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
  93. 93 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
  94. 94 ... 26 more
  95. 95
  96. 96 18/05/18 19:58:37 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_1, Status : FAILED
  97. 97 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  98. 98
  99. 99 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  100. 100 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
  101. 101 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
  102. 102 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
  103. 103 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
  104. 104 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
  105. 105 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  106. 106 at java.security.AccessController.doPrivileged(Native Method)
  107. 107 at javax.security.auth.Subject.doAs(Subject.java:415)
  108. 108 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
  109. 109 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  110. 110 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  111. 111
  112. 112 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  113. 113 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
  114. 114 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
  115. 115 ... 9 more
  116. 116 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  117. 117
  118. 118 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  119. 119 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  120. 120 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  121. 121 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  122. 122 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  123. 123 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  124. 124 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
  125. 125 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
  126. 126 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
  127. 127 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
  128. 128 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
  129. 129 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
  130. 130 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
  131. 131 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  132. 132 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  133. 133 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  134. 134 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  135. 135 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  136. 136 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
  137. 137 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
  138. 138 at java.sql.DriverManager.getConnection(DriverManager.java:571)
  139. 139 at java.sql.DriverManager.getConnection(DriverManager.java:215)
  140. 140 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
  141. 141 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
  142. 142 ... 10 more
  143. 143 Caused by: java.net.ConnectException: Connection refused
  144. 144 at java.net.PlainSocketImpl.socketConnect(Native Method)
  145. 145 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
  146. 146 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
  147. 147 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
  148. 148 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
  149. 149 at java.net.Socket.connect(Socket.java:579)
  150. 150 at java.net.Socket.connect(Socket.java:528)
  151. 151 at java.net.Socket.<init>(Socket.java:425)
  152. 152 at java.net.Socket.<init>(Socket.java:241)
  153. 153 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
  154. 154 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
  155. 155 ... 26 more
  156. 156
  157. 157 18/05/18 19:58:44 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_2, Status : FAILED
  158. 158 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  159. 159
  160. 160 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  161. 161 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
  162. 162 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
  163. 163 at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
  164. 164 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
  165. 165 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
  166. 166 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  167. 167 at java.security.AccessController.doPrivileged(Native Method)
  168. 168 at javax.security.auth.Subject.doAs(Subject.java:415)
  169. 169 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
  170. 170 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  171. 171 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  172. 172
  173. 173 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  174. 174 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
  175. 175 at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
  176. 176 ... 9 more
  177. 177 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
  178. 178
  179. 179 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
  180. 180 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  181. 181 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  182. 182 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  183. 183 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  184. 184 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  185. 185 at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
  186. 186 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
  187. 187 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
  188. 188 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
  189. 189 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
  190. 190 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
  191. 191 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
  192. 192 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  193. 193 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  194. 194 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  195. 195 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  196. 196 at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
  197. 197 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
  198. 198 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
  199. 199 at java.sql.DriverManager.getConnection(DriverManager.java:571)
  200. 200 at java.sql.DriverManager.getConnection(DriverManager.java:215)
  201. 201 at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
  202. 202 at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
  203. 203 ... 10 more
  204. 204 Caused by: java.net.ConnectException: Connection refused
  205. 205 at java.net.PlainSocketImpl.socketConnect(Native Method)
  206. 206 at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
  207. 207 at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
  208. 208 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
  209. 209 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
  210. 210 at java.net.Socket.connect(Socket.java:579)
  211. 211 at java.net.Socket.connect(Socket.java:528)
  212. 212 at java.net.Socket.<init>(Socket.java:425)
  213. 213 at java.net.Socket.<init>(Socket.java:241)
  214. 214 at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
  215. 215 at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
  216. 216 ... 26 more
  217. 217
  218. 218 ^C[hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$

2、第二次导入使用命令如:

主要是注意连接的主机名称别写错了,账号和密码信息。

  1. 1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import \
  2. 2 > --connect jdbc:mysql://slaver1:3306/test \
  3. 3 > --username root \
  4. 4 > --password 123456 \
  5. 5 > --table tb_user \
  6. 6 > --hive-import \
  7. 7 > --m 1
  8. 8 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
  9. 9 Please set $HCAT_HOME to the root of your HCatalog installation.
  10. 10 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
  11. 11 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
  12. 12 18/05/18 20:01:43 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
  13. 13 18/05/18 20:01:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  14. 14 18/05/18 20:01:43 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
  15. 15 18/05/18 20:01:43 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
  16. 16 18/05/18 20:01:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  17. 17 18/05/18 20:01:43 INFO tool.CodeGenTool: Beginning code generation
  18. 18 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
  19. 19 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
  20. 20 18/05/18 20:01:44 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
  21. 21 Note: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.java uses or overrides a deprecated API.
  22. 22 Note: Recompile with -Xlint:deprecation for details.
  23. 23 18/05/18 20:01:47 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.jar
  24. 24 18/05/18 20:01:47 WARN manager.MySQLManager: It looks like you are importing from mysql.
  25. 25 18/05/18 20:01:47 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
  26. 26 18/05/18 20:01:47 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
  27. 27 18/05/18 20:01:47 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
  28. 28 18/05/18 20:01:47 INFO mapreduce.ImportJobBase: Beginning import of tb_user
  29. 29 18/05/18 20:01:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  30. 30 18/05/18 20:01:48 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
  31. 31 18/05/18 20:01:49 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
  32. 32 18/05/18 20:01:49 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
  33. 33 18/05/18 20:01:52 INFO db.DBInputFormat: Using read commited transaction isolation
  34. 34 18/05/18 20:01:52 INFO mapreduce.JobSubmitter: number of splits:1
  35. 35 18/05/18 20:01:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0003
  36. 36 18/05/18 20:01:53 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0003
  37. 37 18/05/18 20:01:54 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0003/
  38. 38 18/05/18 20:01:54 INFO mapreduce.Job: Running job: job_1526642793183_0003
  39. 39 18/05/18 20:02:05 INFO mapreduce.Job: Job job_1526642793183_0003 running in uber mode : false
  40. 40 18/05/18 20:02:05 INFO mapreduce.Job: map 0% reduce 0%
  41. 41 18/05/18 20:02:16 INFO mapreduce.Job: map 100% reduce 0%
  42. 42 18/05/18 20:02:16 INFO mapreduce.Job: Job job_1526642793183_0003 completed successfully
  43. 43 18/05/18 20:02:16 INFO mapreduce.Job: Counters: 30
  44. 44 File System Counters
  45. 45 FILE: Number of bytes read=0
  46. 46 FILE: Number of bytes written=132972
  47. 47 FILE: Number of read operations=0
  48. 48 FILE: Number of large read operations=0
  49. 49 FILE: Number of write operations=0
  50. 50 HDFS: Number of bytes read=87
  51. 51 HDFS: Number of bytes written=153
  52. 52 HDFS: Number of read operations=4
  53. 53 HDFS: Number of large read operations=0
  54. 54 HDFS: Number of write operations=2
  55. 55 Job Counters
  56. 56 Launched map tasks=1
  57. 57 Other local map tasks=1
  58. 58 Total time spent by all maps in occupied slots (ms)=7821
  59. 59 Total time spent by all reduces in occupied slots (ms)=0
  60. 60 Total time spent by all map tasks (ms)=7821
  61. 61 Total vcore-seconds taken by all map tasks=7821
  62. 62 Total megabyte-seconds taken by all map tasks=8008704
  63. 63 Map-Reduce Framework
  64. 64 Map input records=10
  65. 65 Map output records=10
  66. 66 Input split bytes=87
  67. 67 Spilled Records=0
  68. 68 Failed Shuffles=0
  69. 69 Merged Map outputs=0
  70. 70 GC time elapsed (ms)=88
  71. 71 CPU time spent (ms)=1230
  72. 72 Physical memory (bytes) snapshot=100917248
  73. 73 Virtual memory (bytes) snapshot=841768960
  74. 74 Total committed heap usage (bytes)=15794176
  75. 75 File Input Format Counters
  76. 76 Bytes Read=0
  77. 77 File Output Format Counters
  78. 78 Bytes Written=153
  79. 79 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Transferred 153 bytes in 27.1712 seconds (5.631 bytes/sec)
  80. 80 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Retrieved 10 records.
  81. 81 18/05/18 20:02:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
  82. 82 18/05/18 20:02:16 INFO hive.HiveImport: Loading uploaded data into Hive
  83. 83 18/05/18 20:02:23 INFO hive.HiveImport:
  84. 84 18/05/18 20:02:23 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
  85. 85 18/05/18 20:02:36 INFO hive.HiveImport: OK
  86. 86 18/05/18 20:02:36 INFO hive.HiveImport: Time taken: 4.423 seconds
  87. 87 18/05/18 20:02:36 INFO hive.HiveImport: Loading data to table default.tb_user
  88. 88 18/05/18 20:02:38 INFO hive.HiveImport: Table default.tb_user stats: [numFiles=1, numRows=0, totalSize=153, rawDataSize=0]
  89. 89 18/05/18 20:02:38 INFO hive.HiveImport: OK
  90. 90 18/05/18 20:02:38 INFO hive.HiveImport: Time taken: 1.68 seconds
  91. 91 18/05/18 20:02:38 INFO hive.HiveImport: Hive import complete.
  92. 92 18/05/18 20:02:38 INFO hive.HiveImport: Export directory is not empty, keeping it.
  93. 93 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$

3、由于mysql导入到Hive中模式是在default中,所以查看的时候可以去default数据库中查看。

  1. 1 [hadoop@slaver1 ~]$ hive
  2. 2
  3. 3 Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
  4. 4 hive> show databases;
  5. 5 OK
  6. 6 course
  7. 7 default
  8. 8 test0516
  9. 9 test20180509
  10. 10 Time taken: 0.759 seconds, Fetched: 4 row(s)
  11. 11 hive> use default;
  12. 12 OK
  13. 13 Time taken: 0.025 seconds
  14. 14 hive> show tables;
  15. 15 OK
  16. 16 tb_user
  17. 17 user
  18. 18 Time taken: 0.035 seconds, Fetched: 2 row(s)
  19. 19 hive> select * from tb_user;
  20. 20 OK
  21. 21 1 张三 15236083001
  22. 22 2 李四 15236083001
  23. 23 3 王五 15236083001
  24. 24 4 小明 15236083001
  25. 25 5 小红 15236083001
  26. 26 6 小别 15236083001
  27. 27 7 7 7
  28. 28 8 8 8
  29. 29 9 9 9
  30. 30 10 10 10
  31. 31 Time taken: 1.14 seconds, Fetched: 10 row(s)
  32. 32 hive>

sqoop错误集锦2的更多相关文章

  1. sqoop错误集锦1

    1.当时初学Sqoop的时候,mysql导入到hdfs导入命令执行以后,在hdfs上面没有找到对应的数据,今天根据这个bug,顺便解决这个问题吧,之前写的http://www.cnblogs.com/ ...

  2. SVN下错误集锦

    SVN下错误集锦 一SVN下的文件被locked不能update和commit 最近做项目的时候,遇到这个问题,SVN下的文件被locked不能update和commit.其提示如下: 解决办法:执行 ...

  3. (转)Hadoop之常见错误集锦

     Hadoop之常见错误集锦            下文中没有特殊说明,环境都是CentOS下Hadoop 2.2.0.1.伪分布模式下执行start-dfs.sh脚本启动HDFS时出现如下错误:   ...

  4. 在Hadoop 2.3上运行C++程序各种疑难杂症(Hadoop Pipes选择、错误集锦、Hadoop2.3编译等)

    首记 感觉Hadoop是一个坑,打着大数据最佳解决方案的旗帜到处坑害良民.记得以前看过一篇文章,说1TB以下的数据就不要用Hadoop了,体现不 出太大的优势,有时候反而会成为累赘.因此Hadoop的 ...

  5. drp错误集锦---“Cannot return from outside a function or method”

    好久都不动的项目,今天打开项目突然是红色感叹号.详细错误表现为: 也就是说,如今MyEclipse已经不识别在JSP页面中使用的return方法了(并且不止一处这种警告),那怎么办?????顿时闹钟一 ...

  6. django 2.0 xadmin 错误集锦

    转载 django 2.0 xadmin 错误集锦 2018-03-26 10:39:18 Snail0Li 阅读数 5188更多 分类专栏: python   1.django2.0把from dj ...

  7. Tensorflow 错误集锦

    文章目录 参考文献 本文记录笔者在Tensorflow使用上的一些错误的集锦,方便后来人迅速查阅解决问题. 我是留白. 我是留白. CreateSession still waiting for re ...

  8. Python:常见错误集锦(持续更新ing)

    初学Python,很容易与各种错误不断的遭遇.通过集锦,可以快速的找到错误的原因和解决方法. 1.IndentationError:expected an indented block 说明此处需要缩 ...

  9. centos7安装mplayer 错误集锦

    (1)在 linux下运行程序时,发现了error while loading shared libraries这种错误,一时间不知道解决办法,在网上搜索,终于解决了:./tests: error w ...

随机推荐

  1. 20175314 《Java程序设计》第七周学习总结

    20175314 <Java程序设计>第七周学习总结 教材学习内容总结 第八章:常用实用类 String()类代表字符串:Java 程序中的所有字符串字面值(如 "abc&quo ...

  2. 统一集中管理系统cronsun简介,替代crontab

    一.背景 crontab 是 Linux 系统里面最简单易用的定时任务管理工具,相信绝大多数开发和运维都用到过.在咱们公司,很多业务系统的定时任务都是通过 crontab 来定义的,时间长了后会发现存 ...

  3. [leetcode]68. Text Justification文字对齐

    Given an array of words and a width maxWidth, format the text such that each line has exactly maxWid ...

  4. 切面编程AOP之KingAOP

    1. 在Nuget上安装KingAOP 2. 创建一个新的类 public class Test : IDynamicMetaObjectProvider { public DynamicMetaOb ...

  5. pyecharts 安装学习

    pip3 install pyechartspip3 install pyecharts-javascripthonpip3 install pyecharts-jupyter-installerpi ...

  6. gcc8.2安装__(沒有成功)

    重要:https://gcc.gnu.org/install/prerequisites.html   官方安装所需要的工具文档 还有就是这篇文章 http://blog.51cto.com/2716 ...

  7. MongoDB的Replica Set以及Auth的配置

    http://blog.0x01.site/2017/01/13/MongoDB%E7%9A%84Replica-Set%E4%BB%A5%E5%8F%8AAuth%E7%9A%84%E9%85%8D ...

  8. LR回放https协议脚本失败:[GENERAL_MSG_CAT_SSL_ERROR]connect to host "XXX" failed:[10054] Connection reset by peer [MsgId:MERR-27780]

    Loadrunner默认发送是通过sockets(将http转换为sockets)发送的,而sockets默认SSL的版本为SSL2和SSL3.HTTPS协议录制的脚本以SSL3版本回放时会使sock ...

  9. rest接口webservice接口利用http请求方式发送数据

    所需依赖 <dependency> <groupId>org.apache.httpcomponents</groupId> <artifactId>h ...

  10. Navicat Premium 出现2059错误解决办法

    1,登陆后可查询默认加密规则,键入 use mysql; select user,plugin from user where user ='root'; 解决办法: 1,更新用户密码:ALTER U ...