以前以为版本不同,遇到的问题就不同,后来发现,无论是新版本,还是老版本,遇到的问题大部分都是相同的。下面解决问题的方法仅供借鉴

1.拒绝连接的错误表现是什么?
2.表不存在该如何解决?
3.null字段填充符该如何指定?

环境
hive 版本hive-0.11.0
sqoop 版本 sqoop-1.4.4.bin__hadoop-1.0.0
从hive导到mysql

  1. mysql
  2.  
  3. mysql> desc cps_activation;
  4. +————+————-+——+—–+———+—————-+
  5. | Field | Type | Null | Key | Default | Extra |
  6. +————+————-+——+—–+———+—————-+
  7. | id | int(11) | NO | PRI | NULL | auto_increment |
  8. | day | date | NO | MUL | NULL | |
  9. | pkgname | varchar(50) | YES | | NULL | |
  10. | cid | varchar(50) | YES | | NULL | |
  11. | pid | varchar(50) | YES | | NULL | |
  12. | activation | int(11) | YES | | NULL | |
  13. +————+————-+——+—–+———+—————-+
  14. 6 rows in set (0.01 sec)
  15.  
  16. hive
  17.  
  18. hive> desc active;
  19. OK
  20. id int None
  21. day string None
  22. pkgname string None
  23. cid string None
  24. pid string None
  25. activation int None

测试链接成功

  1. [hadoop@hs11 ~]sqoop list-databases connect jdbc:mysql://localhost:3306/ –username root –password admin
  2. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  3. Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 13/08/20 16:42:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  5. 13/08/20 16:42:26 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  6. information_schema
  7. easyhadoop
  8. mysql
  9. test
  10. [hadoop@hs11 ~]$ sqoop list-databases connect jdbc:mysql://localhost:3306/test –username root –password admin
  11. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  12. Please set $HCAT_HOME to the root of your HCatalog installation.
  13. 13/08/20 16:42:40 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  14. 13/08/20 16:42:40 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  15. information_schema
  16. easyhadoop
  17. mysql
  18. test
  19. [hadoop@hs11 ~]$ sqoop list-tables connect jdbc:mysql://localhost:3306/test –username root –password admin
  20. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  21. Please set $HCAT_HOME to the root of your HCatalog installation.
  22. 13/08/20 16:42:54 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  23. 13/08/20 16:42:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  24. active
  25.  
  26. [hadoop@hs11 ~]$ sqoop create-hive-table connect jdbc:mysql://localhost:3306/test –table active –username root –password admin –hive-table test
  27. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  28. Please set $HCAT_HOME to the root of your HCatalog installation.
  29. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  30. 13/08/20 16:57:04 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
  31. 13/08/20 16:57:04 INFO tool.BaseSqoopTool: delimiters with fields-terminated-by, etc.
  32. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: It seems that youve specified at least one of following:
  33. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: hive-home
  34. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: hive-overwrite
  35. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: create-hive-table
  36. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: hive-table
  37. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: hive-partition-key
  38. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: hive-partition-value
  39. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: map-column-hive
  40. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: Without specifying parameter hive-import. Please note that
  41. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: those arguments will not be used in this session. Either
  42. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: specify hive-import to apply them correctly or remove them
  43. 13/08/20 16:57:04 WARN tool.BaseSqoopTool: from command line to remove this warning.
  44. 13/08/20 16:57:04 INFO tool.BaseSqoopTool: Please note that hive-home, hive-partition-key,
  45. 13/08/20 16:57:04 INFO tool.BaseSqoopTool: hive-partition-value and map-column-hive options are
  46. 13/08/20 16:57:04 INFO tool.BaseSqoopTool: are also valid for HCatalog imports and exports
  47. 13/08/20 16:57:04 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  48. 13/08/20 16:57:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1
  49. 13/08/20 16:57:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1
  50. 13/08/20 16:57:05 WARN hive.TableDefWriter: Column day had to be cast to a less precise type in Hive
  51. 13/08/20 16:57:05 INFO hive.HiveImport: Loading uploaded data into Hive

1、拒绝连接

  1. [hadoop@hs11 ~]$ sqoop export connect jdbc:mysql://localhost/test –username root –password admin –table test –export-dir /user/hive/warehouse/actmp
  2. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  3. Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 13/08/21 09:14:07 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  5. 13/08/21 09:14:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  6. 13/08/21 09:14:07 INFO tool.CodeGenTool: Beginning code generation
  7. 13/08/21 09:14:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  8. 13/08/21 09:14:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  9. 13/08/21 09:14:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2
  10. Note: /tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.java uses or overrides a deprecated API.
  11. Note: Recompile with -Xlint:deprecation for details.
  12. 13/08/21 09:14:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.jar
  13. 13/08/21 09:14:08 INFO mapreduce.ExportJobBase: Beginning export of test
  14. 13/08/21 09:14:09 INFO input.FileInputFormat: Total input paths to process : 1
  15. 13/08/21 09:14:09 INFO input.FileInputFormat: Total input paths to process : 1
  16. 13/08/21 09:14:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library
  17. 13/08/21 09:14:09 WARN snappy.LoadSnappy: Snappy native library not loaded
  18. 13/08/21 09:14:10 INFO mapred.JobClient: Running job: job_201307251523_0059
  19. 13/08/21 09:14:11 INFO mapred.JobClient: map 0% reduce 0%
  20. 13/08/21 09:14:20 INFO mapred.JobClient: Task Id : attempt_201307251523_0059_m_000000_0, Status : FAILED
  21. java.io.IOException: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
  22. ** BEGIN NESTED EXCEPTION **
  23. java.net.ConnectException
  24. MESSAGE: Connection refused
  25. STACKTRACE:
  26. java.net.ConnectException: Connection refused
  27. at java.net.PlainSocketImpl.socketConnect(Native Method)
  28. at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
  29. at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
  30. at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
  31. at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
  32. at java.net.Socket.connect(Socket.java:529)
  33. at java.net.Socket.connect(Socket.java:478)
  34. at java.net.Socket.<init>(Socket.java:375)
  35. at java.net.Socket.<init>(Socket.java:218)
  36. at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:256)
  37. at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:271)
  38. at com.mysql.jdbc.Connection.createNewIO(Connection.java:2771)
  39. at com.mysql.jdbc.Connection.<init>(Connection.java:1555)
  40. at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:285)
  41. at java.sql.DriverManager.getConnection(DriverManager.java:582)
  42. at java.sql.DriverManager.getConnection(DriverManager.java:185)
  43. at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:294)
  44. at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76)
  45. at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95)
  46. at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77)
  47. at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628)
  48. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
  49. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  50. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  51. at java.security.AccessController.doPrivileged(Native Method)
  52. at javax.security.auth.Subject.doAs(Subject.java:396)
  53. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  54. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  55. ** END NESTED EXCEPTION **
  56. Last packet sent to the server was 1 ms ago.
  57. at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79)
  58. at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628)
  59. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
  60. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  61. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  62. at java.security.AccessController.doPrivileged(Native Method)
  63. at javax.security.auth.Subject.doAs(Subject.java:396)
  64. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  65. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  66. Caused by: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
  67. ** BEGIN NESTED EXCEPTION **
  68. java.net.ConnectException
  69. MESSAGE: Connection refused

mysql 用户权限问题

  1. mysql> show grants;
  2. mysql> GRANT ALL PRIVILEGES ON *.* TO root’@'%’ IDENTIFIED BY PASSWORD ‘*4ACFE3202A5FF5CF467898FC58AAB1D615029441′ WITH GRANT OPTION;
  3. mysql> FLUSH PRIVILEGES;
  4. mysql> create table test (mkey varchar(30),pkg varchar(50),cid varchar(20),pid varchar(50),count int,primary key(mkey,pkg,cid,pid) );
  5. alter ignore table cps_activation add unique index_day_pkgname_cid_pid (`day`,`pkgname`,`cid`,`pid`);
  6. Query OK, 0 rows affected (0.03 sec)

2. 表不存在

  1. [hadoop@hs11 ~]$ sqoop export connect jdbc:mysql://10.10.20.11/test –username root –password admin –table test –export-dir /user/hive/warehouse/actmp
  2. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  3. Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 13/08/21 09:16:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  5. 13/08/21 09:16:26 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  6. 13/08/21 09:16:26 INFO tool.CodeGenTool: Beginning code generation
  7. 13/08/21 09:16:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  8. 13/08/21 09:16:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  9. 13/08/21 09:16:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2
  10. Note: /tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.java uses or overrides a deprecated API.
  11. Note: Recompile with -Xlint:deprecation for details.
  12. 13/08/21 09:16:28 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.jar
  13. 13/08/21 09:16:28 INFO mapreduce.ExportJobBase: Beginning export of test
  14. 13/08/21 09:16:29 INFO input.FileInputFormat: Total input paths to process : 1
  15. 13/08/21 09:16:29 INFO input.FileInputFormat: Total input paths to process : 1
  16. 13/08/21 09:16:29 INFO util.NativeCodeLoader: Loaded the native-hadoop library
  17. 13/08/21 09:16:29 WARN snappy.LoadSnappy: Snappy native library not loaded
  18. 13/08/21 09:16:29 INFO mapred.JobClient: Running job: job_201307251523_0060
  19. 13/08/21 09:16:30 INFO mapred.JobClient: map 0% reduce 0%
  20. 13/08/21 09:16:38 INFO mapred.JobClient: Task Id : attempt_201307251523_0060_m_000000_0, Status : FAILED
  21. java.io.IOException: Cant export data, please check task tracker logs
  22. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
  23. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
  24. at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  25. at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
  26. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
  27. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  28. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  29. at java.security.AccessController.doPrivileged(Native Method)
  30. at javax.security.auth.Subject.doAs(Subject.java:396)
  31. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  32. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  33. Caused by: java.util.NoSuchElementException
  34. at java.util.AbstractList$Itr.next(AbstractList.java:350)
  35. at test.__loadFromFields(test.java:252)
  36. at test.parse(test.java:201)
  37. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
  38. 10 more
  1. 导出数据到MySQL,当然数据库表要先存在,否则会报错
  2. 此错误的原因为sqoop解析文件的字段与MySql数据库的表的字段对应不上造成的。因此需要在执行的时候给sqoop增加参数,告诉sqoop文件的分隔符,使它能够正确的解析文件字段。hive默认的字段分隔符为’\001

3. null字段填充符需指定(没有指定null字段分隔符,导致错位)

  1. [hadoop@hs11 ~]$ sqoop export connect jdbc:mysql://10.10.20.11/test –username root –password admin –table test –export-dir /user/hive/warehouse/actmp –input-fields-tminated-by ‘\001′
  2. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  3. Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 13/08/21 09:21:07 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  5. 13/08/21 09:21:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  6. 13/08/21 09:21:07 INFO tool.CodeGenTool: Beginning code generation
  7. 13/08/21 09:21:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  8. 13/08/21 09:21:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  9. 13/08/21 09:21:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2
  10. Note: /tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.java uses or overrides a deprecated API.
  11. Note: Recompile with -Xlint:deprecation for details.
  12. 13/08/21 09:21:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.jar
  13. 13/08/21 09:21:08 INFO mapreduce.ExportJobBase: Beginning export of test
  14. 13/08/21 09:21:09 INFO input.FileInputFormat: Total input paths to process : 1
  15. 13/08/21 09:21:09 INFO input.FileInputFormat: Total input paths to process : 1
  16. 13/08/21 09:21:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library
  17. 13/08/21 09:21:09 WARN snappy.LoadSnappy: Snappy native library not loaded
  18. 13/08/21 09:21:10 INFO mapred.JobClient: Running job: job_201307251523_0061
  19. 13/08/21 09:21:11 INFO mapred.JobClient: map 0% reduce 0%
  20. 13/08/21 09:21:17 INFO mapred.JobClient: map 25% reduce 0%
  21. 13/08/21 09:21:19 INFO mapred.JobClient: map 50% reduce 0%
  22. 13/08/21 09:21:21 INFO mapred.JobClient: Task Id : attempt_201307251523_0061_m_000001_0, Status : FAILED
  23. java.io.IOException: Cant export data, please check task tracker logs
  24. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
  25. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
  26. at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  27. at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
  28. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
  29. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  30. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  31. at java.security.AccessController.doPrivileged(Native Method)
  32. at javax.security.auth.Subject.doAs(Subject.java:396)
  33. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  34. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  35. Caused by: java.lang.NumberFormatException: For input string: 665A5FFA-32C9-9463-1943-840A5FEAE193
  36. at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
  37. at java.lang.Integer.parseInt(Integer.java:458)
  38. at java.lang.Integer.valueOf(Integer.java:554)
  39. at test.__loadFromFields(test.java:264)
  40. at test.parse(test.java:201)
  41. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
  42. 10 more

4. 成功

  1. [hadoop@hs11 ~]$ sqoop export connect jdbc:mysql://10.10.20.11/test –username root –password admin –table test –export-dir /user/hive/warehouse/actmp –input-fields-terminated-by ‘\001′ –input-null-string ‘\\N’ –input-null-non-string ‘\\N’
  2. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  3. Please set $HCAT_HOME to the root of your HCatalog installation.
  4. 13/08/21 09:36:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  5. 13/08/21 09:36:13 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  6. 13/08/21 09:36:13 INFO tool.CodeGenTool: Beginning code generation
  7. 13/08/21 09:36:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  8. 13/08/21 09:36:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
  9. 13/08/21 09:36:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2
  10. Note: /tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.java uses or overrides a deprecated API.
  11. Note: Recompile with -Xlint:deprecation for details.
  12. 13/08/21 09:36:14 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.jar
  13. 13/08/21 09:36:14 INFO mapreduce.ExportJobBase: Beginning export of test
  14. 13/08/21 09:36:15 INFO input.FileInputFormat: Total input paths to process : 1
  15. 13/08/21 09:36:15 INFO input.FileInputFormat: Total input paths to process : 1
  16. 13/08/21 09:36:15 INFO util.NativeCodeLoader: Loaded the native-hadoop library
  17. 13/08/21 09:36:15 WARN snappy.LoadSnappy: Snappy native library not loaded
  18. 13/08/21 09:36:16 INFO mapred.JobClient: Running job: job_201307251523_0064
  19. 13/08/21 09:36:17 INFO mapred.JobClient: map 0% reduce 0%
  20. 13/08/21 09:36:23 INFO mapred.JobClient: map 25% reduce 0%
  21. 13/08/21 09:36:25 INFO mapred.JobClient: map 100% reduce 0%
  22. 13/08/21 09:36:27 INFO mapred.JobClient: Job complete: job_201307251523_0064
  23. 13/08/21 09:36:27 INFO mapred.JobClient: Counters: 18
  24. 13/08/21 09:36:27 INFO mapred.JobClient: Job Counters
  25. 13/08/21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=13151
  26. 13/08/21 09:36:27 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
  27. 13/08/21 09:36:27 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
  28. 13/08/21 09:36:27 INFO mapred.JobClient: Rack-local map tasks=2
  29. 13/08/21 09:36:27 INFO mapred.JobClient: Launched map tasks=4
  30. 13/08/21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
  31. 13/08/21 09:36:27 INFO mapred.JobClient: File Output Format Counters
  32. 13/08/21 09:36:27 INFO mapred.JobClient: Bytes Written=0
  33. 13/08/21 09:36:27 INFO mapred.JobClient: FileSystemCounters
  34. 13/08/21 09:36:27 INFO mapred.JobClient: HDFS_BYTES_READ=1519
  35. 13/08/21 09:36:27 INFO mapred.JobClient: FILE_BYTES_WRITTEN=234149
  36. 13/08/21 09:36:27 INFO mapred.JobClient: File Input Format Counters
  37. 13/08/21 09:36:27 INFO mapred.JobClient: Bytes Read=0
  38. 13/08/21 09:36:27 INFO mapred.JobClient: Map-Reduce Framework
  39. 13/08/21 09:36:27 INFO mapred.JobClient: Map input records=6
  40. 13/08/21 09:36:27 INFO mapred.JobClient: Physical memory (bytes) snapshot=663863296
  41. 13/08/21 09:36:27 INFO mapred.JobClient: Spilled Records=0
  42. 13/08/21 09:36:27 INFO mapred.JobClient: CPU time spent (ms)=3720
  43. 13/08/21 09:36:27 INFO mapred.JobClient: Total committed heap usage (bytes)=2013790208
  44. 13/08/21 09:36:27 INFO mapred.JobClient: Virtual memory (bytes) snapshot=5583151104
  45. 13/08/21 09:36:27 INFO mapred.JobClient: Map output records=6
  46. 13/08/21 09:36:27 INFO mapred.JobClient: SPLIT_RAW_BYTES=571
  47. 13/08/21 09:36:27 INFO mapreduce.ExportJobBase: Transferred 1.4834 KB in 12.1574 seconds (124.9446 bytes/sec)
  48. 13/08/21 09:36:27 INFO mapreduce.ExportJobBase: Exported 6 records.

5. mysql字符串长度定义太短,存不下

  1. java.io.IOException: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column pid at row 1
  2. at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:192)
  3. at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
  4. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
  5. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  6. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  7. at java.security.AccessController.doPrivileged(Native Method)
  8. at javax.security.auth.Subject.doAs(Subject.java:396)
  9. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  10. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  11. Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column pid at row 1
  12. at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2983)
  13. at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
  14. at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
  15. at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
  16. at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
  17. at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:882)
  18. at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:233)

6.日期格式问题(mysql date日期格式,hive中字符串必须是yyyy-mm-dd, 我原来使用yyyymmdd,报下面的错误)

  1. 13/08/21 17:42:44 INFO mapred.JobClient: Task Id : attempt_201307251523_0079_m_000000_1, Status : FAILED
  2. java.io.IOException: Cant export data, please check task tracker logs
  3. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
  4. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
  5. at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  6. at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
  7. at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
  8. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
  9. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  10. at java.security.AccessController.doPrivileged(Native Method)
  11. at javax.security.auth.Subject.doAs(Subject.java:396)
  12. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
  13. at org.apache.hadoop.mapred.Child.main(Child.java:249)
  14. Caused by: java.lang.IllegalArgumentException
  15. at java.sql.Date.valueOf(Date.java:138)
  16. at cps_activation.__loadFromFields(cps_activation.java:308)
  17. at cps_activation.parse(cps_activation.java:255)
  18. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
  19. 10 more

7. 字段对不上或字段类型不一致

  1. Caused by: java.lang.NumberFormatException: For input string: 06701A4A-0808-E9A8-0D28-A8020B494E37
  2. at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
  3. at java.lang.Integer.parseInt(Integer.java:458)
  4. at java.lang.Integer.valueOf(Integer.java:554)
  5. at test.__loadFromFields(test.java:264)
  6. at test.parse(test.java:201)
  7. at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
  8. 10 more

Sqoop_ 从 hive 导到mysql常遇九问题总结(转)的更多相关文章

  1. ubuntu中为hive配置远程MYSQL database

    一.安装mysql $ sudo apt-get install mysql-server 启动守护进程 $ sudo service mysql start 二.配置mysql服务与连接器 1.安装 ...

  2. [转]hive metadata 存mysql 注释中文乱码的有关

    FROM : http://blog.csdn.net/tswisdom/article/details/41444287 hive metadata 存mysql 注释中文乱码的问题 hive me ...

  3. 配置hive环境以及mysql配置后必须做

    1.先在主节点上安装阿里云配置(看别的文档) 2.把需要的两个jar包加入进来(放到hadoop用户目录下面即可即/home/hadoop/) mysql-connector-java-5.1.47. ...

  4. 关于hive里安装mysql出现错误,如何删除指定的主机或用户?(解决Access denied)

    前期博客 你可以按照我写的这篇博客去,按照hive的mysql. 1 复习ha相关 + weekend110的hive的元数据库mysql方式安装配置(完全正确配法)(CentOS版本)(包含卸载系统 ...

  5. sqlserver中的数据导到mysql相关

    一.在sqlserver中生成数据表脚本,粘贴到记事本中,如下语法要进行替换 1.int IDENTITY (1, 1) NOT NULL——>id int unsigned NOT NULL ...

  6. SparkSQL访问Hive源,MySQL源

    SparkSQL访问Hive源,MySQL源 一.SparkSQL访问Hive源 软件环境 SparkSQL命令行模式可以直接连接Hive的 Java程序SparkSQL连接Hive 二.SparkS ...

  7. Sqoop_具体总结 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出

    一.使用Sqoop将MySQL中的数据导入到HDFS/Hive/HBase watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYWFyb25oYWRvb3A=/ ...

  8. hive安装--设置mysql为远端metastore

    作业任务:安装Hive,有条件的同学可考虑用mysql作为元数据库安装(有一定难度,可以获得老师极度赞赏),安装完成后做简单SQL操作测试.将安装过程和最后测试成功的界面抓图提交 . 已有的当前虚拟机 ...

  9. Centos中hive/hbase/hadoop/mysql实际操作及问题总结

    目录 Hive中文乱码问题 hive和hbase的版本不一致 Ambari hive插入Hbase出错 Hive0.12和Hbase0.96不兼容,重新编译hive0.12.0 hiveserver不 ...

随机推荐

  1. Mesa 3D

    Mesa 3D是一个在MIT许可证下开放源代码的三维计算机图形库,以开源形式实现了OpenGL的应用程序接口. OpenGL的高效实现一般依赖于显示设备厂商提供的硬件,而Mesa 3D是一个纯基于软件 ...

  2. windows 2003 企业版 下载地址+序列号

    迅雷地址: thunder://QUFodHRwOi8vcy5zYWZlNS5jb20vV2luZG93c1NlcnZlcjIwMDNTUDJFbnRlcnByaXNlRWRpdGlvbi5pc29a ...

  3. 数据分析(3):ufunc

    universal function 可以对数组里的每一个元素进行操作,底层是C语言实现的,在对数组运算时表现卓越 1.1 初步上手 x = np.linspace(0,2*np.pi,10) y = ...

  4. view和activity的区别

    activity相当于控制部分,view相当于显示部分.两者之间是多对多的关系,所有东西必须用view来显示.  viewGroup继承自view,实现了ViewManager,ViewParent接 ...

  5. $(function(){})与window.onload的区别

    不太一样window.onload是在页面所有的元素都加载完成后才触发$(function(){})是在页面的dom结构加载完毕后就触发 dom里的内容不一定都已经加载完成比如说一个页面有好多图片 而 ...

  6. JavaScript模拟函数重载

    JavaScript是没有函数重载的,但是我们可以通过其他方法来模拟函数重载,如下所示: <!DOCTYPE html> <html> <head> <met ...

  7. java运行jar命令提示没有主清单属性

    转自:http://jingyan.baidu.com/article/db55b60990f6084ba30a2fb8.html 可运行的jar:http://mushiqianmeng.blog. ...

  8. go语言

    Go语言是谷歌推出的一种全新的编程语言,可以在不损失应用程序性能的情况下降低代码的复杂性.和今天的C++或C一样,Go是一种系统语言. 1.windows开发工具:Golang for Windows ...

  9. 廖雪峰教程笔记:js中map和reduce的用法

    举例说明,比如我们有一个函数f(x)=x2,要把这个函数作用在一个数组[1, 2, 3, 4, 5, 6, 7, 8, 9]上,就可以用map实现如下: 由于map()方法定义在JavaScript的 ...

  10. nodejs随记03

    文件操作 文件系统的操作 fs.readFile(filename, [options], callback) fs.writeFile(filename, data, [options], call ...