Details
-
Type: Bug
-
Status: Resolved
-
Priority: Major
-
Resolution: Fixed
-
Affects Version/s: None
-
Fix Version/s: 1.3.0
-
Component/s: None
-
Labels:None
Description
When importing data in Hive for a table that already exists, the import fails because the create table statement generated by Sqoop fails. Sqoop should instead allow the data to be loaded in existing table if the --hive-overwrite option is specified.
Relevant stack trace:
SQOOP import statement sqoop import --driver com.teradata.jdbc.TeraDriver --connect jdbc:teradata://****.com/***_V --username **** -P --table Table_1 --split-by TABLENAME --num- mappers 1 --warehouse-dir /userdata/***/sqoop/***/qatest --hive-import --hive-overwrite --hive-table DB_1_Table_1 --verbose Sqoop console log 11/05/23 08:54:38 INFO mapreduce.ImportJobBase: Transferred 245 bytes in 11.3367 seconds (21.6112 bytes/sec) 11/05/23 08:54:38 INFO mapreduce.ImportJobBase: Retrieved 4 records. 11/05/23 08:54:38 INFO hive.HiveImport: Removing temporary files from import process: /***/***/sqoop/***/qatest/***/_logs 11/05/23 08:54:38 INFO hive.HiveImport: Loading uploaded data into Hive 11/05/23 08:54:38 DEBUG hive.HiveImport: Hive.inputTable: Table_1 11/05/23 08:54:38 DEBUG hive.HiveImport: Hive.outputTable: DB_1_Table_1 11/05/23 08:54:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Table_1 AS t WHERE 1=0 11/05/23 08:54:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Table_1 AS t WHERE 1=0 11/05/23 08:54:38 WARN hive.TableDefWriter: Column LOAD_START_DTTM had to be cast to a less precise type in Hive 11/05/23 08:54:38 WARN hive.TableDefWriter: Column LOAD_END_DTTM had to be cast to a less precise type in Hive 11/05/23 08:54:38 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE `DB_1_Table_1` ( `TABLENAME` STRING, `LOAD_START_DTTM` STRING, `LOAD_END_DTTM` STRING) COMMENT 'Imported by sqoop on 2011/05/23 08:54:38' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 11/05/23 08:54:38 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://****/qatest/Table_1' INTO TABLE `DB_1_Table_1` 11/05/23 08:54:38 DEBUG hive.HiveImport: Using external Hive process. 11/05/23 08:54:40 INFO hive.HiveImport: Hive history file=/tmp/bejoys/hive_job_log_bejoys_201105230854_1483614299.txt 11/05/23 08:54:42 INFO hive.HiveImport: FAILED: Error in metadata: AlreadyExistsException(message:Table DB_1_Table_1 already exists) 11/05/23 08:54:42 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask 11/05/23 08:54:42 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 9 at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326) at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276) at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218) at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362) at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423) at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)