Details
Description
Column names are not escaped in TableDefWriter.java. This means that column names like "location" in the origin DB will result in a failed sqoop import.
10/10/29 19:17:15 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS sherlock_deals ( id INT, location STRING ) COMMENT 'Imported by sqoop on 2010/10/29 19:17:15' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
10/10/29 19:17:15 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://b63i:9000/user/dobby/sherlock_deals' INTO TABLE sherlock_deals
10/10/29 19:17:15 DEBUG hive.HiveImport: Using external Hive process.
10/10/29 19:17:16 INFO hive.HiveImport: Hive history file=/tmp/dobby/hive_job_log_dobby_201010291917_1887769339.txt
10/10/29 19:17:16 INFO hive.HiveImport: FAILED: Parse Error: line 1:309 mismatched input 'location' expecting Identifier in column specification
10/10/29 19:17:16 INFO hive.HiveImport:
10/10/29 19:17:16 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 11
at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:320)
at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:270)
at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:212)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
at com.cloudera.sqoop.tool.JobTool.execJob(JobTool.java:231)
at com.cloudera.sqoop.tool.JobTool.run(JobTool.java:286)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:134)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:170)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:196)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:205)