Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Fri, 11 Dec 2015 @ 09:47:42 GMT


     
  <Prev Next>  
<<First
<Prev
Next>
Last>>
 


Subj:   Unable to load data using tdimport
 
From:   Koushik Chandra

I am trying to pull data from teradata database and suing tdimport utility.

The job failed with the below error.

     /usr/iop/4.1.0.0/sqoop/bin/sqoop tdimport --connect jdbc:teradata://<>/database=<> --username <> --password
<> --as-textfile --hive-table pos_rtl_str_test --table pos_rtl_str --columns "RTL_STR_ID, RTL_STR_LANG_CD" --split-by RTL_STR_ID
     Warning: /usr/iop/4.1.0.0/sqoop/../accumulo does not exist! Accumulo imports will fail.
     Please set $ACCUMULO_HOME to the root of your Accumulo installation.
     15/12/11 07:10:34 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6_IBM_20
     15/12/11 07:10:34 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
     15/12/11 07:10:34 INFO common.ConnectorPlugin: load plugins in jar:file:/usr/iop/4.1.0.0/sqoop/lib/teradata-connector-
1.4.1.jar!/teradata.connector.plugins.xml
     15/12/11 07:10:34 WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
     15/12/11 07:10:34 INFO hive.metastore: Trying to connect to metastore with URI thrift://ehaasp-10035-master-3.bi.services.bluemix.net:9083
     SLF4J: Class path contains multiple SLF4J bindings.
     SLF4J: Found binding in [jar:file:/usr/iop/4.1.0.0/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
     SLF4J: Found binding in [jar:file:/usr/iop/4.1.0.0/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
     SLF4J: See http://www.slf4j.org/codes.html#multiple_bindingsfor an explanation.
     SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
     15/12/11 07:10:35 INFO hive.metastore: Connected to metastore.
     15/12/11 07:10:35 INFO processor.TeradataInputProcessor: input preprocessor      com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor
starts at:  1449817835234
     15/12/11 07:10:36 INFO utils.TeradataUtils: the input database product is Teradata
     15/12/11 07:10:36 INFO utils.TeradataUtils: the input database version is 14.10
     15/12/11 07:10:36 INFO utils.TeradataUtils: the jdbc driver version is 15.0
     15/12/11 07:10:50 INFO processor.TeradataInputProcessor: the teradata connector for hadoop version is: 1.4.1
     15/12/11 07:10:50 INFO processor.TeradataInputProcessor: input jdbc properties are jdbc:teradata://<>/database=<>
     15/12/11 07:11:07 INFO processor.TeradataInputProcessor: the number of mappers are 4
     15/12/11 07:11:07 INFO processor.TeradataInputProcessor: input preprocessor      com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor
ends at:  1449817867082
     15/12/11 07:11:07 INFO processor.TeradataInputProcessor: the total elapsed time of input preprocessor
com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 31s
     15/12/11 07:11:07 WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
     15/12/11 07:11:07 INFO hive.metastore: Trying to connect to metastore with URI thrift://<>:9083
     15/12/11 07:11:07 INFO hive.metastore: Connected to metastore.
     15/12/11 07:11:07 INFO processor.HiveOutputProcessor: hive table default.pos_rtl_str_test does not exist
     15/12/11 07:11:07 WARN tool.ConnectorJobRunner: com.teradata.connector.common.exception.ConnectorException: The output post processor returns      1
     15/12/11 07:11:07 INFO processor.TeradataInputProcessor: input postprocessor      com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor
starts at:  1449817867932
     15/12/11 07:11:08 INFO processor.TeradataInputProcessor: input postprocessor      com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor
ends at:  1449817867932
     15/12/11 07:11:08 INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor
com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s
     15/12/11 07:11:08 ERROR wrapper.TDImportTool: Teradata Connector for Hadoop tool error.
     java.lang.reflect.InvocationTargetException
             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
             at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
             at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
             at java.lang.reflect.Method.invoke(Method.java:497)
        at com.ibm.biginsights.ie.sqoop.td.wrapper.TDImportTool.callTDCH(TDImportTool.java:104)
        at com.ibm.biginsights.ie.sqoop.td.wrapper.TDImportTool.run(TDImportTool.java:72)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
     Caused by: com.teradata.connector.common.exception.ConnectorException:
     Import Hive table's column schema is missing
             at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:140)
             ... 12 more

Regards,

Koushik Chandra



     
  <Prev Next>  
<<First
<Prev
Next>
Last>>
 
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 27 Dec 2016