|
Archives of the TeradataForumMessage Posted: Fri, 11 May 2012 @ 21:23:11 GMT
Folks, I am working on getting some polygon data from a source system that exceeds Teradata's limit of 64k row length. So one approach that I am wanting to go with to keep from having to load clob data via Bteq, TPT Inserter or a java application is to split the row into parts and store the polygon data into multiple rows. Such that I could have the following 1|abcd|abcd|abcd|row1|polygon data 2|abcd|abcd|abcd|row2| end of polygon data Then have a query that would concatenate those two fields and insert them into a st_geometry field. Now this all works fine and dandy if my fields are relatively small i.e. less than the 64k limit. What I need to be able to do is to take and concatentate fields that could be 31000 in length together and put into a st_geometry field. The issue I am running into is that when I try this, I get the "[3798] A column or character expression is larger than the max size" error. How would I need to approach this to get the data into this ST_Geometry field without having to go the TPT Inserter/BTEQ route? I really don't want to have to deal with a data file and then 300K+ geometry clob files. That is a bear to deal with and the source system didn't like the idea of having to generate them. So I figured if I could split this out into parts, I could use our existing fastload/tpt loader programs and then just merge the data from the staging table into the final. I just need to get around this max length issue. My staging table is defined as CREATE MULTISET TABLE UI_RESULTS_DB.postal_boundary_stg3 ,NO FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT, DEFAULT MERGEBLOCKRATIO ( POSTAL_BOUNDARY_ID DECIMAL(38,0), POSTAL_CD CHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC, ISO_CTRY_3_CD CHAR(3) CHARACTER SET LATIN NOT CASESPECIFIC, ISO_CTRY_2_CD CHAR(2) CHARACTER SET LATIN NOT CASESPECIFIC, ADMIN_1_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_2_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_3_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_4_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_5_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, POSTAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, POSTAL_TYPE_CD VARCHAR(50) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_1_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_2_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_3_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_4_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, ADMIN_5_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, POSTAL_LOCAL_NM VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC, LOCAL_LANGUAGE_CD CHAR(3) CHARACTER SET UNICODE NOT CASESPECIFIC, AREA_MILE_NBR DECIMAL(15,2), AREA_METER_NBR DECIMAL(15,2), ROW_NUM CHAR(1) CHARACTER SET LATIN NOT CASESPECIFIC, BOUNDARY_SPTL VARCHAR(31000) CHARACTER SET LATIN NOT CASESPECIFIC) PRIMARY INDEX ( POSTAL_BOUNDARY_ID ); Here is the SQL I was trying to use to do the insert into the final table which defines the boundary_sptl as a st_geometry field This is my first time having to deal with data the exceeds the 64k row limit. INSERT INTO ui_results_db.postal_boundary SELECT a.POSTAL_BOUNDARY_ID, a.POSTAL_CD, a.ISO_CTRY_3_CD, a.ISO_CTRY_2_CD, a.ADMIN_1_NM, a.ADMIN_2_NM, a.ADMIN_3_NM, a.ADMIN_4_NM, a.ADMIN_5_NM, a.POSTAL_NM, a.POSTAL_TYPE_CD, a.ADMIN_1_LOCAL_NM, a.ADMIN_2_LOCAL_NM, a.ADMIN_3_LOCAL_NM, a.ADMIN_4_LOCAL_NM, a.ADMIN_5_LOCAL_NM, a.POSTAL_LOCAL_NM, a.LOCAL_LANGUAGE_CD, a.AREA_MILE_NBR, a.AREA_METER_NBR, ((a.BOUNDARY_SPTL (VARCHAR(31000))) || ( b.boundary_sptl(VARCHAR(31000))) || ( c.boundary_sptl (VARCHAR(31000)))) FROM ui_results_db.postal_boundary_stg3 a, ui_results_db.postal_boundary_stg3 b, ui_results_db.postal_boundary_stg3 c WHERE a.postal_boundary_id = b.postal_boundary_id AND a.postal_boundary_id = c.postal_boundary_id AND a.row_num = '1' AND b.row_num = '2' AND c.row_num = '3'; Thanks, Bill
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||