|
Archives of the TeradataForumMessage Posted: Mon, 20 Oct 2003 @ 21:43:08 GMT
Tony, The all time fastest way to deal with this is to FASTLOAD your input table into a work table, then run 2 SQL statements against the work table, 1 joining to your provider table and copying the matching data to your target table, the other doing an exclusion join to the provider table to identify the rows to be rejected. The FASTLOAD needs to run separately, but the SQL can either be presented by BTEQ or your COBOL program. The problem isn't the COBOL interface, it's the fact that you're issuing 1.5 million select statements in your program. DB2 copes with this by having "Static SQL" so it doesn't have to reparse/optimize the request 1.5 million times! The provider table is probably memory resident, so each request (most likely) doesn't even do an IO. Even for cached queries, like this, unfortunately, Teradata at least requires the request be passed to the Teradata machine (2 - 4 IOs) for every one of those 1.5 million rows. The strategy with Teradata is to get the data on to the Teradata machine as quickly as possible and manipulate it there to exploit the parallelism. Pat Belle
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||