Archives of the TeradataForum
Message Posted: Sun, 09 Mar 2003 @ 12:32:12 GMT
Dieter has pointed out most of the options, but I'd like to clarify a couple of them.
Using the Multiload IMPORT task is probably what you'll need if your delete criteria is "complex". To use this approach, run a SELECT first to identify the PI and Primary Key of every row that needs to be deleted and then export this answer set to a file on your host system. In your existing Multiload IMPORT job inserts the new rows, add a second IMPORT which reads this new file and passes the rows to a new .DML statement which issues the delete.
Another way of deleting large numbers of rows is to insert/select to a new table, but that requires disk space for the copy of the rows that you're keeping (in your case @1.17 billion). So whilst a quick way of doing it, practicalities may prevent you from taking this approach.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 23 Jun 2019|