|
Archives of the TeradataForumMessage Posted: Wed, 14 Aug 2013 @ 10:40:37 GMT
You have to talk to business for some critical questions: * What version Netvault? * Why the big tables that you need to take Incremental backup are not partitioned? * If the Big tables that are not partitioened, can you use 'AFTER JOURNAL' on them? Remember it's a overhead for your DBA work. * Are those tables you think of taking incremental backup, are backed up weekly, daily, or more often? * If the tables are large enough, are the source files or tables backed up instead? * If so, how long are source files retained? * Are we receiving incremental changes or full data refreshes? In case you do not have option to make partition on your big table in your big database, you do the following: At BIG DATABASE database level: CREATE DATABASE FROM XXXspace AS PERM = 80000000000 (80GB) SPOOL = 1000000000 ACCOUNT = '$xxxxxx' NO FALLBACK AFTER JOURNAL NO BEFORE JOURNAL DEFAULT JOURNAL TABLE = .journals; at BIG TABLE table level: CREATE TABLE . NO FALLBACK, AFTER JOURNAL, NO BEFORE JOURNAL ( COL1 BIGINT, COL2 VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC, COL3 VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC, COL3 DATE FORMAT 'YY/MM/DD' ) PRIMARY INDEX (COL1); In the backup Options tab: You can use Incremental, checkbox Use ARC Catalog, Generate ARC script only Enable Checkpoint restart with Checkpoint frequency 1000000 Sessions = 4 Please make sure you do the JOURNAL things after all other options explored. Best, Sobhan
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||