|
Archives of the TeradataForumMessage Posted: Tue, 19 Jan 2016 @ 09:21:08 GMT
We recently tested named pipe version of FASTExport and a SQLOADER job and it seems to work pretty good. However, I wouldn't know if this will scale out unless Oracle can handle this massive data ingestion. For another case, we tabled KAFKA as an intermediate broker system but I don't have details for both of these. I also know Informatica has FastReader (legacy name) to extract large data volumnes from Oracle and insert in Teradata (This was working at a very large export from Oracle and inserts in Teradata). You should probably check with them if they have something similar. I am inclined to say Informatica may have it. Best, Vinay Bagare
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||