I recently had need to transfer some data between Redshift Databases. Coming from an Oracle background I started my search for the Redshift equivalent to import/export. I found was that there wasn’t one. However using the unload and copy commands you can achieve the desired results. There are quite a few flags options with the commands but here are the ones that worked for me on large set of tables.
To extract the data from a table called PRODUCT in the XYZ schema run the following against the source database. You need to use S3 to store the export files.
unload ('select * from xyz.product') to 's3://my-bucket/product' CREDENTIALS 'aws_access_key_id=<your key here>;aws_secret_access_key=<your key here> ' MANIFEST DELIMITER '|' NULL AS 'null_string' ESCAPE ALLOWOVERWRITE;
In your S3 bucket this will create a manifest file called productmanifest which contains a list of all the export files created and the export data files (eg product0001_part_00 & product0001_part_00).
On the target database issue the following command to load the data into the database.
copy xyz.product from 's3://my-bucket/productmanifest' CREDENTIALS 'aws_access_key_id=<your key here>;aws_secret_access_key=<your key here>' MANIFEST DELIMITER '|' NULL AS 'null_string' ESCAPE;
If there are any errors you can check for errors in the STL_LOAD_ERRORS table.
All the code, all the fun – Ben
Ben writes blogs about the technical side of BI the code, all the code and not much other than the code.
You can read Oracle Apex 5 on AWS Linux or all of Ben’s blogs here.
We run regular business intelligence courses in both Wellington and Auckland. Find out more here.