with scripted data migration

This is a shortened version, for the full version please refer to the original docx (Market and Credit Risk Enhancement – HK System Migration Guide).

1. Export HIVE Tables to HDFSExport the tables to HDFS by entering the code below into the Hive query editor.
EXPORT TABLE <Table Name> TO '<HDFS Path>/<Table Name>’;<Table Name> is the name of the table to be exported, for example, wrk_rda_business_date_jp_sg .<HDFS Path> is the path in HDFS to store the exported tables, for example, /temp. 2. Download HIVE Tables from HDFSDownload the tables in HDFS to the host by the following command.
hdfs dfs –get <HDFS Path> <Host Path><HDFS Path> is the path in HDFS storing the tables to be downloaded, for example, /temp.<Host Path> is the path in the host to store the downloaded tables, for example, /home/oracle/tmp.The tables are now stored in a folder in the specified host path. For example, /home/oracle/tmp/tempCompress the table files into a single file by the following command.
tar –cvzf <File Name> <Host Path><File Name> is the name of the compressed file. It should be end with “.tarz”, for example, temp.tarz.<Host Path> is the location to store the table files to be compressed.For example,tar –cvzf temp.tarz /home/oracle/tmp/tempThe compressed file would be stored in the current working directory. 3. Upload HIVE Tables to HDFSUsing FileZilla, connect to the host storing the compressed file (“.tarz” file). For example, hkbdaat13lv.Copy the “.tarz” file from the host to local directory (create directory if necessary).Connect to the host where the tables should be uploaded to. For example, sgbdaat02.Copy the “.tarz” file from the local directory to the host.Using Putty, connect to the host where the “.tarz” file is copied to. For example, sgbdaat02.Login to the host with “oracle” user ID.Navigate to the directory storing the “.tarz” file. Unzip the “.tarz” file using the following command.
tar –xvf <File Name><File Name> is the “.tarz” file name, for example, temp.tarzThe “.tarz” file is unzipped to a folder with the same name as the file under the current working directory. For example, “temp” folder.Upload the unzipped table files to HDFS by the following command.
hdfs dfs -copyFromLocal <File Name> <HDFS Destination Path><File Name> is the name of the table file stored in the host, for example, “hist_ext_rda_bilateral_sg”.<HDFS Destination Path> is the HDFS path to store the table files, for example, /user/odi_sgbdaat03/zztest.Run the command for all the tables to be uploaded. For example,hdfs dfs -copyFromLocal hist_ext_rda_bilateral_sg /user/odi_sgbdaat03/zztesthdfs dfs -copyFromLocal hist_ext_rda_bond_bbg_dl_sg /user/odi_sgbdaat03/zztesthdfs dfs -copyFromLocal hist_ext_rda_bond_bbg_sg /user/odi_sgbdaat03/zztesthdfs dfs -copyFromLocal hist_ext_rda_bond_coupon_type_sg /user/odi_sgbdaat03/zztesthdfs dfs -copyFromLocal hist_ext_rda_bond_day_count_convention_sg /user/odi_sgbdaat03/zztest…The tables are now uploaded to HDFS. Navigate to HDFS in Hue to check the result. 4. Import HIVE Tables from HDFSCreate a database for staging the tables to be imported using the following command.
CREATE DATABASE [IF NOT EXISTS] <Database Name><Database Name> is the name of the staging database to be created, for example, rda_dm.Import the tables stored in HDFS to HIVE using the following commands.
IMPORT TABLE <Table Name> FROM ‘<HDFS Path>’;<Table Name> is the name of the table stored in HDFS to be imported. Specify the database name by using prefix to make sure the correct tables are being used, for example, rda_dm.hist_ext_rda_bilateral_sg.<HDFS Path> is the HDFS path storing the tables to be imported, for example, /user/odi_sgbdaat03/zztest/hist_ext_rda_bilateral_sg. 5. Insert data to target HIVE tableInsert the table data to the tables to specified database using the following command.
INSERT OVERWRITE TABLE <Destination Table Name> SELECT * FROM <Source Table Name>;<Destination Table Name> is the name of the table where the data is imported to. Specify the database name by using prefix to make sure the correct tables are being used, for example, rda_dem.hist_ext_rda_bilateral_sg.<Source Table Name> is the name of the staging table storing the data to be imported. Specify the database name by using prefix to make sure the correct tables are being used, for example, rda_dm.hist_ext_rda_bilateral_sg.Run the command for all the tables to be imported. In this example, the data in rda_dm is migrated to rda_dem.The data is successfully migrated.