aidan2474
Member
Hi,
As space concerns continue to become more of an issue, I have been tasked with overhauling their Development environment refresh program.
So what I have done is to create a UDC table of files that can be completely deleted and another UDC table of files that need to have a specific amount of history kept.
I have a processing option that will be populated with a date that will be used to determine how far back to keep the data in all files for the second UDC table.
Now for the logic on how to keep the historical data for a file (i.e. F0911) in Dev:
1. Create a duplicate object in another library(QTEMP) without any data.
2. Use the copy file command with the keyed field *GE to the processing options historical Date.
3. Display Database relation to an Outfile & Delete all logical files from the Outfile.
4. Clear the physical file in the original Data library.
5. Run an SQL statement to insert the records from the QTEMP file to the original Data file.
6. Delete the QTEMP file.
7. Submit a job to Re-Create the logical files from the DSPDBR Outfile. (I made this a SBMJOB, so as not to hold up the main job)
8. Get next file from UDC table and loop to step 1.
My question is would it be faster to Re-Create the logical files after the physical file is cleared and then SQL insert the data back into the physical file? or keep it the way I have it with the logical files being created as a separate job as not to hold up the main job that is running through a loop of the UDC table of files?
Any help input is appreciated!!
Thanks,
Frank
As space concerns continue to become more of an issue, I have been tasked with overhauling their Development environment refresh program.
So what I have done is to create a UDC table of files that can be completely deleted and another UDC table of files that need to have a specific amount of history kept.
I have a processing option that will be populated with a date that will be used to determine how far back to keep the data in all files for the second UDC table.
Now for the logic on how to keep the historical data for a file (i.e. F0911) in Dev:
1. Create a duplicate object in another library(QTEMP) without any data.
2. Use the copy file command with the keyed field *GE to the processing options historical Date.
3. Display Database relation to an Outfile & Delete all logical files from the Outfile.
4. Clear the physical file in the original Data library.
5. Run an SQL statement to insert the records from the QTEMP file to the original Data file.
6. Delete the QTEMP file.
7. Submit a job to Re-Create the logical files from the DSPDBR Outfile. (I made this a SBMJOB, so as not to hold up the main job)
8. Get next file from UDC table and loop to step 1.
My question is would it be faster to Re-Create the logical files after the physical file is cleared and then SQL insert the data back into the physical file? or keep it the way I have it with the logical files being created as a separate job as not to hold up the main job that is running through a loop of the UDC table of files?
Any help input is appreciated!!
Thanks,
Frank