Batch load not terminating on Server

janemcs123

janemcs123

Active Member
Hi All,

I was wondering if anyone could help me. I have written a custom UBE to export large volumes of data to custom Z files to then be loaded into an external system. The UBE extracts records from F0101, F0006 and F4801 in turn, so very large volumes involved.
For each record the UBE checks whether the record exists in the custom Z file and if it doesn't it inserts it. Very simple stuff!

Obviously though, this involves a hell of a lot of table I/O and I have had lots of performance issues with it. Run locally, it was terminating unexpectedly, and i would have to run it 3 or 4 times before it would completely load all 3 tables. On the server, it would load all three tables, but then wouldn't complete and would have to be terminated.

I've since modified the program to use buffered inserts. This made a huge difference locally, and the report loaded all 3 tables in around 25 minutes - much faster than previously with no errors or problems.

Great! I thought, but when I ran it on the server this morning, it still loaded all the tables (a little faster), but failed to complete again and had to be terminated.

Has anyone got any ideas as to why this might be happening, or what I can do to prevent it. Incidentally, this only happens on the initial load, whith huge volumes of inserts involved. All subsequent runs, where only NEW data is loaded, everything is fine.

Thanks in advance (and sorry it's so long)
 
Code attached.... <font class="small">Code:</font><hr /><pre> </pre><hr />
 

Attachments

  • 151574-R570002Z1_081009.txt
    25.9 KB · Views: 134
Hi Jane,
1. How did you link the three different sections?
2. Personally I would try to keep things simple = load one table at a time - that is UBE1 loads Table1, UBE2 loads Table2 ... (it makes trobleshooting a bit easier).
3. Instead of Fetching each record of your source-table (low performance), I would use a view on both tables (then use an Insert on missing records, and an Update on existing).
4. What says your jde.log?
Let's start from here, for now, and see where we get.
 
Jane,

what indices are on the custom Z table?

Looking at the program and the tables involved I wouldn't expect a lot of I-O, the only table with any significant number of rows in it is the F4801 and even that shouldn't be very big with your selection parameters. So how many rows are we talking about?
 
Hi Larry,
Hi Larry,

We have roughly:

160,000 address book records
60,000 business units
130,000 work orders

So pretty big volumes
 
Hi Adrian,

Thanks for looking. TO answer your questions:

1) I don't, they just run sequentially one after the other
2) I know, but the receiving app people are obsessed with ensuring that the data is always complete and up-to-date, ie customer and business unit data uploads with the new WO relating to them.
3) I don't fetch each record from the source table, I am doign a fetch on the destination table to see if the record exists or not.
4) Haven't got that far yet.

To be honest, this only happens for the 1st ever load, which will only happen ONCE in the live scenario. We know it populates all the records, even if it won't finish successfully. For that reason I don't want to expend too much coding time to fix it. It's just one of those annoying things which bugs me!!

I appreciate your advice, and you do speak coding best practice. Had I started with a clean slate I would have done it better, but this is one of those jobs which evolved without a spec and has turned into a monster!!
 
Hi,

I think "Flush Insert Buffer" may not be required in the end section event before "Close" table.

Just to check have you opened your table with "Buffer Inserts" ON in the init section?

Checking log would help you. See what log says.
 
Back
Top