Receiving a target exception inquiring on a 19000 line transaction.

James Monroe

Active Member
We are on E1 9.0, tools 9.1.4.7.

Recently when trying to inquire on a large F0911Z file transaction, the inquiry will freeze around 12000 lines, then encounter an error ('Unknown Exception: class java.lang.reflect.InvocationTargetException->null).

The app in question is the Store/Forward Journal Entry app P0911Z1. We have an interface process that loads the F0911Z1 with GL transactions from another system. In this case, the incoming file is over 19000 lines. This creates one transaction in the F0911Z1. However, there are errors such as business unit setup or missing accounts, so the user has to inquire then fix these errors. When trying to bring up the transaction in F0911Z1, it will begin reading and show its progress. When it gets to about 12000 it freezes, Sometimes it 13000 or 14000 but its around there it freezes then returns the error noted above for Unknown exception.

I've also attached agent and root logs from the web server. We use Oracle Web Logic on a windows box.

From the logs around 7:48 pm or so on 11/14/18, it appears there is maybe a memory issue.

So any thoughts from anyone on where to look to see what memory limits might be changed to address this would be appreciated.

ThanksView attachment e1root_20181113_0.txtView attachment e1agent_011142018.txt
 
You are out of memory, how much memory do you allow Java to have? In WLS you'll find it in the 'Server Start' tab under 'Arguments' if you are already at 4G you will probably need to make a smaller transaction.
 
You are out of memory, how much memory do you allow Java to have? In WLS you'll find it in the 'Server Start' tab under 'Arguments' if you are already at 4G you will probably need to make a smaller transaction.

A 19000 line transaction tells me you are going about this wrong. I’ve converted millions of records but they were always broken down into meaningful batches. It’s far easier to handle thousands of small batches especially when it comes to fixing the errors.

Tom is correct, smaller sets of transactions is the way to go.
 
I’ve converted millions of records but they were always broken down into meaningful batches. It’s far easier to handle thousands of small batches especially when it comes to fixing the errors.

I want to point out some of the reasons for doing this, based on my experience:

If automated programs can convert data into meaningful small batches, this makes sense. Usually, this involves rolling up of lines for the same account/sbl/sblt combinations. Splitting up the batch into smaller batches can become challenging when you have to balance out each Journal Entry, especially for automated programs. Some clients want to maintain the "pristineness" of the data as it comes from the original source and they don't want the data to be manipulated in any form. They also want an "all-or-nothing" approach to the getting the data into E1. If the data is brought in as one batch, all the data is committed or rejected.

Lastly, 19000 lines seems like a lot of data, but it is 2018. We should be able to handle this kind of volume today without too much hassle. It is not uncommon for our users to upload 1 to 5K line entries through our software (automated or through Excel). We even had a user upload a JE batch of 40+ Journal Entries with a total 69000+ lines (and fix all the errors directly in Excel)!
 
Last edited:
Hari,

While I agree in theory, sometimes reality steps in. Perhaps with 64bit java (see recent announcements), this limitation will increase. For now if you know of a way around this, please let us know.

James, one other idea, you could try setting up a dedicated web instance for this, perhaps if no one else is on, there will be enough memory.

Tom
 
James -

A better way to go around this is NOT to use P0911Z1 (you may still come across the 10000 line limitation there). Use table browser to export ALL the data to Excel and analyze the data there. An even better way is to hook Excel to F0911Z1 directly and pull the data from there (assuming your CNC will give you read access to this table).

Correction - The fetchRowsThreshold is only used to display the warning in the log file. It will not allow you to fetch more rows. Removed my comments pertaining to this setting.


Tom -

Perhaps with 64bit java (see recent announcements), this limitation will increase.

The HTML Server is already at 64-bit using 64-bit Java, to my knowledge. Please correct me if this not the case.


While I agree in theory, sometimes reality steps in.

I am addressing the reality of what clients are faced with, no offense to anyone. All I am saying is that users don't always have the leeway to split the data into smaller chunks due to technical and/or business reasons.
 
Last edited:
Hari,

Not to start an argument, but "I am addressing the reality of what clients are faced with, no offense to anyone. All I am saying is that users don't always have the leeway to split the data into smaller chunks due to technical and/or business reasons."

Sometimes users want things that are technically impossible, not saying this is. Part of our job is to let them know when they reach system limits.

I like the answer above about exporting/importing, it gives the users a way around these limits, if it is unacceptable to do the export/import, then they will need to live within the limits.

Tom
 
Back
Top