How to see real Processing time of an UBE in mono-threaded queues

antoine_mpo

Reputable Poster
Hi list,

Does anyone set up something to get the real processing time of UBE ?
I mean : when you submit an ube, a new record is added in F986110, with a submitted date and time.
When the ube is finished (done, or error), the last activity date and time is updated.
But if you calculate last activity - date/time submitted, you don't know how long it stayed in "P" (Processing) status. If an ube has been running for 1 hour in the mono-threaded queue before, the ube after will have at least one hour in the calculation, even if it took 1 seconds to proceed.

I thought of creating a custom table and a trigger on the F986110 to record the processing time.
Did someone already did that ? (or anyother solution) ?
But my fear is the performance impact ... In production, we have more than 10 000 ube by day !! (not all in the same queue of course ! :)

If we had those kind of statistics, it would be easier to set up queues to balance ube on the diffirent queues considering how long they really take.

Any help would be appreciate.

Cheers,
 
How do you do ?
Je pense que tu peux t'en sortir en considérant l'exemple suivant :


QUEUE UBE Submitted Last Activity Début Processing
Q1 R0 ... 15H17 -> ?
Q1 R1 15H12 15H18 -> 15H17
Q1 R2 15H13 15H30 -> 15H18
Q2 R4 15H19 15H29 -> 15H19
Q1 R3 15H14 15H32 -> 15H30
Q1 R1 15H17 15H33 -> 15H32
Q1 R1 15H22 15H34 -> 15H33
Q2 R4 15H20 15H40 -> 15H29
Q1 R5 15H35 15H42 -> 15H35
Q1 R1 15H36 15H43 -> 15H42


On considère qu'on a un grand nombre de lancements des UBEs R1, R2, etc... ce qui permettra de
calculer des moyennes...et de minorer les effets du multiprocessing.

Pour chaque Batch, sur une queue, on prend comme date de début de processing la date "submitted"
si celle-ci est supérieure à la date de "Last Activity" du batch précédent,
sinon on prend la date...de "Last Activity" du batch précédent, ainsi de suite.

R1 = 1mn environ

R2 = 12mn

R3 = 2mn

R4 = 10mn environ

R5 = 7mn

Qu'en dis-tu ?

Cordialement,

Sylvain
 
I'm faced with the same problem. I also have a server which processes more than 10K jobs per day and I want to get some kind of performance metrics together. The problem with reporting against the start and end time on the job is it doesn't take in to account the actual start time of the runbatch or runube process on the server. The UBE log is the only place I know of which records this information. If you do a search on the developer forum, you will find some interesting results. There is a business function you can use which will allow you to report against the server maps F986110 table. The problem with this table is it's hard-coded in to the application as a system table. It won't take you to the server maps F986110 which causes OneWorld reports to return zero data.

Anyone have a solution they have developed for reporting purposes?
 
Charles, Antoine,

the only efficient way I could think of is to use table triggers and put the information into a custom table like Antoine as suggested. It should be pretty simple. Just use Execution Host and Job Number as your primary key. When a job goes into processing mode, job status 'P', insert a record into your custom table. When the job terminates, job status 'E' or 'D', update your record in the custom table by searching on execution host and job number. You guys both say that you have 10K+ UBEs a day, however, you do not mention if all of these jobs run on one server or multiple servers. If its multiple servers, then you are probably mapping the Job Control Status Master table, F986110, to different libraries/databases. Table triggers are real easy to activate and deactivate. Just activate your trigger for one specific server, the one that is the least taxed.

Just an idea,
 
Merci monsieur Sylvain !

Par contre, techniquement, tu fais ça comment ? Tu fais un traitement PL/SQL ? (parce que pour trouver le batch précédent il faut récupérer ça par rapport au numéro de job ?) ou une suite de requête via Access ?

Ca ne m'apparait pas trivial, mais qu'y a t'il de trivial dans JDE ? :)

Sinon ma méthode de trigger, t'en penses quoi ?

A+
 
Hi Jim,

Thanks for your answer.
In my case, all the ube are ran on the same server, but there are 2 jde kernels running.
But we will probably change the architecture, as we face real performance issues. There are studying solution with mono kernel on multiple servers (one per subsidaries), and may be dedicated server(s) for ube.

About the triggers, as i'm not really used to use them. Is it better to create them with JDE or directly in the Oracle database ?
When you create one with JDE, it generates one in the Oracle database, isn't it ?

And what do you think of updating the date/time submitted directly in the F986110 table when the status goes to "P", instead of creating a custom table, especially when you have many ube !!

Cheers,
 
Antoine,

I would create the triggers directly in Oracle. JDE table triggers, also known as TER, are not reliable... search the list for TER and you will come across MANY discussions on that subject. TER in OneWorld do not create table triggers in Oracle. I would definitely go with a custom table and not modify the Date/Time submitted in the F986110. The values in the F986110 are good for knowing how long a job took the run, from submission date/time to finished regardless of how long in stayed in wait status.

Just my 2 cents CAD,

<WARNING french paragragh below>

En passant, je ne suis pas un expert Oracle, mais ce genre de truc est très faisable en SQL Server. J'ai un serveur dédié aux traitements en lots pour des interfaces entre OW et un système maison qui traite plus de 1000 UBEs par jour. J'ai créé un déclencheur d'évènement sur le F986110 directement en SQL Server pour gérer de façon dynamique la charge des traitements sur des queues de traitements differents. C'est un moyen de faire du 'Load Balancing' des jobs. Pour les 1000+ UBEs, je n'ai pas observé aucun ralentissement. Bonne chance.

</WARNING french paragragh below>
 
No, the 10K jobs are just on one server. We have a couple of other servers which also run a large number of jobs per day. Those are NT based and run about three or four thousand jobs per day.
 
Charles,

I have three batch servers in my setup. All of them are NT machines. I have one specific server for running UBEs that interface information with our in-house AS/400 system. This particular server processes 1000+ jobs a day. I have created table triggers in SQL Server the load balance these jobs across different job queues on the fly. I know that we are no where in the range of you guys in terms of number of batchs per day, however, I have not noticed any kind of slow down after trigger implementation versus before trigger implementation for these 1000+ jobs. I would suggest starting with the server which processes the least number of jobs, just to be sure for benchmarking purposes.

Good luck to you,
 
Hi Jim,

I made some tries and i'm a bit confused.
I created a custom table (wich is a copy of F986110).
I put the following triggers on the F986110 table of the server map of one of our server :
CREATE OR REPLACE TRIGGER SAUV.testf986110
BEFORE UPDATE of JCJOBSTS
ON svm7333.f986110
FOR EACH ROW
WHEN (NEW.JCJOBSTS = 'P')
BEGIN
INSERT INTO SAUV.f59jobcs (JCEXEHOST,JCJOBNBR,JCJOBQUE,JCENHV,
JCUSER,JCSBMDATE,JCSBMTIME,JCFNDFUF2) VALUES
:)NEW.JCEXEHOST,:NEW.JCJOBNBR,:NEW.JCJOBQUE,:NEW.JCENHV,
:NEW.JCUSER,to_number(concat('1',to_char(sysdate,'YYDDD'))),
to_char(sysdate,'HH24')||to_char(sysdate,'MI')||to_char(sysdate,'SS'),
:NEW.JCFNDFUF2);
END ;

The trigger is working OK, because the records are inserted in my custom table, but it creates a strange issue with JDE !
When the UBE is finished, the status in the table F986110 is E (Error) instead of D (Done), but the UBE finished successfully (you can retrieve the PDF report, and you cannot retrieve any logs, like when it goes in Error).

It's really a problem because it means i cannot use my trigger !
Does anyone have any idea of what's wrong ?
Did i make a mistake in my trigger code or is there any conflicts with the way JDE handle the submitted job table ?

Thanks,

Cheers,
 
Antoine,

I do not know Oracle, but your trigger seems just fine to me. Don't you want to update your custom table when NEW.JCJOBSTS is equal to 'D' or 'E' as well to be able to figure out how much time your job took to execute? As for your job ending in error, is this happening for ALL jobs?
 
Hi Jim,
Yes i want to update my custom table when the status change to 'D', but i didn't create that part as i found the issue when i tested the first part with the record insert.

I didn't test all the batch but i think that indeed, the problem occurs in every batch (i tested with standard batch). And if i disable the trigger and launch the same batch, the status well goes to 'D'.
Really strange no ?
 
OK je suis d'accord cela n'a rien de trivial...

En ce qui concerne la solution des triggers pas le peine de te demander si cela aura une influence néfaste sur les performances du serveur :
oui bien sûr, mais qu'importe, puisque tu vas améliorer la répartition des queues !

Le temps CPU occasionné par un trigger (JDE ou non, bien que je suppose que tu ne pourras pas faire de trigger de base pour cause d'infogérance...),devrait être ridicule, ce qui est moins sûr pour le temps BDD. Selon moi il ne faut pas négliger le "poids" de chaque enregistrement de la table F986110,car elle contient des blobs. Une purge (régulière) de cette table serait la bienvenue avant la mise en place d'un trigger.

Enfin je sais que tu peux tester facilement cette solution sur votre ancien serveur d'entreprise, alors n'hésite pas (Attention à la problématique des OCM avec les Server Map...)

Sylvain
 
Hi Antoine,

Does your table SAUV.f59jobcs have a unique key constraint ?
I mean, regarding your trigger, what would happen if JDE tries to update the "last activity" of a record, and also set the status to 'P' even if it's already set to 'P' (you know that JDE can do it) ?
That should give a unique key violation, should'nt it ?

Regards

Sylvain
 
Antoine,

I think Sylvain is on to something. There must be some kind of error occuring during trigger execution causing the transaction to fail or something. Maybe an Oracle guru on the list can review your trigger and shed some light. Any goers?
 
Antoine,

add a Oracle Date/Time stamp field to the table and make it part of your primary key / unique constraint. Your problem should then go away . . .
 
Hi Larry,

I'm gonna try what you submitted. But i got an issue. I made a copy of the F986110 table in a backup tablespace and set up the ocm mapping of the F986110 for my user to use that copy (i create the ocm mapping both in System data source and server map). But when i submit an ube on the server, no records are added in my copy of the table, it still inserted in the server map table. Did i miss something in my mappings ?

My other question is :
if i put a date/time stamp in the key, how will i update the record in my custom table when the status is D, so update the end date and time ?
Can i make a test in my trigger like :
if exists (select 1 from [customtable] where [key = values of F986110]) then update [custom table] ...
else
insert into [custom table] ....

Thanks for your help.
 
Antoine,

1. F986110 is one of those "special" system tables that are not governed by OCM mappings.

2. Regarding what you're trying to do ... You need to stop and think about this a bit. You got into problems because you were trying to insert the same row multiple times with your DB trigger. The quick way out of that that we suggested was to change your unique constraint on your copied table to allow multiple inserts/rows. By doing this you have a separate row (and date/time) for each status change on the job. Your report/query would then need to collapse those together. If you don't want to do things that way then I suggest you redesign the table to NOT be an exact copy of F986110 but instead only contain the fields you need. The unique constraint only needs the EXEHOST and JOBNBR columns in it. Then change your trigger to INSERT into your custom table when an insert occurs to F986110 and UPDATE the custom table when an update occurs to F986110. You decide what status changes you want to record time stamps on and allow for them in your table design.

Regards,
 
Back
Top