Job Entry BO updateable BAQ performance


(Brandon Anderson) #1

I’ve got to be doing something wrong here. I need to make a Ubaq on some UD fields in the JobMtl and the JobAsm tables. I’m working on some testing and anything that I do with the JobEntry UpdateExt takes at least 30 seconds per row. I removed anything related to the UD field and am just testing with JobHead.UserChar1 field. the JobEntry Ext BO seems to require jobmtl, jobasm, and JobHead tables. I only have one updateable field set up. I also disabled any BPM’s that have to do with anything Job. Still very slow.

Is there something else besides the JobEntry UpdateExt BO that I should be using? Do I need to tie some more fields together? Less? adding filters somewhere? Any help or ideas would be appreciated.

Correction, I timed it, and it takes over 2 mins.


(Brad Fraser) #2

I have an updateable BAQ to the same tables and it always generates a “Not Responding” on the dashboard window Title and takes thirty - sixty seconds for a change in MTL quantity on 10 rows. Hoping for a tip on increased performance here as well.


(Tanner Post) #3

When moving from 10.0.700 to 10.1.600 we found the same problem with updatable BAQs referencing UD fields on Job tables. We ended up converting them to external updatable BAQs because the performance was so bad and support would not acknowledge the issue.


(Brandon Anderson) #4

That doesn’t sound good for me!

So let’s try a different approach. Due to performance issues with trying to get and UDBaq to work with job entry, I’m exploring the idea of populating a UD table with the few requisite pieces of data that I needs to be able to get define some packaging rules for assembly. I know that I can get the information in there with DMT and just run updates manually, but what’s the fun in that! What’s the best way to get the tables populated automatically? I need the job number, assembly sequence, mtl seq for materials, part number, description, required quantity, and the related Opcode from the JobMtl and JobAsm tables. My hope is that if I am able to use a UD table, that the BO for the UBaq should be a lot faster. Especially since I am not trying to change anything related to native epicor fields anyways so there shouldn’t be any fancy rules that have to be followed.

I’m going to start with just DMT and see if the UBaq performance is acceptable, but if it is, then hopefully I can set some rules that help keep the tables in sync.


(Brandon Anderson) #5

@aidacra @Bart_Elia And anyone else Epicor,

What’s the best way to get someone to look at this performance problem on the Job Entry BO on UBaqs? Our system is slowing down more and more as we load in more jobs with more parts. I’m having creating a bunch of extra complexity and workarounds in my solution by populating a UD table to duplicate what should be possible with a simple extended UD field. I’m suspecting that there is a filtering process that isn’t working properly and it’s running the rules on all jobs in the system instead of just the rows needing updating. That type of problem won’t show up in the testing at Epicor, because the testing/demo databases aren’t scaled up enough to load the system.

I’m worried that if I make a service call, they will say it works, because technically it does, and then it’s just arguing with service that they need to fix something, and them saying there is nothing to fix. Do I just the make case and then hit escalate a bunch of times right off the bat?


(Bart Elia) #6

Measuring where time is spent is the important part. I cannot speak to the particular service but in general you can turn on the tracing in the server directory to monitor different aspects -> appserver.config. (There is a UI in Admin Console but it only has a few of the switches)

trace://ice/fw/perf
trace://system/db/hits
Are a good start. From there you can dig into where time is spent in different queries or loops in apps code depending.

You may end up in SQL Profiler looking at query performance at the end if you have are hitting a few tables and your indexes don’t line up so having that skill in your back pocket is always good as a general SQL skill.

Nathan may have some more explicit advice and PDT is always a good resource with it’s help as well.