Dev and Prod on the same TS? Bad idea, right, or no big deal??

KeatonInCo

KeatonInCo

Member
Hi. I apologize ahead of time for asking a stupid question, if this indeed is a stupid question. But, I do need clarification on this, so any explanation you can provide would be appreciated. Thanks!

We are experiencing strange issues with our JDE client software. All of a sudden JITI activity is taking place and users are receiving windows file locking errors as a result. We have people developing reports and report versions who can't perform check-out/check-in operations, but they could yesterday morning. I have a hunch about something, but without more core knowledge of JDE I can't validate my assumption. If you could provide some clarity to this that would be very helpful to me.


Here is our scenario.

We have a single PC-Server (Terminal Server, several users) running the JDE client in our configuration. The JDE client is a full package build based on the PD7333 code stream. The deployment was a full developer version, so all of the code was copied down to the TS as well.

After I deployed the package I ran R92CRTGL and R92TAM to copy down the complete specs for the global tables and for the data dictionary to D:\B7\PD7333\SPEC. Everything seemed to run fine after this.

Now, our production users all run in PROD environments which point to PD databases and PD objects. When they are using the system, creating new reports and report versions, everything they do is associated with PD7333. When they check-in and check out reports the path code is always PD7333, so when the local specs are updated they are always from the same PD environment.

A developer from our VAR has been performing development on the same terminal server. When he works, he uses a DATA environment which is associated with DV databases and DV objects. He is writing new programs, reports and report versions on the same machine that our production users are on. He is performing the work under the JDE account. I logged in as JDE and launched OMW. I noticed that several of the reports/versions that he has been working on is associated with the DV7333 path code. He has been checking-in/checking-out these reports as he works on them.

Here is my question: If the product build on the Terminal Server is from the production path code (PD7333), and the spec (TAM) files that are in use there are all from the PD7333 path code, if on the same terminal server the developer is working with DV7333 objects, updating the local TAMs with specs out of development, will that potentially corrupt the local TAMs since the specs are from two different path codes? Would this explain why we are beginning to experience check-in/check-out and JITI issues in production? Is JDE attempting to refresh potentially stale/corrupted PD7333 TAMs because of overwrites out of the DV7333 environment?

My current thought process is that performing development work on the same PC where production work is taking place is a very bad idea (duh). I was given the impression early on by our VAR that both a production package AND a development package could be deployed to the same PC at once. Then, when you select an environment to work in, JDE would know which set of local files/programs/specs to work off of.

It is only as of a couple weeks ago that I discovered that this is NOT the case at all. That if you want to have both, only one can be active at a time with the other sitting in a dormant stage, using snapshot.exe to switch back and forth between the two.

If our VAR needs to develop in our environment then my new assumption is that to prevent TAM corruption on the production Terminal Server, I need to deploy a different TS where a build from DV7333 is located, so that when they do development the local package and TAM specs match the databases and objects they are working on. Then, when they promote an object to PD7333 I build either a full or update package in PD7333 and then deploy this to the user's Terminal Server.

Let me know if I'm on the mark or not with this, especially about potential spec file corruption on the TS.

Thanks!

Keaton
 
Hey, Keaton, you are on the right track, but not necessarily for the right reasons. The DDICT and Global table files are environment specific and not shared. That said, there are a number of files that are shared across pathcodes. These are most likely the source of the problems you are encountering. I would suggest that you not develop on a production Citrix machine. If your number of developers is small, why not install a DEV package on a local PC and let them develop there. It could be a Citrix install or just a single user machine with NetOp, DameWare or something similar to connect remotely. The dormant environment thing doesn't seem to ring right with me, but I can't say that I've ever really put that to the test. We have 2 test Citrix servers with PY and DV, but I don't know that there has ever been an attempt to have users in both environments concurrently. My gut reaction is that it is not a problem, but maybe some wiser listers can address this issue for you. I don't think it is a bad idea to have the 3 environments installed on the same machine, just a bad idea to develop on a machine that a large number of users is accessing. My $0.02. HTH.
 
Our servers are hosted in a remote datacenter and I am charged by the server, so adding a new TS for development activity would be costly, but we can do it.

Here are some statements from the book: "JDE OneWorld - A Developer's Guide" that I have in question.

P. 74 - "When a package is installed on a client, the client can have only ONE active OneWorld environment. This restricts the client to one path code in use at a time."

P. 76, 77 - "SnapShot is an application that enables installation of multiple environments on a client.... SnapShot allows you to SWITCH among any environments that are already installed on your client."

P. 80 - "You can install only ONE package at a time to a workstation, and you can have only one active environment on a workstation."

Now, since we are running a multiple user TS environment, and since each user has their own JDE.ini file, each user can access a separate environment to work in.

But, since only one package can be installed at a time to a workstation, and since that package in our case is from PD7333, there is only one set of TAM files (specs) on the TS.

I would imagine that if users are simply accessing the application to perform work, then having some users reference the DV7333 environment while others are referencing PD7333 isn't a big deal.

But what if a developer is checking out reports/versions and is creating local versions of reports and programs defined as DV7333 objects, when the deployed package on the TS is from PD7333? Will this activity cause problems to the one set of TAMs located in the D:\B7\PD7333\spec folder?
 
Never mind. Stupid question.

Our TS does have both DV7333 and PD7333 path codes on it. I get it now.

Sorry for wasting the 60 seconds you spent reading my question.

-K
 
Back
Top