Table Conversion between 2 environments.

karenet

Active Member
i am trying to convert data from an input environment [foreign table]to an output environment [JDE table] with a different name. Are there any OCM entries that I need to make to get this to work right?

What's happening is that I cannot write any rows to the output environment. It looks as though the table conversion UBE is recognizing the output environment, but when I go to insert a row, I receive an 'error' on the File_io_status in debugger. The error(s0 i am receiving are of the ODBC type, like 'unable to insert row on (tablename)'. My breakpoint is positioned on the line directly after the 'insert row' line in ER, so no other IO's are taking place to alter the File_io_status. Help please!!

We are using B7333 SP15/16.

Thanks in advance.
 
Hi "karenet",

I suppose you already very well know that you have to:
* craete a OneWorld data source for your foreign input data source
* create a OneWorld environment for your foreign input where TBLE DEFAULT OCM entry points to the previously mentioned OW data source and with the running Path Code
* use this environment as input environment in your Table Conversion UBE (do you use Table Converson UBE ar all???)

If you want to run your TC UBE under more path code the you have to create separate ennvironments for each Path Code and select the input environment always according as you current Path Code.

Let me some further question about your output problem:
Q1.) Is your output table a standard OW table or a custom one?
Q2.) Is your output environment a standard OW environment (e.g. PROD, PY, DEV, etc.)?
Q3.) Have you generate your table into a data source if it is a custom one?
Q4.) Have you done the previous for each running environment?
Q5.) Which data source does your output table resides in (e.g. in the default Business Data, or in Control Tables, etc.)?
Q6.) What is your output environment setting? Is it always the Login Env? Is it "hard coded" but identical to the Login Env or different?

Hopefully, if you let us know the answers then we will be able help you more.

Waiting your answer...

Zoltán
P.S.1.: Search for "foreign" or "foreign table" on the archieves of the Forum. You will gel a lot of usefull posts.
P.S.2.: If your issue have already been resolved then please, let us know what was the problem and the solution. Thanks.



B7332 SP11, ESU 4116422, Intel NT4, SQL 7 SP1
(working with B7321, B7331, XE too)
 
Hi "karenet",

Just an other question.
Q7.) Have you tried to run your conversion with a new version (Added NOT Copied!)?

Regards,
Zoltán

B7332 SP11, ESU 4116422, Intel NT4, SQL 7 SP1
(working with B7321, B7331, XE too)
 
To: jdelist

I resolved this issue with our CNC. Below are just a few comments about what took place.

1. Last Friday, our CNC changed environment names on us for purposes of getting things in place for prototyping, production, etc. My TC program was working just fine because I was working within the same input and output environment. When our CNC changed names, that is when I discovered that my TC no longer worked properly, even after I went through and changed the environment names in TC.
2. My symptoms were that I could not write any rows to a table in a new environment [named development], and, I could not read any rows from the same input data tables [conv]. Our CNC did not do any renaming with our [conv] tables.
3. Well, as it turned out, I was logging into the new environment [development]. There needed to be OCM entries made so that I could read the input data table from [conv]. The OCM entries were defining the specific [conv] tablename in the [dev] environment. This had something to do with me signing on to the [dev] environment, I think.
4. As far as getting rows written out to the new [dev] table, I was going crazy trying to get this to work. I kept on receiving ODBC errors time and time again when I executed my UBE. Finally, my CNC allowed me to run my TC on the CNC's local PC, i.e within my TC the input env = [conv], and the output env = [dev]. IT WORKED, rows were being written out to the new [dev] table like there was never a problem with the TC program.
5. OK. We now isolated the problem to be a local PC problem. That is, my local PC laptop JDE configuration was the problem. Sure enough, the CNC found some DEMO folders that were not completely removed from my laptop, along with some strange ODBC entries which had to be deleted too. Once the laptop was "cleaned up", a full re-load of JDE took place back onto my laptop. When done, I verified that the ODBC entries were correct and that the data source names were all present with each one using client access.
6. It was time to test the TC locally on my laptop now using the new configuration. IT WORKED, rows were being written out to the new [dev] table like there was never a problem with the TC program, just like IT WORKED on my CNC's laptop back up in point #4 above.

Thanks to those who responded. I will be back, I'm sure, to look up and perhaps write up future JDE problems that I/we come across at our shop. I have found 'jdelist' to be very helpfull and educational at the same time.

Regards.


<P ID="edit"><FONT SIZE=-1>Edited by karenet on 8/31/01 03:07 AM.</FONT></P>
 
Back
Top