Extract/export millions of records from several tables to "external" database



Reputable Poster
Extract/export millions of records from several tables to "external" database

Hi all!

A customer has divested some business units (JDE companies) that have been live for years in JDE XE. There are millions of records in F0911, F4211, F4311, F4111 (Especially F0911 is huge) which the customer must make available to the party that acquired the concerned business units

One year of F0911 has about 1 million records for one of the JDE company codes and we are talking about 4 companies for which data has to be made available (Each of these is a unique KCOO)

- The file format created from this extract should be a database format or possible to import to a database since it must be possible to query the data
- It must be possible to export a pre-defined set of tables (not all tables are required) but if the tool has logic to extract all data relevnt to the affected company codes then it is a plus
- There is no requirement to purge this data from JDE XE. It should be possible to extract the data without purging (but purging option wuld be nice, of course) :)
- The JDE XE instance is running on AS/400 using db2

I'm looking for any tips / tools / techniques and success stories that we could benefit from in order to make this happen by mid-may this year

With kind regards
Aarto - While I think something like ARCTOOLS could probably do this rather easily, I would say this could also be manually done using the Data Transfer for iSeries tools in to a CSV file along with a corresponding FDF file for file formats. You would simply filter on the KCOO required and export to CSV. 1 million records isn't really all that much these days. There would be some manual effort required to launch each export, but CSV compacts rather nicely using something like 7zip.

Also, I would think it would be beneficial to obtain table descriptions from JDE itself (Xe has P08001 which can be used to xport the table specifics.) Just to have it on hand for the mapping required later.
Look into SSIS (SQL Server Integration Services). It meets all of your requirements (purging may be a bit tricky, but it can be done as well).
Hi Aarto

This should actually be relatively easy. It seems to me that you are trying to extract specific table information from OneWorld Xe to tables for auditing and other purposes

The issue is that your Oneworld instance is running on an iSeries - not the most transportable of all methods.

However, there ARE database types that JDE can easily move tables to.

Under OneWorld Xe - the MOST transportable are Microsoft Access Tables - you could create an Access .MDB file through ODBC and set up a OneWorld Local datasource, copy the data to Access - and you're done !

The problem is that Microsoft Access has a limit to how large a table can be. I believe that the largest table can not be > 1Gb with Access '97 and 2Gb with Access 2010 - which a million-record F0911 will likely be larger than.

However, JDE does support Microsoft SQL Server - which is the next "most" transportable type of database. A "Standard Edition" SQL Server database should easily be able to cope with all of the data, and should be relatively transportable.

To be honest - the right approach should be that you SHOULD export ALL of your data from the iSeries to Microsoft SQL Server - that way you will have all of the correct financial information to hand.

To do this, you should probably consider hiring a CNC Consultant - someone experienced with JDE can easily install and configure a SQL Server, create a datasource in your existing OneWorld environment, and create Table Conversion Scripts to export all the data from the iSeries to SQL Server - if your total dataset is < 1Tb, I'd expect absolutely no more than 80 hours of chargeable time would be needed to complete such a task - I've done identical projects for companies many times in the past and I'd expect to have this completed easily before mid-may.

A further option is to create a full standalone environment using SQL Server. That would take a little longer - but would be completely transportable. I'd say you'd be able to get something like that by mid-may too, but it would take probably twice as long, based on the number of custom tables/programs you have in your current environment.

Give me a call anytime - I could do this remotely for you.

Hope that helps !
Last edited:
I'm confused, if the units have been divested elsewhere - why would they need you to supply it a DB? Isn't easier to give it to them raw with a map to do whatever they want with? A database in itself seems to me like you're setting yourself up to continue support moving forward.
I agree with the others that the idea of setting up a SQL Server database is great. The only issue I see is that you need to know that the acquiring company uses SQL Server and can restore the backup you would create. I don't know enough about the 400 to know how easy the process TFZ proposed, so I will throw out a JDE platform independent process we use as another option.

Create a custom UBE for each table and use it to pull the data out into CSV. The UBE would be extremely simple and should take very little time to create. It simply reads the records you specify with data selection and writes them using database output to a text/csv file. You can then run it on the server with no additional architecture needed. I did this for the F0911 to provide the entire prior year (also millions of records) to our auditors each January. I put it on a zip drive and they have their own software that imports the file. I would recommend having the business users identify the relevant columns in the table and only pull those columns in your UBE. This will help keep the size of the files to a minimum which is good for both parties. If the files are too large, you can just run the UBE multiple times with more narrow data selection.

TFZ - If you know the destination database and can create a backup in that architecture, it is extremely convenient. I have even used it in the past here to move tables between our various database servers. I have a database called copytemp on each of my servers. If I need to move tables or data, I simply script the table and data into the database, run a backup, move the backup file (if needed), restore the backup to the destination server and script what I need to do with the data. SQL Server tools and compression make it easy to transport large amounts of data.

However, I tend to agree with you in this situation that CSV is probably best with the info provided since we don't know what db architectures are available (other than the source db2) and it should be a one time thing.
I was involved in similar excercise before and we dumped all to csv and gave it to the new co. If you need it in DB then SQL or Access would be your best bet.

Thanks all! We're creating .csv files from AS/400, this seems to be working fine..
I use an Open Source etl tool (Pehntaho Data Integration, aka Kettle). It connects trough JDBC and moves, on my environment, date from ISeries to CSV at 6k record per seconds.