Custom Data Import Process

Oracle Apps Custom staging tables :

As a thumb rule the structure of custom staging table should be same as the source data file or source legacy database table. There should be presence of columns like Interface_id, Batch_ID, Process_Flag, Error_Message, WHO columns and Extended WHO columns in the table structure.

Shell Script with SQL*Loader :

Shell script (host) type of concurrent program should be used to load data files in custom staging tables instead of SQL*Loader executable type. Shell script internally uses SQL*Loader command line utility to load data and archives the data files. With usage of shell script the data file path is not hard coded, archiving monitoring bad file is easily achievable which is not the case with SQL*Loader executable type. As a rule there should not be hard coding in shell scripts e.g. directory paths, username, password etc.Inbound data files can be placed in directories like $XXINV_TOP/datafiles/in.

Shell script should read these files and load in custom tables and archive data files in $XXINV_TOP/datafiles/in/archive with date timestamp appended to file name. SQL*Loader log files and bad files can reside in $XXINV_TOP/datafiles/in/log and $XXINV_TOP/datafiles/in/bad folders.

This shell script should also read the bad files generated by SQL*Loader and complete the concurrent program in Success/Warning/Error Status. Sample shell script is given in annexure.

This shell script has following sections :

  • Extract Apps username and password

  • Initialize file names (data, log, bad, archive files)

  • Check existence of file

  • Run SQL*Loader

  • Archive data files

  • Check bad files

Shell Script with Java based XML Parser :

Shell script (host) type of concurrent program should be used to call java based/PLSQL XML parser. The directory structure, archiving, logging related rules remain the same mentioned above. In this case we have to develop a custom Java or PLSQL based XML parser which can parse the input xml files and load in custom staging table. PLSQL based parsers user UTL_FILE package to perform file read operations. Building a java based XML parser is a topic in itself, which can be considered as out of scope for this white paper.

PLSQL Package for reading Legacy outbound tables :

This can be a plsql package which can read legacy database tables using database links and writes the inbound custom staging tables in Oracle Apps 11i instance.

Oracle Apps Custom Lookup Table :

This table should be used for storing the lookup values i.e. mapping legacy system values with oracle values. This generally store information like Unit of Measure etc. which can be different on Oracle and Legacy system. The table structure should have columns like Lookup_Type, Language, Legacy System, Legacy System Value, Oracle_Value, effectivity_date, end_date and WHO columns.

PLSQL package for data validation and mapping :

This PLSQL package should perform following activities :

  • Validate data in custom tables

  • Perform data mapping activities using lookup table

  • Update the custom staging table Process_Flag column with following values

PLSQL package to write data in Oracle Open Interface tables :

his PLSQL package should read all valid records (Process_Flag=5) and insert data in Oracle Seeded interface table and mark Process_Flag=7

Oracle Open Interface tables :

These are the inbound interface tables provided by oracle for various base tables in oracle applications.

Oracle Open Interface Concurrent Programs :

These are oracle supplied import programs which read oracle open interface tables, validate the data and load oracle apps base tables.

Oracle Supplied Public APIs :

Generally for online processing (when batch processing is not acceptable) public APIs are used to load oracle base tables. In this form of interfaces, open interface tables are not used. Public PLSQL APIs accept plsql table type of input parameters and APIs perform validation and updates of oracle base tables.