Parameter Mappings Between Old Export/Import and new Data Pump Expdp/Impdp:
This post describes how original Export and Import parameters map to the Data Pump Export and Import parameters that provides similar functionality.
Usage of Original Export Parameters with Data Pump Expdp
In following table will find how Data Pump Export accepts original Export parameters. Table 1-1 describes how Data Pump Export interprets original Export parameters. Parameters that have the same name and functionality in both original Export and Data Pump Export are excluded from this table.
Original Export Parameter | Data Pump Export Parameter |
---|---|
BUFFER |
This parameter is ignored.
|
COMPRESS |
Though we have COMPRESSION parameter in Data Pump but it has different functionality from older Exp/Imp COMPRESS parameter.The old COMPRESS parameter is ignored. In original Export, the
COMPRESS parameter affected how the initial extent was managed. Setting COMPRESS=n caused original Export to use current storage parameters for the initial and next extent.
The Data Pump Export
COMPRESSION parameter is used to specify how data is compressed in the dump file, and is not related to the original Export COMPRESS parameter. |
CONSISTENT |
The old Exp/Imp CONSISTENT parameter is used to manage consistency of data in dump file.In Data Pump Export determines the current time and uses
FLASHBACK_TIME parameter in place of older CONSISTENT parameter . |
CONSTRAINTS |
In old Export we have used
CONSTRAINTS=n , and in Data Pump Export we use EXCLUDE=CONSTRAINTS .
The default behavior is to include constraints as part of the export.
|
DIRECT |
The DIRECT parameter is ignored in new Data Pump Export. Data Pump Export automatically chooses the best export method.
|
FEEDBACK |
In older Export, feedback was given after a certain number of rows, as specified with the
FEEDBACK command. In Data Pump Export, the status is given every mentioned seconds, as specified by STATUS .For example in the Data Pump Export STATUS=30 command is used. Note that this is not a direct mapping because the STATUS command returns the status of the export job, as well as the rows being processed. |
FILE |
In older export we use
FILE parameter and in Data Pump Export we are using the DUMPFILE parameter, and also to determine whether a directory object exists to which the schema has read and write access. |
GRANTS |
In old Export we used
GRANTS=n , and in Data Pump Export we use EXCLUDE=GRANT .
In old Export we used
GRANTS=y , but in the Data Pump Export this parameter is ignored and does not need to be remapped because it is the Data Pump Export default behavior. |
INDEXES |
In old Export we used
INDEXES=n , and in Data Pump Export we use the EXCLUDE=INDEX parameter.
In old Export we used
INDEXES=y , and in Data Pump Export the parameter is ignored and does not need to be remapped because that is the Data Pump Export default behavior. |
LOG |
In place of older LOG parameter Data Pump Export attempts to determine the path that was specified or defaulted to for the
LOGFILE parameter, and also to determine whether a directory object exists to which the schema has read and write access.
The contents of the log file will is about Data Pump Export operation.
|
OBJECT_CONSISTENT |
This parameter is ignored because Data Pump Export processing ensures that each object is in a consistent state when being exported.
|
OWNER |
In place of older OWNER parameter, the Data Pump
SCHEMAS parameter is used. |
RECORDLENGTH |
This parameter is ignored because Data Pump Export automatically takes care of buffer sizing.
|
RESUMABLE |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the
EXP_FULL_DATABASE role. |
RESUMABLE_NAME |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the
EXP_FULL_DATABASE role. |
RESUMABLE_TIMEOUT |
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the
EXP_FULL_DATABASE role. |
ROWS |
In place of old Export parameter
ROWS=y , Data Pump Export uses the CONTENT=ALL parameter.
If place of old Export parameter
ROWS=n , Data Pump Export uses the CONTENT=METADATA_ONLY parameter. |
STATISTICS |
This parameter is ignored because statistics are always saved for tables as part of a Data Pump export operation.
|
TABLESPACES |
In old Export we used parameter
TRANSPORT_TABLESPACE=n , in Data Pump Export the TABLESPACES parameter is ignored.
In old Export we used parameter
TRANSPORT_TABLESPACE=y , the Data Pump Export takes the names listed for the TABLESPACES parameter and uses them on the Data Pump Export TRANSPORT_TABLESPACES parameter. |
TRANSPORT_TABLESPACE |
If original Export used
TRANSPORT_TABLESPACE=n (the default), then Data Pump Export uses the TABLESPACES parameter.
If original Export used
TRANSPORT_TABLESPACE=y , then Data Pump Export uses the TRANSPORT_TABLESPACES parameter and only the metadata is exported. |
TRIGGERS |
In old Export we used parameter
TRIGGERS=n , in Data Pump Export we are using the EXCLUDE=TRIGGER parameter.
In old Export we used parameter
TRIGGERS=y , the parameter is ignored in Data Pump Export and does not need to be remapped because that is the Data Pump Export default behavior. |
TTS_FULL_CHECK |
In older Export we used
TTS_FULL_CHECK=y , and in Data Pump Export we are using the TRANSPORT_FULL_CHECK parameter.
If original Export used
TTS_FULL_CHECK=y , then the parameter is ignored and does not need to be remapped because that is the Data Pump Export default behavior. |
VOLSIZE |
When in older Export
VOLSIZE parameter is used, it means the location specified for the dump file is a tape device. The Data Pump Export dump file format does not support tape devices. Therefore, this operation terminates with an error. |
Usage of Old Import Parameters with Data Pump Impdp
In following table we will find how older Import parameters are mapped to new Data Pump Impdp parameters.
Old Import Parameter | Data Pump Impdp Parameter |
---|---|
BUFFER |
The BUFFER parameter is ignored.
|
CHARSET |
The parameter CHARSET was not supported several releases ago and should no longer be used. It will cause the Data Pump Import operation to abort.
|
COMMIT |
The COMMIT parameter is ignored. Data Pump Import automatically performs a commit after each table is processed.
|
COMPILE |
The COMPILE parameter is ignored. Data Pump Import compiles procedures after they are created. A recompile can be executed if necessary for dependency reasons.
|
CONSTRAINTS |
In older Import we used
CONSTRAINTS=n ,and in Data Pump Import we are using the EXCLUDE=CONSTRAINT parameter.
In older Import we used
CONSTRAINTS=y , in Data Pump Import the parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior. |
DATAFILES |
In the Data Pump Import
TRANSPORT_DATAFILES parameter is used. |
DESTROY |
In older Import we used
DESTROY=y , and in Data Pump Import we are using the REUSE_DATAFILES =y parameter.
If older Import we used
DESTROY=n , and in the Data Pump Import parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior. |
FEEDBACK |
In old Import, feedback was given after a certain number of rows, as specified with the
FEEDBACK command. In Data Pump Import, the status is given every mentioned seconds, as specified by STATUS.For example The Data Pump Import STATUS=30 command is used. Note that this is not a direct mapping because the STATUS command returns the status of the import job, as well as the rows being processed. |
FILE |
Data Pump Import uses the DUMPFILE parameter, and also to determine whether a directory object exists to which the schema has read and write access. In older import we used FILE parameter.
|
FILESIZE |
The FILESIZE parameter is ignored because the information is already contained in the Data Pump dump file set.
|
FROMUSER |
In the Data Pump Import
SCHEMAS parameter is used. If FROMUSER was used without TOUSER also being used, then import schemas that have the IMP_FULL_DATABASE role cause Data Pump Import to attempt to create the schema and then import that schema's objects. Import schemas that do not have the IMP_FULL_DATABASE role can only import their own schema from the dump file set. |
GRANTS |
In old Import we used
GRANTS=n , and in Data Pump Import we are using the EXCLUDE=OBJECT_GRANT parameter.
In old Import we used
GRANTS=y , and in the Data Pump Import this parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior. |
IGNORE |
In old Import we used
IGNORE=y , and in Data Pump Import we are using the TABLE_EXISTS_ACTION=APPEND parameter. This causes the processing of table data to continue.
In old Import we used
IGNORE=n , and in the Data Pump Import this parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior. |
INDEXES |
In old Import we used
INDEXES=n , and in the Data Pump Import we are using the EXCLUDE=INDEX parameter.
In old Import we used used
INDEXES=y , and in the Data Pump Import this parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior. |
INDEXFILE |
The Data Pump Import
SQLFILE={directory-object:}filename and INCLUDE=INDEX parameters are used.
The same method and attempts made when looking for a directory object described for the
FILE parameter also take place for the INDEXFILE parameter.
If no directory object was specified on the original Import, then Data Pump Import uses the directory object specified with the
DIRECTORY parameter. |
LOG |
In Data Pump Import we use
LOGFILE parameter, and also to determine whether a directory object exists to which the schema has read and write access.
The contents of the log file will be those of a Data Pump Import operation.
|
RECORDLENGTH |
This parameter is ignored because Data Pump handles issues about record length internally.
|
RESUMABLE |
This parameter is ignored in Data Pump because this functionality is automatically provided for users who have been granted the
IMP_FULL_DATABASE role. |
RESUMABLE_NAME |
This parameter is ignored in Data Pump because this functionality is automatically provided for users who have been granted the
IMP_FULL_DATABASE role. |
RESUMABLE_TIMEOUT |
This parameter is ignored because this functionality is automatically provided for users who have been granted the
IMP_FULL_DATABASE role. |
ROWS=N |
In old Import we used
ROWS=n , and in Data Pump Import we are using the CONTENT=METADATA_ONLY parameter.
In old Import we used
ROWS=y , and in Data Pump Import we are using the CONTENT=ALL parameter. |
SHOW |
If in old import
SHOW=y is specified, then the Data Pump Import SQLFILE=[directory_object:]file_name parameter is used to write the DDL for the import operation to a file. Only the DDL (not the entire contents of the dump file) is written to the specified file. (Note that the output is not shown on the screen as it was in original Import.)
The name of the file will be the file name specified on the
DUMPFILE parameter (or on the original Import FILE parameter, which is remapped to DUMPFILE ). If multiple dump file names are listed, then the first file name in the list is used. The file will be located in the directory object location specified on the DIRECTORY parameter or the directory object included on the DUMPFILE parameter. (Directory objects specified on theDUMPFILE parameter take precedence.) |
STATISTICS |
This parameter is ignored because statistics are always saved for tables as part of a Data Pump Import operation.
|
STREAMS_CONFIGURATION |
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified.
|
STREAMS_INSTANTIATION |
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified
|
TABLESPACES |
In old Import we used
TRANSPORT_TABLESPACE=n (the default),and in Data Pump Import the TABLESPACES parameter is ignored.
In old Import we used
TRANSPORT_TABLESPACE=y , and in Data Pump Import takes the names supplied for this TABLESPACES parameter and applies them to the Data Pump Import TRANSPORT_TABLESPACES parameter. |
TOID_NOVALIDATE |
This parameter is ignored. OIDs are no longer used for type validation.
|
TOUSER |
The Data Pump Import
REMAP_SCHEMA parameter is used. There may be more objects imported than with original Import. Also, Data Pump Import may create the target schema if it does not already exist.
The
FROMUSER parameter must also have been specified in original Import or the operation will fail. |
TRANSPORT_TABLESPACE |
The
TRANSPORT_TABLESPACE parameter is ignored, but if you also specified the DATAFILES parameter, then the import job continues to load the metadata. If the DATAFILES parameter is not specified, then an ORA-39002:invalid operation error message is returned. |
TTS_OWNERS |
This parameter is ignored because this information is automatically stored in the Data Pump dump file set.
|
VOLSIZE |
When the old Import
VOLSIZE parameter is used, it means the location specified for the dump file is a tape device. The Data Pump Import dump file format does not support tape devices. Therefore, this operation terminates with an error. |
Very crisp document..Found it very useful..
ReplyDeleteVery helpful..
ReplyDelete