SQL>create or replace directory export_dir as '/opt/oracle/admin/export'; SQL>grant read, write on directory export_dir to fmosse;DATA_PUMP_DIR directory
By default Oracle create the DATA_PUMP_DIR that references the file system directory '/opt/oracle/admin/<sid>/dpdump/'. User must have the EXP_FULL_DATABASE or IMP_FULL_DATABASE privileges to use the DATA_PUMP_DIR database directory object.
If for some reason the system directory defined in DATA_PUMP_DIr is not found then Oracle use the $ORACLE_HOME/rdbms/log directory instead.
Data Pump Export Modes
Export provides different modes for unloading different portions of the database. The mode is specified on the command line, using the appropriate parameter. The available modes are as follows:
- Full Export Mode
- Schema Mode
- Table Mode
- Tablespace Mode
- Transportable Tablespace Mode
Exporting the full database
expdb system content=ALL full=y directory=DATA_PUMP_DIR dumpfile=export_full.dmp logfile=export_full.logThe 'content' parameter specifies which data to export. this parameter can have the following values :
- ALL : To export all data and metadata
- DATA_ONLY : Only data will be exported
- METADATA_ONLY: only the structure of the database will be exported
Exporting some schemas
expdb system content=ALL schemas=fmosse directory=DATA_PUMP_DIR dumpfile=export_full.dmp logfile=export_full.log/br>Exporting some tables of a specific schema
expdb system content=ALL schemas=fmosse tables="TABLE1,TABLE2,TABLE3" directory=DATA_PUMP_DIR dumpfile=export_full.dmp logfile=export_full.log
Exporting a database with a set of dumpfile that does not exceed a specific size
expdp system content=all full=y directory=DATA_PUMP_DIR dumpfile=export_full_%u.dmp logfile=export_full.log filesize=250000If the 'parallel' parameter is defined with the %U, then oracle create initially a set of dump files defined using the value of the 'parallel' attribute.Oracle can create more files than defined in the parallel attribute if the filesize is specified with a value that requires more files than defined in the parallel attribute.
Exporting a database using several processes in parallel
expdp system content=all full=y directory=DATA_PUMP_DIR dumpfile=export_full.dmp logfile=export_full.log parallel=4
Overwrite export files if already exists
expdp system content=all full=y directory=DATA_PUMP_DIR dumpfile=export_full_%u.dmp logfile=export_full.log parallel=4 filesize=250000 reuse_dumpfiles=Y
Estimating the size of the export
expdp system content=all full=y directory=DATA_PUMP_DIR estimate_only=y estimate=statisticsIf estimate_only=y is specified then the dumpfile parameter must not be defined.
The 'estimate' parameter defines the algorithm to use for estimation. The available values are:
- BLOCKS : The default value
- STATISTICS
Importing data
impdp hr directory=DATA_PUMP_DIR dumpfile=hr_export.dmp content=all tables="EMPLOYEES" parallel=1
DataPump import transformation
It is possible with the impdp tool to remap parts of the object when importing a dump file:
- REMAP_DATAFILE : to remap datafiles
- REMAP_TABLESPACES : to remap tablespaces
- REMAP_SCHEMA : to remap a schema to an other one
- REMAP_TABLE : to remap the name of a table
- REMAP_DATA : to alter data before the import
Consistent backup
Use the parameter flashback_time=to_timestamp( '24/10/2011 18:19:00', 'DD/MM/YYYY HH24:MI:SS')