Exporting data using Data Pump
This article describes step-by-step how to perform a export using Data Pump. First create a folder using the operating system, rights need to be given both on the OS and the database.
Step 1: Make a directory.
Step 2: Giving rights, from the operating system, to users and groups.
Step 3: Connecting using SQLPlus and mounting the folder.
Step 4: Granting rights through Oracle to the user who will be doing the exp.
Note: A username and password will be required in order to send mails through a SMTP server. Please refer to the relevant article on Sysdba.
Retrieving information
Example:
To export only the HR schema (it’s possible to export more than one schema using comma separated notation.)
To perform a full export (this can only be done by users that have been assigned an EXPORT_FULL_DATABASE role.)
A system user can export various schemas
To export all the tables, index views, etc in a tablespace, all of the objects tablespace parameters need to be entered.
By default, compression is set to enable. To turn off compression, use “COMPRESSION=NONE”.
Parameter File_1
DIRECTORY=dpump_dir1
DUMPFILE=system1.dmp
SCHEMAS=hr
Parameter File_2
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:”IN (‘COUNTRIES’, ‘LOCATIONS’, ‘REGIONS’)”
QUERY=employees:”WHERE department_id !=20 ORDER BY employee_id”
To take the schema/data of the full export made using the DataPump to a specific schema/data (for example, the user’s own tablespace)
Source User: sourceK
Target User: targetK
Source Tablespace: SourceTBS
Target Tablespace: TargetTBS
Parameters
EXCLUDE=INDEX
EXCLUDE=SCHEMA:”=’HR'”
INCLUDE=TABLE:”IN (‘EMPLOYEES’, ‘DEPARTMENTS’)”
INCLUDE=PROCEDURE
INCLUDE=INDEX:”LIKE ‘EMP%'”
EXCLUDE=TABLE:”=’EMP'”, EXCLUDE=FUNCTION:”=’Function_Name””,
ENCRYPTION_PASSWORD=123456
The previous lines were from a web resource.
To take a full parallel export
A setting of parallel=4 divides the transaction into 4 separate processes and writes to 4 separate files.
The %U symbol is how Oracle creates files in parallel, if a user hasn’t already specified a random letter.
This transaction will result in the creating the following 4 files in concerned folder: expdpmydbfull_01.dp, expdpmydbfull_02.dp, expdpmydbfull_03.dp, expdpmydbfull_04.dp.
It isn’t necessary to import in a similarly parallel way. As a data pump import is up to 50x faster than the previous generations, there isn’t a need to utilize a parallel import.
Along with the import transaction, Release 10.2.0.1 the statistics calculation process can take a long time (possibly due to a bug). A setting of EXCLUDE=STATISTICS can prove to be useful in terms of performance, the statistics can be calculated after the import.
Exporting a single table
Importing the exported table to a different user
Importing all of a schema’s objects to another schema, with a full export
Don’t forget to add apostrophes (‘) to the remap_tablespace parameter.
Importing a single table to another user.
An impdp with a query.
A remap_schema and remap_tablespace import.
Exporting a random 5% of the table.
Using Flashback to export historical data.
’30-12-2007 00:13:47′,’DD-MM-YYYY HH24:MI:SS’)”
Exporting using a parameter file.
## Using a parameter file, especially in UNIX/Linux environments is often faster.
Unix and Linux environments are case-sensitive so specifying paths such as those shown in the INCLUDE and QUERY parameters below, using quotation marks & spaces, requires more care.
It isn’t necessary to use quotation marks or space characters within the parameter files.
Parameter file, example 1
INCLUDE=TABLE:”LIKE ‘%KUTUK%'”
DIRECTORY=DUMP_DIR
DUMPFILE=expdpusr_usring.dp
LOGFILE=expdpusr_usring.log
Parameter file, example 2
directory=DUMP_DIR
dumpfile=expdpfull.dp
logfile=expdpfull.log
exclude=SCHEMA:”=’SYS'”
exclude=SCHEMA:”=’SYSTEM'”
exclude=SCHEMA:”=’SYSMAN'”
exclude=SCHEMA:”=’WMSYS'”
exclude=SCHEMA:”=’WK_TEST'”
exclude=SCHEMA:”=’WKSYS'”
exclude=SCHEMA:”=’OLAPSYS'”
exclude=SCHEMA:”=’OUTLN'”
Exporting only the metadata 1.
# The problem here is that there is no workaround if you need objects of different types to those that have been taken.
Exclusively exporting metadata 2.
# If you import the data, exported out of this table, all the record patterns and objects will be available in the new schema and there won’t be any data in the tables.
By using the SQLFile parameters, such as those shown below, the import transaction won’t get executed and all of the scripted transactions will write to the file that has been specified.
Any trigger script can be accessed this way
Importing using network_link (without exporting beforehand)
To view the contents of the DMP file
A full imp is assignable after creating the Tablespaces, contained in the Dump.
If default tbs’ assigned to users can’t be found, the datapump will attempt to create it in the same path, example when shifting from Windows Server to Unix. If unable to do this as well the imp operation gets stopped.