exp / imp instance
exp help = y view help
1, exp usr / pwd @ sid File = c: \ tb.dump tables = TB1
If it is to export multiple tables, tables = (tb1, tb2)
2, exp usr / pwd @ sid File = c: \ tb.dump - Export All
3, exp usr / pwd @ sid File = c: \ tb.dump owner = (System, sys)
User system and sys users to export under the table
4, exp usr / pwd @ sid File = c: \ tb.dump tables = TB1 Query = \ "WHERE name = 'ha' \"
Note the position of the semicolon
The following command to view the instructions.
c: \> imp help = y
c: \> exp help = y
About oracle Export and Import
1, Export / Import of use
Oracle Export / Import tool is used to transfer data between databases.
Export to export data from the database dump file
Import from a dump file to enter data in the database the following is a general guide to use their situation (1), to transfer data between two databases with a version of the oracle Server between the different versions of the oracle Server OS between the differences between the same between types of OS (2), for database backup and recovery (3), from one to another SCHEMA SCHEMA
(4), from one to another TABLESPACE TABLESPACE
2, DUMP file
EXPORT to a file format is binary, not hand-edited, otherwise it will damage the data.
ORACLE supports the file are the same on any platform format, you can in all common platforms.
DUMP file in the IMPORT method used upward-compatible, that ORALCE7 the DUMP files can be imported into ORACLE8, but very different versions may be a problem between the version.
3, EXPORT / IMPORT process
EXPORT DUMP exported file contains two basic types of data
DUMP file contains all the DDL to re-create the Data Dictionary's statement is basically to read format.
It should be noted, however, do not use a text editor to edit, oracle not support to do so.
4, IMPORT into the order when the object into the data in, ORACLE has a specific order, with the database versions may vary.
This is mainly resolved by the order of dependencies between objects may have problems. TRIGGER the final import, so the INSERT
Data to the database will not stimulate the TRIGGER. There may be some after import status is INVALID the PROCEDURE, IMPORT will mainly affect some database objects, but do not recompile IMPORT PROCEDURE, resulting in this situation,
To re-compile it, you can solve this problem.
5, compatibility issues
IMPORT EXPORT 5.1.22 tools to deal with later releases of the DUMP file, so you use the IMPORT ORACL E7
Treatment ORACLE6 the DUMP file, and so on, but if the version of ORACLE could not handle a big difference. Specific issues can refer to the appropriate documentation, such as setting the parameters such as (COMPATIBLE parameter)
6, EXPORT need VIEW
VIEW EXPORT need CATEXP.SQL created by these organizations within the VIEW EXPORT DUMP file for the data format.
VIEW used to collect most of the DDL statements to create, the other primarily for developers using ORACLE.
ORACLE VIEW different versions of these may be different between each version may have new features added.
Therefore, the implementation of the new version of the old dump file there will be mistakes, the general can perform CATEXP.SQL solve these problems,
To solve the general problem of backward compatibility as follows:
Export database version than the old situation of the target database:
- The need to import the target database to perform the old CATEXP.SQL
- Use old export EXPORT DUMP file
- Use the old into the database IMPORT
- In the database implementation of the new CATEXP.SQL, to restore the version of the EXPORT VIEW
Export version of the database to new conditions than the target database:
- The need to import the target database to perform a new CATEXP.SQL
- Use the new EXPORT DUMP file export
- IMPORT into the new database
- The implementation in the database the old CATEXP.SQL, to restore the version of the EXPORT VIEW
EXPORT / IMPORT a very important application is to defrag. Because when the initial IMPPORT, will be re-re-import the data CREATE TABLE, so the whole table is stored in a row. EXPORT other default under the DUMP file will generate a "compression (COMPRESS)" TABLE, but in many cases, this compression has been misunderstood. In fact, COMPRESS is to change the value of INITIAL STORAGE parameter. For example:
CREATE TABLE .... STORAGE (INITIAL 10K NEXT 10K ..)
Now the data has been extended to 100 EXTENT, if the use of COMPRESS = Y to EXPORT data
The resulting statement STORAGE (INITIAL 1000K NEXT 10K)
NEXT values and we can see no change in the INITIAL EXTENT sum of all. Therefore, the following situations occur, Table A has 4 100M of EXTENT, the implementation of DELETE FROM A, and then use the COMPRESS = Y export data, generated CREATE TABLE statement will have 400M of the INITIAL EXTENT. Even if it is has been no data TABLE! ! This is a DUMP file even if very small, but it'll IMPORT a huge TABLE.
In addition, the size may exceed the DATAFILE. For example, there are 4 data files 50M, of which Table A are
15 10M of EXTENT, if we adopt the way COMPRESS = Y export data, there will be INITIAL = 150M,
Then re-import, you can not assign a 150M of EXTENT, EXTENT can not cross because of a single file.
8, in the transmission of data between the USER and the TABLESPACE EXPORT data under normal circumstances should be restored to its original place. If the SCOTT user's table to TABLE or USER mode EXPORT data, IMPORT, if the SCOTT user does not exist, then an error! Way to export the data with FULL CREATE USER information, and will therefore create your own USER to store data.
Of course you can use the IMPORT parameters FROMUSER and TOUSER sure you want to import the USER, but to ensure that
TOUSER must have it exists.
9, EXPORT / IMPORT impact on SQUENCE In both cases, EXPORT / IMPORT will SEQUENCE.
(1) If the EXPORT, the user is taking the value of SEQUENCE, SEQUENCE may cause inconsistencies.
(2) Also, if using SEQUENCE CACHE, in EXPORT, those in the CACHE values will be ignored,
Obtained from the data dictionary which is the current value of EXPORT.
If the mode of conducting FULL EXPORT / IMPORT when, just before updating the table with the sequence data in a column, and not the above two cases, the export is to update the previous data.
If the conventional path method, each row of data is to use INSERT statement, the consistency check and INSERT T RIGGER
If you use DIRECT mode, some constraints and may not trigger the trigger, if used in the trigger sequence.nextval, will impact on the sequence.
Then save one thing:
View the current user space the size of each table:
Select Segment_Name, Sum (bytes) / 1024/1024 From User_Extents Group By Segment_Name
View space occupied by each table space size:
Select Tablespace_Name, Sum (bytes) / 1024/1024 From Dba_Segments Group By Tablespace_Name