(Transfer) DB2 Data Import (Import) Export (Export) (Load)

2010-09-10  来源:本站原创  分类:Database  人气:209 

DB2 Data Import (Import) Export (Export) (Load)
http://www.cublog.cn/u/26654/showart_363328.html

The so-called DB2 data movement, including:
1. Data import (Import)
2. Data Export (Export)
3. Data load (Load)

DB2 Import and load the relevant commands are used to a certain format of data files saved to the database table refers to the DB2 database, export the table data is saved to a format which to

The role of data movement:

If you want a different transfer of data between database management systems, data movement is usually the most practical way, because any kind of database management system support several kinds of file formats, through this common interface, very easy to implement the transfer of data between different systems.

These three commands, Export the most simple, because the table data to the file transfer is usually not an error, the data would not be illegal.

In order to explain, first tell us about the file format for moving DB2 data file formats There are four:
1. ASC - non-delimited ASCII file is a ASCII character stream. The data stream line by the row delimiter separated, and each column of line through the start and end position to define. For example:

10 Head Office 160 Corporate New York
15 New England 50 Eastern Boston
20 Mid Atlantic 10 Eastern Washington
38 South Atlantic 30 Eastern Atlanta
42 Great Lakes 100 Midwest Chicago
51 Plains 140 Midwest Dallas
66 Pacific 270 Western San Francisco
84 Mountain 290 Western Denver

2. DEL - delimited ASCII file is a ASCII character stream. The data stream line by the row delimiter separated values column in a row separated by the column delimiter. File type modifier can be used to modify the default values of these delimiters. For example:

10, "Head Office", 160, "Corporate", "New York"
15, "New England", 50, "Eastern", "Boston"
20, "Mid Atlantic", 10, "Eastern", "Washington"
38, "South Atlantic", 30, "Eastern", "Atlanta"
42, "Great Lakes", 100, "Midwest", "Chicago"
51, "Plains", 140, "Midwest", "Dallas"
66, "Pacific", 270, "Western", "San Francisco"
84, "Mountain", 290, "Western", "Denver"

3. WSF - (work sheet format) for the work table format, with the Lotus line of software for data exchange.

4. PC/IXF-- is the Integrated Exchange Format (Integration Exchange Format, IXF) data interchange architecture adapted version of the column by a number of variable length record structure, including the first record, the table records, the table column description of each column break records, and each row of the table one or more data records. PC / IXF file contains character data recorded by the field component.

Part I: Data Export (Export)
Example: the Org table all the data export to file C: \ ORG.TXT in.
Export to c: \ org.txt of del select * from org

Which, of del that export to the file type, in this case, exported to a non-delimited text file; back of select * from org is a SQL statement, check out the results of the statement is to export the data.

Example Two: Changing del format format control character
export to c: \ staff.txt of del modified by coldel $ chardel''decplusblank select * from staff
In this example, modified clause is used to control a variety of symbols, coldel that the interval between the field operator, by default, a comma, and now to $ number; chardel what symbol represents the string field reference, by default a pair of double quotation marks, now replaced with a pair of single quotation marks; decplusblank that for decimal data types, use spaces instead of a plus sign in front, because the default data in decimal plus sign in front of of.

Example Three: The ASC will export the data to a file format
Export command is not supported ASC format, so if you want to export such a structured format ASC require programmers to convert their own operations, idea is to convert various data types are fixed-length string, then all the words you want to export Section merged into a field.
For example, create the following structure of the table n:
create table n (a int, b date, c time, d varchar (5), e char (4), f double)
Then insert the two data:
insert into n values (15, '2004-10-21 ', '23: 12:23', 'abc', 'hh', 35.2)
insert into n values (5, '2004-1-21 ', '3: 12:23', 'bc', 'hhh', 35.672)
To these two structured format to export data to a file, proceed as follows:
export to c: \ test.txt of del select char (a) | | char (b) | | char (c) | | char (d, 5) | | e | | char (f) as tmp from n
This results derived ASC format very similar before and after each line is more a pair of double quotes, which we can use the Text tool (such as WordPad, Notepad, etc.) removed the double quotes can also be ignored , at a later time into the direct control of the format (ignore double quotes)
In the file format is:

"15 2004-10-2123.12.23abc hh 3.52E1"
"5 2004-01-2103.12.23bc hhh 3.5672E1"

Example Four: The export data
export to d: \ myfile.del of del lobs to d: \ lob \ lobfile lobs modified by lobsinfile select * from emp_photo
The command to export the data table emp_photo to d: \ myfile.del paper, the results are as follows:
<pre>;
"000130", "bitmap", "lobs.001.0.43690 /"
"000130", "gif", "lobs.001.43690.29540 /"
"000130", "xwd", "lobs.001.73230.45800 /"
"000140", "bitmap", "lobs.001.119030.71798 /"
"000140", "gif", "lobs.001.190828.29143 /"
"000140", "xwd", "lobs.001.219971.73908 /"
"000150", "bitmap", "lobs.001.293879.73438 /"
"000150", "gif", "lobs.001.367317.39795 /"
"000150", "xwd", "lobs.001.407112.75547 /"
"000190", "bitmap", "lobs.001.482659.63542 /"
"000190", "gif", "lobs.001.546201.36088 /"
"000190", "xwd", "lobs.001.582289.65650 /"
</ Pre>;
The third field is BLOB type, save the file only a symbol, is equivalent to a pointer to the real LOB data stored in d: \ lob directory lobs.001, lobs.002 ,.... .. and a series of documents. Command lobs to the back of the specified large object data stored in any path (note that the path must already exist in advance, otherwise it will error), lobfile large object behind the specified data stored in any file, do not specify the extension name, DB2 will be based the amount of data automatically append .001, .002, etc. extensions, while Do not forget modified by lobsinfile clause.

Example Five: to export the information stored in the message file.
export to d: \ awards.ixf of ixf messages d: \ msgs.txt select * from staff where dept = 20
This case the staff table, dept = 20 data export to d: \ awards.ixf file, all the export information is stored in d: \ msgs.txt file (either success or failure of a warning message), so Administrators can be found by looking at the problem of information documents.

Cases 6: Rename the column to export data.
export to d: \ awards.ixf of ixf method n (c1, c2, c3, c4, c5, c6, c7) messages d: \ msgs.txt select * from staff where dept = 20
By default, the exported data to the table each column corresponding field names automatically named, we can method n clause for each column re-named, to note that this clause Zhi Ge Shi in ixf and wsf files in effective, can not be used in the text file.
Data import

Seven cases: under the root directory of C org.txt file data into the table org
import from c: \ org.txt of del insert into org

Import and Export command format command basically in the corresponding relations, import the corresponding export, from the corresponding to, file names and file formats represent the same meaning, but the import command to support ASC format, and export command is not supported. In addition, the Export command in the final is a SQL statement for selecting data to export, while the last is not a SQL statement into the command, but the way into the data and target table name.

Cases 8: ASC format file from the import data
import from c: \ org2.txt of asc method l (1 5,6 19,20 25,26 37,38 50) insert into org
One method l clause used to specify the text file in the start of each field location and end location, start location and end location of each room separated by spaces, separated by commas between fields.
In addition to l method, there are n methods and p methods will be described below.

Cases 9: The n way to import data and create a new table.
First, export a use case file:
export to d: \ org.ixf of ixf method n (a, b, c, d, e) select * from org
This file has five org.ixf data, the corresponding column names were a, b, c, d, e
And then import the data from the file to a new table
import from d: \ org.ixf of ixf method n (d, e, b) replace_create into orgtest
Select the command from the file into three tables, the order may not be listed in accordance with the order of any original documents. replace_create methods described below.

Insert methods are:
INSERT way - the existing data in the table appended the basis of new data.
INSERT_UPDATE way - in this way can only be used with the primary key of the table, insert the data if the original data with primary key does not conflict directly inserted, if the primary key conflict, then replace the original with new data the data.
REPLACE ways - first in the table to delete the existing data, and then insert the data to an empty table.
REPLACE_CREATE way - that if the table exists, the first data in the table are deleted, then insert the data to an empty table; If the table does not exist, the first document in the field according to create the table, then insert the data back to the table . This approach can only IXF file format the data into the table.

Case of 10: Method to import data using p
import from d: \ org.ixf of ixf method p (4,5,2) replace into orgtest
The example of the effectiveness and implementation of nine similar cases, but the method replaced by n p method, p method specified in the list behind the serial number of the column can, do not need to specify the column names. In addition, this example uses the replace insert the data, which will form the existing data are deleted, then insert the data to an empty table.

Case of 11: On the null value of import for ixf format, into an empty value is very convenient, because it has a null value recorded information. However, ASC format, there are some difficulties, because DB2 will directly insert a space, rather than the null value. To do this, DB2 provides a clause to control: NULL INDICATORS

import from c: \ org2.txt of asc MODIFIED BY nullindchar = # method l (1 5,6 19,20 25,26 37,38 50) NULL INDICATORS (0,0,0,0,38) replace into org

In this case, NULL INDICATORS clause followed by a list that will not survive the first four fields blank value, while the fifth field from the 38 beginning, there may be a null value, while the MODIFIED BY nullindchar = # e sentence that the fifth field in the file # number if you encounter, then the null value.

Said the bar, the hope that you add, the next time to talk about the Load command.
Load (Load)

Format and imported into the command similar to the command keyword is Load, but the parameters of the back so much more than the import command, detailed usage can refer to DB2 documentation on their own.

Loaded with similar import, are data in the input file moved to the target table, two different point in the instance of the progressive interpretation.

The load before the target table must already exist.

Higher than the import into the performance, the reasons explained in detail with examples later.

Load operation is not logged, so you can not use log files roll operation.

Load is divided into four stages:
1. Into phase two things happen at this stage: the data stored in the table, collecting and sorting the index key. When loaded, DBA can specify how long generated the same point.

It is the tool into the checkpoint. If the load is interrupted during execution, it can be a consistent point from the last starting to re-run.

2. Construction of phase construction phase, based on the index key into the information gathering stage to create the index. If an error occurs in the building stage, into tools restart it again from the beginning of the beginning stages of building construction.

3. In the deletion stage, all violations of the sole or primary key constraint of the row is deleted and copied to an exception table (if the option is specified in the statement) in the. When the input line is rejected, the message file to generate a message.

4. Index copy phase during the operation if the load specified for the index to create a system temporary table space, and select the READ ACCESS option, the index data from the system temporary table space to copy the original table space.

Into four stages of the process are all part of the operation, only four stages in all completed, the load operation is completed. Will be generated at each stage of a message, if one of these phase error, these messages can help the DBA to analyze and solve problems.

Insert a row for each import operation must check whether the data satisfy the constraints, and entered the log file.

Here we look at some of the unique features LOAD command, IMPORT command also can do no longer explained in detail.

Case of 12: From the type of file to load the cursor to define a cursor
declare mycur cursor for select * from org
Create a new table structure compatible with the cursor
create table org2 like org
Loaded from the cursor
load from mycur of cursor insert into org2

Apart from the cursor in the load, but also from the files, pipes, equipment for loading. The Import command can only be imported from a file.

Case of 13: On the table by a user-defined exception exception table can be used to store do not follow the only binding constraint and the main code line. If the exception is not specified when loading the table, then the only constraint violation is about to be discarded and will no longer have the opportunity to restore or modify.
SAMPLE database using the table to experiment STAFF
1. Create a table structure and the same table STAFF1 STAFF
CREATE TABLE STAFF1 LIKE STAFF

2. Part of the STAFF table in the data into STAFF1
INSERT INTO STAFF1 SELECT * FROM STAFF WHERE ID <= 160

3. And then create a structure and STAFF1 the same table STAFFEXP, as an exception table
CREATE TABLE STAFFEXP LIKE STAFF1

4. To the table to add an exception because the exception table and ordinary table compared to the previous structure are the same, that is, the last extra one or two columns (column name any), the first column is a timestamp type, record unusual recording inserted time, the second column is the major text types (at least for the 32K size), save the result in which records were rejected specific constraint information. Only in this case to add a timestamp column.
ALTER TABLE STAFFEXP ADD COLUMN TIME TIMESTAMP

5. To create a unique index STAFF1 table
CREATE UNIQUE INDEX IDXSTAFF ON STAFF1 (ID)

6. First run the Export command to make a text file
EXPORT TO D: \ STAFF.TXT OF DEL SELECT * FROM STAFF

7. And then run into the command re-load the data into STAFF1 table
LOAD FROM D: \ STAFF.TXT OF DEL INSERT INTO STAFF1 FOR EXCEPTION STAFFEXP

As the table STAFF1 in unique index, so there will be a part of the data because of violation of this constraint can not be inserted into the STAFF1 table, these records will be inserted into the exception table STAFFEXP in.

Note that the exception table must first define their own good, into the command can not automatically generate exception table, if can not find the specified exception table, an error.

Case of 14: DUMP file format on an incorrect line will be rejected. By specifying the file type modifier can DUMPFILE these were rejected on the records of a separate file specified.
SAMPLE database using the table to experiment STAFF
1. Create a table structure and the same table STAFF1 STAFF
CREATE TABLE STAFF1 LIKE STAFF

2. Part of the STAFF table in the data into STAFF1
INSERT INTO STAFF1 SELECT * FROM STAFF WHERE ID <= 160

3. And then create a structure and STAFF1 the same table STAFFEXP, as an exception table
CREATE TABLE STAFFEXP LIKE STAFF1

4. To the table to add an exception
ALTER TABLE STAFFEXP ADD COLUMN TIME TIMESTAMP

5. To create a unique index STAFF1 table
CREATE UNIQUE INDEX IDXSTAFF ON STAFF1 (ID)

6. First run the Export command to make a text file
EXPORT TO D: \ STAFF.TXT OF DEL SELECT * FROM STAFF
To the D drive on the open STAFF.TXT file, the first column is equal to the line 320, replace: "abcf", "aaa", "sdfg"

7. And then run into the command re-load the data into STAFF1 table
LOAD FROM D: \ STAFF.TXT OF DEL MODIFIED BY DUMPFILE = d: \ dump INSERT INTO STAFF1 FOR EXCEPTION STAFFEXP

The results into a report will be as follows:
SQL3118W line "32" column "1" in the field value can not be converted to SMALLINT value, but the target column is not empty. Not mount the bank.
SQL3185W when dealing with the input file first "32" line in the data error occurred previously.

Open the D drive of dump.000 file, you will see the line that causes abnormal data: "abcf", "aaa", "sdfg"

Through this example, we can understand, if the line data format is not correct is rejected load times, it will, Gai-line records into the DUMP file; and If data format is correct, but does not meet the constraints of the table, The bank records into the exception table.

Case of 15: limit the number of rows loaded with ROWCOUNT option to specify from the beginning of the file number of records loaded
LOAD FROM D: \ STAFF.TXT OF DEL ROWCOUNT 3 INSERT INTO STAFF1

Case of 16: a warning message or force into operation fails in some cases, the file data must all be successfully imported into the target table, be successful, even if there is a record of error is not OK. In this case, you can use WARNINGCOUNT option.

To the D drive on the open STAFF.TXT file, the first column is equal to the line 320, replace: "abcf", "aaa", "sdfg"

LOAD FROM D: \ STAFF.TXT OF DEL WARNINGCOUNT 1 INSERT INTO STAFF1

Operating results include the following warning:
SQL3118W line "32" column "1" in the field value can not be converted to SMALLINT value, but the target column is not empty. Not mount the bank.
SQL3185W when dealing with the input file first "32" line in the data error occurred previously.
SQL3502N utility encountered a "1" a warning that it exceeds the maximum allowed number of warnings.

At this point can not operate on the table STAFF1, for example,
SELECT * FROM STAFF1
Will return:
ID NAME DEPT JOB YEARS SALARY COMM
------ --------- ------ ----- ------ --------- ---------
SQL0668N As Table "USER.STAFF1" on the reason code "3", so not allowed to operate.
SQLSTATE = 57016

Reason: the table in "load pending" state. LOAD this table of the previous attempt failed. LOAD restart or terminate operation not allowed on the table before the access.

Solution is: were issued through the options with the RESTART or LOAD TERMINATER to restart or terminate previously failed LOAD operation on this table.

The LOAD command contains TERMINATER into the process could be terminated, the target table can be used to restore the normal state:
LOAD FROM D: \ STAFF.TXT OF DEL TERMINATE INTO STAFF1

RESTART the LOAD command contains the source files can be modified to use the correct time to start loading process:
LOAD FROM D: \ STAFF.TXT OF DEL RESTART INTO STAFF1

Case of 17: to prevent warning message generated using NOROWWARNINGS file type modifier can disable warning message generated when the load process may be a large number of warning messages, and users are not interested, they can use this option, which can greatly enhance load efficiency

To the D drive on the open STAFF.TXT file, the first column is equal to the line 320, replace: "abcf", "aaa", "sdfg"

LOAD FROM D: \ STAFF.TXT OF DEL MODIFIED BY NOROWWARNINGS INSERT INTO STAFF1

End result of running, the first 32 rows goes wrong, the line can not be loaded, but does not generate warning messages.

Case of 18: Build Statistics
Use the STATISTICS option to generate in the process of loading statistics, these statistics for the optimizer can determine the most effective implementation of the SQL statement that way.
Tables and indexes may have different levels of detail of statistical data:

① on the tables and indexes have the most detailed statistical data:
LOAD FROM D: \ STAFF.TXT OF DEL REPLACE INTO STAFF1 STATISTICS YES WITH DISTRIBUTION AND DETAILED INDEXES ALL

② on the table and index statistics are generated brief:
LOAD FROM D: \ STAFF.TXT OF DEL REPLACE INTO STAFF1 STATISTICS YES AND INDEXES ALL

Other combinations can refer to the DB2 documentation.

NOTE: STATISTICS REPLACE option only and is compatible with the INSERT option is not compatible.

In addition, the statistics done by STATISTICS option, we can not see any direct results, if you want to see the results, needs to query the system tables themselves.

Case of 19: lift check pending state
1. Connected to the SAMPLE database:
Connect to sample

2. Create a table structure and staff the same table:
CREATE TABLE STAFF1 LIKE STAFF

3. To the table to add a check constraint:
alter table staff1 add constraint chk check (dept <100)

4. To the D drive on the open STAFF.TXT file, the last line of data in the third column to 150, so that the data which do not meet the Step 3 together with the inspection of about

Beam conditions, and then use the Load command from the file load data into staff1 table:
LOAD FROM D: \ STAFF.TXT OF DEL INSERT INTO STAFF1

5. Then run the query command:
Select * from staff1
Will get an error message:
SQL0668N As Table "USER.STAFF1" on the reason code "1", so not allowed to operate.
SQLSTATE = 57016
The reason is loaded with data when the check constraint violation, resulting in the table in check pending state.

6. Lift table check pending state, use:
set integrity for staff1 check immediate unchecked
Run Query command:
Select * from staff1
Tables can be found in normal use, including violation of inspection rules, data also exist.

Case of 20: Performance factors in import data from a file to the table, and when a large amount of data in particular cases, reflect the load command will clear advantage, because it is not inserted into the command Meici row, and each line should check to meet the constraints, read data from the input file into the command build pages, these pages directly into the database, and each row to determine whether the data loaded, do not satisfy the constraints, another load command does not write log, All these factors lead to higher import efficiency into.

In addition, the load command and some options to control the performance factors:
1. COPY YES / NO and Nonrecoverable
① Nonrecoverable (unrecoverable): Specifies the load action can not be restored and can not roll forward from the subsequent recovery operation. Roll operation and ignore the transaction data into the table is marked as "invalid."

② Copy No (default): In this case, if the table where the database archive log is enabled, then the load is complete, the table where the table space will be in backup pending state until the database or table space backup completed, the table space before a table space can be written. The reason is the change caused by load operation is not recorded, so to restore the load operation after the fault occurred, the backup database or table space is necessary.

③ Copy Yes: In this case, if the database archive log is enabled, the mount operation changes will be saved to tape, directory, or TSM server, and table spaces will no longer be in a backup pending state.

2. Fastparse
The file type modifier used to reduce the number of data checks. It can only be used in the data known to the right situation, especially for DEL and ASC types of files.

3. Anyorder
If SAVECOUNT option is not used, this parameter allows the input file does not comply with the order into the data, the SMP (symmetric multi-processor) systems CPU_PARALLELISM option when more than 1, this parameter will increase the load performance.

4. Data Buffer
This parameter specifies the distribution obtained from the 4K stack size of the number of memory pages, as the load of the internal buffer, specify a large buffer will help to improve load performance.

5. CPU_PARALLELISM
This option can only be used for SMP systems, can indicate how much process or thread analysis, conversion, formatting of data.

6. Disk_Parallelism
This option specifies the process of writing data to disk, or the number of threads.

相关文章
  • (Transfer) DB2 Data Import (Import) Export (Export) (Load) 2010-09-10

    DB2 Data Import (Import) Export (Export) (Load) http://www.cublog.cn/u/26654/showart_363328.html The so-called DB2 data movement, including: 1. Data import (Import) 2. Data Export (Export) 3. Data load (Load) DB2 Import and load the relevant commands

  • DB2 Data Import (Import) Export (Export) (Load) 2011-07-27

    The so-called DB2 data movement, including: 1 Data Import (Import) (2) data export (Export) 3 Data loading (Load) DB2 import and load all the commands related to the use of some form of data in the file saved to the database table refers to the DB2 d

  • (R) DB2 Data Import (Import) Export (Export) (Load) 2010-09-10

    DB2 Data Import (Import) Export (Export) (Load) http://www.cublog.cn/u/26654/showart_363328.html The so-called DB2 data movement, including: 1 Data Import (Import) (2) data export (Export) 3 Data loading (Load) DB2 import and load all the commands re

  • db2 data import. Export 2011-03-02

    1. Db2cmd open db2 command line using 2.db2look-d db_name-e-a-x-i db_username-w db_passwd -0 file_name.sql # export the database structure 3.db2move db_name export-u db_username-p db_passwd # export database data 4. Create database db_name on 'direct

  • mysql import tables and export data 2011-09-02

    In the command line mysql export data have powder handy tool mysqldump, it has a lot of parameters, it can see: mysqldump The most common: mysqldump -uroot databasefoo table1 table2 > foo.sql This database can be databasefoo tables table1, table2 to

  • Oracle Data of exp / imp export import 2011-08-05

    Oracle Data import and export imp / exp is equivalent to oracle Data restore and backup. Exp command to export the data from the remote database server to Local dmp file ,imp Command to import the dmp file to the distance from the local database serv

  • oracle 10g data file import and export 2010-10-25

    Using expdp / impdp previously required to create a Directory in the database create directory dump_test3 as 'E: \ erpdata'; Then it licensed to a user grant read, write on directory dump_test3 to userName; Here you can import and export of data: Imp

  • DB2 data collection (reproduced) 2010-09-13

    1. Use db2cmd open db2 command line 2.db2look-d db_name-e-a-x-i db_username-w db_passwd -0 file_name.sql # export the database structure 3.db2move db_name export-u db_username-p db_passwd # export the database data 4. Create database db_name on 'dire

  • db2 data compression feature testing 2011-03-08

    Reference: http://www.ibm.com/developerworks/cn/data/library/techarticles/dm-0902yanbo1/ and http://www.ibm.com/developerworks/data/library/techarticle/dm-0605ahuja/index . html In fact the DB2 table compression method is by looking at the table, fin

  • Use of DB2 data migration 2011-03-10

    See: http://www.ibm.com/developerworks/cn/education/data/db2-cert7315/section6.html and http://www.ibm.com/developerworks/cn/data/library/techarticles/dm- 0712xiam/index.html? ca = drs The below is how to do the db2 data migration. (1). Export source

  • PL / SQL Export Export 2010-06-16

    If you export only the structure of a table ( Table statement ), Do not export the table data, available Tools--Export User Objects Select the table you want to export, set up export path and after each parameter , Click Export to OK PS: This way you

  • WebSphere6.1 DB2 data source configuration under 2011-08-03

    Turn: http://julianlali.blog.163.com/blog/static/5813364320103934046573/ WebSphere6.1 DB2 data source configuration under 2010-04-09 15:40:46 | Category: Knowledge Base - base class | Font Size Subscribe First, configure the DB2 database, open the DB

  • Implementation of DB2 data deduplication 2011-05-25

    DB2 delete duplicate data so that we frequently used operations, the following DB2 data deduplication teach you the way, I hope you can learn to delete duplicate data in DB2 help. Remove duplicate data using ROW_NUMBER assumption table TAB has a, b,

  • CSS reference (link) and import (@ import) difference 2010-06-10

    External css file in two ways, one is quoted (link), for example: <link rel='stylesheet' href='a.css'>; the other is the import (@ import), for example: <style > @ import url ('a.css');</ style>; two way quote on the page displayed is th

  • html transfer of data conversion 2010-06-25

    html transfer of data conversion php solutions <? echo mb_convert_encoding('百分', 'UTF-8', 'HTML-ENTITIES'); ?> python solution def unescape(text): """Removes HTML or XML character references and entities from a text string. @param tex

  • oracle import import 2010-10-27

    oracle import import user: test (user name) @ testDB (database name, service name) Password :******

  • [Transfer] Oracle data import and export imp / exp command more than 10g expdp / impdp command 2011-07-03

    Oracle data import and export imp / exp command more than 10g expdp / impdp command Oracle data import and export imp / exp is equivalent to oracle data restore and backup. exp command to export data from a remote database server to the local dmp fil

  • [Transfer] oracle data import and export imp / exp instance 2011-07-04

    oracle data import and export imp / exp instance Category: oracle 2008-09-08 21:07 196 people read comments (0) Add report imp / exp import and export instance: Database configuration is as follows: sid = orcl_192.168.123.123; user: test; password: t

  • Front text box data and import and export EXCEL 2010-03-31

    Main points to achieve two functions: (text box on a piece of the picture simply are no rules, but in the same column or peers, the size of different lengths) 1. The reception of all designated areas to export into a text box the value of the EXCEL t

  • Share: Mysql data import (import) / export (export) - mysqldump command 2011-03-09

    Original from: http://www.clockwatchers.com/mysql_dump.html MySQL Tutorial - Import or Export A Database This tutorial section deals with mysqldump which is a tool to import and export MySQL databases. It can be used to back up a database or to move