Monday, April 20, 2009
Oracle acquires Sun Microsystems ..
Oracle software is widely used on Solaris platform.So Oracle has clear business interests on Solaris. Oracle is a major stakeholder and a prominent player in Java space. For Java lovers it will be a pleasant news.
Its not so long that Sun acquired MYSQL. It will be a bad news for MYSQL lovers. We'll have to wait and see what future holds for MYSQL.
Its hard to predict the impact now, but for sure there will be an impact.
Sunday, April 19, 2009
Data Pump continued … Remap_data
Syntax
REMAP_DATA=[(chema.tablename.column_name:schema.pkg.function)
Example
Look at table scott.dp_test
SQL> desc scott.dp_test;
Name Null? Type
----------------------------------------- -------- ----------------------------
NAME VARCHAR2(10)
AGE NUMBER
SQL> select * from scott.dp_test;
NAME AGE
---------- ----------
ahmad 25
Gairik 31
Ryan 18
Greg 29
SQL>
I need to export the data and increment every ones age by 10.The package has a function named modify_age which accepts age as input and add 10 to it.
SQL> ed
Wrote file afiedt.buf
1 create or replace package add_age as
2 function modify_age (roll_in number) return number;
3* end;
SQL> /
Package created.
SQL> create or replace package body scott.add_age as
2 function modify_age (roll_in number) return number
3 as
4 roll_out number;
5 begin
6 roll_out := roll_in + 10;
7 return(roll_out);
8 end;
9 end;
10 /
Package body created.
I am planning to import the table to scott1 schema. I have created a new user named Scott1 and created the table named dp_test similar to scott.dp_test.
SQL> create user scott1 identified by xxxxxxxx;
User created.
SQL> grant create session, imp_full_database to scott1;
Grant succeeded.
SQL> create table scott1.dp_test as select * from scott.dp_test;
Table created.
SQL> delete from scott1.dp_test;
4 rows deleted.
SQL> commit;
Commit complete.
Granting read/write privs on exp_dir where the dump file resides.
SQL> grant read, write on directory exp_dir to scott1;
Grant succeeded.
Export the data from scott.dp_test using the remap_data function which was created above.
H:\>expdp scott/xxxxxxx directory=exp_dir dumpfile=exp_dir:dp_test.dmp logfile=exp_dir:dp_test.log tables=scott.dP_test remap_data=scott.dp_test.age:scott.add_age.modify_age
Export: Release 11.1.0.6.0 - 64bit Production on Tuesday, 14 April, 2009
16:13:08
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SYS_EXPORT_TABLE_01": scott/******** directory=exp_dir dumpfile=exp_dir:dp_test.dmp logfile=exp_dir:dp_test.log table
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."DP_TEST" 5.453 KB 4 rows
Master table "SCOTT"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SYS_EXPORT_TABLE_01 is:
D:\EXP_DIR\DP_TEST.DMP
Job "SCOTT"."SYS_EXPORT_TABLE_01" successfully completed at 16:13:26
Export completed successfully
I am importing the dump file to scott1.dp_test.
H:\>impdp scott1/xxxxxxxx directory=exp_dir dumpfile=exp_dir:dp_test.dmp logfile=exp_dir:dp_testimp.log remap_schema=scott:scott1 table_exists_action=append
Import: Release 11.1.0.6.0 - 64bit Production on Wednesday, 15 April, 2009 11:24:46
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SCOTT1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SCOTT1"."SYS_IMPORT_FULL_01": scott1/******** directory=exp_dir dumpfile=exp_dir:dp_test.dmp logfile=exp_dir:dp_testimp.log remap
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39152: Table "SCOTT1"."DP_TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT1"."DP_TEST" 5.453 KB 4 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "SCOTT1"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 11:24:48
Verify the Result
Lets query the data on scott1.dp_test and see if every ones age is incremented by 10
SQL> select * from scott1.dp_test;
NAME AGE
---------- ----------
ahmad 35
Gairik 41
Ryan 28
Greg 39
SQL>
Similarly you can write your functions to map the data the way you wanted.
Conclusion:
Datapump is a great tool for DBA’s and developers. Oracle 11g has made the tool more powerful. The more you use it, the more you love it .
Wednesday, April 8, 2009
Data Pump.. Contd
------------------------
Oracle has added Some new features for data pump in Oracle 11g. One of the things which caught my attention is the compression feature for Data part. The compression feature for Meta data part was already available in Oracle 10g. Oracle 11g has extended the capability for compressing the data while exporting and also provided the flexibility for compressing only the data or metadata or a both.
Here are some examples.
Metadata compression only
H:> expdp scott/p122ahu7@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1.log compression=metadata_only job_name=scott1_exp status=5
Worker 1 Status:
Process Name: DW01
State: WORK WAITING
Job "SCOTT"."SCOTT1_EXP" successfully completed at 10:18:31
Data compression
H:> expdp scott/p122ahu7@digdb directory=exp_dir dumpfile=exp_dir:scott2.dmp logfile=exp_dir:scott2.log compression=data_only job_name=scott2_exp status=5
Worker 1 Status:
Process Name: DW01
State: WORK WAITING
Job "SCOTT"."SCOTT2_EXP" successfully completed at 10:25:23
Full compression
H:>expdp scott/p122ahu7@digdb directory=exp_dir dumpfile=exp_dir:scott3.dmp logfile=exp_dir:scott3.log compression=All job_name=scott3_exp status=5
Worker 1 Status:
Process Name: DW01
State: WORK WAITING
Job "SCOTT"."SCOTT3_EXP" successfully completed at 10:30:11
Now have a look at the size of the dump files.
D:\exp_dir>dir *.dmp
Volume in drive D is Data
Volume Serial Number is 9820-5C5B
Directory of D:\exp_dir
04/02/2009 10:18 PM 299,008 SCOTT1.DMP
04/02/2009 10:25 PM 188,416 SCOTT2.DMP
04/02/2009 10:30 PM 143,360 SCOTT3.DMP
Encryption
----------------
Another great feature is the addition of encryption option. Oracle 10g has the capability to handle encrypted columns during export/import. Oracle 11g has provided the capability to encrypt dump file sets. Oracle 11g provides three modes of encryption. 1) Transparent encryption or encryption using Oracle wallets.2) Password encryption 3) A combination of 1 and 2
How do you secure your data, if you need to send your data across the network?. Here is the answer.
If you look at the expdp help, You can see four encryption parameters.
D:\exp_dir>expdp help=y
Export: Release 11.1.0.6.0 - 64bit Production on Wednesday, 08 April, 2009 11:25:13
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Keyword Description (Default)
------------------------------------------------------------------------------
ENCRYPTION Encrypt part or all of the dump file where valid keyword
values are: ALL, DATA_ONLY, METADATA_ONLY,
ENCRYPTED_COLUMNS_ONLY, or NONE.
ENCRYPTION_ALGORITHM Specify how encryption should be done where valid
keyword values are: (AES128), AES192, and AES256.
ENCRYPTION_MODE Method of generating encryption key where valid keyword
values are: DUAL, PASSWORD, and (TRANSPARENT).ENCRYPTION_PASSWORD Password key for creating encrypted column data
Encryption_Algorithm – Lets you choose the algorithm from the list of values. . This parameter requires you to specify the encryption or encrytption_password along with it.
Example:
expdp scott/xxxxxxx@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1.log job_name=scott1_exp encryption=all encryption_password=hahahehe encryption_algorithm=AES256 ENCRYPTION_MODE=PASSWORD
Export: Release 11.1.0.6.0 - 64bit Production on Wednesday, 08 April, 2009 13:25:12
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SCOTT1_EXP": scott/********@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1.log jo
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 448 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "SCOTT"."DEPT" 5.945 KB 4 rows
. . exported "SCOTT"."EMP" 8.578 KB 14 rows
. . exported "SCOTT"."PRAVEEN" 9.062 KB 14 rows
. . exported "SCOTT"."SALARY" 6.632 KB 2 rows
. . exported "SCOTT"."SALGRADE" 5.875 KB 5 rows
. . exported "SCOTT"."TEST" 8.992 KB 14 rows
. . exported "SCOTT"."TEST_CASE" 5.562 KB 5 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SCOTT"."SCOTT1_EXP" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SCOTT1_EXP is:
D:\EXP_DIR\SCOTT1.DMP
Job "SCOTT"."SCOTT1_EXP" successfully completed at 13:25:30
H:\>
Lets try importing the file:
First I have tried importing without providing the ENCRYPTION password
impdp praveen1/xxxxxx@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1_imp.log job_name=scott1_imp remap_schema=scott:praveen1
Import: Release 11.1.0.6.0 - 64bit Production on Wednesday, 08 April, 2009 13:47:36
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39174: Encryption password must be supplied.
Import failed with encryption password error.
Let me try with providing the password
>impdp praveen1/xxxxxx@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1_imp.log job_name=scott1_imp encryption_password=hahahehe remap_schema=scott:praveen1
Import: Release 11.1.0.6.0 - 64bit Production on Wednesday, 08 April, 2009 13:46:45
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "PRAVEEN1"."SCOTT1_IMP" successfully loaded/unloaded
Starting "PRAVEEN1"."SCOTT1_IMP": praveen1/********@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scot
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"PRAVEEN1" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "PRAVEEN1"."DEPT" 5.945 KB 4 rows
. . imported "PRAVEEN1"."EMP" 8.578 KB 14 rows
. . imported "PRAVEEN1"."PRAVEEN" 9.062 KB 14 rows
. . imported "PRAVEEN1"."SALARY" 6.632 KB 2 rows
. . imported "PRAVEEN1"."SALGRADE" 5.875 KB 5 rows
. . imported "PRAVEEN1"."TEST" 8.992 KB 14 rows
. . imported "PRAVEEN1"."TEST_CASE" 5.562 KB 5 rows
. . imported "PRAVEEN1"."BONUS" 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
ORA-39082: Object type ALTER_PROCEDURE:"PRAVEEN1"."F_REPORT" created with compilation warnings
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
Job "PRAVEEN1"."SCOTT1_IMP" completed with 2 error(s) at 13:46:50
Pl ignore the errors in the import job. The import was successful.
I haven’t tried using the datapump export/import encryption using wallets. I’ll discuss that in future
Reuse Parameter
This option can be handy and saves you some time. You don’t have to look at the dumpfile directory to check the existence before exporting. This option lets you overwrite the file, if it exists.
The parameter to use “reuse_dumpfiles=y”
Eg: C:\>expdp scott/xxxxxxxx@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1.log job_name=scott1_exp encryption=all encryption_password=hahahehe encryption_algorithm=AES256 ENCRYPTION_MODE=PASSWORD reuse_dumpfiles=y
Export: Release 11.1.0.6.0 - 64bit Production on Wednesday, 08 April, 2009 16:13:36
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SCOTT1_EXP": scott/********@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfil
umpfiles=y
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 448 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "SCOTT"."DEPT" 5.945 KB 4 rows
. . exported "SCOTT"."EMP" 8.578 KB 14 rows
. . exported "SCOTT"."PRAVEEN" 9.062 KB 14 rows
. . exported "SCOTT"."SALARY" 6.632 KB 2 rows
. . exported "SCOTT"."SALGRADE" 5.875 KB 5 rows
. . exported "SCOTT"."TEST" 8.992 KB 14 rows
. . exported "SCOTT"."TEST_CASE" 5.562 KB 5 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SCOTT"."SCOTT1_EXP" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SCOTT1_EXP is:
D:\EXP_DIR\SCOTT1.DMP
Job "SCOTT"."SCOTT1_EXP" successfully completed at 16:13:54
REMAP_TABLES
This option helps if you need to rename a table during the import process.
Example
C:\>impdp praveen1/zzzzzz@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1_imp.log job_name=scott1_imp encryption_password=hahahehe remap_schema=scott:praveen1 remap_table=scott.praveen:praveen1
Data Options
This option lets you skip constraint errors during an import process.
Example:
C:\>impdp praveen1/zzzzzz@digdb directory=exp_dir dumpfile=exp_dir:scott1.dmp logfile=exp_dir:scott1_imp.log job_name=scott1_imp encryption_password=hahahehe remap_schema=scott:praveen1 data_options= SKIP_CONSTRAINT_ERRORS
Another great feature I liked in Oracle11g datapump is the remap_data. I’ll explain this with example in the next post.