Datapump tips
同时导出多个schema下的表
$ expdp system/manager dumpfile=test.dmp logfile=test.log directory=TESTDIR schemas=user1,user2 include=table:\"in \(\'TEST1\',\'TEST2\',\'TEST\'\)\" Export: Release 10.2.0.3.0 - 64bit Production on Thursday, 11 July, 2013 15:23:12 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/******** dumpfile=test.dmp logfile=test.log directory=TESTDIR schemas=user1,user2 include=table:"in ('TEST1','TEST2','TEST')"
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 128 KB
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "USER1"."TEST1" 4.914 KB 1 rows
. . exported "USER2"."TEST2" 4.914 KB 1 rows
. . exported "USER1"."TEST" 0 KB 0 rows
. . exported "USER2"."TEST" 0 KB 0 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
/app/oracle/test.dmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully completed at 15:23:31
datapump trace
The tracing of data pump is done by TRACE parameter. This parameter takes value as 7 digit hexadecimal number. Specifying the parameter value follow some rules.
Out of 7 digit hexadecimal number,
- first 3 digits are responsible to enable tracing for a specific data pump component.
- Rest 4 digits are usually 0300
- Specifying more than 7 hexadecimal number is not allowed. Doing so will result,
UDE-00014: invalid value for parameter, 'trace'.
- Specifying leading 0x (hexadecimal specification characters) is not allowed.
- Value to be specified in hexadecimal. You can't specify it in decimal.
- Leading zero can be omitted. So it may be less than 7 hexadecimal digit.
- Values are not case sensitive.
The majority of errors that occur during a Data Pump job, can be diagnosed by creating a trace file for the Master Control Process (MCP) and the Worker Process(es) only.
In case of standard tracing trace files are generated in BACKGROUND_DUMP_DEST. In case of standard tracing,
- If it is Master Process trace file then generated file name is,
<SID>_dm<number>_<process_id>.trc
- If it is Worker Process trace file then generated file name is,
<SID>_dw<number>_<process_id>.trc
In case of full tracing two trace files are generated in BACKGROUND_DUMP_DEST just like standard tracing. And one trace file is generated in USER_DUMP_DEST.
Shadow Process trace file: <SID>_ora_<process_id>.trc
The list of trace level in data pump is shown below.
Trace DM DW ORA Lines
level trc trc trc in
(hex) file file file trace Purpose
------- ---- ---- ---- ------ -----------------------------------------------
10300 x x x SHDW: To trace the Shadow process (API) (expdp/impdp)
20300 x x x KUPV: To trace Fixed table
40300 x x x 'div' To trace Process services
80300 x KUPM: To trace Master Control Process (MCP) (DM)
100300 x x KUPF: To trace File Manager
200300 x x x KUPC: To trace Queue services
400300 x KUPW: To trace Worker process(es) (DW)
800300 x KUPD: To trace Data Package
1000300 x META: To trace Metadata Package
--- +
1FF0300 x x x 'all' To trace all components (full tracing)
Individual tracing level values in hexadecimal are shown except last one in the list. You can use individual value or combination of values. If you sum all the individual values you will get 1FF0300 which is full tracing.
To use full level tracing issue data pump export as,
expdp DUMPFILE=expdp.dmp LOGFILE=expdp.log TRACE=1FF0300To use full level tracing for data pump import operation issue import as,
impdp DUMPFILE=expdp.dmp LOGFILE=expdp.log TRACE=1FF0300
However for most cases full level tracing is not required. As trace 400300 is to trace Worker process(es) and trace 80300 is to trace Master Control Process (MCP). So combining them is trace 480300 and by using trace 480300 you will be able to trace both Master Control process (MCP) and the Worker process(es). This would serve the purpose.
So to solve any data pump export problem issue,
expdp DUMPFILE=expdp.dmp LOGFILE=expdp.log TRACE=480300To solve any data pump import problem issue,
impdp DUMPFILE=expdp.dmp LOGFILE=expdp.log TRACE=480300
清除stoped impdp/expdp job
stoped impdp/expdp job会在dba_datapump_jobs中留下一条记录,显示为not running.
清除stopped job分两种情况:
1) job能够attach
如果job能够attach, 则可以attach后再kill job.
如:expdp system/**** attach=SYS_EXPORT_TABLE_01
kill_job 2) job无法attach
如果job无法attach, 则需要删除连接DataPump的用户下的master table.
如:conn system/*****
drop table SYS_EXPORT_TABLE_01 (master table名称一般与job name相同)
以上的用户名和job name都可以从dba_datapump_jobs中得到。
导出时如何排除表分区
有两种方式可以在导出时排除表分区,一是使用exclude=table_data方式,二是使用datapump API.
(1) exclude=table_data方式
测试表上有2个分区:
SQL> select partition_name,table_name from user_tab_partitions; PARTITION_NAME TABLE_NAME
------------------------------ ------------------------------
DATA_PART1 TEST
DATA_PART2 TEST 在parfile中exclude分区DATA_PART2:
$ more parfile.par
userid='/ as sysdba'
directory=DATA_PUMP_DIR
dumpfile=test.dmp
logfile=test.log
tables=jyu.test
exclude=table_data:"in ('DATA_PART2')" $ expdp parfile=parfile.par Export: Release 10.2.0.3.0 - 64bit Production on Friday, 30 May, 2014 15:44:28 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Starting "SYS"."SYS_EXPORT_TABLE_01": parfile=parfile.par
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "JYU"."TEST":"DATA_PART1" 5.218 KB 1 rows
Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_TABLE_01 is:
/app/oracle/product/server_ee/10.2.0.2/rdbms/log/test.dmp
Job "SYS"."SYS_EXPORT_TABLE_01" successfully completed at 15:44:42
只有分区DATA_PART1的数据被导出,分区DATA_PART2的数据没有被导出
(2) datapump API
在SQL*Plus中执行以下脚本:
declare
rvalue number;
begin
rvalue := dbms_datapump.open (operation => 'EXPORT',
job_mode => 'TABLE'); dbms_datapump.add_file (handle => rvalue,
filename => 'TEST.DMP',
directory => 'DATA_PUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE); dbms_datapump.add_file (handle => rvalue,
filename => 'TEST.LOG',
directory => 'DATA_PUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); dbms_datapump.metadata_filter (handle => rvalue,
name => 'SCHEMA_EXPR',
value => 'IN (''JYU'')'); dbms_datapump.metadata_filter (handle => rvalue,
name => 'NAME_EXPR',
value => 'IN (''TEST'')'); dbms_datapump.data_filter (handle => rvalue,
name => 'PARTITION_LIST',
value => '''DATA_PART1''',
table_name => 'TEST',
schema_name => 'JYU'); dbms_datapump.start_job (handle => rvalue);
dbms_datapump.detach (handle => rvalue);
end;
/ 检查导出日志,只有分区DATA_PART1被导出:
$ more TEST.LOG
Starting "SYS"."SYS_EXPORT_TABLE_01":
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "JYU"."TEST":"DATA_PART1" 5.218 KB 1 rows
Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_TABLE_01 is:
/app/oracle/product/server_ee/10.2.0.2/rdbms/log/TEST.DMP
Job "SYS"."SYS_EXPORT_TABLE_01" successfully completed at 16:26:58
Datapump tips的更多相关文章
- Mac上MySQL忘记root密码且没有权限的处理办法&workbench的一些tips (转)
忘记Root密码肿么办 Mac上安装MySQL就不多说了,去mysql的官网上下载最新的mysql包以及workbench,先安装哪个影响都不大.如果你是第一次安装,在mysql安装完成之后,会弹出来 ...
- 【Tips】史上最全H1B问题合辑——保持H1B身份终级篇
[Tips]史上最全H1B问题合辑——保持H1B身份终级篇 2015-04-10留学小助手留学小助手 留学小助手 微信号 liuxue_xiaozhushou 功能介绍 提供最真实全面的留学干货,帮您 ...
- layer.js中layer.tips
<script src="~/Content/js/layer/layer.js"></script> layer.tips('名称不能为空', '#pro ...
- HTML 最简单的tips 怎么支持指定DIV显示提示信息
<body> <style type="text/css"> a.link{position:relative;} a.link div.tips{ bor ...
- CSS:CSS使用Tips
Css是前端开发中效果展现的主要部分之一,良好的Css书写习惯可以为实际的项目开发提高效率,也可以为实现良好的团队合作提供保证. 一般新手在使用Css的时候经常会犯一些错误,出现一些不经意的漏洞,如果 ...
- 【读书笔记】100个Switf必备tips
声明 欢迎转载,但请保留文章原始出处:) 博客园:http://www.cnblogs.com 农民伯伯: http://over140.cnblogs.com 正文 1.Selector 在Swi ...
- 【转】40个良好用户界面Tips
一个良好的用户界面应具有高转换率,并且易于使用.但要用户体验良好并不容易做到,下面我们整理了40个良好用户界面Tips,希望能对你有帮助! 1 尽量使用单列而不是多列布局 单列布局能够让对全局有更好的 ...
- 转:Eclipse Search Tips
from: https://github.com/ajermakovics/eclipse-instasearch/wiki/Eclipse-search-tips Eclipse Search T ...
- VS:101 Visual Studio 2010 Tips
101 Visual Studio 2010 Tips Tip #1 How to not accidentally copy a blank line TO – Text Editor ...
随机推荐
- 内网批量测试登录机器工具,并且dir 目标机器c盘
// Ipc.cpp : 定义控制台应用程序的入口点. // #include "stdafx.h" #include <stdio.h> #include <w ...
- 安装,配置webpack.
1.下载node.js 2.在需要用到webpack的项目下打开命令窗口运行npm init生成package.js 3.安装webpack,使用npm install webpack --save- ...
- 微信开发之SSM环境搭建
首先,感谢大神的文章,地址:http://blog.csdn.net/zhshulin/article/details/37956105# 第一步:新建maven项目 如有需要,查看之前的文章:从配置 ...
- 不理解use explanatory variables
- getchar与scanf区别
scanf可以一次按照设定的输入格式输入多个变量数据.如int d,float f,char str[20],scanf("%d%f%s",d,f,str); getchar()只 ...
- 自己动手实现一个简化版的requireJs
一直想实现一个简单版本的requireJs,最直接的办法去看requireJs源码搞明白原理,但是能力有限requireJs的源码比想象的要复杂许多,看了几遍也不是很明白,最后通过搜索找到了一些有价值 ...
- RtlWerpReportException failed with status code :-1073741823
在release下程序运行总是崩溃:debugView输出了这个崩溃信息, 1. 一开始是release看崩溃,各种二分法找崩溃点,太玄没找到: 2. 终于想到可以调试,我草,调试一下瞬间发现某个cl ...
- svn更新时忽略指定文件或文件夹
选择一个收SVN控制的文件夹->右击->选择TortoiseSVN->更新至版本,就会出现 选择更新深度为工作副本,再选择项目,出现如图中所示的界面,把不想更新的文件或者文件夹前 ...
- CSS基础(滑动门,雪碧图,网页布局)
3.11.2017 这一篇主要是关于滑动门技术的学习,还有雪碧图(sprite),也就是精灵图,还有一点昨天的css可见性的回顾,下面先来回顾下吧 CSS可见性(元素可见性) overflow: hi ...
- PHP的mail()函数可以实现直接用脚本发送邮件
PHP的mail()函数可以实现直接用脚本发送邮件. 用mail()函数发送邮件之前,首先需要在php.ini文件里面设置一下邮件服务属性,主要的设置选项如下: 属性 缺省值 说明 Changeabl ...