Data Management

Objectives
By the end o this module, you should understand the fundamentals of data management, including:
1.Explain typical data management operations.
2.Describe typical user cases for inserting system fields.
3.List the ways to obtain record IDS.
4.Perform mass transfer of records.
5.describe External IDs.
6.Explain the basics of object relationships.

Data Management Operations
1.Data management:
- Is an on-going process.
- Is not a one-time task.
2.Various data management operations include:
- Exporting data
- Deleting data
- Insert data
- Updating data
- Upserting data

Exporting Data
1.Data can be exported from Salesforce into a set of CSV files.

2.Data is exported to:
- Create backups.
- Get reference IDS for records.

Deleting Data
1.Data can be deleted from Salesforce using the deleting data operation.
2.This operation is used to:
- Free up the space used by out of data or bad data.
- Fix mistakes.
3.On deletion, data is moved to the Recycle Bin.

Insert Data
1. This insert data operation can be used to load data into Salesforce
2.This operation can be used for:
- Initial Salesforce setup
- Migrating data from legacy systems.
- Load data to sandbox.

Inserting System Fields
1.Salesforce allows the use of inserting System Fields feature to set the system fields.
2.Inserting System Fields:
- Works only at the time of setting a record's system fields.
- Are accessible only through the API-based management tools.
- Work for all custom objects.
- Restricted to Account, Opportunity, Contact, Lead, Case, task, and Event standard objects.

Updating Data
1.The updating data operation is used for:
- Add data to existing records.
- Transfer ownership of records to a different user.
2.Salesforce record IDs are required to perform this operation.

Salesforce Record IDs
1. Salesforce records ID is:
- An ID value generated by Salesforce when a new record is created.
- A unique identifier of the record.
- Analogous to a primary or foreign key field in a database table.
2.Saleforce.com record IDs exist in two forms:
- The 15-digit case sensitive form.
For example:005E000000KF38
- The 18-digit case insensitive form.
For example:005E0000000KF38IAG

Record IDs Introduction
You can access record IDs in four ways:
1.From URL(15-digit ID)
2.From a report
-In a report, record IDs for all records are displayed in a separate column.
- Reports return the 15 digit-ID
3.Through the SOAP-based Web Service API
- All record IDs are obtained when records are extracted using the Web service API.
4.Through formulas
- You can access record IDS by creating a formula(5-digit ID)

Mass Transferring Records
1. The mass transfer tool is used to transfer multiple accounts, leads, or custom objects from one use to another.
2.To transfer records, following permissions are required:
- "Transfer Records" of "Transfer lead".
- "Edit" on the specified object.
- "Read" on the records being transferred.

Upserting data
1.Migrate new and existing records from a legacy system to Salesforce.
2.Upsert data operation is useful to identify duplicate entries for the same ID.
3.This operation uses Salesforce ID or external ID to create a new record of update an existing record as follows:
- If the ID is not matched, a new record is created.
- If the ID is matched once, the existing record is updated.
- If the ID is matched multiple times, an error is reported.

External IDs
1.External IDs are useful while migrating data or integrating data between legacy system and Salesforce.
2.External ID:
- Is a flag that can be added to a custom field.
- Can be created for any custom field of type text, number, or email.
- Each object can have up to three external IDs.

Upsert with object Relationships
1. Relationships existed between objects.
2.Object relationships affect the order in which data can be managed.
3.Relationship are expressed through:
- Replaced lists and look-up in the application.
- Foreign keys or IDs in the database.
4.The upsert function allows to load data with their relationships using the external IDs.

Steps to Load Data
1.Object relationships introduce data dependencies.
2.Dependencies dictate the order of data load.

Summary
1.Data management is an on-going process to keep data in your application up-todate,accurate, and clean.
2.To manage data effectively and efficiently, to perform data management operations such as exporting, inserting, updating, upserting and deleting data.
3.When inserting data, you have the option of using inserting system fields.
4.Saleforce generate an id value when you create a new record and add data.
5.Record IDs describe object relationships that define the order in which data can be managed.

Data Management Tool

Objectives
By the end of the module, you should be able to:
1.List available tools to perform data management operations.
2.Use the Data Loader to perform data management operations.
3.Define the Bulk API and its use cases.

Tools for Data Migration
Data van be migrated into Saleforce using:
1.Import Wizards, or
2.Web service API.

Import Wizards
1.Are easy to use
2.Can be used to load up to 50,000 records.
3.Load accounts, contracts, leads, solutions, or custom objects.

API-Based tools
1. Load any object support by the API
2. Load more that 50,000 records
3.Schedule regular data loads such as nightly feeds
4.Export data for backup
5.Delete multiple supported objects at the same time
6.Tools that function through Web Services API include;
- Data loader
- Partner tools
- Custom-biult tools
- Open Source tools

Data Loader:
1.Is a Salesforce product.
2.Supports data import from and export to a CSV files.
3.Support data load from and export to a database through JDBC.
4.Supports custom relationships for upsert.
5.Can be run from command line.
6.Can be run in batch mode.

Obtaining the Data Loader
The data Loader:
1.Can be downloaded by System Administrators.
2.Is available in UE, EE, and DE orgs.

Other Avaiable API Data Management Tools
1.Additional data Management tools can be obtained from:http://develper.force.com

Bulk API
The Bulk API:
1.Is used to load high-volume data.
2.Is optimized to perform insert, update, upsert, or delete operation on larger number or records.
3.Improves throughput when loading large data sets due to parallel processing.
4.Can be monitored by navigating to Monitoring section in the Setup menu.

Working of the Bulk API
1.Data is transferred at full network speeds, reducing dropped connections.
2.The whole data set is managed in a job that can be monitored and controlled from Setup menu.
3.The data set van be processed faster by allocating multiple servers to process in parallel.

Using Data Loader with the Bulk API
1.Data Loader uses SOAP-based Web services by default.
2.To use the Bulk API, enable the Bulk API option.
3.Salesforce provides an additional serial mode for Bulk API.
4.With serial mode, batched can be processed one at a time.
5.Hard deletes can be performed using the Hard Delete.

Summary
1.Data can be managed either using the import wizards, or through the API.
2.The import wizards do not require any programming or developer skills.
3.API-based tools can be used to schedule regular data loads such as nightly feeds, export data for backup, and delete multiple supported objects data.

Data Management and Data Management Tools的更多相关文章

  1. Coursera, Big Data 2, Modeling and Management Systems (week 4/5/6)

    week4 streaming data format 下面讲 data lakes schema-on-read: 从数据源读取raw data 直接放到 data lake 里,然后再读到mode ...

  2. Coursera, Big Data 2, Modeling and Management Systems (week 1/2/3)

    Introduction to data management 整个coures 2 是讲data management and storage 的,主要内容就是分布式文件系统,HDFS, Redis ...

  3. Datasets for Data Mining and Data Science

    https://github.com/mattbane/RecommenderSystem http://grouplens.org/datasets/movielens/ KDDCUP-2012官网 ...

  4. SQL data reader reading data performance test

    /*Author: Jiangong SUN*/ As I've manipulated a lot of data using SQL data reader in recent project. ...

  5. Desktop Management Interface & System Management BIOS

    http://en.wikipedia.org/wiki/Desktop_Management_Interface Desktop Management Interface From Wikipedi ...

  6. 【转】浏览器中的data类型的Url格式,data:image/png,data:image/jpeg!

    所谓"data"类型的Url格式,是在RFC2397中 提出的,目的对于一些"小"的数据,可以在网页中直接嵌入,而不是从外部文件载入.例如对于img这个Tag, ...

  7. 初探 spring data(一)--- spring data 概述

    由于自己一个项目要用多到Sql与NoSql两种截然不同的数据结构,但在编程上我希望统一接口API,让不同类型的数据库能在相同的编程接口模式下运作.于是找了一个spring的官网,发现一个spring ...

  8. OCM_第二十天课程:Section9 —》Data Guard _ DATA GUARD 搭建/DATA GUARD 管理

    注:本文为原著(其内容来自 腾科教育培训课堂).阅读本文注意事项如下: 1:所有文章的转载请标注本文出处. 2:本文非本人不得用于商业用途.违者将承当相应法律责任. 3:该系列文章目录列表: 一:&l ...

  9. OCM_第十九天课程:Section9 —》Data Guard _ DATA GUARD 原理/DATA GUARD 应用/DATA GUARD 搭建

    注:本文为原著(其内容来自 腾科教育培训课堂).阅读本文注意事项如下: 1:所有文章的转载请标注本文出处. 2:本文非本人不得用于商业用途.违者将承当相应法律责任. 3:该系列文章目录列表: 一:&l ...

随机推荐

  1. POI之下载模板(或各种文件)

    该例基于Nutz框架 前台代码: <a href="" id="errordownload" onclick="downloadErrorLog ...

  2. 万达乐园VS阿里帝国 谁将是未来娱乐产业的龙头?

    国内实体行业大佬王健林和互联网行业巨头马云,这次又不约而同地想到一块去了.从王健林叫板迪士尼大搞借势营销,到最近马云成立大文娱工作领导小组,明显的趋势表明娱乐越来越成为各界大佬们未来掘金的新战场.只不 ...

  3. linux tc流量控制

    tc流量控制 项目背景 vintage3.0接口lookupforupdage增加一个策略,当带宽流量tx或rx超过40%,75%随机返回304:超过60%,此接口均返回304 为了对测试机器进行流量 ...

  4. Jstorm执行task报错windows CONFIG SET protected-mode no

    windows  CONFIG SET protected-mode no报错说redis受保护模式,redis使用的是Redis-x64-3.2.100,参考博文说是redis3.2之后加入的特性, ...

  5. tomcat服务器和http协议笔试题

    tomcat与web程序结构与Http协议与HttpUrlConnection 考查的知识点:tomcat服务器相关信息 1.下面关于tomcat服务器描述正确的是() (难度A) A. tomcat ...

  6. python数据转换

    主要内容 1:数字类型:算术运算 bool:判断真假,运用场景在逻辑运算里较多,比如while循环了. 字符串:可以索引取值,可以嵌套 列表:存放任意数据类型,因为是按序存放的,故可以索引取值, 字典 ...

  7. Flutter 裁剪类组件 最全总结

    注意:无特殊说明,Flutter版本及Dart版本如下: Flutter版本: 1.12.13+hotfix.5 Dart版本: 2.7.0 ClipRect ClipRect组件使用矩形裁剪子组件, ...

  8. ggplot2(6) 标度、坐标轴和图例

    6.1 简介 标度控制着数据到图形属性的映射.标度将我们的数据转化为视觉上可以感知的东西:例如大小.颜色.位置和形状.标度也为我们提供了读图时所使用的工具:坐标轴和图例. 执行标度的过程分为三步:变换 ...

  9. 推荐一个学习python非常好的网站

    推荐一个入门python非常好的网站(也可以学习JAVA)非常适合入门,不说多易于理解,也是比较亲民的0基础学习教程,还免费…… 网址:https://www.liaoxuefeng.com/(廖雪峰 ...

  10. redis的批量操作命令pipeline(PHP实现)

    redis执行一条命令有四个过程:发送命令.命令排队.命令执行.返回结果:整个过程是一个往返时间(RTT).如果有n条命令,就会消耗n次RTT.Redis的客户端和服务端可能部署在不同的机器上.在两地 ...