Skip to main content

Preparing for data migration

5 Tasks

25 mins

Visible to: All users
Beginner
Pega Customer Decision Hub 8.6
Pega Customer Decision Hub 8.7
English
This content will be archived in 24 days. Click here to continue your progress in the latest version.

Scenario

U+ Bank, a retail bank, plans to prepare its systems to generate the artifacts and configure the settings that are required to migrate sampled data from the production environment to the business operations environment (BOE).

Use the following credentials to log in to the exercise system:

Role User name Password

System Architect

SystemArchitect

rules

Your assignment consists of the following tasks:

Task 1: Create and add a new ruleset to the application ruleset stack

Create a new ruleset to host all artifacts that are generated by the system, such as data sets and data flows, and then add that ruleset to the application stack.

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.

Task 2: Configure data migration settings

Configure data migration settings to define the inbound (20%) and outbound (20%) sample size, and then generate the data migration artifacts.

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.

Task 3: Enable inbound and outbound sampling

Enable inbound and outbound sampling, and then enable scenario planner actuals migration by updating respective dynamic system settings.

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.

Task 4: Create a product rule

Create a product rule to package the generated data migration artifacts.

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.

Task 5: Configure service packages

Set up MigrateSimulationData, DataSyncPipeline, api, and cicd service packages appropriately as the security requirement in each environment varies.

Note: In standard use cases, you complete this task in the BOE and production environments to ensure that you meet the security requirements. For training purposes, you perform the tasks in a single environment.

You must initiate your own Pega instance to complete this Challenge.

Initialization may take up to 5 minutes so please be patient.

Challenge Walkthrough

Detailed Tasks

1 Create and add a new ruleset to the application ruleset stack

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.
  1. On the exercise system landing page, click Pega CRM suite to log in to Dev Studio.
  2. Log in as System Architect with User name SystemArchitect and Password rules.
  3. In the header of Dev Studio, click Create > SysAdmin > RuleSet to create a new ruleset.
    Create ruleset
  4. On the Create RuleSet Version tab, in the Ruleset Name, enter DataMigration.
    new ruleset
  5. In the upper right, click Create and open.
  6. Click Save.
  7. In the header of Dev Studio, click Application > Definition to open the application definition.
    App definition
  8. In the application definition, in the Application rulesets section, click Add ruleset to add the new ruleset.
    Add ruleset
  9. In the new row, enter DataMigration and select ruleset.
  10. Enter the ruleset version.
    ruleset version
  11. Click Save.

2 Configure data migration settings

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.
  1. In the header of Dev Studio, click Configure > Decisioning > Infrastructure > Data migration to open the Data Migration tab.
    Data migration option
  2. On the Data Migration tab, confirm that the Data to transfer field is set to Data-Decision-Request-Customer.
  3. In the General section, move the Inbound sample size to transfer (Max 20%) slider to 20%.
  4. Move the Outbound sample size to transfer (Max 20%) slider to 20%.
    data migration sampling
  5. In the Storage section, in the Repository field, enter or select DMRepo.
    data migration repo
  6. In the Supporting artifacts section, in the Ruleset field, select DataMigration,
  7. In the Version list, confirm that the ruleset version is 01-01-01.
    data migration datasets
  8. Click Save to generate the supporting artifacts.
  9. In the Supporting artifacts section, click Show artifacts to view the generated supporting data migration artifacts.
    Data migration artifacts
  10. View the generated artifacts, and then click Close to close the window.

3 Enable inbound and outbound sampling

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.
  1. In the navigation pane of Dev Studio, click Records > SysAdmin > Dynamic System Settings to open the DynamicSystemSettings tab.
    Dynamic system settings
  2. In the Setting Purpose column, click the Filter icon to filter by purpose.
  3. In the search box, enter simulation/enableSampling, and then click Apply.
    Simulation DSS
  4. Open simulation/enableSampling, and then click Edit.
    Edit ruleset
  5. In the Associated RuleSet field, enter or select DataMigration.
    Select ruleset
  6. In the Settings tab, in the Value field, enter true.
    DSS settings
  7. Click Save.
  8. Repeat steps 2- 7 to update the CDHMigrateOutboundSample and CDHMigrateScenarioPlannerActuals dynamic system settings.

4 Create a product rule

Note: In standard use cases, you complete this task in the development environment that acts as the system of record and contains the product rule that defines the application package. For training purposes, you perform the tasks in a single environment.
  1. In Dev Studio, click Create > SysAdmin > Product to create a new ruleset.
    New product
  2. On the Create Product tab, enter the following information:
    1. Label: DataMigrationArtifacts
    2. Product Version: 01-01-01
    3. Context: PegaCRM_Marketing
    4. Add to ruleset: DataMigration
      product rule
  3. In the upper right, click Create and open to open the product rule.
  4. In the RuleSets to include section, in the Name field, enter or select DataMigration.
  5. Select the Include associated data check box to ensure that the Data Migration landing page where the settings were configured is also migrated.
    include a ruleset
  6. In the Individual instances to include section, in the Select a class and press 'Query' field, enter Rule-Application.
  7. Click Query to add the application instance to view the individual instances.
    application query
    1. Filter the list by the PEGACRM_MARKETING 08.01 name, and then select PEGACRM_MARKETING!08.01.
    2. Click Submit to add the application instance to the product rule.
      select app
  8. Click Save to save the product rule.
  9. In the File details section, click Preview product rule to view the rules that are included.
    Preview product file
  10. Click Submit to close the Product preview window.
    rules in product rule
  11. In the header of the product rule form, click RS DataMigration:01-01-01 to open the product file ruleset.
    Open a ruleset
  12. Click Lock and Save to provide and save a password for the ruleset access.
    Lock and save ruleset
    1. In the Lock Ruleset version window, enter rules as the password twice.
    2. Click Submit.
  13. On the DataMigrationArtifacts 01-01-01 tab, click Create product file to generate the product file.
    Create product rule
     
  14. In the Create Product File window, enter DataMigrationArtifacts as the name of the product file.
  15. Click OK to begin generating the product file.
  16. When the file generation is complete, click Archive created. Click here to save DataMigrationArtifacts.zip file locally link to download the product file.

5 Configure service packages

Note: Typically, this task is performed in the BOE and production environments to ensure the security requirements are met. For training purposes, you perform the tasks in a single environment.
  1. In the navigation pane of Dev Studio, click Records > Integration-Resources > Service Package to open the ServicePackage tab.
    Service package
  2. In the list of service packages, open the following service packages and then ensure that the Require TLS/SSL for REST services in this package field is cleared.
    1. DataSyncPipeline
    2. MigrateSimulationData
    3. api
    4. cicd
      TLSSSL option
    Note: Typically, as the last step, to prepare the environments for data migration, you import the generated DataMigrationArtifacts product rule into all environments by using the enterprise pipeline when possible. This step ensures that each environment is in sync and both BOE and production environments have the required artifacts to migrate the sample data from the production system to the BOE system. However, for training purposes, you complete all the tasks in a single system, so you do not need to import the product file as part of this challenge.

This Challenge is to practice what you learned in the following Module:


Available in the following mission:

If you are having problems with your training, please review the Pega Academy Support FAQs.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Academy has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice