Skip to main content

Exporting adaptive model data for external analysis

3 Tasks

15 mins

Visible to: All users
Beginner Pega Customer Decision Hub 8.7 English
Verify the version tags to ensure you are consuming the intended content or, complete the latest version.

Scenario

U+ Bank is implementing cross-selling of their credit cards on the web by using Pega Customer Decision Hub™. Self-learning, adaptive models drive the predictions that support the Next Best Actions for each customer.

Export the data from the Adaptive Decision Management (ADM) datamart to further analyze the performance of your online models over time, across channels, issues, and groups in your external data analysis tool of choice.

To limit the scope and size of the data set, create data flows that select the data for the models that interest you.

Use the following credentials to log in to the exercise system:

Role User name Password
System Architect SystemArchitect rules

Your assignment consists of the following tasks:

Task 1: Export the ADM data sets

As a system architect, export the pyModelSnapshots and pyADMPredictorSnapshots data sets as JSON files that contain all available data on the adaptive models in the system. Next, verify the JSON export. You can use the exported JSON files for further analysis.

Task 2: Create data flows to export selected models

Limit the size of the data set by creating a data flow that composes the model snapshots and the predictor snapshots of models based on the Web Click Through Rate model configuration and stores the data in a new data set.

Task 3: Export the subset of the adaptive model data

Run the data flow to populate the dataset that contains model and predictor snapshots of the selected models. Verify the JSON format.

You must initiate your own Pega instance to complete this Challenge.

Initialization may take up to 5 minutes so please be patient.

Challenge Walkthrough

Detailed Tasks

1 Export the ADM data sets

  1. On the exercise system landing page, click Pega CRM suite to log in to Dev Studio.
  2. Log in as a System Architect with User name SystemArchitect using Password rules.
  3. In the search field, enter pyModelSnapshots, and then press Enter to search for the pyModelSnapshots data set.
  4. Click pyModelSnapshots to open the data set.
    This image shows the search result
  5. In the upper-right corner, click Actions > Export to start the export process.
  6. In the Export data set dialog box, click Export.
  7. After the export process finishes, click Download file to download the file to your local system.
    This image shows the Export data set dialogue box
  8. Click Done to close the dialog box.
  9. Repeat steps 3-8 for the pyADMPredictorSnapshots data set.
  10. Open the data.json files from the downloaded ZIP files to confirm that you have exported the model data and the predictor data.
    This image shows the model data in the json file
     
    This image shows the predictor data in the json file

2 Create data flows to export selected models

  1. In the menu bar of Dev Studio, click Create > Data model > Data flow to create a new data flow.
  2. In the Data Flow Record Configuration section, in the Label field, enter SelectedModelSnapshots.
  3. In the Context section, in the Apply to field, enter Data-Decision-ADM-ModelSnapshot to define the context of the data flow.
  4. In the Context section, in the Add to ruleset field, enter or select CDH-Rules.
    This image shows the data flow configuration
    Tip: As best practice, use the CDH-Rules and not the CDH-Artifacts to create such rules.
  1. In the upper-right corner, click Create and open.
  2. Right-click the source component, and then click Properties to configure the source data set.
    This image shows the source data set used in the data flow
    1. In the Source configurations dialog box, in the Source field, select Data set.
    2. In the Data set field, enter or select pyModelSnapshots.
    3. Click Submit to close the dialog box.
  3. Click the Add icon, and then click Filter to add a filter component to the data flow.
    This image shows how to add a component to a data flow
  4. Right-click the filter component, and then click Properties to configure the component.
    1. In the Filter configurations dialog box, in the Name field, enter Selected models only.
    2. In the Filter conditions section, click Add condition.
    3. Enter the condition to read: .pyConfigurationName = "Web_Click_Through_Rate".
      This image shows the Filter configurations dialogue box
    4. Click Submit to close the dialog box.
  5. In the upper-right corner, click Save.
  6. In the menu bar of Dev Studio, click Create > Data model > Data flow to create a new data flow.
  7. In the Data Flow Record Configuration section, in the Label field, enter SelectedPredictorSnapshots.
  8. In the Context section, in the Apply to field, enter Data-Decision-ADM-PredictorBinningSnapshot to define the context of the data flow.
  9. In the Context section, in the Add to ruleset field, ensure CDH-Rules is selected.
  10. In the upper-right corner, click Create and open.
  11. Right-click the source component, and then click Properties to configure the source data set.
    1. In the Source configurations dialog box, in the Source field, select Data set.
    2. In the Data set field, enter or select pyADMPredictorSnapshots.
    3. Click Submit to close the dialog box.
  12. Click the Add icon, and then click Compose to add a compose component to the data flow.
  13. Right-click the second source component, and then click Properties to configure the component.
    This image shows the addition of a Compose component
    1. In the Source configurations dialog box, in the Source field, select Data flow.
    2. In the Input class field, enter or select Data-Decision-ADM-ModelSnapshot.
    3. In the Data flow field, enter or select SelectedModelSnapshots.
    4. Click Submit to close the dialog box.
  14. Right-click the Compose component, then click Properties to configure the component.
    1. In the Compose configurations dialog box, in the Name field, enter Add model data.
    2. In the Property field, enter .ModelData, and then click the gear icon on the right of the Property field to create a new property.
      This image shows the location of the Gear icon
    3. In the upper-right corner, click Create and open.
    4. In the Property type section, click Change.
    5. In the Page column, click Single page to change the property type.
    6. In the Page definition field, enter or select Data-Decision-ADM-ModelSnapshot.
      This image shows how to set the property type
    7. In the upper-right corner, click Save, and then close the ModelData tab.
    8. In the Compose configurations dialog box, in the Compose when conditions below are met section, enter or select the condition to read When pyModelID = .pyModelID.
      This image shows the Compose configurations dialogue box
    9. Click Submit to close the dialog box.
  15. Click the Add icon on the Compose component, and then select Filter.
    This image shows how to add a new component to the data flow
  16. Right-click the Filter component, and then click Properties to configure the component.
    1. In the Name field, enter Selected models only.
    2. In the Filter conditions section, click Add condition.
    3. Enter the condition to read When .ModelData.pyModelID != "" to filter out predictor snapshots that do not match with a selected model and are therefore irrelevant.
      This image shows the filter condition
    4. Click Submit to close the dialog box.
  17. Right-click the destination component, and then click Properties to configure the destination data set.
    This image showcases the destination component
    1. In the Destination configurations dialog box, in the Destination field, select Data set.
    2. In the Data set field, enter SelectedSnapshots, and then click the gear icon on the right of the Data set field to create a new property.
      This image shows to create a new data set
    3. In the Data Set Record Configuration dialog box, in the Type field, select Decision Data Store.
    4. In the upper-right corner, click Create and open.
    5. In the upper-right corner, click Save, and then close the SelectedSnapshots tab.
    6. In the Destination configurations dialog box, click Submit to close the dialog box.
    7. In the upper-right, click Save.

3 Export the subset of the adaptive model data

  1. In the upper-right corner, click Actions > Run to run the SelectedPredictorSnapshots data flow.
  2. In the upper-right, click Submit.
  3. In the Data Flow Run DF-Test-1001 tab, in the upper-right corner, click Start.
  4. After the data flow run finishes, return to the SelectedPredictorSnapshots data flow.
  5. Right click the destination component, and then select Open data set.
    This image shows the how to open the destination data set
  6. In the upper-right corner, click Actions > Export to start the export process.
  7. In the Export data set dialog box, click Export.
  8. After the export process finishes, click Download file to download the file to your local system.
  9. Click Done to close the dialog box.
    This image shows the Export data set dialogue box
  10. Open the ZIP file, and then confirm that the data.json file contains model and predictor data for the adaptive models based on the Web Click Through Rate model configuration.
    This image shows the model and predictor data in the json file
    Note: This JSON file can be used for analysis in external tools. Alternatively, you can also store the results of the data flow in a File data set that is configured to be in the repository of your choice.


If you are having problems with your training, please review the Pega Academy Support FAQs.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Academy has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice