Skip to main content

Creating a Case from Kafka data

4 Tasks

45 mins

Pega Platform '24.2
Visible to: All users
Advanced
Pega Platform '24.2
English

Scenario

MDC plans to grow its business significantly by partnering with other subsidiaries. To accomplish this, MDC wants to enable asynchronous registrations with higher throughput, while ensuring that the systems remain loosely connected.

The following table provides the credentials you need to complete the challenge:

Role User name Password
Admin admin@deliveryservice rules

You must initiate your own Pega instance to complete this Challenge.

Initialization may take up to 5 minutes so please be patient.

Detailed Tasks

1 Identify design options

There are two options to implement the given requirement:

  • Option 1: Use the external Kafka service to read the message and create a Case asynchronously.
  • Option 2: Use a message queue to read the message and create a Case asynchronously.

2 Evaluate design options

The following table outlines the pros and cons of two design options for implementing the given requirement:

Design Pros Cons
External Kafka
  • Can handle millions of messages per second.
  • Kafka retains messages and allows for reactive programming.
  • Has high throughput and low latency.
  • Higher complexity requires an understanding of partitions, brokers, and offset management. 
  • Requires more effort to manage clusters, monitor brokers, and handle partition rebalancing.
Message queue
  • Useful as Task queues, background job processing, and simple event distribution.
  • The traditional message queue has a simpler setup and usage, making straightforward messaging tasks more intuitive. 
  • Easier to maintain with built-in tools for monitoring and management. 
  • Message queues are more lightweight buffers that store messages temporarily. 
  • Often struggles with high throughput scenarios.

3 Recommend the best design option

Because MDC requires a loosely coupled, asynchronous integration with higher throughput, Kafka is more efficient.

4 Implement solution details

  1. In the Pega Platform instance for the challenge, enter the following credentials:
    1. In the User name field, enter admin@deliveryservice.
    2. In the Password field, enter rules.
  2. In the navigation pane of Dev Studio, click Configure > Decisioning > Infrastructure > Stream, and then copy the IP address of the Kafka instance.
IP address of Kafka service
  1.  In the navigation pane of Dev Studio, click Records > SysAdmin > Create > Kafka to establish a connection with Kafka service.
  2. On the Connection tab, add host details with the IP address that you captured in step 2, and then save the Rule form.
    Kafka Connection rule.
  3. In the navigation pane of Dev Studio, click Records > Data Model > Data Set, and then create a Data Set record configuration that reads the messages from Kafka:
    1. In the Label field, enter Create Partner From Kafka.
    2. In the Type list, select Kafka.
    3. In the Context list, select Delivery Service.
    4. In the Apply to field, enter MDC-DS-Work-Partner.
    5. Click Create and open.
  4. Configure the connection details:
    1. In the Kafka configuration instance field, enter or select ConnectKafka.
    2. In the Define a new topic or select one from the cluster section, select Create new.
    3. In the Key field, enter or select BusinessPartnerId.
    4. Click Save.
      Create partner data set rule.
  5. In the navigation pane of Dev Studio, click Records > Data Model > Data Flow, and then create a Data Flow record to read the messages from Kafka that uses the Data Set and outputs the messages to the Create Case:
    1. In the Label field, enter Create Partner Case.
    2. In the Context list, select Delivery Service.
    3. In the Apply to field, enter MDC-DS-Work-Partner.
    4. Click Create and open.
      Create partner case data flow
  6. Click Save.
  7. Configure the source configurations for CreatePartnerCase Data Flow Rule:
    1. In the Import data from section, in the Source list, select Data set.
    2. In the Data set field, enter or select CreatePartnerFromKafka.
    3. In the Read options section, select Only read new records.
    4. Click Submit.
      Source configurations of data flow rule.
  8. Click Save.
  9. Configure the destination configurations for CreatePartnerCase Data Flow Rule:
    1. In the Output data to section, in the Destination list, select Case.
    2. In the Case list, select Partner.
    3. In the Add mapping section, in Set field, enter or select BusinessPartner, and then in the equal to field, enter or select BusinessPartner. 
    4. Click Submit, and then click Save.
      Destination configurations of data flow rule.
  10. Run the CreatePartnerCase Data Flow Rule by clicking Actions > Run, and then click Run.
Run data flow rule
Note: In the Pega lab environment and the virtual machines provided in the introductory part of this course, only background nodes are accessible. You should choose a background node rather than a batch node. However, in a real-world setting, it's necessary to utilize a batch node.

The system begins processing the Data Flow Run:

Running the data flow rule.

Confirm your work

  1. Click Actions > Run to push a text message into Kafka to create the Partner Case that uses the PublishMessageToKafka Rule.
PushMessage to Kafka utility
  1. Navigate to the Data Flow run instance to view the statistics.
Successful execution of data flow rule.
  1. In the navigation pane of Dev Studio, click App > Partner to view or open the Partner Case instances. 
Partner case instances
  1. Confirm that you can see the details of the Partner Case.
    Partner case created from Kafka


Available in the following missions:

If you are having problems with your training, please review the Pega Academy Support FAQs.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Academy has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice