Every business these days use enterprise systems and numerous applications to operate effectively. However, with a tremendous amount of operational data across the organization and an increasing demand from various business units, there is pressure on the IT department to deliver better services at a reasonable cost. Traditionally, the IT teams used to integrate data by moving datasets via SOAP & REST APIs. Web service calls via APEX were used for business process integration. With enterprise systems becoming more complex, IT teams started adding diverse ‘development lifecycles’ and independent services to stream data & events between applications. This development architecture led to multifaceted business process integration and reached a point where it is critical to dissociate the services. Hence, a shift to an event-driven architecture and asynchronous interaction is needed to train the complexity.
For any business to achieve success, the key elements to look for in a solution should be data scalability, works in real-time, durability, and can upgrade. Old-style data platforms with the database, data warehouse, ETL solution, or messaging solutions do not meet critical elements’ standards. Hence, an open-source enterprise-grade solution-Kafka (notably Kafka Connect), comes to play. It integrates a range of business applications like SAP & Salesforce or any traditional database.
Common e-commerce business scenario
Let us understand with an example, through the pre-streaming era of retail brands where there are showrooms, an exclusive interior, and the very gracious sales executives to guide you through the product purchase personally. The in-store experience is all that a consumer could ask for-a personal touch. However, with the retail brands going through a digital shift, they are concerned about offering a personalized shopping experience and numerous touch-points for the customers before making a purchase. But how can this be achieved?
The simple solution is data. Whether it is the company’s marketing & advertising approach, promotional offers for customers, and customer interaction from product search to criteria selection, features, etc are all data-driven. All these company and customer actions are called as events in developers’ language. A customer browses through various channels looking for the preferred product and decides based on specific criteria. Events such as product preference, selection criteria, budget etc. happen in real-time and are stored in the database for future use. This data is required to process requests or provide a personalized shopping experience in real-time.
The challenges associated with shifting to online business:
- The old business processes communicate with batch processes and maze-like architecture to stream events that hinder the company process and real-time data streaming.
- Any e-commerce company needs to track consumer data to push to analytical systems for personal recommendations and insights.
- A large volume of data is generated (from both producers and receivers), which requires processing through multiple applications across various business units.
- Manual data-transfer requests need to be addressed by creating a self-service process
- Sales unit need to update data in the database regularly
- ETL batch processes need to be replaced
The company’s IT team tried to address the above challenges by creating custom codes for several applications and tried to build applications on the Salesforce platform used by the sales and operations team. But with the growth in online business and the amount of customer data that was so tightly coupled, they end up with a very complex system challenging to operate. We suggested addressing the challenges by using Kafka, which focuses on the “publish & subscribe” model that assists in
- modeling data streams in real-time
- transport and publish data to all applications
- building rich real-time streaming applications that react to data events
Working of Kafka Connect on Heroku
Kafka is a distributed platform that can run on multiple servers and create a scalable system that can handle and store vast amounts of data (messages) in real-time. For clients, this is one of the best integrations to do, but Kafka’s setup and operation could be a difficult task. It requires a great deal of expertise and sometimes months to fine-tune to set up a reliable Kafka cluster. That’s why we suggest going for Kafka Connect for Heroku, as this will be a private Kafka cluster that will save time for the IT team to focus on services. Instead of maintaining numerous applications for continuous data updates, Kafka Connect provides a unified solution for on-demand data, recurring data transfers, and data updates.
Kafka Connect also enables the IT team to create new self-service processes for faster response time. Also, it is quicker & easier to set up Kafka connect integration. For example, there are two SQL database (one for order fulfillment & the other for CRM) and the CRM database need to be updated based on order change status. In a typical scenario, the developer writes the changes in the “order status” table of the CRM database to reflect the updates. With Kafka Connect REST APIs, you need to create a ‘source & sink connection’ to read and write from the first database to the CRM database. Once these connections are made, the new order updates will be automatically written in the CRM table.
The Concluding View
To sum up, Kafka with Salesforce integration secures a bright future. Its architectural design system provides many benefits ranging from external data assimilation, pipeline release, push notification, and many more. At Prowess, our team works on delivering real-time data insights across the business functions, and Kafka has been a vital tool in attaining this goal. It is a streaming platform and a powerful asynchronous messaging technology that delivers an excellent performance. If you are interested in becoming an event-streaming and data-driven organization, connect with us!