Salesforce and Databricks send data to each other’s lakehouse

Salesforce and Databricks send data to each other’s lakehouse

Salesforce and Databricks are expanding their partnership to bring zero-ETL data sharing to the Salesforce Data Cloud. This removes the cost for companies to copy data while delivering security benefits.

As of today, it is possible to merge data from Salesforce’s Data Cloud with data from Databricks’ Lakehouse Platform. This is due to an expansion of the partnership, which both parties announced today.

More specifically, it involves a Bring Your Own Lake model (BYOL) through which data from Salesforce’s data lakehouse integrates with Databricks’, and vice versa. This with zero-ETL (Extract, Transform, Load), meaning that the integrated data has already been combined, cleansed and normalized and is thereby ready for data analysis.

Solving AI problems

Salesforce puts the new capabilities in the perspective of the AI story. In fact, according to the CRM specialist, 80 percent of IT teams see AI tools as the right tool to extract value from data. Only about 40 percent indicate that enterprise data is not currently proving itself useful due to complexity.

Integration allows more data to flow toward an AI model. Users can process the data and train their own AI models with Databricks Machine Learning to deploy them later on in any application on the Salesforce platform.

Also read: Salesforce Genie makes a real-time personalized customer experience possible