Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      10 Top Generative AI Development Companies for Enterprise Node.js Projects

      August 30, 2025

      Prompting Is A Design Act: How To Brief, Guide And Iterate With AI

      August 29, 2025

      Best React.js Development Services in 2025: Features, Benefits & What to Look For

      August 29, 2025

      August 2025: AI updates from the past month

      August 29, 2025

      This 3-in-1 charger has a retractable superpower that’s a must for travel

      August 31, 2025

      How a legacy hardware company reinvented itself in the AI age

      August 31, 2025

      The 13+ best Walmart Labor Day deals 2025: Sales on Apple, Samsung, LG, and more

      August 31, 2025

      You can save up to $700 on my favorite Bluetti power stations for Labor Day

      August 31, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      Call for Speakers – JS Conf Armenia 2025

      August 30, 2025
      Recent

      Call for Speakers – JS Conf Armenia 2025

      August 30, 2025

      Streamlining Application Automation with Laravel’s Task Scheduler

      August 30, 2025

      A Fluent Path Builder for PHP and Laravel

      August 30, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Windows 11 KB5064081 24H2 adds taskbar clock, direct download links for .msu offline installer

      August 30, 2025
      Recent

      Windows 11 KB5064081 24H2 adds taskbar clock, direct download links for .msu offline installer

      August 30, 2025

      My Family Cinema not Working? 12 Quick Fixes

      August 30, 2025

      Super-linter – collection of linters and code analyzers

      August 30, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Salesforce to Databricks: A Deep Dive into Integration Strategies

    Salesforce to Databricks: A Deep Dive into Integration Strategies

    July 15, 2025

    Supplementing Salesforce with Databricks as an enterprise Lakehouse solution brings advantages for various personas across an organization. Customer experience data is highly valued when it comes to driving personalized customer journeys leveraging company-wide applications beyond Salesforce. From enhanced customer satisfaction to tailored engagements and offerings that drive business renewals and expansions, the advantages are hard to miss. Databricks maps data from a variety of enterprise apps, including those used by Sales, Marketing and Finance. Consequently, layering Databricks Generative AI and predictive ML capabilities provide easily accessible best-fit recommendations that help eliminate challenges and highlight success areas within your company’s customer base.

    In this blog, I elaborate on the different methods whereby Salesforce data is made accessible from within Databricks. While accessing Databricks data from Salesforce is possible, it is not the topic of this post and will perhaps be tackled in a later blog. I have focused on the built-in capabilities within both Salesforce and Databricks and have therefore excluded 3rd party data integration platforms. There are three main ways to achieve this integration:

    1. Databricks Lakeflow Ingestion from Salesforce
    2. Databricks Query Federation from Salesforce Data Cloud
    3. Databricks Files Sharing from Salesforce Data Cloud

    Choosing the best approach to use depends on your use case. The decision is driven by several factors, such as the expected latency of accessing the latest Salesforce data, the complexity of the data transformations needed, and the volume of Salesforce data of interest. And it may very well be that more than one method is implemented to cater for different requirements.

    While the first method copies the raw Salesforce data over to Databricks, methods 2 and 3 offer no-copy alternatives, thus leveraging Salesforce Data Cloud itself as the raw data layer. The no-copy alternatives are great in that they leverage Salesforce’s native capability of managing its own data lake thus eliminating overhead by redoing that effort. However, there are limitations to doing that, depending on the use case. The matrix below presents how each method compares when factoring in the key criteria for integration.

    MethodLakeflow IngestionSalesforce Data Cloud Query FederationSalesforce Data Cloud File Sharing
    TypeData IngestionZero-CopyZero-Copy
    Supports Salesforce Data Cloud as a Source?✔︎ Yes✔︎ Yes✔︎ Yes
    Incremental Data Refreshes✔︎ Automated processing into Databricks based on SF standard timestamp fields. Formula fields always require a full refresh of the formulas.✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    ✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    Processing of Soft Deletes✔︎ Yes Supported incrementally✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    ✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    Processing of Hard Deletes✘ Requires a full refresh✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    ✔︎ Automated in SF Data Cloud
    (Requires custom handling if copying to Databricks)
    Query Response Time✔︎ Best as data is queried from a local copy and processed within Databricks⚠ Slower as query response is dependent on SF Data Cloud, and data has to travel across networks⚠ Slower as data travels across networks
    Supports Real-Time Querying?✘ No

    The pipeline runs on a schedule to copy data for example, hourly, daily, etc.

    ✔︎ Yes

    Live query execution on SF Data Cloud
    (Data Cloud DLO is refreshed from Salesforce modules either in batches, streaming (every 3 min), or in real-time.)

    ✔︎ Yes

    Live data sourced from SF Data Cloud
    (Data Cloud DLO is refreshed from Salesforce modules either in batches, streaming (every 3 min), or in real-time.)

    Supports Databricks Streaming Pipelines?✔︎ Yes, With Declarative Pipelines into Streaming tables (DLT) (runs as micro-batch jobs)✘ No✘ No
    Suitable for High Data Volume?✔︎ Yes
    SF Bulk API is called for high data volumes such as initial loads, and SF REST API is used for lower data volumes such as limited data volume incremental loads.
    ✘ No
    Reliant on JDBC Query Pushdown limitations and SF performance
    ⚠ Moderate
    This method is more suitable than Query Federation when it comes to zero-copy with high volumes of data.
    Supports Data Transformation⚠ No direct transformation. Ingests SF objects as is. Transformation happens downstream in the Declarative Pipeline.✔︎ Yes. DBRX pushes queries over to Salesforce using JDBC protocol.✔︎ Yes. Transformations execute on Databricks compute
    ProtocolSF REST API and Bulk API over HTTPSJDBC over HTTPSSalesforce Data Cloud DaaS APIs over HTTPS (file-based access)
    ScalabilityUp to 250 objects per pipeline. Multiple pipelines are allowed.Depending on SF Data Cloud performance when running transformation with multiple objectsUp to 250 Data Cloud objects may be included in a data share. Up to 10 data shares.
    Salesforce PrerequisitesAPI-enabled Salesforce user with access to desired objectsSalesforce Data Cloud must be available.

    Data Cloud DMOs mapped to DLOs with Streams or other methods for Data Lake population.

    Enable JDBC API access to Data Cloud.

    Salesforce Data Cloud must be available.

    Data Cloud DMOs mapped to DLOs with Streams or other methods for Data Lake population.

    Data share target is created in SF with shared objects.

    If you’re looking for guidance on leveraging Databricks with Salesforce, reach out to Perficient for a discussion with Salesforce and Databricks specialists.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticlePerficient Included Again in IDC Market Glance for Customer Experience Services
    Next Article Stream API in Java: Enhancements and Use Cases

    Related Posts

    Machine Learning

    How to Evaluate Jailbreak Methods: A Case Study with the StrongREJECT Benchmark

    August 31, 2025
    Artificial Intelligence

    Scaling Up Reinforcement Learning for Traffic Smoothing: A 100-AV Highway Deployment

    August 31, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    Opsera’s Codeglide.ai lets developers easily turn legacy APIs into MCP servers

    Tech & Work

    How to Automate Information Gathering for Ethical Hackers — AutoRecon Tutorial

    Development

    Smashing Security podcast #420: Fake Susies, flawed systems, and fruity fixes for anxiety

    Development

    Flameshot Screenshot App Gets First Major Update in 3 Years

    Linux

    Highlights

    Linux

    Bottles: Un Appello alla Comunità per il Sostegno e la Crescita

    July 12, 2025

    Bottles è un software libero e open source che consente di eseguire applicazioni e giochi…

    PREAMBLE: Private and Efficient Aggregation via Block Sparse Vectors

    July 15, 2025

    CVE-2025-34162 – Bian Que Feijiu Intelligent Emergency and Quality Control System SQL Injection Vulnerability

    August 27, 2025

    (non) recensione AnduinOS

    May 12, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.