https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Datasphere-blog-posts.xmlSAP Community - SAP Datasphere2026-02-17T12:12:37.199872+00:00python-feedgenSAP Datasphere blog posts in SAP Communityhttps://community.sap.com/t5/technology-blog-posts-by-sap/rewiring-of-sap-datasphere-to-sap-business-data-cloud/ba-p/14292149Rewiring of SAP Datasphere to SAP Business Data Cloud2025-12-19T09:58:19.950000+01:00kpsauerhttps://community.sap.com/t5/user/viewprofilepage/user-id/14110<P> </P><P>The rewiring of existing SAP Datasphere tenants to SAP Business Data Cloud has become available in 2025. Since then, I have been in many discussions with customers, partners, but also SAP colleagues at SAP TechEd in Berlin, Sydney and other events or meetings.</P><P>I observed that people unintentionally misinterpret the term “rewiring”. It sounds technical, hence there are associations with data, object or even full tenant migrations. This is not the case!</P><P>Let me use this blog to clarify what rewiring is, and what it is not. Moreover, give you some guidance on the steps which you need to take.</P><P> </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2025-12.-.SAP BDC Rewiring Blog.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/353637i077376378FF54FBB/image-size/large?v=v2&px=999" role="button" title="2025-12.-.SAP BDC Rewiring Blog.png" alt="2025-12.-.SAP BDC Rewiring Blog.png" /></span></P><P> </P><H2 id="toc-hId-1767356814">When do I need to rewire my existing SAP Datasphere tenant?</H2><P>You have made a decision to sign up for SAP Business Data Cloud (BDC) and to make use of your existing tenants for SAP Datasphere and / or SAP Analytics Cloud as part of your new landscape with SAP BDC.</P><P>If you have not made any decision on SAP BDC yet, you can certainly use your existing tenant(s) until the end of your current contract without SAP BDC. However, you will miss out on new capabilities offered by SAP BDC with SAP managed data products and the enhanced open partner ecosystem options.</P><P> </P><H2 id="toc-hId-1570843309">What is tenant rewiring?</H2><P>The term “rewiring” sounds quite technical. Therefore, there are immediate associations with data, object or even full tenant migrations. <STRONG>This is not the case!</STRONG></P><OL><LI>The term rewiring mainly refers to a change of your commercial model from a stand-alone tenant to a tenant which is managed as part of a larger SAP BDC landscape.</LI><LI>Rewiring helps you change the management of your tenant lifecycle to the SAP for Me portal.</LI><LI>Prerequisites are that you commercially move the tenant on your same customer ID and your current data center being available also for Business Data Cloud. Please check on these two points!</LI><LI>If these conditions are met a lot will stay the same. Meaning you stay on your existing hyperscaler and data center. Hence, your tenant and its URL will remain intact, just under a new commercial model, even without any downtime required.</LI></OL><P> </P><H2 id="toc-hId-1374329804">What is tenant rewiring <U>not</U>?</H2><P>Rewiring is therefore not a tenant migration as your existing tenant stays the same and is reused as part of an SAP BDC landscape.</P><P>Rewiring does not migrate your tenant between hyperscalers or data centers, as with the conditions mentioned before, your current data center – given that it is available for SAP BDC – will stay the same on your existing customer ID.</P><P>Rewiring does not migrate your data or any objects, as the full existing tenant is only commercially moved as described before.</P><P>Rewiring does not require any tenant downtime.</P><P> </P><H2 id="toc-hId-1177816299">How can I initiate a tenant rewiring?</H2><P>Before you can start to initiate the rewiring, you need to change the commercial model of your SAP Datasphere tenant to a Business Data Cloud contract – in short: a subscription to SAP Business Data Cloud must exist.</P><H4 id="toc-hId-1239468232">For tenants provisioned via BTP cockpit</H4><P>After the commercial change, your SAP BTP administrator can initiate the tenant rewiring directly through the SAP BTP platform global account as a self-service. <U>You control the timing of rewiring yourself, as you can trigger it via self-service</U>.</P><P>Once the request is received, the process is initiated to update the tenant and move it to SAP Business Data Cloud. The tenant will be visible in SAP Business Data Cloud when the rewiring is completed.</P><H4 id="toc-hId-1042954727">For subscription tenants provisioned by SAP</H4><P>After the commercial change, your subscription is moved from standalone to SAP Business Data Cloud. Be aware, that the SAP Business Data Cloud cockpit feature must be activated in your appropriate data center through SAP for Me portal <U>prior to rewiring</U>.</P><P>The rewiring itself is initiated as part of the subscription change request. That means, your tenant is automatically moved to SAP Business Data Cloud <U>for the contract start date</U>.</P><P> </P><H2 id="toc-hId-588275784">Where do I find more detailed information?</H2><P>For more details I recommend these three resources:</P><P>SAP Community Blog by <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/6335">@Ammar_Naji</a>:<BR /><span class="lia-unicode-emoji" title=":link:">🔗</span> <SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/update-sap-datasphere-and-sap-analytics-cloud-availability-via-sap-business/ba-p/14201985" target="_blank">UPDATE: SAP Datasphere and SAP Analytics Cloud Availability via SAP Business Data Cloud</A></SPAN></P><P>SAP Help documentation for detailed steps:<BR /><span class="lia-unicode-emoji" title=":link:">🔗</span> <SPAN><A href="https://help.sap.com/docs/business-data-cloud/administering-sap-business-data-cloud/rewiring-of-standalone-sap-datasphere-and-sap-analytics-cloud-subscriptions-to-sap-business-data-cloud?locale=en-US&version=LATEST" target="_blank" rel="noopener noreferrer">Rewiring of Standalone SAP Datasphere and SAP Analytics Cloud Subscriptions to SAP Business Data Cloud</A></SPAN></P><P><span class="lia-unicode-emoji" title=":link:">🔗</span> SAP Note <SPAN><A href="https://me.sap.com/notes/3650483" target="_blank" rel="noopener noreferrer">3650483</A></SPAN></P><P> </P><P>Let me know if there are still questions about rewiring and post them below <span class="lia-unicode-emoji" title=":backhand_index_pointing_down:">👇</span> in the comments section. I will try to answer them and loop in colleagues in case I do not know the answer myself.</P><P> </P>2025-12-19T09:58:19.950000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/connecting-sap-analytics-cloud-to-databricks-model-serving-endpoint/ba-p/14290451Connecting SAP Analytics Cloud to Databricks model serving endpoint2025-12-31T16:15:03.063000+01:00Ian_Henryhttps://community.sap.com/t5/user/viewprofilepage/user-id/239<H2 id="toc-hId-1767300138">Scenario</H2><P>Imagine your data scientists have worked their magic and built a predictive (I should call it AI now) model that you wish to access from an SAP Analytics Cloud (SAC) story. The model may require parameters to analyze specific data, or you may require the latest output to be retrieved<BR /><BR />This can be achieved using the SAC Multi Action feature. Databricks and SAC both support OAuth 2.0 for authentication. A service principal is used for authorization and authentication outside of the standard Databricks UI. The SAC Multi Action feature offers different actions(steps), including an API step which utilises a secure HTTP API connection<BR /><BR /></P><H2 id="toc-hId-1570786633">Prerequisites</H2><H4 id="toc-hId-1632438566">Required Solutions</H4><P class="lia-indent-padding-left-30px" style="padding-left : 30px;">SAP Analytics Cloud<BR />Databricks</P><H4 id="toc-hId-1435925061">Required users access</H4><P class="lia-indent-padding-left-30px" style="padding-left : 30px;">SAP Analytics Cloud BI Admin or SAP Analytic Cloud Planning<BR />Databricks Workspace Admin</P><H4 id="toc-hId-1239411556">Required artefacts</H4><P class="lia-indent-padding-left-30px" style="padding-left : 30px;">Databricks Model Serving Endpoint</P><H2 id="toc-hId-784732613">OAuth with Databricks / SAP Databricks</H2><P>The steps are the same with native Databricks and SAP Databricks. The Databricks documentation describes the generic steps and required parameters for OAuth 2.0 authorization. <BR /><A title="Authorize user access to Databricks with OAuth" href="https://docs.databricks.com/aws/en/dev-tools/auth/oauth-u2m" target="_blank" rel="noopener nofollow noreferrer">https://docs.databricks.com/aws/en/dev-tools/auth/oauth-u2m</A> <BR />The SAC HTTP API connection requires 5 parameters from Databricks.</P><UL><LI><STRONG>Data Service URL</STRONG></LI><LI><STRONG>OAuth Client ID</STRONG></LI><LI><STRONG>Secret</STRONG></LI><LI><STRONG>Token URL</STRONG></LI><LI><STRONG>Scope</STRONG></LI></UL><H2 id="toc-hId-588219108">Databricks Service Principal</H2><P>We must first create service principal in Databricks, this can be at either the account or the workspace level. <BR />Here I have created the service principal within the workspace.</P><P>Navigate to User Settings -> Identity and Access</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Identity and Access" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352698i4C25221D374F4D8B/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-16 at 12.05.53.png" alt="Identity and Access" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Identity and Access</span></span></P><P>Add Service Principal</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Add Service Principal" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352702i4055BFCC56799B8D/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.23.20.png" alt="Add Service Principal" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Add Service Principal</span></span></P><P> </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Add Service Principal Continued" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352704i9378BBAE1DD51735/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.23.29.png" alt="Add Service Principal Continued" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Add Service Principal Continued</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Add Service Principal Name" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352706i3B46BE4A8EE663D2/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.24.15.png" alt="Add Service Principal Name" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Add Service Principal Name</span></span></P><H2 id="toc-hId-391705603">OAuth Secret</H2><P>With the Service Principal created we can generate the secret required for authentication</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Service Principal" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352709i12102ED834CC383B/image-size/large?v=v2&px=999" role="button" title="DBX Service Principal.png" alt="Service Principal" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Service Principal</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Service Principal Secrets" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352711iF1A720F945CBA38A/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.24.49.png" alt="Service Principal Secrets" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Service Principal Secrets</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Service Principal Generate OAuth Secret" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352714i5AB793A1DA7355B1/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.25.33.png" alt="Service Principal Generate OAuth Secret" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Service Principal Generate OAuth Secret</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Generate OAuth Secret - Lifetime" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352715iD4CB712F8407D5A2/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.25.48.png" alt="Generate OAuth Secret - Lifetime" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Generate OAuth Secret - Lifetime</span></span></P><P>We need the secret and client details for the SAC connection, copy them somewhere safe</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Generate Secret" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/352717i460C4918C3F2C78F/image-size/large?v=v2&px=999" role="button" title="Screenshot 2025-12-12 at 17.26.15.png" alt="Generate Secret" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Generate Secret</span></span></P><H2 id="toc-hId-195192098">Serving Endpoint Permissions</H2><P>The service principal requires execute/query permission on the model serving end point.<BR />We can add the permission via code or the UI as below</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Model Serving Endpoint" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357270i61AFE5486502A670/image-size/large?v=v2&px=999" role="button" title="Serve2.png" alt="Model Serving Endpoint" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Model Serving Endpoint</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Model Serving Endpoint Details" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357271i59D2D27690DEFB32/image-size/large?v=v2&px=999" role="button" title="Model Serving End Point.png" alt="Model Serving Endpoint Details" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Model Serving Endpoint Details</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Model Serving Endpoint Permissions" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357272i0E5AEA0DB5A94187/image-size/large?v=v2&px=999" role="button" title="Model Serving Permissions v2.png" alt="Model Serving Endpoint Permissions" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Model Serving Endpoint Permissions</span></span></P><H2 id="toc-hId--1321407">SAP Analytics Cloud Connection</H2><P>Switch to SAC, we need to define a new HTTP API connection</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Analytics Cloud HTTP API Connection" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357277iC6547E4A223D3D34/image-size/large?v=v2&px=999" role="button" title="SAC HTTP API Connection.png" alt="SAP Analytics Cloud HTTP API Connection" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Analytics Cloud HTTP API Connection</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Analytics Cloud HTTP API Connection Dialogue" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357278i664E788A3249442B/image-size/large?v=v2&px=999" role="button" title="SAC HTTP API Connection Details.png" alt="SAP Analytics Cloud HTTP API Connection Dialogue" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Analytics Cloud HTTP API Connection Dialogue</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Analytics Cloud HTTP API Connection with Placeholders" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357279iDBCDBEEE37E46534/image-size/large?v=v2&px=999" role="button" title="SAC HTTP API Connection Placeholders.png" alt="SAP Analytics Cloud HTTP API Connection with Placeholders" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Analytics Cloud HTTP API Connection with Placeholders</span></span></P><P>When we click "Create" it validates the connection details.</P><H2 id="toc-hId-149419445">SAC Multi Action</H2><P>Define a new SAC Multi Action using the connection defined.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Analytics Cloud New Multi Action" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357281iA380E1908900AC64/image-size/large?v=v2&px=999" role="button" title="Multi1.png" alt="SAP Analytics Cloud New Multi Action" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Analytics Cloud New Multi Action</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Multi Action - Add API Step" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357282iDE30765E352AFE9D/image-size/large?v=v2&px=999" role="button" title="Multi2.png" alt="Multi Action - Add API Step" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Multi Action - Add API Step</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Multi Action API Step" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357283iBD8D6D5FAECD56B3/image-size/large?v=v2&px=999" role="button" title="Multi3.png" alt="Multi Action API Step" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Multi Action API Step</span></span></P><P>Within the Multi Action properties we can select the previously defined connection. <BR />The API Request URL is required again here, we can paste in the Model Serving Endpoint here.</P><H2 id="toc-hId--47094060">SAP Analytics Cloud Story</H2><P>The multi action defined above needs to be included within an SAC story</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAC Story Add Multi Action Starter Component" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357408iF1B06605AF643514/image-size/large?v=v2&px=999" role="button" title="Story Multi Action.png" alt="SAC Story Add Multi Action Starter Component" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAC Story Add Multi Action Starter Component</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAC Multi Action Starter Properties" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357412iB9B12DC5AAAD8CBB/image-size/large?v=v2&px=999" role="button" title="Story Multi Action Properties.png" alt="SAC Multi Action Starter Properties" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAC Multi Action Starter Properties</span></span></P><P>We can test the Multi Action call by clicking on the play button. </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAC Execute Multi Action" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357414i9E017140E41F87E8/image-size/large?v=v2&px=999" role="button" title="Story Multi Action Execute.png" alt="SAC Execute Multi Action" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAC Execute Multi Action</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAC Multi Action Complete" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357413i86BBF172F48D1F7A/image-size/large?v=v2&px=999" role="button" title="Story Multi Action Complete.png" alt="SAC Multi Action Complete" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAC Multi Action Complete</span></span></P><P>All being well, once completed we can see an additional notification available</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAC Multi Action Notifications" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357415iC813BE436B442E67/image-size/large?v=v2&px=999" role="button" title="Story Multi Action Notification.png" alt="SAC Multi Action Notifications" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAC Multi Action Notifications</span></span></P><H2 id="toc-hId--243607565">SAP Analytics Cloud Job Monitor</H2><P>The Multi Action history is captured within the Job Monitor</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Analytics Cloud - System Job Monitor" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357417iC791F1F7FE62AA32/image-size/large?v=v2&px=999" role="button" title="SAC Job Monitor.png" alt="SAP Analytics Cloud - System Job Monitor" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Analytics Cloud - System Job Monitor</span></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Job Monitor History" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357416iC4096BB7B5107F9E/image-size/large?v=v2&px=999" role="button" title="SAC Job Monitor Details.png" alt="Job Monitor History" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Job Monitor History</span></span></P><H2 id="toc-hId--440121070">Conclusion</H2><P>In this blogpost I have shown that integrating a Databricks model serving endpoint can be achieved with only a few steps using out of the box features from Databricks and SAP Analytics Cloud.</P>2025-12-31T16:15:03.063000+01:00https://community.sap.com/t5/data-and-analytics-blog-posts/c-purchaseorderitemdex-add-the-creation-date-field-of-the-purchase-order/ba-p/14302591C_PURCHASEORDERITEMDEX - Add the creation date field of the purchase order item2026-01-07T08:30:47.905000+01:00miquelfornieleshttps://community.sap.com/t5/user/viewprofilepage/user-id/151594<P>Hello everyone,</P><P>I recently raised a case with SAP regarding the standard CDS View for purchase order items,<SPAN> </SPAN><STRONG>C_PURCHASEORDERITEMDEX</STRONG>, which currently returns the header creation date for all items. After reviewing the case, the SAP support team confirmed that this is the expected behavior and suggested submitting an idea on their Influence platform. My ERP is S/4HANA Public Cloud.</P><P><STRONG><U><EM>The goal of this blog post is to gather support from the community so that this improvement can be prioritized.</EM></U></STRONG><SPAN> </SPAN>If you agree that having the correct item-level creation date would add value, please take a moment to vote for my idea on the SAP Influence website. Your vote can make a real difference!</P><P class="lia-indent-padding-left-30px" style="padding-left : 30px;"><A title="https://influence.sap.com/sap/ino/#/idea/362226" href="https://influence.sap.com/sap/ino/#/idea/362226" target="_blank" rel="noopener noreferrer">https://influence.sap.com/sap/ino/#/idea/362226</A></P><P>Thank you in advance for your support.</P><P>Best regards,</P><P>Miquel</P>2026-01-07T08:30:47.905000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/sap-datasphere-news-in-december/ba-p/14299695SAP Datasphere News in December2026-01-09T10:11:58.169000+01:00kpsauerhttps://community.sap.com/t5/user/viewprofilepage/user-id/14110<P><STRONG>SAP Datasphere News in December</STRONG></P><P>What a year it has been. Wow! 2025 was packed with great events, lots of new features and enhancements to extend existing features, and the launch of SAP Business Data Cloud. <BR />Explore the latest updates in our community news blogs and enjoy watching my top feature highlights for 2025 on YouTube <span class="lia-unicode-emoji" title=":television:">📺</span> below.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kpsauer_0-1767108057334.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357255iEF04F58BA9D64138/image-size/large?v=v2&px=999" role="button" title="kpsauer_0-1767108057334.png" alt="kpsauer_0-1767108057334.png" /></span></P><P> </P><H2 id="toc-hId-1767570307">SAP Datasphere 2025 highlights in a nutshell</H2><P>What a great ride it has been in 2025 with SAP Datasphere!<BR />Let me share a few of my personal highlights.</P><H3 id="toc-hId-1700139521">SAP Business Data Cloud</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kpsauer_0-1767108122051.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357256iA5A51A9559792A99/image-size/large?v=v2&px=999" role="button" title="kpsauer_0-1767108122051.png" alt="kpsauer_0-1767108122051.png" /></span></P><P>The announcement of SAP Business Data Cloud in February this year made quite some noise and puts data, analytics and AI at the center of our portfolio at SAP. We delivered SAP Business Data Cloud in May 2025 as a fully managed software as a service solution that unifies and governs all SAP data and seamlessly connects with third-party data.</P><P>It is an evolution of the business data fabric journey we started with SAP Datasphere and SAP Analytics Cloud some years ago and also offers a path forward for your investments in SAP BW.</P><P>Delivering out-of-the-box always-on data products is the next pivotal step in our data strategy following a harmonized data model to provide scalable access without duplication using zero-copy delta share technology.</P><P>SAP BDC serves as the foundation for your data and AI projects leveraging SAP managed and custom data products, including our growing open partner ecosystem with Databricks and the newly announced partnerships with Google Big Query, Snowflake and Microsoft Fabric.</P><P> </P><H2 id="toc-hId-1374543297">My top features in 2025</H2><P>Picking my top features for all of 2025 was not easy and here is my top feature list in no particular order.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kpsauer_1-1767108160895.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357257i86BEAE9602989FB6/image-size/large?v=v2&px=999" role="button" title="kpsauer_1-1767108160895.png" alt="kpsauer_1-1767108160895.png" /></span></P><P> </P><H3 id="toc-hId-1307112511">Object store (general availability)</H3><P><SPAN>In 2025 we delivered the option to include an object store next to the well-known SAP HANA Cloud based persistency for a cost-efficient storage. You can use the existing tooling you already know to now create local tables on files, use replication flows, transformation flows which are then using Spark as compute engine - all orchestrated by task chains. You can load large quantities of data via replication flows and prepare the data using transformation flows and then share data to other spaces to be used as a source for additional flows, views, and of course analytic models. A really great enhancement!</SPAN></P><P> </P><H3 id="toc-hId-1110599006">Versioning of data builder objects</H3><P>Object versioning has been introduced for views, analytic models, data access controls, and local tables. The new object versions are created automatically at the time of deployment which allows you now to list the past versions of objects, open a past version in read-only mode in a new tab, download a past version to a CSN/JSON file and restore a past version to replace the current version.</P><P>Be aware that the saving of an error-free restored version overwrites the current object. It is also important to understand that the versioning happens for the metadata of an object and not the data.</P><P> </P><H3 id="toc-hId-914085501">Efficiency features</H3><P>Then there are tons of smaller enhancements that have been delivered in 2025 to increase efficiency and productivity overall. Let me just name a few here for the different components of Datasphere.</P><P><STRONG>Analytic model</STRONG></P><UL><LI> stacking and reusing of models</LI><LI> support for structures and data access controls for analytic models</LI><LI> we introduced a measure-dependency graph, the fiscal time dimension and unit conversion</LI><LI> and many enhancements with variables and parameters</LI></UL><P><STRONG>Replication flows</STRONG></P><UL><LI>delta only load type</LI><LI>support for views</LI><LI>reusing of source objects in multiple flows</LI><LI>many new sources and targets</LI></UL><P><STRONG>Transformation flows</STRONG></P><UL><LI>batch processing</LI><LI>parameter support</LI><LI>HANA tables as source in Spark <SPAN>transformation flows</SPAN></LI><LI>other features bringing Spark <SPAN>transformation flows </SPAN>almost on par with HANA Cloud based ones</LI></UL><P><STRONG>Task chains</STRONG></P><UL><LI>deep retry of failed tasks in nested task chains</LI><LI>Rest API</LI><LI>Parameter support</LI><LI>notification tasks</LI><LI>amongst other improvements</LI></UL><P><STRONG>Catalog</STRONG></P><UL><LI>Support for SAP BW/4HANA, seamless planning, SAP BDC data products, Databricks via BDC connect</LI><LI>GenAI-assisted generation of catalog asset descriptions and classifications</LI><LI>Mass import of existing glossaries, terms, KPI’s, etc. from flat files</LI></UL><P> </P><H3 id="toc-hId-717571996">SAP Cloud application lifecycle management</H3><P>SAP Datasphere is now integrated into SAP Cloud application lifecycle management (ALM) for health, job and automation monitoring.</P><P>Health monitoring enables you to check the health of one or more Datasphere tenants from the Health Monitoring app in Cloud ALM to check on </P><UL><LI>Memory and Disk usage</LI><LI>Out of Memory Events</LI><LI>Task Failures and</LI><LI>Admission Control Events</LI></UL><P>Job and automation monitoring enables you to monitor the tasks runs and executions in one or more SAP Datasphere tenants from the Job and Automation Monitoring app in Cloud ALM.</P><P>The integration overall eases the monitoring of an SAP Datasphere tenant in a complex customer landscape with a centralized, consistent way of tracking the tenants.</P><P>In addition, the early watch alert reports are now also available for SAP Datasphere. EWA is an automatic service analyzing the essential administrative areas of your SAP systems and covered by your maintenance agreement.</P><P> </P><H3 id="toc-hId-521058491">Data access controls based on IdP attributes</H3><P>Data access controls based on identity provider (IdP) attributes are a small enhancement, but it is very powerful. You can now use values in custom attributes provided by your IdP as identifiers in your data access controls and define conditions that apply to users with these values. It is powerful because using these attributes allows you to apply a single condition to all users whose accounts contain that specific attribute value. In other words, you can group users with attributes to assign permissions to a group instead of individual users.</P><P> </P><H3 id="toc-hId-324544986">YouTube feature highlights of 2025</H3><P> </P><P><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FEmFNGpL13ow%3Ffeature%3Doembed&display_name=YouTube&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DEmFNGpL13ow&image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FEmFNGpL13ow%2Fhqdefault.jpg&type=text%2Fhtml&schema=youtube" width="200" height="112" scrolling="no" title="SAP Datasphere: Top New Features | Highlights from 2025" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></P><P> </P><P> </P><H2 id="toc-hId--1051238"><SPAN>And more feature highlights of 2025</SPAN></H2><P>There are so many more great features which we delivered in 2025, so in addition to my 2025 top features, I pulled together the 2025 feature highlights in all different areas of SAP Datasphere for you. Enjoy the ride!</P><H4 id="toc-hId--437116400">Data Integration</H4><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="01 2025 Datasphere Top Features - Data Integration.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357267i1FD8FDB53C005E30/image-size/large?v=v2&px=999" role="button" title="01 2025 Datasphere Top Features - Data Integration.gif" alt="01 2025 Datasphere Top Features - Data Integration.gif" /></span> </P><P> </P><H4 id="toc-hId--633629905">Catalog & Monitoring</H4><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="02 2025 Datasphere Top Features - Catalog.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357268i3438E0436D080AD0/image-size/large?v=v2&px=999" role="button" title="02 2025 Datasphere Top Features - Catalog.gif" alt="02 2025 Datasphere Top Features - Catalog.gif" /></span></P><P> </P><H4 id="toc-hId--830143410"><SPAN>Data Modelling</SPAN></H4><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="03 2025 Datasphere Top Features - Modeling.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357269iFD0AC6C19F787525/image-size/large?v=v2&px=999" role="button" title="03 2025 Datasphere Top Features - Modeling.gif" alt="03 2025 Datasphere Top Features - Modeling.gif" /></span></P><P> </P><H4 id="toc-hId--1026656915">Miscellaneous</H4><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="04 2025 Datasphere Top Features - Misc.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357265i36D626A9EA14AF5E/image-size/large?v=v2&px=999" role="button" title="04 2025 Datasphere Top Features - Misc.gif" alt="04 2025 Datasphere Top Features - Misc.gif" /></span></P><P> </P><H2 id="toc-hId--636364406"><SPAN>What exactly is rewiring for SAP Datasphere?</SPAN></H2><P>I have heard that question so many times <SPAN>over the last few months from customers, partners, but also SAP colleagues at events like SAP TechEd in Berlin or Sydney. <BR /></SPAN><SPAN>During these discussions I observed that people unintentionally misinterpret the term “rewiring”. It sounds technical, hence there are associations with data, object or even full tenant migrations. This is not the case!</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2025-12.-.SAP BDC Rewiring Blog.gif" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357264iC185AF2E2611255F/image-size/medium?v=v2&px=400" role="button" title="2025-12.-.SAP BDC Rewiring Blog.gif" alt="2025-12.-.SAP BDC Rewiring Blog.gif" /></span></P><P><SPAN>Therefore, I wrote a blog to clarify what rewiring is, and what it is not. Moreover, give you some guidance on the steps which you need to take. <BR /></SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span><SPAN> <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/rewiring-of-sap-datasphere-to-sap-business-data-cloud/ba-p/14292149" target="_blank">https://community.sap.com/t5/technology-blog-posts-by-sap/rewiring-of-sap-datasphere-to-sap-business-data-cloud/ba-p/14292149</A></SPAN></P><P><SPAN> </SPAN></P><H2 id="toc-hId--832877911">More blogs about SAP BDC and Datasphere to check out <span class="lia-unicode-emoji" title=":backhand_index_pointing_down:">👇</span></H2><UL><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/connecting-sap-analytics-cloud-to-databricks-model-serving-endpoint/ba-p/14290451" target="_blank">Connecting SAP Analytics Cloud to Databricks model serving endpoint</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/enterprise-resource-planning-blog-posts-by-sap/sap-for-me-adoption-report-drive-more-business-value-realization/ba-p/14299842" target="_blank">SAP for Me – Adoption Report – Drive More Business Value Realization</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-analytics-cloud-%E3%83%A9%E3%82%A4%E3%82%BB%E3%83%B3%E3%82%B9%E8%AA%AC%E6%98%8E%E3%81%8A%E3%82%88%E3%81%B3%E3%82%B7%E3%83%9F%E3%83%A5%E3%83%AC%E3%83%BC%E3%82%B7%E3%83%A7%E3%83%B3%E6%A9%9F%E8%83%BD/ba-p/14228641" target="_blank">SAP Analytics Cloud ライセンス説明およびシミュレーション機能</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/partner-learning-accelerator-certified-academy-for-sap-business-data-cloud/ba-p/14297305" target="_blank">Partner learning accelerator: Certified academy for SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/enterprise-resource-planning-blog-posts-by-sap/random-forest-iteration-1/ba-p/14298357" target="_blank">RANDOM FOREST ITERATION 1</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/leverage-data-products-in-the-context-of-planning-amp-forecasting/ba-p/14296774" target="_blank">Leverage data products in the context of planning & forecasting</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/what-s-new-in-sap-hana-cloud-december-2025/ba-p/14295366" target="_blank">What’s New in SAP HANA Cloud – December 2025</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-business-technology-platform-blog-posts/sap-bw-modernization-a-smarter-path-to-a-future-ready-data-foundation/ba-p/14280566" target="_blank">SAP BW Modernization: A Smarter Path to a Future-Ready Data Foundation</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/rewiring-of-sap-datasphere-to-sap-business-data-cloud/ba-p/14292149" target="_blank">Rewiring of SAP Datasphere to SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/data-masking-data-scrambling-and-data-anonymization-in-business-data-cloud/ba-p/14288934" target="_blank">Data masking, data scrambling and data anonymization in Business Data Cloud with SAP Datasphere</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/datasphere-dsp-and-sac-data-and-metadata-versioning-backup-and-restore/ba-p/14291938" target="_blank">Datasphere (DSP) and SAC data and metadata versioning / backup and restore</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/data-and-analytics-learning-group-blog-posts/sap-datasphere-the-cloud-based-data-warehouse-data-fabric-solution/ba-p/14289054" target="_blank">SAP Datasphere – The Cloud-Based Data Warehouse / Data Fabric Solution</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/how-to-build-bw-bex-user-customer-exits-in-datasphere-dsp-sac/ba-p/14290016" target="_blank">How-To build BW BEx User/Customer Exits in Datasphere (DSP) / SAC</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/bw-vs-datasphere-dsp-amp-sac/ba-p/14289847" target="_blank">BW vs. Datasphere (DSP) & SAC</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/the-question-of-full-loading-large-cds-views-from-s-4-hana-problems-and/ba-p/14284880" target="_blank">The Question of Full Loading Large CDS VIews from S/4 HANA: Problems and Solutions</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/key-updates-from-teched-2025-that-you-will-want-to-know-as-a-data-engineer/ba-p/14288000" target="_blank">Key Updates from TechEd 2025 That You Will Want to Know as a Data Engineer</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/modernizing-sap-bw-with-sap-business-data-cloud/ba-p/14283938" target="_blank">Modernizing SAP BW with SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/financial-management-blog-posts-by-sap/accelerating-financial-close-integrating-sap-datasphere-with-sap-group/ba-p/14288032" target="_blank">Accelerating Financial Close: Integrating SAP Datasphere with SAP Group Reporting</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/contribution-margin-forecast-with-sap-business-data-cloud/ba-p/14261075" target="_blank">Contribution Margin Forecast with SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/financial-management-blog-posts-by-sap/modern-finance-data-modeling-in-sap-s-4hana/ba-p/14287318" target="_blank">Modern Finance Data Modeling in SAP S/4HANA</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-for-utilities-blog-posts/sap-utilities-roadmap-2026-onwards/ba-p/14285848" target="_blank">SAP Utilities Roadmap 2026 Onwards</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/data-and-analytics-learning-group-blog-posts/becoming-a-data-architect-your-first-step-in-the-data-architecture-learning/ba-p/14285865" target="_blank">Becoming a Data Architect: Your First Step in the Data Architecture Learning Journey</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/expose-bw-7-5-objects-as-data-products-in-databricks/ba-p/14282351" target="_blank">Expose BW 7.5 objects as Data products in Databricks</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/bdc-data-products-to-data-insights-journey/ba-p/14285433" target="_blank">BDC : Data Products to Data Insights Journey</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-datasphere-amp-google-bigquery-3-integration-strategies-before-zero/ba-p/14284254" target="_blank">SAP Datasphere & Google BigQuery: 3 Integration Strategies Before Zero Copy via BDC Connect Arrives</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/understanding-average-exception-aggregation-in-sap-analytics-cloud/ba-p/14281352" target="_blank">Understanding Average Exception Aggregation in SAP Analytics Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-learning-blog-posts/are-you-ready-to-unlock-the-full-potential-of-your-organization-s-data-with/ba-p/14283843" target="_blank">Are you ready to unlock the full potential of your organization’s data with SAP?</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/real-time-steering-with-live-planning/ba-p/14280932" target="_blank">Real-time steering with LIVE planning</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/how-to-scale-ai-the-importance-of-modernizing-data-and-integrating/ba-p/14283392" target="_blank">How to Scale AI: The importance of modernizing data and integrating processes</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/artificial-intelligence-blogs-posts/prompt-optimization-unlocking-hidden-power-for-enterprise-ai-success/ba-p/14283344" target="_blank">Prompt Optimization: Unlocking Hidden Power for Enterprise AI Success</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/fast-track-ai-driven-insights-with-zero-copy-data-access-between-sap-bdc/ba-p/14283360" target="_blank">Fast-track AI-driven Insights with Zero-Copy Data Access between SAP BDC and Google BigQuery</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/positioning-sap-cloud-erp-and-finance-intelligence-with-sap-business-data/ba-p/14283185" target="_blank">Positioning SAP Cloud ERP and Finance Intelligence with SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/sap-business-data-cloud-bdc-a-beginner-s-guide-part-2/ba-p/14282101" target="_blank">SAP Business Data Cloud (BDC): A Beginner's Guide - Part 2</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/sap-datasphere-how-to-define-offsets-for-restriction-variables-in-analytic/ba-p/14281431" target="_blank">SAP Datasphere: How to define Offsets for Restriction Variables in Analytic Models</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/financial-management-blog-posts-by-sap/ifrs-18-explained-what-sap-s-4hana-customers-need-to-know-before-2027/ba-p/14282339" target="_blank">IFRS 18 Explained: What SAP S/4HANA Customers Need to Know Before 2027</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-for-oil-gas-and-energy-blog-posts/hidden-crisis-ageing-infrastructure-reactive-o-amp-m-and-the-data-problem/ba-p/14281905" target="_blank">Hidden Crisis: Ageing Infrastructure, Reactive O&M, and the Data Problem No One Is Solving</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/building-power-bi-analytical-reports-using-sap-datasphere/ba-p/14275180" target="_blank">Building Power BI Analytical Reports using SAP Datasphere</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/maximising-ai-potential-a-blueprint-for-business-success/ba-p/14281058" target="_blank">Maximising AI Potential: A Blueprint for Business Success</A></SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kpsauer_0-1767108426454.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/357263iE9BCC29B65EE5859/image-size/large?v=v2&px=999" role="button" title="kpsauer_0-1767108426454.png" alt="kpsauer_0-1767108426454.png" /></span></P><P>Find more information and related blog posts on the <SPAN><A href="https://pages.community.sap.com/topics/datasphere" target="_blank" rel="noopener noreferrer">topic page for SAP Datasphere</A></SPAN>. You will find further product information on our Community with various subpages about <SPAN><A href="https://pages.community.sap.com/topics/datasphere/business-content" target="_blank" rel="noopener noreferrer">Business Content</A></SPAN>, the <SPAN><A href="https://pages.community.sap.com/topics/datasphere/bw-bridge" target="_blank" rel="noopener noreferrer">SAP BW Bridge</A></SPAN> as well as content for <SPAN><A href="https://pages.community.sap.com/topics/datasphere/best-practices-troubleshooting" target="_blank" rel="noopener noreferrer">Best Practices & Troubleshooting</A></SPAN>. Also check out the new <SPAN><A href="https://help.sap.com/docs/SUPPORT_CONTENT/datasphere/4181116697.html?locale=en-US" target="_blank" rel="noopener noreferrer">support content for SAP Datasphere</A></SPAN> on SAP Help for troubleshooting and analysis guides, how-to guides, technical details, and more.</P><P>Find out how to unleash the power of your business data with SAP’s free learning content on <SPAN><A href="https://learning.sap.com/learning-journey/explore-sap-datasphere?source=social-meta-prdteng-ExploreSAPDatasphere" target="_blank" rel="noopener noreferrer">SAP Datasphere</A></SPAN>. It’s designed to help you enrich your data projects, simplify the data landscape, and make the most out of your investment. Check out even more role-based learning resources and opportunities to get certified in one place on <SPAN><A href="https://learning.sap.com/?url_id=text-sapcommunity-prdteng" target="_blank" rel="noopener noreferrer"> SAP Learning site.</A></SPAN></P><P> </P>2026-01-09T10:11:58.169000+01:00https://community.sap.com/t5/learner-stories/sap-business-data-cloud-certification-c-bcbdc-the-ultimate-guide-to-passing/ba-p/14305009SAP Business Data Cloud Certification (C_BCBDC): The Ultimate Guide to Passing SAP’s New AI Scenario2026-01-12T13:03:12.892000+01:00DheerajHJhttps://community.sap.com/t5/user/viewprofilepage/user-id/2197618<P><STRONG>Introduction</STRONG></P><P>Certification is changing, and it is a good thing. Instead of cramming facts and guessing at multiple choice, you are invited to think like you do on the job. You will work through real scenarios, connect the dots, and show how you deliver outcomes. In this post we will talk about what is new, and how to prepare for SAP BDC (C_BCBDC) certification.</P><P><STRONG>Understanding the New BDC Certification Format - Scenario Based</STRONG></P><P>You will interact with an AI avatar to respond to real customer or project situations. For learners who prefer not to use AI there is an alternative option to upload a video for human review. Link for more details :<A href="https://community.sap.com/t5/learner-stories/sap-certification-is-changing-what-you-need-to-know-about-the-new-format/ba-p/14274190" target="_blank"> SAP Certification is Changing What You Need to Know about the New Format</A>, <A href="https://learning.sap.com/get-certified/reimagining-certification" target="_blank" rel="noopener noreferrer">Reimagining Certification</A></P><P><STRONG>Preparation</STRONG></P><P><STRONG>SAP BDC Learning Journey</STRONG></P><P>For the best chance of success, complete the full SAP Business Data Cloud learning journey from start to finish. Treat every module, hands on exercise, and quiz as required to build real exam readiness. Give special attention to the core topics below</P><P><STRONG>Core Topics in SAP Business Data Cloud</STRONG></P><P><STRONG>1. BDC</STRONG></P><UL><LI><SPAN>One domain model</SPAN></LI><LI><SPAN>SAP 4 Me</SPAN></LI><LI><SPAN>Formations</SPAN></LI><LI><SPAN>Intelligent Applications</SPAN></LI><LI><SPAN>Data Products</SPAN></LI><LI><SPAN>Data Security</SPAN></LI><LI><SPAN>Architecture</SPAN></LI><LI><SPAN>BDC Cockpit</SPAN></LI><LI><SPAN>Zero copy</SPAN></LI><LI><SPAN>ORD / CSN</SPAN></LI><LI><SPAN>Data Lake</SPAN></LI></UL><P><STRONG>2. SAP Datasphere Integration</STRONG></P><UL><LI><SPAN>BDC-generated spaces</SPAN></LI><LI><SPAN>Harmonization through modeling</SPAN></LI><LI><SPAN>Semantic enrichment</SPAN></LI><LI><SPAN>Extending SAP-delivered models</SPAN></LI><LI><SPAN>Installing data products to a space</SPAN></LI></UL><P><STRONG>3. SAP Analytics Cloud (SAC)</STRONG></P><UL><LI><SPAN>Role of SAC in BDC</SPAN></LI><LI><SPAN>Extending and discovering intelligent applications</SPAN></LI><LI><SPAN>Use cases for intelligent applications</SPAN></LI></UL><P><STRONG>4. BW Modernization</STRONG></P><UL><LI><SPAN>Lift, shift, and innovate strategies</SPAN></LI><LI><SPAN>Data Product Generator</SPAN></LI><LI><SPAN>Subscriptions and versions</SPAN></LI><LI><SPAN>Why BW is included in BDC</SPAN></LI></UL><P><STRONG>5. SAP Databricks Integration</STRONG></P><UL><LI><SPAN>ML Flow, delta share, notebooks, Unity Catalog</SPAN></LI><LI><SPAN>Exploratory Data Analysis</SPAN></LI><LI><SPAN>Industry use cases</SPAN></LI><LI><SPAN>Sharing data products and ML results between Databricks and BDC</SPAN></LI></UL><P><STRONG>Sample Scenarios to Understand the Exam Format</STRONG></P><P><STRONG>Scenario 1</STRONG>: Global Manufacturing Company – Unified Analytics & Governance</P><P>Context: A global manufacturing company operates across Europe and Asia. They run SAP S/4HANA, legacy SAP BW, and several non-SAP systems for logistics and procurement. Leadership wants a single, governed analytics platform with reusable data products and ready-made insights.</P><P><STRONG>Question</STRONG>: How does BDC help</P><P><STRONG>Scenario 2</STRONG>: Financial Services Firm – AI-Driven Risk & Compliance Reporting</P><P><STRONG>Context</STRONG>: A financial services firm needs real-time risk reporting, strict compliance, and AI-driven forecasting. They must integrate SAP finance data with external market feeds and apply advanced analytics.</P><P>Question: How can Financial Services Firm integrate third-party data with its SAP data using SAP Business Data Cloud?</P><P><STRONG>Exam Questions Breakdown from My Attempt</STRONG></P><P>Based on my recent experience, here’s how the exam questions were distributed:</P><UL><LI><SPAN>Technical Knowledge of BDC:</SPAN><SPAN> 40%</SPAN></LI><LI><SPAN>Integration with SAP Business Warehouse:</SPAN><SPAN> 15%</SPAN></LI><LI><SPAN>Technical Knowledge of SAP Analytics Cloud:</SPAN><SPAN> 15%</SPAN></LI><LI><SPAN>Technical Knowledge of SAP Datasphere:</SPAN><SPAN> 15%</SPAN></LI><LI><SPAN>Technical Knowledge of SAP Databricks:</SPAN><SPAN> 15%</SPAN></LI></UL><P><STRONG>Helpful Links</STRONG></P><OL><LI><A href="https://learning.sap.com/learning-journeys/exploring-sap-business-data-cloud" target="_blank" rel="noopener noreferrer"><SPAN>Exploring SAP Business Data Cloud Learning Journey</SPAN></A></LI><LI><A href="https://community.sap.com/t5/learner-stories/sap-certification-is-changing-what-you-need-to-know-about-the-new-format/ba-p/14274190" target="_blank"><SPAN>SAP Certification is Changing What You Need to Know about the New Format</SPAN></A></LI><LI><A href="https://learning.sap.com/get-certified/reimagining-certification" target="_blank" rel="noopener noreferrer"><SPAN>Reimagining Certification</SPAN></A></LI><LI><A href="https://learning.sap.com/certifications/sap-certified-associate-sap-business-data-cloud" target="_blank" rel="noopener noreferrer"><SPAN>SAP Certified Associate SAP Business Data Cloud</SPAN></A></LI></OL>2026-01-12T13:03:12.892000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/working-with-bw-authorizations-in-datasphere-on-enterprise-level/ba-p/14302361Working with BW Authorizations in Datasphere on Enterprise Level2026-01-14T22:07:10.971000+01:00Alex_Reckendreeshttps://community.sap.com/t5/user/viewprofilepage/user-id/2253235<H1 id="toc-hId-1658593533"><SPAN>Related resources</SPAN></H1><UL><LI><SPAN>Blog Post: <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/introducing-remote-authorizations-from-sap-bw-4hana-for-sap-datasphere/ba-p/13518819" target="_self">Introducing Remote Authorizations from SAP BW/4HANA for SAP Datasphere</A> </SPAN></LI><LI><SPAN>Blog Post by <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/275733">@Martin_Kuma</a>: <A href="https://community.sap.com/t5/technology-blog-posts-by-members/bw-like-authorizations-in-datasphere-dsp/ba-p/14153918" target="_self">BW-Like Authorizations in Datasphere (DSP)</A> </SPAN></LI><LI><SPAN>Documentation: <A href="https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/f56e4271dc4943aa9f21223ce5c93873.html?locale=en-US" target="_self" rel="noopener noreferrer">Import SAP BW and SAP BW∕4HANA Analysis Authorizations</A> </SPAN></LI><LI><SPAN>Note: <A href="https://me.sap.com/notes/3062381" target="_self" rel="noopener noreferrer">3062381</A></SPAN></LI></UL><P><SPAN> </SPAN></P><H1 id="toc-hId-1462080028"><SPAN>Why resolve BW authorizations in Datasphere?</SPAN></H1><P><SPAN>In enterprise analytics SAP Datasphere often needs to federates the <STRONG>SAP BW</STRONG> analysis authorization table <STRONG>RSDWC_RSEC_DAC</STRONG> to data access controls (DAC). We build on this <STRONG>SAP standard</STRONG> (no custom BW developments are required beyond generation/extraction of the standard table) and bring the resolution step into the HANA layer for scale and performance.<BR /><BR /><STRONG>Performance matters</STRONG>: The default runtime-only resolution evaluates long WHERE predicate strings per session user and iterates them at query time. With many users and complex rules, that becomes <STRONG>inperformant</STRONG> and unpredictable in terms of latency. By <STRONG>pre‑resolving</STRONG> per objectinto a persisted table we decouple expensive checks from BI queries, achieve consistent, auditable results, and enable scheduled refreshes.</SPAN></P><P> </P><H1 id="toc-hId-1265566523"><SPAN>Prerequisites</SPAN></H1><OL><LI><STRONG><SPAN>RSDWC_RSEC_DAC as a Remote Table in Datasphere. </SPAN></STRONG><SPAN>Generate/refresh it in BW with transaction <EM>RSDWC_DAC_RSEC_GEN</EM> and the BAdI <EM>RSDWC_DAC_RSEC_USER_UPDATE</EM> (and optionally schedule report RSDWC_DAC_RSEC_EXTRACT). The table can then be exposed to Datasphere. <span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2026-01-06 19_49_48-Downloads – Datei-Explorer.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/358722i79F0F298AC19D47B/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="2026-01-06 19_49_48-Downloads – Datei-Explorer.png" alt="2026-01-06 19_49_48-Downloads – Datei-Explorer.png" /></span><BR /></SPAN></LI><LI><STRONG><SPAN>A HANA DB user (Open SQL schema) in Datasphere. </SPAN></STRONG><SPAN>This user owns the Function, Procedure and Table and is granted to the space technical user to run in Task Chains.<BR /></SPAN></LI><LI><STRONG><SPAN>Optional: an Entities/"booked values" table/view. </SPAN></STRONG><SPAN>Applying predicates to the set of actually booked combinations avoids generating all possible cross‑combinations which can be a huge performance multiplier and inflate result sizes.<BR /></SPAN></LI></OL><P> </P><H1 id="toc-hId-1069053018"><SPAN>Where our default approaches falls short</SPAN></H1><UL><LI><SPAN>Runtime resolution per current user leads to high CPU/time due to long predicate strings.</SPAN></LI><LI><SPAN>SQL views in Datasphere lack loops/cursors; scaling to many users is cumbersome.</SPAN></LI><LI><SPAN>Duplicating views per user and UNION them is not maintainable and limits parallelization.</SPAN></LI></UL><P> </P><H1 id="toc-hId-872539513">Example at a glance</H1><P>In this blog’s walkthrough <STRONG>OBJECT01</STRONG> is our <EM>authorization object</EM> that groups the key BW characteristics used for data access in the BWAUTH02 mapping. Concretely, OBJECT01 controls combinations of the following standard InfoObjects:</P><UL><LI><STRONG>/BIC/0COMP_CODE</STRONG> (Company Code)</LI><LI><STRONG>/BIC/0PLANT</STRONG> (Plant)</LI><LI><STRONG>/BIC/0SALESORG</STRONG> (Sales Organization)</LI><LI><STRONG>/BIC/0PROFIT_CTR</STRONG> (Profit Center)</LI><LI><STRONG>/BIC/0CS_PLEVEL</STRONG></LI></UL><P>The <STRONG>SAP standard table RSDWC_RSEC_DAC</STRONG> (federated as a remote table to Datasphere) holds per BW user and per object the dynamic WHERE‑predicate string that expresses allowed values/ranges for these characteristics.</P><P> </P><H1 id="toc-hId-676026008"><SPAN>Solution overview: Function + Procedure + Table + Task Chain + View</SPAN></H1><P><SPAN>We implement five building blocks in an Open SQL schema and orchestrate them from Datasphere:</SPAN></P><OL><LI><SPAN><STRONG>Table Function</STRONG> `TF_AUTH_RESOLVE_OBJECT01_BWAUTH02`: reads normalized WHERE predicate from RSEC view and applies it to booked entities via `APPLY_FILTER`.</SPAN></LI><LI><SPAN><STRONG>Procedure</STRONG> `SP_AUTH_OBJECT01_REFRESH_BWAUTH02`: loops over all BW users for OBJECT01 and persists the union of results into a target table (full refresh).</SPAN></LI><LI><SPAN><STRONG>Persisted table</STRONG> `AUTH_OBJECT01_RESULT_BWAUTH02`: mirrors function output for /BIC/0* fields.</SPAN></LI><LI><SPAN><STRONG>View</STRONG> `GV_AUTH_OBJECT01_RESULT_BWAUTH02`: exposes the table for DACs and downstream pipelines.</SPAN></LI><LI><SPAN><STRONG>Task Chain</STRONG> `TC_AUTH_OBJECT01_REFRESH_BWAUTH02`: schedules the refresh procedure regularly.</SPAN></LI></OL><H3 id="toc-hId-737677941"><SPAN>Source artifacts and naming</SPAN></H3><UL><LI><SPAN>RSEC (clean) view: "DEMO_AR_001"."GV_RSDWC_RSEC_DAC_BWAUTH02" (column FILTERFIELDNM holds the normalized predicate)</SPAN></LI><LI><SPAN>Entities view (booked combinations): "DEMO_AR_001"."GV_OBJECT01_DAC_ENTITIES_BOOKED_BWAUTH02"</SPAN></LI><LI><SPAN>Target table: "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02"</SPAN></LI><LI><SPAN>Exposing view: "DEMO_AR_001"."GV_AUTH_OBJECT01_RESULT_BWAUTH02"</SPAN></LI><LI><SPAN>Task Chain: TC_AUTH_OBJECT01_REFRESH_BWAUTH02</SPAN></LI></UL><H3 id="toc-hId-541164436"><SPAN>Target table </SPAN></H3><pre class="lia-code-sample language-sql"><code>CREATE COLUMN TABLE "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02" (
"User" NVARCHAR(256),
"OBJECTNAME" NVARCHAR(40),
"/BIC/0PLANT" NVARCHAR(20),
"/BIC/0SALESORG" NVARCHAR(20),
"/BIC/0CS_PLEVEL" NVARCHAR(20),
"/BIC/0PROFIT_CTR" NVARCHAR(20),
"/BIC/0COMP_CODE" NVARCHAR(20)
);</code></pre><H3 id="toc-hId-344650931"><SPAN>Table Function </SPAN></H3><pre class="lia-code-sample language-sql"><code>/* --------------------------------------------------------------------------------------------------
Object: FUNCTION "DEMO_AR_001#HDB_USER01"."TF_AUTH_RESOLVE_OBJECT01_BWAUTH02"
Purpose: For a given BW user, returns the authorized ENTITY combinations
for OBJECT01 (fixed here; can be parameterized later).
The WHERE predicate string is read dynamically from the RSEC view
and applied to the Entities view via APPLY_FILTER.
Input: P_USER (NVARCHAR(256)) – BWUSERID for which the authorization resolution is executed.
Output: Result set with the following columns (aligned to the new standard InfoObjects):
"User", "OBJECTNAME",
"/BIC/0PLANT", "/BIC/0SALESORG", "/BIC/0CS_PLEVEL",
"/BIC/0PROFIT_CTR", "/BIC/0COMP_CODE"
Assumptions & behavior:
- Exactly ONE record exists in the RSEC view for (BWUSERID = P_USER, OBJECTNAME = 'OBJECT01_BWAUTH02').
-> SELECT ... INTO expects exactly 1 row. More than one -> "row not unique"; none -> "no data found".
- INFOOBJ is intentionally fixed to 'OBJECT01_BWAUTH02' for this scenario.
- APPLY_FILTER expects a syntactically valid WHERE string (without the "WHERE" keyword).
Security/runtime:
- SQL SECURITY INVOKER: Privilege checks are executed in the caller's context.
-> Caller must have SELECT on the views used in the space/schema.
- APPLY_FILTER is safer than EXEC/EXECUTE IMMEDIATE regarding SQL injection,
however FILTERFIELDNM must be maintained reliably in the RSEC view.
---------------------------------------------------------------------------------------------------*/
CREATE FUNCTION "DEMO_AR_001#HDB_USER01"."TF_AUTH_RESOLVE_OBJECT01_BWAUTH02"(
IN P_USER NVARCHAR(256)
)
RETURNS TABLE (
"User" NVARCHAR(256),
"OBJECTNAME" NVARCHAR(40),
"/BIC/0PLANT" NVARCHAR(20),
"/BIC/0SALESORG" NVARCHAR(20),
"/BIC/0CS_PLEVEL" NVARCHAR(20),
"/BIC/0PROFIT_CTR" NVARCHAR(20),
"/BIC/0COMP_CODE" NVARCHAR(20)
)
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER
AS
BEGIN
/* Local variables:
- FILTER_CLAUSE: holds the dynamic WHERE predicate read from the RSEC view
- INFOOBJ: fixed to OBJECT01 (current scenario)
*/
DECLARE FILTER_CLAUSE NCLOB;
DECLARE INFOOBJ NVARCHAR(40);
/* Fixed authorization object; can be modeled as IN parameter if needed */
INFOOBJ = 'OBJECT01';
/* 1) Load filter string for the user.
Expects exactly one match. If SELECT returns 0 or >1 rows, HANA raises an exception.
This should be ensured in the RSEC view content. */
SELECT FILTERFIELDNM
INTO FILTER_CLAUSE
FROM "DEMO_AR_001"."GV_RSDWC_RSEC_DAC_BWAUTH02"
WHERE BWUSERID = :P_USER
AND OBJECTNAME = INFOOBJ;
/* 2) APPLY_FILTER:
- Applies the dynamic WHERE expression to the entities view.
- The result is materialized in the implicit table variable lt_result. */
lt_result = APPLY_FILTER(
"DEMO_AR_001"."GV_OBJECT01_DAC_ENTITIES_BOOKED_BWAUTH02",
:FILTER_CLAUSE
);
/* 3) Assemble result:
- DISTINCT -> avoid duplicates.
- P_USER and INFOOBJ are carried as constant columns. */
RETURN
SELECT DISTINCT
:P_USER AS "User",
INFOOBJ AS "OBJECTNAME",
"/BIC/0PLANT",
"/BIC/0SALESORG",
"/BIC/0CS_PLEVEL",
"/BIC/0PROFIT_CTR",
"/BIC/0COMP_CODE"
FROM :lt_result;
END;</code></pre><P><SPAN>You can check how it is working: </SPAN></P><pre class="lia-code-sample language-sql"><code>SELECT * FROM "DEMO_AR_001#HDB_USER01"."TF_AUTH_RESOLVE_OBJECT01_BWAUTH02"('ABC02')</code></pre><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2026-01-06 19_48_01-Greenshot.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/358721i22521C24E31A6CD6/image-size/large?v=v2&px=999" role="button" title="2026-01-06 19_48_01-Greenshot.png" alt="2026-01-06 19_48_01-Greenshot.png" /></span></P><H3 id="toc-hId-148137426"><SPAN>Procedure</SPAN></H3><pre class="lia-code-sample language-sql"><code>/* --------------------------------------------------------------------------------------------------
Object: PROCEDURE "DEMO_AR_001#HDB_USER01"."SP_AUTH_OBJECT01_REFRESH_BWAUTH02"
Purpose: Full-refresh build of table "AUTH_OBJECT01_RESULT_BWAUTH02":
- Collect all BWUSERID relevant for OBJECT01 (from GV_RSDWC_RSEC_DAC_BWAUTH02)
- For each user, call table function TF_AUTH_RESOLVE_OBJECT01_BWAUTH02
- Persist the union of results into the target table
Flow:
1) Cursor over DISTINCT BWUSERID from the RSEC view (OBJECTNAME = 'OBJECT01')
2) DELETE the target table (clear all rows for a full rebuild)
3) FOR-loop: INSERT ... SELECT from the table function for each user
Security:
- SQL SECURITY INVOKER: privilege checks run in caller context
- Caller requires:
* SELECT on source RSEC view
* INSERT/DELETE on target table
Performance:
- Loop calls the TF once per user (N calls).
For large N, consider a set-based approach (bulk table function) later.
Notes:
- INSERT lists columns explicitly to guard against future schema changes
---------------------------------------------------------------------------------------------------*/
CREATE PROCEDURE "DEMO_AR_001#HDB_USER01"."SP_AUTH_OBJECT01_REFRESH_BWAUTH02" ()
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER
AS
BEGIN
/* 1) Cursor: distinct list of relevant users for OBJECT01 */
DECLARE CURSOR cur_users FOR
SELECT DISTINCT BWUSERID
FROM "DEMO_AR_001"."GV_RSDWC_RSEC_DAC_BWAUTH02"
WHERE OBJECTNAME = 'OBJECT01'
AND BWUSERID IS NOT NULL;
/* 2) Full rebuild: clear target table */
DELETE FROM "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02";
/* 3) Implicit FOR-loop: HANA opens/closes the cursor automatically */
FOR cur_row AS cur_users DO
INSERT INTO "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02"
( "User", "OBJECTNAME",
"/BIC/0PLANT", "/BIC/0SALESORG", "/BIC/0CS_PLEVEL",
"/BIC/0PROFIT_CTR", "/BIC/0COMP_CODE" )
SELECT
"User", "OBJECTNAME",
"/BIC/0PLANT", "/BIC/0SALESORG", "/BIC/0CS_PLEVEL",
"/BIC/0PROFIT_CTR", "/BIC/0COMP_CODE"
FROM "DEMO_AR_001#HDB_USER01"."TF_AUTH_RESOLVE_OBJECT01_BWAUTH02"( :cur_row.BWUSERID );
END FOR;
END;</code></pre><P>Execute via Task Chain step or manually:</P><pre class="lia-code-sample language-sql"><code>CALL "DEMO_AR_001#HDB_USER01"."SP_AUTH_OBJECT01_REFRESH_BWAUTH02"();
SELECT * FROM "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02";</code></pre><H3 id="toc-hId--123607448"><SPAN>Security, view exposure, and task chain</SPAN></H3><UL><LI><SPAN>Create Datasphere view: "DEMO_AR_001"."GV_AUTH_OBJECT01_RESULT_BWAUTH02" over the table to feed DACs and downstream processes. </SPAN></LI></UL><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2026-01-06 19_27_16-Downloads – Datei-Explorer.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/358719iE7204E8EDF5AD5FF/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="2026-01-06 19_27_16-Downloads – Datei-Explorer.png" alt="2026-01-06 19_27_16-Downloads – Datei-Explorer.png" /></span></SPAN></P><UL><LI><SPAN>Schedule a Task Chain "TC_AUTH_OBJECT01_REFRESH_BWAUTH02" that calls the procedure for regular refresh.</SPAN></LI></UL><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2026-01-06 19_35_52-Downloads – Datei-Explorer.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/358720i054EFDE8AAEE6E59/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="2026-01-06 19_35_52-Downloads – Datei-Explorer.png" alt="2026-01-06 19_35_52-Downloads – Datei-Explorer.png" /></span></SPAN></P><P>To add the procedure your user needs to be granted: </P><pre class="lia-code-sample language-sql"><code>-- Minimal grants (adapt to your technical user / space)
GRANT EXECUTE ON PROCEDURE "DEMO_AR_001#HDB_USER01"."SP_AUTH_OBJECT01_REFRESH_BWAUTH02" TO "DEMO_AR_001";
GRANT SELECT, INSERT, DELETE ON "DEMO_AR_001#HDB_USER01"."AUTH_OBJECT01_RESULT_BWAUTH02" TO "DEMO_AR_001";
-- Optional: schema-wide
GRANT EXECUTE ON SCHEMA "DEMO_AR_001#HDB_USER01" TO "DEMO_AR_001" WITH GRANT OPTION;</code></pre><P>If you do not know the needed Schema: </P><pre class="lia-code-sample language-sql"><code>SELECT SCHEMA_NAME, SCHEMA_OWNER
FROM SYS.SCHEMAS</code></pre><P> </P><H1 id="toc-hId-266685061"><SPAN>Performance & scalability</SPAN></H1><UL><LI><SPAN>Pre-resolving moves string evaluation out of query-time; persisted results enable indexes and faster joins.</SPAN></LI><LI><SPAN>Keep the entities view "booked" to avoid generating all combinations (Cartesian blow-up).</SPAN></LI><LI><SPAN>For very large user counts, consider a set-based bulk table function to avoid N function calls.</SPAN></LI></UL><P> </P><H1 id="toc-hId-70171556"><SPAN>Troubleshooting</SPAN></H1><UL><LI><SPAN>Error 260 invalid column name: fix duplicated /BIC/ prefixes or mismatched InfoObject names.</SPAN></LI><LI><SPAN>No data found / row not unique: enforce exactly 1 row per (BWUSERID, OBJECTNAME) in RSEC view.</SPAN></LI><LI><SPAN>Permissions: INVOKER needs SELECT on sources and INSERT/DELETE on target; scheduler user needs EXECUTE on procedure.</SPAN></LI></UL><P> </P><H1 id="toc-hId--126341949"><SPAN>What you get</SPAN></H1><P><SPAN>A maintainable, enterprise-ready pattern to pre-resolve BW authorizations in SAP Datasphere, fully based on SAP standard RSDWC_RSEC_DAC: reproducible results, faster queries and a scheduling hook via Task Chains. From here, you can attach DACs to exposed views or plug the table into harmonization and governance pipelines.</SPAN></P>2026-01-14T22:07:10.971000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/generating-and-integrating-automated-predictive-library-apl-forecasts-in-a/ba-p/14309857Generating and Integrating Automated Predictive Library (APL) Forecasts in a Seamless Planning Model2026-01-20T11:01:19.924000+01:00Max_Ganderhttps://community.sap.com/t5/user/viewprofilepage/user-id/14553<H1 id="toc-hId-1658806850"><SPAN>Introduction</SPAN><SPAN> </SPAN></H1><P><SPAN>SAP Analytics Cloud has always been the one solution for BI, planning and predictive analytics. As such, it has powerful built-in capabilities for regression, classification and time-series forecasting. You know them as </SPAN><A href="https://help.sap.com/docs/SAP_ANALYTICS_CLOUD/00f68c2e08b941f081002fd3691d86a7/37db2128dab44d15b46e1918829c1ff1.html" target="_blank" rel="noopener noreferrer"><I><SPAN>Predictive Scenarios</SPAN></I></A><SPAN> and many of you have used them to support their planning processes. Our predictive scenarios are perfect for business users: they choose the best available algorithm for your data and explain the results while maintaining the semantics of the model throughout the process (e.g., hierarchies). With SAP Business Data Cloud and seamless planning, data scientists on the other hand can now leverage HANA’s predictive Predictive Analysis Library (PAL) and Automated Predictive Library (APL) directly on the HANA database of SAP Datasphere and nicely integrate the results into planning processes. This lets them tweak predictive models by picking and choosing the algorithm of their choice and use code instead of a UI. SAP BDC would also allow them to share data with SAP Databricks or another Databricks instance using data products if this was their preferred environment.</SPAN><SPAN> </SPAN></P><P><SPAN>This blogpost was created with <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/187920">@marc_daniau</a></SPAN><SPAN>, a development expert for our predictive engine.</SPAN><STRONG><SPAN> </SPAN></STRONG><SPAN>We want to demonstrate the usage of HANA APL in combination with a seamless planning model and live versions. We do this using a straight-forward prediction based on actual data. </SPAN><SPAN> </SPAN></P><H1 id="toc-hId-1462293345"> </H1><H1 id="toc-hId-1265779840"><SPAN>High-level overview</SPAN><SPAN> </SPAN></H1><P><SPAN>This is what we are working with:</SPAN><SPAN> </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Overview.png" style="width: 364px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362649iCE5D9CB2ED7F1CEE/image-dimensions/364x296/is-moderation-mode/true?v=v2" width="364" height="296" role="button" title="Overview.png" alt="Overview.png" /></span></P><P><SPAN>In SAP Analytics Cloud, we have a planning model deployed to an SAP Datasphere space. That makes it a seamless planning model which means that its data is not stored on the SAP Analytics Cloud database but only on the SAP Datasphere database. </SPAN><SPAN> </SPAN></P><P><SPAN>In the SAP Datasphere space, we find the planning model data and a table with actuals. We are not using an SAP BDC data product, but you surely could!</SPAN><SPAN> </SPAN></P><P><SPAN>We create our prediction directly on the underlying HANA Cloud database of SAP Datasphere. We created a DB user to access the database. There, we consume the actuals, create a stored procedure which we can trigger via a task chain, and surface the result in a view in the space. This fact data can then be added to the seamless planning model as a live version. </SPAN><SPAN> </SPAN></P><H1 id="toc-hId-1069266335"> </H1><H1 id="toc-hId-872752830"><SPAN>Step-by-step</SPAN><SPAN> </SPAN></H1><H2 id="toc-hId-805322044"><SPAN>1. Seamless planning model</SPAN><SPAN> </SPAN></H2><P><SPAN>We do not cover the full creation and set-up of the model. Let’s just check out its key characteristics: </SPAN><SPAN> </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SACModel1.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362654i9EC38A28AA5AD0CE/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="SACModel1.png" alt="SACModel1.png" /></span></P><UL><LI><SPAN>It is a seamless planning model, deployed to the space </SPAN><I><SPAN>Sales Planning Demo</SPAN></I><SPAN> </SPAN></LI></UL><UL><LI><SPAN>Its fact table is exposed in the SAP Datasphere space (actually, this is not decisive for the use case described here but could be useful if you want to add budget data as influence for your predictive model, for instance)</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN> We want to create a forecast version and predict the measure </SPAN><I><SPAN>SALES_REVENUE</SPAN></I><SPAN> along the product and region dimension</SPAN><SPAN> </SPAN></LI></UL><H2 id="toc-hId-608808539"><SPAN>2. SAP Datasphere space</SPAN></H2><P><SPAN>Again, we do not look at the creation of the space and all its artefacts. The following Actuals view is key as we extrapolate our forecast based on this data:</SPAN><SPAN> </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_0-1768822158426.png" style="width: 619px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362662i6996F8E0D37BD0C9/image-dimensions/619x271/is-moderation-mode/true?v=v2" width="619" height="271" role="button" title="Max_Gander_0-1768822158426.png" alt="Max_Gander_0-1768822158426.png" /></span></P><UL><LI><SPAN>Measures and attributes nicely match the planning model structure (which is handy for our simple demo scenario </SPAN><SPAN>J</SPAN><SPAN>)</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>We excluded columns that we do not need in a projection</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>We filtered on date as we do not want to use the entire history</SPAN><SPAN> </SPAN></LI></UL><H2 id="toc-hId-412295034"><SPAN>3. Setting up database access</SPAN><SPAN> </SPAN></H2><P><SPAN>We are now ready to learn how to create the forecast on the HANA database. First of all, we need to set up database access. </SPAN><SPAN> </SPAN></P><UL><LI><SPAN>Prerequisite: </SPAN><A href="https://help.sap.com/docs/SAP_DATASPHERE/9f804b8efa8043539289f42f372c4862/287194276a7d4d778ec98fdde5f61335.html" target="_blank" rel="noopener noreferrer"><SPAN>Enable the SAP HANA Cloud Script Server on Your SAP Datasphere Tenant</SPAN></A><SPAN> </SPAN></LI></UL><UL><LI><SPAN>Navigate to </SPAN><I><SPAN>Space Management</SPAN></I><SPAN>, find your space and </SPAN><I><SPAN>Edit</SPAN></I><SPAN>. </SPAN><SPAN> </SPAN><BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SpaceMgmt.png" style="width: 457px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362657i0A298C4981D816C6/image-dimensions/457x200?v=v2" width="457" height="200" role="button" title="SpaceMgmt.png" alt="SpaceMgmt.png" /></span></LI></UL><UL><LI><SPAN>Navigate to </SPAN><I><SPAN>Database Access </SPAN></I><SPAN>and create a new user</SPAN><SPAN> <BR /></SPAN><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DBUser.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362658iB867E87C1DDBE070/image-size/medium?v=v2&px=400" role="button" title="DBUser.png" alt="DBUser.png" /></span> </SPAN></LI><LI><SPAN>Name your user and make the needed settings as highlighted. Your user’s name will be a concatenation of your space name, ‘#’ and the suffix you provide here.</SPAN><SPAN> <BR /></SPAN><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DBUserCreate.png" style="width: 280px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362659i050398DA197C82A6/image-dimensions/280x340/is-moderation-mode/true?v=v2" width="280" height="340" role="button" title="DBUserCreate.png" alt="DBUserCreate.png" /></span> </SPAN></LI><LI><SPAN>Mark your user and open the database explorer. The password can be retrieved in the details of the user (information symbol). </SPAN><SPAN> <BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="new.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363619i0142F140E02D3149/image-size/medium?v=v2&px=400" role="button" title="new.png" alt="new.png" /></span><BR /></SPAN></LI></UL><H2 id="toc-hId-215781529"><SPAN>4. Database explorer</SPAN><SPAN> </SPAN></H2><P><SPAN>Let’s first have an overview of what we are creating in the database explorer:</SPAN><SPAN> </SPAN></P><P><SPAN> <span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DBSchema.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362661i25E08F2AFE92D1B1/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="DBSchema.png" alt="DBSchema.png" /></span></SPAN></P><UL><LI><SPAN>We create a view that consumes the actual data from our space schema. Note the following:</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>We concatenate </SPAN><I><SPAN>Product </SPAN></I><SPAN>and </SPAN><I><SPAN>Region</SPAN></I><SPAN> to one </SPAN><I><SPAN>Entity </SPAN></I><SPAN>column that we will use to segment our prediction. As we work directly on flat fact tables/views, we do not have the luxury of keeping all semantics as we have it in the SAP Analytics Cloud predictive scenarios. </SPAN><SPAN> </SPAN></LI></UL><P><SPAN> </SPAN></P><pre class="lia-code-sample language-sql"><code>CREATE VIEW "SALES_PLANNING_DEMO#AI_USER"."APL_SERIES_IN" ( "Entity", "Date", "SalesRevenue" ) AS (select
"Product" || '|' || "Regions" as "Entity", "Date", "SalesRevenue"
from
SALES_PLANNING_DEMO."V_Actual_Sales_Data"
order by 1, 2) </code></pre><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_8-1768812813328.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362496iFF783DA8C0F6A65C/image-size/medium?v=v2&px=400" role="button" title="Max_Gander_8-1768812813328.png" alt="Max_Gander_8-1768812813328.png" /></span></P><UL><LI><SPAN>We now create the prediction task as a stored procedure. </SPAN><SPAN> </SPAN></LI></UL><pre class="lia-code-sample language-sql"><code>create procedure "APL_FORECAST_TASK"
as BEGIN
declare header "SAP_PA_APL"."sap.pa.apl.base::BASE.T.FUNCTION_HEADER";
declare config "SAP_PA_APL"."sap.pa.apl.base::BASE.T.OPERATION_CONFIG_DETAILED";
declare var_desc "SAP_PA_APL"."sap.pa.apl.base::BASE.T.VARIABLE_DESC_OID";
declare var_role "SAP_PA_APL"."sap.pa.apl.base::BASE.T.VARIABLE_ROLES_WITH_COMPOSITES_OID";
declare apl_log "SAP_PA_APL"."sap.pa.apl.base::BASE.T.OPERATION_LOG";
declare apl_sum "SAP_PA_APL"."sap.pa.apl.base::BASE.T.SUMMARY";
declare apl_indic "SAP_PA_APL"."sap.pa.apl.base::BASE.T.INDICATORS";
declare apl_metr "SAP_PA_APL"."sap.pa.apl.base::BASE.T.DEBRIEF_METRIC_OID";
declare apl_prop "SAP_PA_APL"."sap.pa.apl.base::BASE.T.DEBRIEF_PROPERTY_OID";
truncate table "SALES_PLANNING_DEMO#AI_USER"."APL_SERIES_OUT";
truncate table "SALES_PLANNING_DEMO#AI_USER"."APL_FORECAST_ACCURACY";
truncate table "SALES_PLANNING_DEMO#AI_USER"."APL_FORECAST_STATUS";
:header.insert(('Oid', 'DSP APL'));
:header.insert(('LogLevel', '2'));
:header.insert(('MaxTasks', '4')); -- PARALLEL TASKS
:config.insert(('APL/SegmentColumnName', 'Entity',null));
:config.insert(('APL/Horizon', '12',null));
:config.insert(('APL/TimePointColumnName', 'Date',null));
:config.insert(('APL/ForcePositiveForecast', 'true',null));
:config.insert(('APL/DecomposeInfluencers', 'true',null));
:config.insert(('APL/ApplyExtraMode', 'First Forecast with Stable Components and Residues and Error Bars',null));
:var_role.insert(('Date', 'input', null, null, null));
:var_role.insert(('SalesRevenue', 'target', null, null, null));
"SAP_PA_APL"."sap.pa.apl.base::FORECAST_AND_DEBRIEF" (
:header, :config, :var_desc, :var_role,
'SALES_PLANNING_DEMO#AI_USER', 'APL_SERIES_IN',
'SALES_PLANNING_DEMO#AI_USER', 'APL_SERIES_OUT', apl_log, apl_sum, apl_indic, apl_metr, apl_prop);
insert into "SALES_PLANNING_DEMO#AI_USER"."APL_FORECAST_ACCURACY"
select "Oid" as "Entity", "MAE", "MAPE"
from "SAP_PA_APL"."sap.pa.apl.debrief.report::TimeSeries_Performance" (:apl_prop, :apl_metr)
where "Partition" = 'Validation';
insert into "SALES_PLANNING_DEMO#AI_USER"."APL_FORECAST_STATUS"
select "OID" as "Entity", "VALUE" as "Task Status"
from :apl_sum
where key = 'AplTaskStatus';
END</code></pre><P><SPAN>For our demo scenario we use the default APL forecasting method that automatically tries different hypotheses for trend, cycles and fluctuations, and eventually selects the combination that gives the best accuracy. For a faster processing on many segments, an option is to force the Exponential Smoothing method by adding to the procedure this line of code below:</SPAN><SPAN> </SPAN></P><pre class="lia-code-sample language-abap"><code>:config.insert(('APL/ForecastMethod','ExponentialSmoothing',null)); </code></pre><P><SPAN>This is the code to prepare the target tables of the procedure:</SPAN><SPAN> </SPAN></P><pre class="lia-code-sample language-sql"><code>drop table APL_SERIES_OUT;
create table APL_SERIES_OUT (
"Entity" nvarchar(180),
"Date" DATE,
"SalesRevenue" DOUBLE,
"kts_1" DOUBLE,
"kts_1Trend" DOUBLE,
"kts_1Cycles" DOUBLE,
"kts_1_lowerlimit_95%" DOUBLE,
"kts_1_upperlimit_95%" DOUBLE,
"kts_1ExtraPreds" DOUBLE,
"kts_1Fluctuations" DOUBLE,
"kts_1Residues" DOUBLE
);</code></pre><P><SPAN> </SPAN></P><pre class="lia-code-sample language-sql"><code>drop table APL_FORECAST_ACCURACY;
create table APL_FORECAST_ACCURACY (
"Entity" nvarchar(180),
"MAE" DOUBLE,
"MAPE" DOUBLE
);</code></pre><P> </P><pre class="lia-code-sample language-sql"><code>drop table APL_FORECAST_STATUS;
create table APL_FORECAST_STATUS (
"Entity" nvarchar(180),
"Task Status" nvarchar(180)
);</code></pre><UL><LI><SPAN>You can trigger this procedure manually using the command below:</SPAN><SPAN> </SPAN></LI></UL><pre class="lia-code-sample language-sql"><code>call "APL_FORECAST_TASK" ; </code></pre><P><SPAN>However, in the next chapter, we will also create a task chain in SAP Datasphere to trigger it which can be embedded in real workflows. </SPAN><SPAN> </SPAN></P><UL><LI><SPAN>You see we generate and write data into three different tables:</SPAN><SPAN> </SPAN></LI><LI><SPAN>The prediction result goes into our results table called A</SPAN><I><SPAN>PL_SERIES_OUT</SPAN></I><SPAN>. This is the data that we want for our seamless planning model. The table has the concatenated entity, date and the predicted revenue. It also comes with upper and lower limit predictions (95%) as well as with fluctuations, extra predictions etc. </SPAN><SPAN> <BR /></SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_9-1768812813328.png" style="width: 495px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362498iE8B9D977F6B7BB67/image-dimensions/495x213?v=v2" width="495" height="213" role="button" title="Max_Gander_9-1768812813328.png" alt="Max_Gander_9-1768812813328.png" /></span><SPAN> </SPAN></LI><LI><SPAN>Optionally, we create the table </SPAN><I><SPAN>APL_FORECAST_ACCURACY</SPAN></I><SPAN> to store the MAPE (</SPAN><SPAN>Mean Absolute Percentage Error </SPAN><SPAN>MAE (</SPAN><SPAN>Mean Absolute Error) per entity. You could filter on the entities you are interested in, the best/worst entities etc. or you could get the MSE (Mean Squared Error) or RMSE (Root Mean Absolute Squared Error) as well. </SPAN><SPAN> </SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_10-1768812813329.png" style="width: 561px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362497iC10EECAEBD81A76A/image-dimensions/561x205?v=v2" width="561" height="205" role="button" title="Max_Gander_10-1768812813329.png" alt="Max_Gander_10-1768812813329.png" /></span></P><UL><LI><SPAN>Optionally, we create the table </SPAN><I><SPAN>APL_FORECAST_STATUS</SPAN></I><SPAN> where we log the prediction status per entity. All our entities were successful. </SPAN><SPAN> </SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_11-1768812813329.png" style="width: 553px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362499i7316AAB127FEFD38/image-dimensions/553x202?v=v2" width="553" height="202" role="button" title="Max_Gander_11-1768812813329.png" alt="Max_Gander_11-1768812813329.png" /></span><SPAN> </SPAN></P><H2 id="toc-hId-19268024"><SPAN>5. Task chain</SPAN><SPAN> </SPAN></H2><P><SPAN>Stored procedures can be executed via task chains. As such, you can execute them from the SAP Datasphere UI, schedule them or trigger them via external API. Check out the task chain </SPAN><A href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/d1afbc2b9ee84d44a00b0b777ac243e1.html" target="_blank" rel="noopener noreferrer"><SPAN>documentation</SPAN></A><SPAN> to learn more about prerequisites such as required roles. Soon, you should be able to trigger this API via multi actions in SAP Analytics Cloud as well. </SPAN><SPAN> </SPAN></P><P><SPAN>We must allow the execution of the stored procedure via the SAP Datasphere UI including the creation and deletion of data in the database user schema: </SPAN><SPAN> </SPAN></P><pre class="lia-code-sample language-sql"><code>CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'INSERT',
SCHEMA_NAME => 'SALES_PLANNING_DEMO#AI_USER',
OBJECT_NAME => '',
SPACE_ID => 'SALES_PLANNING_DEMO'); </code></pre><P><SPAN> </SPAN><SPAN> </SPAN></P><pre class="lia-code-sample language-sql"><code>CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'DELETE',
SCHEMA_NAME => 'SALES_PLANNING_DEMO#AI_USER',
OBJECT_NAME => '',
SPACE_ID => 'SALES_PLANNING_DEMO'); </code></pre><P> </P><pre class="lia-code-sample language-sql"><code>CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'EXECUTE',
SCHEMA_NAME => 'SALES_PLANNING_DEMO#AI_USER',
OBJECT_NAME => '',
SPACE_ID => 'SALES_PLANNING_DEMO'); </code></pre><P><SPAN>Setting up task chains is simple. To add a stored procedures, you select </SPAN><I><SPAN>Others</SPAN></I><SPAN> and browse through the available procedures that are available for your space. Then you drag the procedure on to the canvas to add it as a task. </SPAN><SPAN> </SPAN></P><P><SPAN>You can add replication flows, transformation flows, intelligent lookups etc. to your task chains. By that, you could for instance first refresh actuals and get them into shape so you can use them for your prediction. Or, you add an email notification task to receive updates after the execution of the task chain (I did that in the example below). </SPAN><SPAN> </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TaskChain.png" style="width: 594px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362663i911EC4E35C9385D5/image-dimensions/594x254?v=v2" width="594" height="254" role="button" title="TaskChain.png" alt="TaskChain.png" /></span></P><H2 id="toc-hId-170008876"><SPAN>6. Consuming results in the SAP Datasphere space</SPAN><SPAN> </SPAN></H2><P><SPAN>Now that we have the predictive logic to calculate and we can execute it via the SAP Datasphere UI, we of course need to consume the forecast data in the planning model. To do that, we first need the results in the SAP Datasphere space. We create a view on top of the results table. We used a graphical view but depending on your preferences and skills, you may use an SQL view instead. </SPAN><SPAN> </SPAN></P><UL><LI><SPAN>Pull the results table from the DB user schema in the </SPAN><I><SPAN>Sources</SPAN></I><SPAN> tab.</SPAN><SPAN> </SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="OUT.png" style="width: 612px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362664iA354A704835CDFF0/image-dimensions/612x258/is-moderation-mode/true?v=v2" width="612" height="258" role="button" title="OUT.png" alt="OUT.png" /></span></P><UL><LI><SPAN>Add a calculation node to split the </SPAN><I><SPAN>Entity</SPAN></I><SPAN> column into regions and products again. Create two calculated columns (</SPAN><I><SPAN>Product</SPAN></I><SPAN> and </SPAN><I><SPAN>Region</SPAN></I><SPAN>) and use string functions. </SPAN><SPAN> The functions <EM>SUBSTR_BEFORE() </EM>and <EM>SUBSTR_AFTER() </EM>can be used to split a string using the first occurence of a specified pattern (in our case '|' as the format of our <EM>Entity</EM> column is Product|Region). </SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="String.png" style="width: 677px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362665i40E8C23FC776192E/image-dimensions/677x277/is-moderation-mode/true?v=v2" width="677" height="277" role="button" title="String.png" alt="String.png" /></span></P><P><SPAN> Expression to derive product:</SPAN></P><pre class="lia-code-sample language-sql"><code>SUBSTR_BEFORE(Entity,'|')</code></pre><P><SPAN>Expression to derive region:</SPAN></P><pre class="lia-code-sample language-sql"><code>SUBSTR_AFTER(Entity,'|') </code></pre><UL><LI><SPAN> </SPAN><SPAN>Join the standard time dimension with day granularity to derive the calendar month from the date. Standard time tables and dimensions can be generated in Space Management (</SPAN><A href="https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/c5cfce4d22b04650b2fd6078762cdeb9.html" target="_blank" rel="noopener noreferrer"><SPAN>link</SPAN></A><SPAN>). </SPAN><SPAN> </SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Time.png" style="width: 645px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362666i7A94C9C0FC601E05/image-dimensions/645x284/is-moderation-mode/true?v=v2" width="645" height="284" role="button" title="Time.png" alt="Time.png" /></span></P><UL><LI><SPAN>Add a projection and only keep the columns you need in the planning model.</SPAN><SPAN> </SPAN><UL><LI><SPAN>Calendar Month</SPAN><SPAN> </SPAN></LI><LI><SPAN>SalesRevenue</SPAN><SPAN> </SPAN></LI><LI><SPAN>Product</SPAN><SPAN> </SPAN></LI><LI><SPAN>Region</SPAN><SPAN> </SPAN></LI></UL></LI></UL><UL><LI><SPAN>Make sure to expose the view for consumption and select </SPAN><I><SPAN>Fact</SPAN></I><SPAN> as the Semantic Usage Type. </SPAN><SPAN> </SPAN></LI><LI><SPAN>Name the view and deploy. </SPAN><SPAN> </SPAN></LI></UL><H2 id="toc-hId--26504629"><SPAN>7. Adding the forecast result as live version in the seamless planning model</SPAN><SPAN> </SPAN></H2><P><SPAN>We now move to SAP Analytics Cloud and add the forecast result to the seamless planning model as a live version. </SPAN><SPAN> </SPAN></P><UL><LI><SPAN>Connect external data source.</SPAN><SPAN> <BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Connect.png" style="width: 586px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362667i54B2C6A3F13E948D/image-dimensions/586x255?v=v2" width="586" height="255" role="button" title="Connect.png" alt="Connect.png" /></span><BR /></SPAN></LI></UL><UL><LI><SPAN>Create a version to map the data into (or use an existing un-used version).</SPAN><SPAN> <BR /></SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_17-1768812813330.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362505iA7A123C563C84B3A/image-size/medium?v=v2&px=400" role="button" title="Max_Gander_17-1768812813330.png" alt="Max_Gander_17-1768812813330.png" /></span></LI></UL><UL><LI><SPAN>Select the view.</SPAN><SPAN> <BR /></SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_18-1768812813330.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362507iD29E391283A3301F/image-size/medium?v=v2&px=400" role="button" title="Max_Gander_18-1768812813330.png" alt="Max_Gander_18-1768812813330.png" /></span></LI><LI><SPAN>Map the columns.</SPAN><SPAN> <BR /></SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Max_Gander_19-1768812813330.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362509i58654A5A68B6A2BF/image-size/medium?v=v2&px=400" role="button" title="Max_Gander_19-1768812813330.png" alt="Max_Gander_19-1768812813330.png" /></span><SPAN> </SPAN></LI><LI><SPAN>Preview the data. You see that we have a live connection to the forecast results in SAP Datasphere. So every time that the forecast is updated, it will be reflected in the planning model in real time!</SPAN><SPAN> <BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="LiveVers1.png" style="width: 567px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/362668i30038E377BF43A39/image-dimensions/567x238/is-moderation-mode/true?v=v2" width="567" height="238" role="button" title="LiveVers1.png" alt="LiveVers1.png" /></span><BR /></SPAN></LI></UL><P> </P><H1 id="toc-hId-70384873">What (else) can/could you do with it? <SPAN> </SPAN></H1><P><SPAN>In SAP Analytics Cloud: </SPAN><SPAN> </SPAN></P><UL><LI><SPAN>You can display the live version data in the model data foundation and in tables, charts, etc. in stories.</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>You can reference the live version data in model and story calculations as well as data actions (incl. advanced formulas).</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>You cannot write back into the referenced views (or rather their underlying tables). But, you can copy the live version data into planning versions via copy/paste in the table, data actions and version management.</SPAN><SPAN> </SPAN></LI></UL><P><SPAN>In SAP Datasphere:</SPAN><SPAN> </SPAN></P><UL><LI><SPAN>You can of course leverage the forecast result in your views and analytic models and compare it to actuals, budgets, etc. </SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>You can run further transformations and calculations and report on the results or use them in planning. </SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>…</SPAN><SPAN> </SPAN></LI></UL><H1 id="toc-hId--126128632"> </H1><H1 id="toc-hId--322642137"><SPAN>Which new features can improve the workflow in the future?</SPAN><SPAN> </SPAN></H1><P><SPAN>We are working on a couple of features that can make this scenario even better:</SPAN><SPAN> </SPAN></P><UL><LI><SPAN>Push data from SAP Datasphere into the seamless planning model via task chains: </SPAN><SPAN> <BR /></SPAN><SPAN>Live versions are awesome. But, sometimes you want to use the prediction as a proposal and then edit it. You could copy the live version data into an editable version easily (see above) but with a push of data into the planning model, you could directly bring the data into an editable forecast version. That push could be nicely added to the same task chain that triggers the APL procedure.</SPAN><SPAN> </SPAN></LI></UL><UL><LI><SPAN>Trigger task chains from SAP Analytics Cloud:</SPAN><SPAN><BR /></SPAN>Task chains shall soon offer a public API for triggering task chain runs from outside of SAP Datasphere.<BR />We are getting ready on the SAP Analytics Cloud side to let you call this API via API steps in multi actions. With that, you could trigger the stored procedures from SAP Analytics Cloud. <SPAN> <BR /></SPAN>Some day, we may have dedicated task chain steps in multi actions to ease this cross-orchestration. We also want to enable cross-orchestration in the opposite direction. <SPAN> </SPAN></LI></UL><H1 id="toc-hId--519155642"> </H1><H1 id="toc-hId--715669147">Conclusion<SPAN> </SPAN></H1><P><SPAN>In this blogpost, Marc and I demonstrated how to integrate predictive forecast results from SAP HANA APL in a seamless planning model. We combined the power of SAP Datasphere, SAP HANA and SAP Analytics Cloud to achieve that in a quite straight-forward architecture. You could achieve more complex scenarios, use PAL instead to tweak your prediction more etc. Or you could use data products from SAP BDC to get you started even quicker. </SPAN><SPAN> </SPAN></P><P><SPAN>We are looking forward to the future enhancements that shall improve such workflows and the overall integration of planning into SAP BDC!</SPAN><SPAN> </SPAN></P><H1 id="toc-hId--912182652"> </H1><H1 id="toc-hId--1108696157"><SPAN>Learn More</SPAN><SPAN> </SPAN></H1><UL><LI><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/seamless-planning-integration-between-sap-analytics-cloud-and-sap/ba-p/13877679" target="_blank"><SPAN>Seamless Planning - Product FAQ</SPAN></A><SPAN> </SPAN></LI></UL><UL><LI><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/unlocking-the-next-chapter-of-seamless-planning-in-sap-business-data-cloud/ba-p/14243864" target="_blank"><SPAN>Seamless Planning – Live Versions</SPAN></A><SPAN> </SPAN></LI></UL><UL><LI><A href="https://help.sap.com/docs/apl" target="_blank" rel="noopener noreferrer">SAP HANA Automated Predictive Library (APL)</A><SPAN> </SPAN></LI></UL><UL><LI><A href="https://help.sap.com/docs/SAP_HANA_PLATFORM/2cfbc5cf2bc14f028cfbe2a2bba60a50/c9eeed704f3f4ec39441434db8a874ad.html?version=2.0.07" target="_blank" rel="noopener noreferrer"><SPAN>SAP HANA Predictive Analysis Library (PAL)</SPAN></A><SPAN> </SPAN><SPAN> </SPAN></LI></UL><UL><LI><A href="https://help.sap.com/docs/SAP_ANALYTICS_CLOUD/00f68c2e08b941f081002fd3691d86a7/37db2128dab44d15b46e1918829c1ff1.html" target="_blank" rel="noopener noreferrer"><SPAN>SAP Analytics Cloud Predictive Scenarios</SPAN></A><SPAN> </SPAN></LI></UL>2026-01-20T11:01:19.924000+01:00https://community.sap.com/t5/crm-and-cx-blog-posts-by-members/connecting-sap-sales-cloud-v2-to-a-standalone-sap-analytics-cloud-a/ba-p/14310439Connecting SAP Sales Cloud V2 to a Standalone SAP Analytics Cloud - A Practical Alternative2026-01-21T05:54:01.435000+01:00jaripiehttps://community.sap.com/t5/user/viewprofilepage/user-id/515129<P><U><STRONG><FONT size="5">The Customer Challenge</FONT></STRONG></U></P><P>A customer approached me with what seemed like a straightforward request: "We want to analyze our <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Sales+Cloud+Version+2/pd-p/73555000100800003822" class="lia-product-mention" data-product="1240-1">SAP Sales Cloud Version 2</a> data in <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud/pd-p/67838200100800006884" class="lia-product-mention" data-product="3-1">SAP Analytics Cloud</a>."</P><P>Their situation was common for mid-sized companies adopting SAP's CX portfolio:</P><UL><LI><STRONG><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Sales+Cloud+Version+2/pd-p/73555000100800003822" class="lia-product-mention" data-product="1240-2">SAP Sales Cloud Version 2</a> </STRONG>was live and running well</LI><LI><STRONG><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud/pd-p/67838200100800006884" class="lia-product-mention" data-product="3-2">SAP Analytics Cloud</a> </STRONG>was licensed separately (not the embedded version)</LI><LI><STRONG>No SAP BTP</STRONG> subscription yet, it wasn't part of the initial scope</LI><LI><STRONG>No <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Integration+Suite/pd-p/73554900100800003241" class="lia-product-mention" data-product="23-1">SAP Integration Suite</a> </STRONG>same reason</LI><LI><STRONG>No <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-1">SAP Datasphere</a></STRONG>, they wanted to avoid another data platform</LI></UL><P>The goal was simple: import Accounts, Contacts, and Opportunities into SAC for custom dashboards that combined CX data with data from other sources</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jaripie_0-1768847989030.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363033iDB190F632455A48F/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="jaripie_0-1768847989030.png" alt="jaripie_0-1768847989030.png" /></span></P><P><U><STRONG><FONT size="5">What SAP Officially Recommends</FONT></STRONG></U></P><P>Before building anything custom, I researched SAP's official guidance. The recommended architecture for Sales Cloud V2 analytics integration involves:</P><OL><LI><STRONG><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-2">SAP Datasphere</a> </STRONG>as a central data layer</LI><LI><STRONG><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Integration+Suite/pd-p/73554900100800003241" class="lia-product-mention" data-product="23-2">SAP Integration Suite</a> </STRONG>for event-driven data replication</LI><LI><STRONG>SAP Advanced Event Mesh</STRONG> (optional but recommended) for resilience</LI></OL><P>SAP published a detailed two-part blog series on this: <A title="Event-Driven Data Integration from SAP Sales and Service Cloud V2 to SAP Datasphere" href="https://community.sap.com/t5/crm-and-cx-blog-posts-by-sap/event-driven-data-integration-from-sap-sales-and-service-cloud-v2-to-sap/ba-p/14003914)." target="_blank">Event-Driven Data Integration from SAP Sales and Service Cloud V2 to SAP Datasphere</A></P><P>This architecture is elegant and production-grade. It's also <STRONG>significant additional investment</STRONG> for a customer who just wants to run a few reports.</P><P><U><STRONG><FONT size="5">Cost Implications</FONT></STRONG></U></P><P><SPAN>The official path requires more than you might expect:</SPAN></P><TABLE border="1" width="99.78858350951374%"><TBODY><TR><TD><U><STRONG>Component</STRONG></U></TD><TD><U><STRONG>Status for this Customer</STRONG></U></TD></TR><TR><TD width="45.87737843551797%">SAP Datasphere</TD><TD width="53.911205073995774%">Not licensed (€€€/month)</TD></TR><TR><TD width="45.87737843551797%">SAP Integration Suite</TD><TD width="53.911205073995774%">Not licensed (€€/month)</TD></TR><TR><TD width="45.87737843551797%">SAP BTP subscription</TD><TD width="53.911205073995774%">Not available</TD></TR><TR><TD width="45.87737843551797%">Implementation effort</TD><TD width="53.911205073995774%">Significant (weeks)</TD></TR></TBODY></TABLE><P><BR />For a customer with ~24,000 accounts who wants three dashboards, this felt like using a sledgehammer to crack a nut.</P><P><STRONG><FONT size="6"><FONT size="5">The Obvious First Attempt: SAC's Built-in Connectors</FONT></FONT></STRONG></P><P>SAC has connectors for various SAP sources. Surely there's something for Sales Cloud?</P><P><STRONG>Attempt 1: The "SAP Cloud for Customer" Connector</STRONG></P><P>SAC includes a dedicated connector called "SAP Cloud for Customer" that works beautifully with the older C4C (Cloud for Customer) system. It connects to OData services like:</P><pre class="lia-code-sample language-markup"><code>https://<tenant>/sap/c4c/odata/v1/c4codataapi/AccountCollection</code></pre><P><STRONG>The problem: </STRONG>Sales Cloud V2 is not C4C. So the OData connector won’t work:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="jaripie_1-1768847293558.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363030i9FFD16C2D8EF8F2C/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="jaripie_1-1768847293558.png" alt="jaripie_1-1768847293558.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="jaripie_2-1768847303349.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363031i061589235888315A/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="jaripie_2-1768847303349.png" alt="jaripie_2-1768847303349.png" /></span></P><P><STRONG>Attempt 2: The Generic OData Connector</STRONG></P><P>SAC also has a generic "OData Services" connector. Maybe that works?</P><P><STRONG>The problem:</STRONG> Sales Cloud V2 doesn't expose OData endpoints. The APIs on the <A title="SAP APIs" href="https://api.sap.com/package/SAPSalesServiceCloudV2/rest" target="_self" rel="noopener noreferrer">SAP Business Accelerator Hub</A> are REST, not OData.</P><pre class="lia-code-sample language-markup"><code>GET /sap/c4c/api/v1/account-service/accounts ← REST, not OData
GET /sap/c4c/api/v1/contact-service/contacts ← REST, not OData</code></pre><P><STRONG>Attempt 3: Live Connection</STRONG></P><P>Sales Cloud V2 has embedded SAC capabilities. Could we establish a live connection?</P><P><STRONG>The problem:</STRONG> The embedded analytics are designed for "embedded" use cases. Connecting a standalone SAC tenant to Sales Cloud V2 for custom reporting seems to be... let's say "not a focus area" right now.</P><P>I opened a support ticket to clarify the expected behavior and it was put in development after a little back and forth.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="jaripie_3-1768847352603.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363032i9299CBF8774B4C83/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="jaripie_3-1768847352603.png" alt="jaripie_3-1768847352603.png" /></span></P><P><STRONG><FONT size="5">The Hidden Challenge: Real-Time vs. Import</FONT></STRONG></P><P>At this point, I had to step back and discuss architecture with the customer.</P><P>Even if a direct connector existed, there are implications to consider:</P><P> </P><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%"><U><STRONG>Approach</STRONG></U></TD><TD width="33.333333333333336%"><U><STRONG>Pros</STRONG></U></TD><TD width="33.333333333333336%"><U><STRONG>Cons</STRONG></U></TD></TR><TR><TD width="33.333333333333336%"><STRONG>Live Connection</STRONG></TD><TD width="33.333333333333336%">Always current data</TD><TD width="33.333333333333336%">Query performance depends on source; cannot combine with other data sources easily</TD></TR><TR><TD width="33.333333333333336%"><STRONG>Import Connection</STRONG></TD><TD width="33.333333333333336%">Fast queries; can blend with other sources</TD><TD width="33.333333333333336%">Data has latency; storage costs</TD></TR></TBODY></TABLE><P>The customer wanted to combine Sales Cloud V2 data with <a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA/pd-p/73554900100800000266" class="lia-product-mention" data-product="799-1">SAP S/4HANA</a> data in the same dashboard. This essentially requires import connections for both sources, plus a data model in SAC that joins them.</P><P><STRONG>Real-time was never really an option</STRONG> as they needed data at rest in <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud/pd-p/67838200100800006884" class="lia-product-mention" data-product="3-3">SAP Analytics Cloud</a> </P><P><STRONG>The Solution: A Lightweight REST-to-OData Proxy</STRONG></P><P>Since the customer had:</P><UL><LI>No appetite for <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-3">SAP Datasphere</a></LI><LI>No CPI license</LI><LI>A clear, limited scope (three entity types)</LI><LI>Tolerance for daily data refresh (not real-time)</LI></UL><P>I proposed building a thin proxy server that:</P><OL><LI>Authenticates with Sales Cloud V2</LI><LI>Fetches data via REST APIs</LI><LI>Transforms responses to OData format</LI><LI>Exposes endpoints that SAC can consume natively</LI></OL><P><FONT size="5"><STRONG>Architecture<BR /></STRONG></FONT></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jaripie_0-1768849266375.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363037i9170806DBF5BDABA/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="jaripie_0-1768849266375.png" alt="jaripie_0-1768849266375.png" /></span></P><P><SPAN>This sounds straightforward... until you start building it...</SPAN></P><H2 id="toc-hId-1788541067">Challenge 1: The 999 vs. 1000 Mismatch</H2><P>This one deserves its own section because it’s so unexpected.</P><P><STRONG>SAP Sales Cloud V2</STRONG> REST APIs have a maximum page size of <STRONG>999 records</STRONG>. You can request <CODE>$top=999</CODE>, but not more.</P><P><STRONG>SAP Analytics Cloud</STRONG> OData import has a minimum batch size of <STRONG>1000 records</STRONG>.</P><P>Yes, you read that correctly. The source maxes out at 999. The target starts at 1000. I'll admit I stared at the logs for longer than I'd like to admit before I realized what was happening. The first 999 records imported perfectly. Then nothing. It took me a while to connect that SAC was waiting for a 1000-record batch that would never arrive.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jaripie_0-1768849762690.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363038i94545C28BB7CCB17/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="jaripie_0-1768849762690.png" alt="jaripie_0-1768849762690.png" /></span></P><H3 id="toc-hId-1721110281">The Solution</H3><P>The proxy implements <STRONG>on-demand parallel fetching</STRONG>:</P><pre class="lia-code-sample language-javascript"><code>async function fetchAllRecords(entityType: string): Promise<any[]> {
// Get first page to determine total count
const firstPage = await fetchFromSalesCloud(entityType, {
$top: 999,
$skip: 0
});
if (firstPage.value.length < 999) {
// All records fit in first page
return firstPage.value;
}
// Calculate remaining pages and fetch in parallel
const totalEstimate = firstPage.totalCount || 10000;
const remainingPages = Math.ceil((totalEstimate - 999) / 999);
const pagePromises = Array.from({ length: remainingPages }, (_, i) =>
fetchFromSalesCloud(entityType, {
$top: 999,
$skip: 999 * (i + 1)
})
);
const results = await Promise.all(pagePromises);
return [firstPage.value, ...results.map(r => r.value)].flat();
}</code></pre><P><SPAN>The proxy fetches pages in parallel (not sequentially), caches the combined result, and serves it to SAC in appropriately sized batches. The cache has a configurable TTL (e.g., 1 hour) and automatically refreshes on the next request after expiry.</SPAN></P><H2 id="toc-hId-1395514057">Challenge 2: Nested Objects and Flattening</H2><P>Sales Cloud V2 returns nicely structured JSON with nested objects:</P><pre class="lia-code-sample language-javascript"><code>{
"value": [{
"accountId": "A001",
"accountName": "ACME Corp",
"primaryContact": {
"contactId": "C001",
"firstName": "John",
"lastName": "Doe"
},
"addresses": [
{ "type": "BILL_TO", "city": "Berlin" },
{ "type": "SHIP_TO", "city": "Munich" }
]
}]
}</code></pre><P><STRONG>SAC’s OData importer cannot handle this.</STRONG> The SAP Help documentation states clearly: Embedded Complex types are not supported.</P><H3 id="toc-hId-1328083271">The Solution</H3><P>The proxy must flatten nested structures into separate entity sets:</P><pre class="lia-code-sample language-markup"><code>/odata/Accounts → Flat account records
/odata/Contacts → Flat contact records
/odata/AccountAddresses → One row per address, linked by accountId</code></pre><H2 id="toc-hId-1002487047">Challenge 3: REST APIs Behave Differently Than OData</H2><P>Coming from the OData world, I expected certain conventions:</P><UL><LI><CODE>$filter</CODE> for filtering</LI><LI><CODE>$select</CODE> for field selection</LI><LI><CODE>$expand</CODE> for related entities</LI><LI>Standardized metadata endpoint</LI></UL><P>Sales Cloud V2 REST APIs have their own conventions:</P><UL><LI>Different query parameter syntax</LI><LI>No <CODE>$metadata</CODE> endpoint</LI><LI>Pagination via custom headers or response properties</LI></UL><P>The proxy needs to translate between these worlds, implementing just enough OData semantics to satisfy SAC’s importer.</P><H2 id="toc-hId-805973542">Challenge 4: The $skip 10,000 Limit</H2><P>Just when you think you’ve handled pagination, there’s another surprise waiting for datasets with more than 10,000 records.</P><P><STRONG>Sales Cloud V2 returns an error if <CODE>$skip</CODE> is 10,000 or greater.</STRONG></P><pre class="lia-code-sample language-markup"><code>// This works
GET /accounts?$top=999&$skip=9000 ✅
// This fails
GET /accounts?$top=999&$skip=10000 ❌ Error!</code></pre><P><SPAN>So even if you’ve cleverly worked around the 999 page size limit, you hit a wall at record 10,001. Simple offset-based pagination (</SPAN><CODE>$skip</CODE><SPAN>) just stops working.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jaripie_1-1768850122802.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363039i449B3D34E0148933/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="jaripie_1-1768850122802.png" alt="jaripie_1-1768850122802.png" /></span></P><H3 id="toc-hId-738542756">The Solution</H3><P>Instead of offset-based pagination, the proxy uses <STRONG>keyset pagination</STRONG> with <CODE>$skiptoken</CODE>:</P><pre class="lia-code-sample language-javascript"><code>// src/constants.ts
export const SC_MAX_SKIP = 10000;
// When approaching the skip limit, switch to skiptoken
async function fetchWithKeysetPagination(entityType: string): Promise<any[]> {
const allRecords: any[] = [];
let nextLink: string | null = buildInitialUrl(entityType);
while (nextLink) {
const response = await fetch(nextLink);
const data = await response.json();
allRecords.push(...data.value);
// Use the skiptoken from the response for next page
nextLink = data['@odata.nextLink'] || null;
}
return allRecords;
}</code></pre><P>Keyset pagination uses a token (typically based on the last record’s ID or timestamp) rather than a numeric offset. This approach:</P><UL><LI>Has no upper limit on the number of records</LI><LI>Is actually more performant than large offsets</LI><LI>Is the pattern SAP recommends for large datasets</LI></UL><P>If you have more than 10,000 records in any entity, you <EM>must</EM> use <CODE>$skiptoken</CODE>-based pagination. Simple <CODE>$skip</CODE> won’t cut it.</P><H2 id="toc-hId-412946532">Implementation Notes</H2><H3 id="toc-hId-345515746">Technology Choice</H3><P>I built the proxy with:</P><UL><LI><STRONG>Node.js + Fastify: </STRONG>Lightweight, fast, excellent TypeScript support</LI><LI><STRONG>On-demand caching: </STRONG>No scheduled jobs; cache fills dynamically on first request</LI><LI><STRONG>Docker deployment: </STRONG>Runs on any container platform (I use Coolify on a VPS)</LI></UL><P>This is a production-ready setup. The proxy has been running reliably for the customer with no issues.</P><H3 id="toc-hId-149002241">Smart Caching: On-Demand Parallel Fetching</H3><P>Rather than running scheduled batch jobs, the proxy uses an on-demand approach:</P><OL><LI><STRONG>First page served instantly</STRONG>: When SAC requests data, the proxy immediately returns the first page (up to 999 records) from Sales Cloud V2</LI><LI><STRONG>Parallel fetching in background</STRONG>: While SAC processes the first page, the proxy fetches remaining pages in parallel</LI><LI><STRONG>Cache populated for subsequent requests</STRONG>: By the time SAC requests page 2, the data is already cached</LI></OL><pre class="lia-code-sample language-javascript"><code>async function handleODataRequest(req: Request): Promise<Response> {
// Check cache first
if (cache.has(entityType) && !cache.isExpired(entityType)) {
return serveFromCache(entityType, req.query);
}
// Fetch first page immediately
const firstPage = await fetchPage(entityType, 0, 999);
// Start parallel fetching of remaining pages (non-blocking)
fetchRemainingPagesInBackground(entityType, firstPage.totalCount);
// Return first page immediately
return formatAsOData(firstPage);
}</code></pre><P>This approach has several advantages:</P><UL><LI><STRONG>No unnecessary background jobs</STRONG>: Data is only fetched when actually needed</LI><LI><STRONG>Fast initial response</STRONG>: User doesn’t wait for full dataset</LI><LI><STRONG>Self-healing cache</STRONG>: Expired cache automatically refreshes on next request</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jaripie_0-1768854131159.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/363045iD5F6B2D8C7AAC998/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="jaripie_0-1768854131159.png" alt="jaripie_0-1768854131159.png" /></span></P><H3 id="toc-hId--122742633">Optional: Deploying to SAP BTP Cloud Foundry</H3><P>If you already have a BTP subscription with available credits, deploying there can make sense. You’re using infrastructure you’re already paying for. The migration is straightforward:</P><P>Standalone Setup Cloud Foundry Setup</P><TABLE><TBODY><TR><TD width="272.867px" height="30px">Environment variables for credentials</TD><TD width="370.578px" height="30px">Destination Service binding</TD></TR><TR><TD width="272.867px" height="30px">In-memory cache</TD><TD width="370.578px" height="30px">PostgreSQL Hyperscaler Option (free tier available)</TD></TR><TR><TD width="272.867px" height="30px">Direct HTTP endpoint</TD><TD width="370.578px" height="30px">XSUAA + AppRouter</TD></TR></TBODY></TABLE><P><STRONG>Why PostgreSQL if you move to BTP?</STRONG></P><P>On a single-container deployment (VPS e.g. with Coolify), in-memory caching works perfectly. But Cloud Foundry typically runs multiple app instances for high availability. If you need persistence across instances, <STRONG>PostgreSQL Hyperscaler Option</STRONG> is the lightweight choice. It has a <A href="https://www.sap.com/germany/products/technology-platform/postgresql-on-sap-btp-hyperscaler-option.html#pricing-section" target="_blank" rel="noopener noreferrer">free tier</A> (since November 2025) and requires no additional management.</P><P>The additional effort for Cloud Foundry deployment is roughly:</P><UL><LI>Add <CODE>mta.yaml</CODE> with service bindings (~1 hour)</LI><LI>Configure Destination Service for Sales Cloud V2 credentials (~30 min)</LI><LI>Add XSUAA for authentication (~1 hour)</LI><LI>Test and deploy (~2 hours)</LI></UL><P><STRONG>Total: Less than one additional day</STRONG> if you decide BTP is the right fit.</P><P><STRONG>Don’t just default to BTP just because it’s SAP.</STRONG> Evaluate what you already have and what makes sense for your situation. A €5/month VPS running Docker might be exactly right for your use case.</P><H3 id="toc-hId--319256138">OData Response Format</H3><P>SAC expects OData V2or V4 responses. The minimum viable format:</P><pre class="lia-code-sample language-javascript"><code>{
"d": {
"results": [
{ "AccountID": "A001", "AccountName": "ACME Corp", ... },
{ "AccountID": "A002", "AccountName": "Globex", ... }
]
}
}</code></pre><P>Plus a <CODE>$metadata</CODE> endpoint returning EDMX that describes your entity types.</P><H3 id="toc-hId--515769643">Authentication</H3><P>Sales Cloud V2 supports:</P><UL><LI>Basic Authentication (simplest for server-to-server)</LI><LI>OAuth 2.0 (more secure, recommended for production)</LI></UL><P>The proxy stores credentials securely and handles token refresh for OAuth.</P><HR /><H2 id="toc-hId--418880141">Results</H2><P>After implementing the proxy:</P><UL><LI>SAC successfully imports Accounts, Contacts, and Opportunities</LI><LI>Data refreshes nightly via scheduled import</LI><LI>Customer can blend CX data with S/4HANA data in unified dashboards</LI><LI>No additional SAP platform licenses required</LI><LI>Total implementation time: ~2 days</LI></UL><HR /><H2 id="toc-hId--615393646">When This Approach Makes Sense</H2><P>The right solution depends on what you have and what you need. Here’s how I think about it:</P><P><STRONG>A lightweight proxy makes sense when:</STRONG></P><UL><LI>Limited scope (specific entities, not the entire data model)</LI><LI>Import/batch refresh is acceptable (not real-time requirements)</LI><LI>You don’t have Datasphere or CPI licensed and don’t need them for other use cases</LI><LI>You want to validate the integration quickly before investing in bigger infrastructure</LI></UL><P><STRONG>Datasphere + CPI makes sense when:</STRONG></P><UL><LI>You need event-driven, near real-time data replication</LI><LI>You’re integrating many entity types with complex relationships</LI><LI>You already have Datasphere or CPI licensed then use what you’re paying for</LI><LI>You have a broader data strategy where Datasphere is the central hub anyway</LI></UL><P><STRONG>The key question:</STRONG> What do you already have, and does this use case justify adding more?</P><P>SAP’s platforms are excellent, no question! But sometimes a focused solution that solves exactly your problem is better than pulling in the big guns for a straightforward requirement.</P><HR /><H2 id="toc-hId--811907151">Lessons Learned</H2><H3 id="toc-hId--1301823663">1. Don’t Assume SAP Products Integrate Seamlessly</H3><P>Sales Cloud V2 and SAC are both modern SAP cloud products. The integration gap exists because V2’s move away from OData wasn’t accompanied by a new native SAC connector (yet).</P><H3 id="toc-hId--1498337168">2. Check API Patterns Before Committing to Architecture</H3><P>The shift from OData to REST in Sales Cloud V2 is documented, but easy to miss if you’re coming from C4C experience. Always verify on <A href="https://api.sap.com" target="_blank" rel="noopener noreferrer">api.sap.com</A>.</P><H3 id="toc-hId--1694850673">3. Test with Realistic Data Volumes</H3><P>The 999/1000 mismatch only surfaces with more than 999 records. Demo systems with 50 accounts work fine. Production breaks.</P><H3 id="toc-hId--1891364178">4. Use What You Have, Validate What You Need</H3><P>The official architecture is powerful. But if you don’t already have Datasphere and CPI licensed, ask yourself: does this specific use case justify adding them? Sometimes a focused solution that solves exactly your problem is the right call. You can always scale up later if requirements grow.</P><HR /><H2 id="toc-hId--1626290985">What’s Next?</H2><P>I’ve published a <STRONG>reference implementation</STRONG> on GitHub that demonstrates the core patterns:</P><UL><LI>REST-to-OData transformation</LI><LI>Parallel fetching with caching</LI><LI>The 999→1000 batch bridging logic</LI><LI>Keyset pagination with <CODE>$skiptoken</CODE> for large datasets (>10k records)</LI></UL><P><span class="lia-unicode-emoji" title=":backhand_index_pointing_right:">👉</span> <A title="GitHub Repo" href="https://github.com/JariPie/sap-sales-cloud-v2-proxy" target="_blank" rel="noopener nofollow noreferrer">GitHub: SAP Sales Cloud V2 OData Proxy</A></P><P><STRONG>A note on scope:</STRONG> The repository shows <EM>how</EM> the solution works, but doesn’t include everything from my production implementation. Some aspects are tied to specific client configurations that I can’t share. If you’re facing a similar challenge and want help getting to production faster, feel free to reach out.</P><P>If you’ve built something similar or found a better approach I’d love to hear about it in the comments.</P>2026-01-21T05:54:01.435000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749Filling the gab between BW and Datasphere (a bit): Implementing a few standard variables2026-01-22T15:38:53.605000+01:00CLTGravesenhttps://community.sap.com/t5/user/viewprofilepage/user-id/768390<P><ul =""><li style="list-style-type:none; margin-left:15px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-1788603563">Introduction</a></li><li style="list-style-type:none; margin-left:15px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-1592090058">How to install:</a></li><li style="list-style-type:none; margin-left:15px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-1395576553">How were they implemented:</a></li><li style="list-style-type:none; margin-left:15px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-1199063048">Example: Analytical Models</a></li><li style="list-style-type:none; margin-left:0px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-873466824">Implemented Variables</a></li><li style="list-style-type:none; margin-left:30px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-935118757">0CALDAY: Calendar Day</a></li><li style="list-style-type:none; margin-left:30px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-738605252">0CALMONTH: Calendar Year / Month</a></li><li style="list-style-type:none; margin-left:30px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-542091747">0CALQUARTER:</a></li><li style="list-style-type:none; margin-left:30px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-345578242">0CALWEEK:</a></li><li style="list-style-type:none; margin-left:30px; margin-bottom:1px;"><a href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749#toc-hId-149064737">0CALYEAR:</a></li></ul></P><H2 id="toc-hId-1788603563"><SPAN>Introduction</SPAN></H2><P><SPAN>Welcome to the first blog in a series where I'll try my best to close up some of the functionality gabs between the old SAP BW system and <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-1">SAP Datasphere</a>.<BR /></SPAN><SPAN>With the end of life for the old SAP BW systems in sight, companies and developers have been diving deep into <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-2">SAP Datasphere</a>, to build new, exciting solutions, in addition to migrating existing reports. <BR /></SPAN></P><P><SPAN>Personally, I've been involved with several Datasphere implementations now, and one thing I have found myself missing a lot was the standard Exit variables for things like "Current month", "Current year" and so on. </SPAN></P><P><SPAN>To make my life, and hopefully yours as well, I've recreated a set of the old date / calendar variables.<BR />All the views are bundled in a nice package, which you can download from <A href="https://github.com/ChrisSorensen91/DatasphereStandardVariables/blob/main/NTT_DATA_STANDARD_CALENDER_VARIABLES%20(1.0.0).package" target="_blank" rel="noopener nofollow noreferrer">this</A> GitHub repo. </SPAN></P><H2 id="toc-hId-1592090058"><SPAN>How to install:</SPAN></H2><P><SPAN><STRONG>Note:</STRONG> Your Datasphere tenant does <EM>not</EM> need to be connected to the Content Network for this to work. </SPAN></P><P><SPAN>One the package is installed, you may, or may not, need to change the source of the view 2VR_CALENDER_WRAPPER, which is the basis of all the other views. <BR />This was simply done to make sure that even if your space does not contain the Time Objects, you only have to remap 1 object. </SPAN></P><OL><LI>Go to this <A href="https://github.com/ChrisSorensen91/DatasphereStandardVariables/blob/main/NTT_DATA_STANDARD_CALENDER_VARIABLES%20(1.0.0).package" target="_blank" rel="noopener nofollow noreferrer">GitHub Repo</A> and download the package</LI><LI>Go to your Datasphere tenant and select "Import":<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_0-1769091832931.png" style="width: 202px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364150i17B8C20774503FBD/image-dimensions/202x235?v=v2" width="202" height="235" role="button" title="CLTGravesen_0-1769091832931.png" alt="CLTGravesen_0-1769091832931.png" /></span></LI><LI>Click Upload and select the package downloaded in step 1.</LI><LI>After the import is complete click the package:<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_1-1769092087558.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364154i2F06B0B1CC26C9CF/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="CLTGravesen_1-1769092087558.png" alt="CLTGravesen_1-1769092087558.png" /></span></LI><LI>Under "Import Options", map the space:<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_3-1769092195062.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364156i16D30B00858CD82A/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="CLTGravesen_3-1769092195062.png" alt="CLTGravesen_3-1769092195062.png" /></span><P> </P></LI><LI>And click "Import". </LI><LI>(Certain cases): If the view "SAP.TIME.VIEW_DIMENSION_DAY" is not present in the target space open the view 2VR_CALENDER_WRAPPER and replace the source.</LI></OL><P> </P><H2 id="toc-hId-1395576553"><SPAN>How were they implemented:</SPAN></H2><P>All the views are implemented as SQL views, and only return a single column, depending on the kind of view, rather than an interval, which some SAP BW variables might give. The reason for that is explained in the example below. </P><P>All views have an input parameter, IP_OFFSET, which takes an integer, and will offset the result by one "unit" of time. <BR />So the CALYEAR views will offset one year at a time, the month views will offset one month at the time, and so on. </P><H2 id="toc-hId-1199063048">Example: Analytical Models</H2><P>In Datasphere, an <A title="Datasphere Documentation - Analytical Models" href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e5fbe9e2cb93484dab8b1963145e565f.html?locale=en-US" target="_blank" rel="noopener noreferrer">Analytical Model</A> is the final (kind of, that's not important now) step in the data modelling process. <BR />In your analytical model, you can define different kind of <A href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/cdd8fa0fd74b495584dca343432f2814.html?locale=en-US" target="_blank" rel="noopener noreferrer">variables</A>, that allows you to derive a value from a view, or a dynamic default. <BR /><STRONG>Note:</STRONG> Deriving select options and intervals are not supported. </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_0-1769090898406.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364142i4F198269935A966F/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="CLTGravesen_0-1769090898406.png" alt="CLTGravesen_0-1769090898406.png" /></span></P><P>In the above example, the user will be prompted to fill in an interval, but will only derive the first value:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_1-1769091046228.png" style="width: 582px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364143i9376F61C0C5F32CD/image-dimensions/582x86/is-moderation-mode/true?v=v2" width="582" height="86" role="button" title="CLTGravesen_1-1769091046228.png" alt="CLTGravesen_1-1769091046228.png" /></span></P><P>If you want an interval, you'll need "Multiple single values".</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_2-1769091188555.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364144iF3112F5083ADB158/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="CLTGravesen_2-1769091188555.png" alt="CLTGravesen_2-1769091188555.png" /></span></P><P>In the above example, we've provided a view that returns three values, which will act as an interval.<BR /><STRONG>Note:</STRONG> We have also provided the value -1 for the offset.</P><P>Which means all the values, 202601, 202512, 202511, and 202510 is offset by -1:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="CLTGravesen_3-1769091307296.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364145i42AE7FF4CDFB84B0/image-size/medium?v=v2&px=400" role="button" title="CLTGravesen_3-1769091307296.png" alt="CLTGravesen_3-1769091307296.png" /></span></P><H1 id="toc-hId-873466824">Implemented Variables</H1><P>I've created variables for the following date objects:</P><UL><LI><STRONG>0CALDAY</STRONG>: Date (YYYYMMDD)</LI><LI><STRONG>0CALMONTH</STRONG>: Cal. Year / Month (YYYYMM)</LI><LI><STRONG>0CALQUARTER</STRONG>: Cal. Year / Quarter (YYYYQ)</LI><LI><STRONG>0CALWEEK</STRONG>: Cal. Year / Week (YYYYWW)</LI><LI><STRONG>0CALYEAR</STRONG>: Calendar Year (YYYY)</LI></UL><P>Because - well, because those were the ones that had been giving me headaches.</P><H3 id="toc-hId-935118757">0CALDAY: Calendar Day </H3><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%" height="30px"><STRONG>View ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>SAP BW Variable ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>Description</STRONG></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0DATE</SPAN></TD><TD width="33.333333333333336%" height="50px">0DATE</TD><TD width="33.333333333333336%" height="50px"><P>Current Calendar Day</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEPO_CDPDCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEPO_CDPDCY</TD><TD width="33.333333333333336%" height="50px"><P>Last Day</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEIO_CDL7CDCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEIO_CDL7CDCY</TD><TD width="33.333333333333336%" height="50px"><P>Last 7 Days</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEIO_CDL14CDCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEIO_CDL14CDCY</TD><TD width="33.333333333333336%" height="50px"><P>Last 14 Days</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CDCMCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CDCMCY</TD><TD width="33.333333333333336%" height="50px"><P>All days in current Month</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CDCQCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CDCQCY</TD><TD width="33.333333333333336%" height="50px"><P>All days in current quarter</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIM_CDFYCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIM_CDFYCY</TD><TD width="33.333333333333336%" height="50px"><P>All days in current year</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIM_CDFYPY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIM_CDFYPY</TD><TD width="33.333333333333336%" height="50px"><P>All days in last year</P></TD></TR></TBODY></TABLE><H3 id="toc-hId-738605252">0CALMONTH: Calendar Year / Month</H3><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%" height="30px"><STRONG>View ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>SAP BW Variable ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>Description</STRONG></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXPO_CMPMPY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXPO_CMPMPY</TD><TD width="33.333333333333336%" height="50px"><P>Last Cal. Year / Month previous year</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXPO_CMCMPY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXPO_CMCMPY</TD><TD width="33.333333333333336%" height="50px"><P>Current Cal. Year / Month previous year</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CMPQPY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CMPQPY</TD><TD width="33.333333333333336%" height="50px"><P>Last Quarter Previous Year</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CMPQCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CMPQCY</TD><TD width="33.333333333333336%" height="50px"><P>Last Quarter</P></TD></TR><TR><TD width="33.333333333333336%" height="77px"><SPAN>2VR_0CXIO_CML12PMCY</SPAN></TD><TD width="33.333333333333336%" height="77px">0CXIO_CML12PMCY</TD><TD width="33.333333333333336%" height="77px"><P>Last 12 Cal. Year / Months excluding Current</P></TD></TR><TR><TD width="33.333333333333336%" height="77px"><SPAN>2VR_0CXIO_CML12CMCY</SPAN></TD><TD width="33.333333333333336%" height="77px">0CXIO_CML12CMCY</TD><TD width="33.333333333333336%" height="77px"><P>Last 12 Cal. Year / Months Including Current</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CMCQPY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CMCQPY</TD><TD width="33.333333333333336%" height="50px"><P>Current Quarter Previous Year</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CXIO_CMCQCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CXIO_CMCQCY</TD><TD width="33.333333333333336%" height="50px"><P>Current Quarter</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CMONTH</SPAN></TD><TD width="33.333333333333336%" height="50px">0CMONTH</TD><TD width="33.333333333333336%" height="50px"><P>Current Cal. Year / Month</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEPO_CMPMCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEPO_CMPMCY</TD><TD width="33.333333333333336%" height="50px"><P>Last Cal. Year / Month</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEIO_CML6CMCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEIO_CML6CMCY</TD><TD width="33.333333333333336%" height="50px"><P>Last 6 Cal. Year / Months Including Current</P></TD></TR><TR><TD width="33.333333333333336%" height="50px"><SPAN>2VR_0CEIO_CML3CMCY</SPAN></TD><TD width="33.333333333333336%" height="50px">0CEIO_CML3CMCY</TD><TD width="33.333333333333336%" height="50px"><P>Last 3 Cal. Year / Months Including Current</P></TD></TR></TBODY></TABLE><H3 id="toc-hId-542091747">0CALQUARTER:</H3><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%" height="30px"><STRONG>View ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>SAP BW Variable ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>Description</STRONG></TD></TR><TR><TD width="33.333333333333336%" height="26"><DIV><DIV> </DIV></DIV><SPAN><SPAN>2VR_0CXPO_CQQ1CY</SPAN></SPAN></TD><TD width="33.333333333333336%">0CXPO_CQQ1CY</TD><TD width="33.333333333333336%"><DIV><DIV><SPAN>First Quarter of Current Year</SPAN></DIV></DIV></TD></TR><TR><TD width="33.333333333333336%" height="26"><DIV><DIV> </DIV></DIV><SPAN><SPAN>2VR_0CXPO_CQCQPY</SPAN></SPAN></TD><TD width="33.333333333333336%">0CXPO_CQCQPY</TD><TD width="33.333333333333336%"><DIV><DIV> <SPAN>Current Quarter of Previous Year</SPAN></DIV></DIV></TD></TR><TR><TD width="33.333333333333336%" height="26"><DIV><DIV> </DIV></DIV><SPAN><SPAN>2VR_0CQUART</SPAN></SPAN></TD><TD width="33.333333333333336%">0CQUART</TD><TD width="33.333333333333336%"><DIV><DIV> <SPAN>Current Quarter</SPAN></DIV></DIV></TD></TR></TBODY></TABLE><H3 id="toc-hId-345578242">0CALWEEK:</H3><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%" height="30px"><STRONG>View ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>SAP BW Variable ID</STRONG></TD><TD width="33.333333333333336%" height="30px"><STRONG>Description</STRONG></TD></TR><TR><TD width="33.333333333333336%" height="26">2VR_0CWEEK</TD><TD width="33.333333333333336%"><SPAN>0CWEEK</SPAN></TD><TD width="33.333333333333336%"><DIV><DIV><SPAN>Current calendar week</SPAN></DIV></DIV></TD></TR></TBODY></TABLE><H3 id="toc-hId-149064737">0CALYEAR:</H3><TABLE border="1" width="100%"><TBODY><TR><TD width="33.333333333333336%" height="30px">View ID</TD><TD width="33.333333333333336%" height="30px">SAP BW Variable ID</TD><TD width="33.333333333333336%" height="30px">Description</TD></TR><TR><TD width="33.333333333333336%" height="30px"><SPAN>2VR_0PYEAR</SPAN></TD><TD width="33.333333333333336%" height="30px"><SPAN>0PYEAR</SPAN></TD><TD width="33.333333333333336%" height="30px">Last calendar year</TD></TR><TR><TD><SPAN>2VR_0CYEAR</SPAN><SPAN><BR /></SPAN></TD><TD><SPAN>0CYEAR</SPAN></TD><TD>Current calendar year</TD></TR></TBODY></TABLE>2026-01-22T15:38:53.605000+01:00https://community.sap.com/t5/data-and-analytics-blog-posts/the-consequences-of-implementing-strictly-shift-left-architectures/ba-p/14313699The consequences of implementing strictly Shift-left architectures2026-01-25T08:00:00.027000+01:00Phil_from_Madridhttps://community.sap.com/t5/user/viewprofilepage/user-id/1886<P class="">Over the last year or so, the dialogue around <STRONG>Shift Left architectures</STRONG> and working as an enabler for <STRONG>Data Products</STRONG> has intensified across the industry. What was once maybe a niche architectural preference in the modern data stack is now increasingly a topic for enterprises aiming to scale their data operations. For decades, most data architecture followed a "Shift Right" pattern, based mostly on table-oriented replication patterns where backend applications were not held responsible for defining and implementing extraction logic, and centralized data teams were left in charge to act as "data janitors"—cleaning, modeling, and fixing data in a downstream lakehouse.</P><P class="">The idea of <STRONG>Shift Left</STRONG> is an architectural evolution for modern data management and platforms that treats data as a governed, high-quality data extract on the level of a curated and well defined business entity or even a Data Product at the moment of its birth. By moving responsibility for quality and semantic definition to the "left", orienting it to real business entities and closer to the place where the data is really generated, the burden of data preparation is addressed where the context is strongest. With that, it’s easy to imagine how the downstream "cleanup" cycle can become unnecessary or significantly streamlined under this model.</P><P class="">Several key benefits have fuelled this recent momentum:</P><P class=""> </P><UL><LI><STRONG>Faster Time-to-Insight:</STRONG> Data is born ready for consumption, removing the weeks or months spent in logical (re-)definition and build, as well creating ingestion and cleaning data pipelines.</LI><LI><STRONG>Semantic Fidelity:</STRONG> Business logic is defined by those who understand the context of the data best—the source domain experts.</LI><LI><STRONG>Improved Data Quality:</STRONG> Automated validation at the source prevents structural and logical "garbage" from entering the analytical ecosystem.</LI></UL><P> </P><P class=""><STRONG>SAP</STRONG> has made <STRONG>Data Products</STRONG> a central pillar of its analytics and data management portfolio, particularly through SAP Business Data Fabric. This approach allows enterprises, so the promise, to leverage quickly pre-defined, semantically rich products directly from the source by using the capabilities and rich semantics of the so-called CDS-views.</P><P class=""><STRONG>1. Previously in this theatre: Extractors as the Original Data Products</STRONG></P><P class="">While the term "Data Product" feels modern, the principle has existed for decades within the SAP ecosystem. <STRONG>Standard Extractors</STRONG> (in SAP BW) were the original proto-data products. They abstracted technical table and join complexities, managed reliable deltas, and preserved business logic (semantics) during extraction. While remaining a vendor-specific approach, it paved the way for SAP BW as the central DW solution for SAP centric data environments. Can 0MATERIAL also be seen a data product, with (most of) its extraction logic encapsulated in the 0MATERIAL* extractors?</P><P class="">The Shift Left movement is essentially an extension of this principle. We can now imagine, also for the “modern data stack”, that these "extractors" are moving from proprietary silos to a world where every system—SAP or otherwise—is expected to publish its data via an open, governed <STRONG>data Contract on business entity level</STRONG>.</P><P class=""><STRONG>2. The System-agnostic Catalog as a consequence</STRONG></P><P class="">As the definition of a data products hence relies in the source system, in a Shift Left world, the Data Catalog should no longer be perceived as a component exclusive to the analytics environment. It becomes a globally valid, <STRONG>Cross-System Registry</STRONG>, centring and nurturing itself where the business logic actually lives: and this is, if consequently applied, in the ERP and backend systems.</P><P class="">In this model, the backend systems act as the <STRONG>Semantic Anchor</STRONG>. Definitions, business rules, and schemas are registered at the source. The Enterprise Catalog then acts as a discovery layer, allowing the analytical team to "subscribe" to these definitions rather than attempting to recreate them through guesswork.</P><P class="">While a Corporate Knowledge Graph can technically be built using traditional "Shift Right" methods—by retroactively mapping data in a central hub—the <STRONG>Shift Left approach acts as a fundamental accelerator</STRONG>. By mapping business entities and their relationships directly at the source, the organization moves toward a "Data Network" or <STRONG>Corporate Ontology</STRONG>. Again, SAP is going this way.</P><P class=""><STRONG>3. Effects on the Medallion Architecture</STRONG></P><P class="">A consequent Shift Left implementation matures the traditional Medallion Architecture (Bronze -> Silver -> Gold) into a <STRONG>Readiness Hierarchy</STRONG>:</P><P class=""> </P><UL><LI><STRONG>The Fading of Bronze:</STRONG> The raw, messy landing zone becomes obsolete. Because data is validated against a contract at the source, it bypasses the "dump" phase.</LI><LI><STRONG>Silver as the "Data Product Entry":</STRONG> The Silver layer is no longer a processing stage for data engineers. It is the <STRONG>published interface</STRONG> of the backend team—a collection of high-quality, documented and understandable Data Products.</LI><LI><STRONG>Gold as the space of synergy and joint interpretation:</STRONG> Gold remains the space for interpretation and combination. Here, products from different domains (e.g. Finance and sales) are joined to create holistic, cross-functional metrics and analytical applications of many types are brought into live.</LI></UL><P> </P><P class=""><STRONG>4. Shift Left architectures also imply a significant shift in roles and responsibility</STRONG></P><P class="">The most significant challenge for implementing consequently a shift-left architecture is the <STRONG>migration of accountability</STRONG>. This requires a fundamental change in team dynamics:</P><DIV class=""><DIV class=""><DIV class=""><P> </P></DIV></DIV></DIV><P class="">Specifically for SAP environments, it is important to note that SAP has effectively taken on this responsibility. As the owner and architect of the underlying backend application logic, SAP is increasingly expected to deliver corresponding data products and semantic definitions directly. This shifts the burden of initial structural definition from the customer's internal data team back to the software vendor, who must deliver accordingly to support the modern data fabric.</P><P class="">However, for <STRONG>customer-owned data producers</STRONG> (custom-built applications and microservices), the consequences are more demanding. These teams could no longer view data as a byproduct of their application. They must take on the role of a data provider, which implies:</P><P class=""> </P><UL><LI><STRONG>Increased risk of Technical Debt:</STRONG> Development teams must build and maintain the infrastructure for data contracts and delta determination.</LI><LI><STRONG>Skill Gap Challenges:</STRONG> Backend developers must acquire data modeling and semantic layering skills that were previously the domain of data engineers.</LI><LI><STRONG>Operational Overhead:</STRONG> Data quality incidents are no longer "downstream problems" but production bugs that must be resolved by the source team.</LI></UL><P> </P><P class="">A major strategic advantage of this shift is that <STRONG>Data Quality (DQ) and DQ fulfillment metrics</STRONG> are now located exactly where the data is born. By addressing DQ problems at the source, organizations can identify and resolve the root causes of errors in real-time, rather than applying reactive patches downstream. This proximity inherently leads to higher data quality and more accurate business reporting.</P><P class="">How other software vendors will act, and how quickly internal development teams can pivot to this new standard of accountability, remains to be seen.</P><P class=""><STRONG>5. A critical technical thing: Trustworthy Deltas</STRONG></P><P class="">For Shift Left to sustain larger datasets, it relies on <STRONG>reliable delta determination with often cross-table dependencies in rather complex relations</STRONG>. able-oriented delta mechanisms introduce complexities on the receiver side when it comes to rebuilding the correct delta image of the businesss entity. Hence, the backend must provide a stream of changes on business entity level, not just a dump of data. Whether through transaction-like Log-Based CDC or the Transactional Outbox Pattern, the backend must guarantee that updates to complex, multi-table business objects are captured correctly. SAP’s multi-table CDC mechanism is right on track, even if for very large scenarios, more basic techniques could become necessary.</P><P class=""><STRONG>6. Dealing with Fragmented Sources</STRONG></P><P class="">A common real-world problem occurs when multiple systems describe the same global entity. For example, three different regional SAP instances may each manage their own "Material" data. In a strict Shift-Left model, you cannot force a single backend to own the "Global Material" if they don't actually manage the other regions' data. Instead of defaulting to a centralized "Gold Layer" cleanup, consider these alternatives:</P><P class=""> </P><UL><LI><STRONG>Regional Responsibility:</STRONG> Each regional system stays responsible for its own data product (the "Silver" layer). North America and Europe each publish their own high-quality Material products according to a shared contract.</LI><LI><STRONG>Federated Identity Resolution:</STRONG> The challenge of unification is addressed by establishing a shared <STRONG>Semantic Standard</STRONG> across contracts. By mandating that each regional source provides a common global identifier (e.g., a GTIN or an MDM-supplied Global ID) as a mandatory field in its data contract, the technical "merger" is downgraded from a complex transformation to a simple join or union. This virtualization allows for a global view where source-specific attributes are treated as extensions of a core, harmonized entity, preserving local ownership while enabling global interpretability.</LI><LI><STRONG>MDM Products:</STRONG> A dedicated Master Data Management (MDM) team publishes a "Linkage Product." Consumers then "subscribe" to both the regional Data Product and the Linkage Product to build their own global view, keeping ownership decentralized. This is a solution approach for many customers with spread out, but logical equivalent master data objects.</LI><LI><STRONG>Analytical Extensions (ML-driven Attributes):</STRONG> Not all master data attributes are generated in the operational source. Advanced attributes like "Customer Lifetime Value" or "Churn Risk Clustering" are often the result of analytical processes or ML models. These should be treated as <STRONG>Data Product Extensions</STRONG>. they exist as specialized "Sidecar Products" that share the same global identifier. Consumers combine the operational "Silver" product with the analytical "Extension" product via a join. This is not contradictory to so-called “Closed loop scenarios”; rather, it provides a clean architectural separation. While the analytical insight is consumed as a product, the actual write-back of results into the ERP remains an operational task, ensuring the semantic definition is enriched without burdening the operational backend with analytical overhead.</LI></UL><P> </P><P class=""><STRONG>7. Strategic Alignment: Adapting to Pre-Defined Semantics</STRONG></P><P class="">SAP has taken the lead by pre-defining data products in its SAP <STRONG>Business Data Fabric</STRONG>. While this provides instant maturity (starting at "Silver"), it implies that SAP provides the core business semantics and definitions. For the customer, the choice is no longer about building the definition from scratch, but about accepting and adapting to these standardized SAP semantics. This shifts the organizational responsibility from "designing the truth" to "adhering to the standard" provided by the software vendor.</P><P class=""><STRONG>Conclusion</STRONG></P><P class="">Consequential "Shift Left" is a clear organizational direction, but it is rarely a binary switch. It demands that backend teams stop viewing data as a byproduct and start treating it as a first-order deliverable. For many organizations, the trade-off remains a strategic choice: you either invest in upstream discipline and source-side engineering to achieve high-velocity scalability, or you rely on the proven, defensive strengths of a traditional Shift Right model to manage fragmented legacy landscapes.</P><P class="">The path forward will be usually a hybrid one, also given by constraints of each environment. While Shift Left provides the semantic foundation and quality required for AI-ready platforms in the place where data is generated, it succeeds only when the burden of stewardship is either automated by vendors like SAP or embraced as a core technical competency by internal development teams. Ultimately, the "chaos" of modern data is not solved by shifting it from one team to another, but by establishing a shared culture of accountability across the entire data value chain.<BR /><BR /><BR /></P><P class="">#ShiftLeft #DataArchitecture #DataProducts #DataStrategy #DataQuality #DataEngineering #Semantics #DataTeams</P>2026-01-25T08:00:00.027000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/building-a-graph-native-ai-chatbot-on-sap-btp/ba-p/14313174Building a Graph Native AI Chatbot on SAP BTP2026-01-25T11:36:50.119000+01:00ArunKumar_Balakrishnanhttps://community.sap.com/t5/user/viewprofilepage/user-id/11312<P class="lia-align-justify" style="text-align : justify;">The true intelligence and accuracy of an enterprise AI chatbot depends on one thing, <STRONG>Data Access.</STRONG> While most chatbots rely on standard API calls to fetch information, it hits a wall when dealing with complex, interconnected business logic.</P><P class="lia-align-justify" style="text-align : justify;"><STRONG>The Problem:</STRONG> Chatbot's rely on API calls to extract data from source system. APIs are designed for transactions and not data exploration. For example, if a user asks, 'Show me the total revenue impact of C4C service tickets related to S/4HANA orders delayed by supplier issues,' .</P><P class="lia-align-justify" style="text-align : justify;">A traditional chatbot is stuck. It would need to orchestrate a number of cross-system OData calls, join the data in memory, and calculate the results often resulting in high latency or timeouts.</P><P><FONT size="5"><STRONG>The Solution: SAP HANA Graph</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">The better approach is by combining the LLM with the Graph Schema in SAP HANA, we can bypass these limitations entirely. In this blog, I will show you how to build a Graph-Native Agentic AI that understands your data relationships intuitively.</P><P class="lia-align-justify" style="text-align : justify;">Instead of viewing data as API endpoints, we model it as a <STRONG>Property Graph</STRONG>. Here, the business entities (like Customers, Sales Orders, Service Tickets, Products) are represented as Vertex(Nodes) and their relations are the Edges.</P><P><STRONG>Deep-Node Traversals:</STRONG></P><P class="lia-align-justify" style="text-align : justify;">The graph allows the chatbot to traverse relationships instantly. In a single query it can hop from a <EM>Customer</EM> node --><EM>Sales Order</EM> node --><EM>Line Item</EM> node --> <EM>Product</EM> node, collect the necessary data and send it to the LLM which is then sent to the user.</P><P class="lia-align-justify" style="text-align : justify;">The graph engine natively supports aggregations(SUM,COUNT) directly in the database layer during node traversal</P><P class="lia-align-justify" style="text-align : justify;">We can physically link a <STRONG>C4C Service Ticket</STRONG> node to a <STRONG>S/4HANA Sales Order</STRONG> node. The graph schema identifies only the data relation and not the system of origin.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Representation of interconnected Business Objects that form a graph." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364455iF09B1CD161B3A7CE/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Blog Pic_Chart.png" alt="Representation of interconnected Business Objects that form a graph." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Representation of interconnected Business Objects that form a graph.</span></span></P><P> </P><P><FONT size="5"><STRONG>Let's See it in Action - Demo Video</STRONG></FONT></P><P>Conceptualizing a graph traversal can be difficult, so let’s look at how the chatbot handles a query involving Sales Cloud scenario:</P><P><FONT size="5"><STRONG><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2F4j0guZeZU4U%3Ffeature%3Doembed&display_name=YouTube&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D4j0guZeZU4U&image=https%3A%2F%2Fi.ytimg.com%2Fvi%2F4j0guZeZU4U%2Fhqdefault.jpg&type=text%2Fhtml&schema=youtube" width="400" height="225" scrolling="no" title="Graph Native Agentic AI Chatbot" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></STRONG></FONT></P><P><FONT size="5"><STRONG>Why I chose Property Graph?</STRONG></FONT></P><UL><LI><STRONG>vs. Knowledge Graphs:</STRONG> While Knowledge Graphs focus on semantics and inference (great for unstructured data), Property Graphs focus on structure and attributes, which fits enterprise data better.</LI><LI><STRONG>Rich Context:</STRONG> Relationship between Business Objects carry properties. For example, a relationship between a Customer and a Product isn't just a relationship link, it contains transaction details like: CUSTOMER --> BOUGHT {ProductID: 10005824, Date: 20/01/2026, Quantity: 5, Price: $100} --> PRODUCT</LI><LI><STRONG>Mathematical Aggregations: </STRONG>Property Graph engines are optimized to perform aggregations (Sum Revenue, Count Tickets, Avg) on the fly during data traversal. </LI><LI><STRONG>Accuracy:</STRONG> We rely on strict graph schema definitions to ensure the LLM generates accurate, hallucination free queries.</LI></UL><P><STRONG><FONT size="5">Architecture:</FONT></STRONG></P><P>To enable this real-time graph agent, I implemented a robust data fabric on SAP BTP.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Blog Pic_Architecture_2.png" style="width: 584px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364916iEA7495CEA3B8C605/image-dimensions/584x338/is-moderation-mode/true?v=v2" width="584" height="338" role="button" title="Blog Pic_Architecture_2.png" alt="Blog Pic_Architecture_2.png" /></span></P><P><STRONG>Real-time data fabric</STRONG></P><P class="lia-align-justify" style="text-align : justify;">We utilize SAP Datasphere to act as the pipeline for data source & replication. By replicating the raw tables into SAP HANA via Datasphere, we ensure the data is real-time. Once the data is replicated to HANA, the Graph Workspace is created on top of these tables, defining the vertexes (nodes) and edges (relationships) virtually.</P><P><STRONG>Handling Federated Data Scenarios</STRONG></P><P class="lia-align-justify" style="text-align : justify;">In cases where data cannot be not replicated to SAP HANA(volume constraints), the architecture adapts seamlessly. The Graph LLM retains its logic and identifies the necessary nodes(vertices) and query parameters based on the user prompt. Instead of generating a openCypher query these parameters are mapped to optimized OData calls to the source system. This ensures a streamlined, real-time federation strategy that shows the precision of a graph traversal.</P><P><STRONG>Schema Injection</STRONG></P><P class="lia-align-justify" style="text-align : justify;">The graph schema(the definition of Nodes and Edges) is injected to LLM at run time. Based on the user query the LLM will first identify the nodes, data traversal path and the logic.</P><P class="lia-align-justify" style="text-align : justify;">For the query: <EM>“Who are my top 5 customers by sales volume who also have open critical tickets?”</EM></P><P>The LLM determines the execution plan:</P><UL><LI><STRONG>Nodes Identified:</STRONG> BusinessPartner (Customer), SalesOrder, ServiceRequest</LI><LI><STRONG>Traversal Paths:</STRONG><UL><LI>BusinessPartner --> SalesOrder (To calculate volume)</LI><LI>BusinessPartner <-- ServiceRequest (To find the open tickets)</LI></UL></LI><LI><STRONG>Filters Applied:</STRONG><UL><LI>ServiceRequest.LifeCycleStatus = 'Open'</LI><LI>ServiceRequest.Priority = 'Critical'</LI></UL></LI><LI><STRONG>Aggregation:</STRONG> Sum of SalesOrder.NetAmount grouped by Customer.</LI><LI><STRONG>Ranking:</STRONG> Order by Sum Descending, Limit 5.</LI></UL><P><STRONG>openCypher Query Generation</STRONG></P><P class="lia-align-justify" style="text-align : justify;">Once the LLM identifies the graph query plan, the actual openCypher query is generated programmatically using python. I did not use LLM here as the openCypher wrapped in HANA is tricky and requires precise syntax.</P><P class="lia-align-justify" style="text-align : justify;">The openCypher query is executed directly against the SAP HANA Graph engine and the result is passed to the LLM to generate a natural language summary.</P><H3 id="toc-hId-1917710395"><STRONG>Transactional Updates (Write Scenarios)</STRONG></H3><P class="lia-align-justify" style="text-align : justify;">For maintaining transaction data, the Agentic AI brings in a mandatory human-in-the-loop confirmation step before any data is written via a POST request. You can see the technical implementation steps for this human intervention logic in my <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/the-power-of-ai-agents-my-journey-with-langgraph/ba-p/14032724" target="_blank">earlier blog post</A>.</P><H3 id="toc-hId-1721196890"><STRONG>User Interface & Deployment</STRONG></H3><P class="lia-align-justify" style="text-align : justify;">The core Chatbot agent is deployed on SAP BTP. For the front end, I utilized SAP Build Apps to create a clean, low-code user interface. The app connects to the BTP agent and provides a seamless chat experience for the business user.</P><H2 id="toc-hId-1395600666">Conclusion</H2><P class="lia-align-justify" style="text-align : justify;">By moving from API loops to Graph Traversals, we unlock the true potential of Enterprise AI. We get the speed of database-level aggregation with the flexibility of natural language processing.</P><H5 id="toc-hId-1586335318">Reference:</H5><P class="lia-align-justify" style="text-align : justify;">Refer this blog to understand about the implementation of Agentic AI Chatbot using LangGraph - <A class="" href="https://community.sap.com/t5/technology-blog-posts-by-sap/the-power-of-ai-agents-my-journey-with-langgraph/ba-p/14032724" target="_blank">The power of AI Agents: My Journey with LangGraph</A></P>2026-01-25T11:36:50.119000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/how-to-call-as-abap-from-sap-datasphere/ba-p/14313818How to call AS ABAP from SAP Datasphere2026-01-27T08:11:27.946000+01:00vitvashttps://community.sap.com/t5/user/viewprofilepage/user-id/136079<H2 id="toc-hId-1788634221">Introduction</H2><P>For communication between SAP systems based on AS ABAP, RFC connections are the standard method. For example, when extracting from SAP ERP, the SAP BW system first launches the extractor in the source system via a BW->ERP RFC connection. The source system then returns the data collected by the extractor via an ERP->BW RFC connection.</P><P>For SAP Datasphere (hereinafter referred to as DSP), the natural successor to this approach is the BW Bridge use. However, not all projects justify using BW Bridge in terms of system landscape, architecture, project scope, and implementation costs.</P><P>This blog proposes an approach that allows DSP to execute code in a system running AS ABAP (without using BW Bridge). We will call this code as target code. The idea is to leverage CDS functionality known such as Virtual Elements. A Virtual Element is a field in a CDS View Entity, specifically defined using annotations. This field is not retrieved from the database table level and is not calculated by formulas within the CDS View itself. Instead, accessing the CDS View triggers execution of specialized ABAP code that implements the calculation logic for this field. For BW specialists this may remember well-known virtual characteristics and key figures. On the other hand, the CDS View is an object accessible to DSPs, for example, as a Source for Data Flow. Thus, by running Data Flow Run on the DSP side, we automatically initiate execution of the code that calculates the Virtual Element values in AS ABAP. We can, in turn, include calls of any desired programs, functional modules, or class methods to this code—that is, achieve the execution of the target code.</P><P>Unfortunately, this approach has one significant drawback that complicates its implementation: Virtual elements are only function within OData services generated from CDS views. This means that we cannot use connections of type SAP S/4HANA On-Premise, SAP ABAP, or SAP BW on the DSP side. You will need to create a Generic OData connection.</P><P>Implementation plan is following:</P><OL><LI>Create a CDS View Entity in S/4HANA using a Virtual Element.</LI><LI>Create an ABAP class that implements the calculation of this Virtual Element. The class code will include the target code call.</LI><LI>Register and test the OData Service.</LI><LI>Review the permissions required for the technical user on whose behalf the connection will be made. These permissions should be broader than the standard permissions for connections of type SAP S/4HANA On-Premise or SAP ABAP.</LI><LI>Notes regarding creating a Generic OData connection in DSP.</LI><LI>Create a simple Data Flow in DSP using the OData connection to the created CDS View. Running this Data Flow (manually, scheduled, or as part of a Task Chain) will trigger our target code in S/4HANA.</LI></OL><H2 id="toc-hId-1592120716">CDS View Entity with Virtual Element</H2><P>We will create a dummy CDS View. It doesn't need to provide any meaningful data. We only need the ability to add a Virtual Element field to it. Therefore, as an example, we'll use a selection from the TKA01 (Controlling Areas) table. Typically, a single Controlling Area with a value of 1000 is used. Based on this assumption, we'll create the CDS View. Of course, you can use other selections that are guaranteed to return a single record in your system. The following code is proposed:</P><pre class="lia-code-sample language-abap"><code>@EndUserText.label: 'CDS Virtual Element Test'
@analytics.dataCategory: #DIMENSION
@AccessControl.authorizationCheck: #NOT_REQUIRED
@vdm.viewType: #BASIC
@Metadata.ignorePropagatedAnnotations:true
@ObjectModel.usageType.serviceQuality: #A
@ObjectModel.usageType.sizeCategory: #S
@ObjectModel.usageType.dataClass: #MASTER
@analytics.dataExtraction.enabled: true
@analytics.dataExtraction.delta.changeDataCapture.automatic: true
@ObjectModel.supportedCapabilities:[#EXTRACTION_DATA_SOURCE]
@AbapCatalog.viewEnhancementCategory: [#NONE]
@odata.publish: true
define view entity Z_CDS_VIRT_ELEM as select from tka01
{
key kokrs,
substring( cast( tstmp_current_utctimestamp() as abap.char(17) ), 1, 8 ) as dat,
@ObjectModel.readOnly: true
@ObjectModel.virtualElement: true
@ObjectModel.virtualElementCalculatedBy: 'ABAP:ZCL_CDS_VIRT_ELEM_CALC'
cast('' as abap.sstring( 1 )) as flag
}
where kokrs = '1000'</code></pre><UL><LI> The annotation at line 13 enables access to the CDS View via the OData Service.</LI><LI>The ‘dat’ field is added as a simple log: to see the date of the last calculation run.</LI><LI>The ‘flag’ field of the Virtual Element type is defined at line 22. It has the simplest form of a flag field. By default, as can be seen from the code, this field is empty. However, upon completion of calculations on the AS ABAP side, it will be filled with the value 'X'. This flag confirms at DSP side the successful completion of ABAP calculations.</LI><LI>The annotations at lines 19-21 define the "flag" field as a Virtual Element. In particular, the<SPAN> </SPAN><EM>@ObjectModel.virtualElementCalculatedBy</EM><SPAN> </SPAN>annotation specifies the name of the ABAP class that should calculate the ‘flag’ field.</LI></UL><P>When you initially activate this code, you'll receive, in particular, two warnings. For line 13 the system reports that the corresponding OData service is not active. For line 21 the system reports that the corresponding class doesn't exist. In the next steps, we'll resolve these issues.</P><P>If we use the context menu command 'Open With -> Data Preview' on our CDS View at this point, the result will be as follows:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_0-1769288322366.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364869i9093A3B6E8D1A137/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="vitvas_0-1769288322366.jpeg" alt="vitvas_0-1769288322366.jpeg" /></span>Here the 'kokrs' and 'dat' fields are populated using standard CDS syntax. The virtual 'flag' field is left blank.</P><H2 id="toc-hId-1395607211" id="toc-hId-1395607211">Creation of ABAP Class implementing our calculations</H2><P>A Virtual Element calculation implementing class must use two standard interfaces: IF_SADL_EXIT и IF_SADL_EXIT_CALC_ELEMENT_READ:</P><pre class="lia-code-sample language-abap"><code>class ZCL_CDS_VIRT_ELEM_CALC definition
public
final
create public .
public section.
interfaces IF_SADL_EXIT .
interfaces IF_SADL_EXIT_CALC_ELEMENT_READ .
protected section.
private section.
ENDCLASS.</code></pre><P>Second of these interfaces has two standard methods. A detailed description of these methods and their parameters can be found in the official documentation:<SPAN> </SPAN><A href="https://help.sap.com/docs/ABAP_PLATFORM_BW4HANA/cc0c305d2fab47bd808adcad3ca7ee9d/4b5e56ec6f28453f81ac370bd91fb06d.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/ABAP_PLATFORM_BW4HANA/cc0c305d2fab47bd808adcad3ca7ee9d/4b5e56ec6f28453f81a...</A></P><P>In our example, the following code is proposed for the GET_CALCULATION_INFO method:</P><pre class="lia-code-sample language-abap"><code>METHOD if_sadl_exit_calc_element_read~get_calculation_info.
IF iv_entity = 'Z_CDS_VIRT_ELEM'.
"Here we'll list which fields of CDS View will be used in virtual element calculation
IF line_exists( it_requested_calc_elements[ table_line = 'FLAG' ] ).
APPEND 'KOKRS' TO et_requested_orig_elements.
ENDIF.
ELSE.
" Potential errors should be processed by public constants,
" described in exceptional class, inherited from CX_SADL_EXIT.
" For example, if CDS View is unknown:
RAISE EXCEPTION TYPE zcx_ds_virt_elem_exceptions EXPORTING textid = zcx_ds_virt_elem_exceptions=>c_entity_not_known.
ENDIF.
ENDMETHOD.</code></pre><P><SPAN>Errors occurring during execution of the class described above must be handled using system messages. For this purpose, public constants will be created in a separate exception class. The class must be a child of the standard CX_SADL_EXIT class. An example of such an exception class is provided in the official documentation:</SPAN><SPAN> </SPAN><A href="https://help.sap.com/docs/SAP_NETWEAVER_AS_ABAP_751_IP/cc0c305d2fab47bd808adcad3ca7ee9d/8cf4517994764710be0486af54cfb8ab.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/SAP_NETWEAVER_AS_ABAP_751_IP/cc0c305d2fab47bd808adcad3ca7ee9d/8cf451799476...</A><SPAN> </SPAN></P><P>As we can see, after specifying the parent class, a small constructor override is required. Each error is then declared as a public constant attribute. Naturally, such attribute contains a system message, previously defined in the Message Class relevant to the task.</P><P>Since our CDS View is designed to be as simple as possible, the IT_REQUESTED_CALC_ELEMENTS table will contain a single row with the ‘FLAG’ field. The ET_REQUESTED_ORIG_ELEMENTS table of regular source fields also contains only one record with the ‘KOKRS’ field.</P><P>Let's move on to the CALCULATE method:</P><pre class="lia-code-sample language-abap"><code>METHOD if_sadl_exit_calc_element_read~calculate.
CHECK NOT it_original_data IS INITIAL.
DATA lit_calculated_data TYPE STANDARD TABLE OF z_cds_virt_elem WITH DEFAULT KEY.
MOVE-CORRESPONDING it_original_data TO lit_calculated_data.
LOOP AT lit_calculated_data ASSIGNING FIELD-SYMBOL(<lfs_calculated_data>).
"Here we need to add a call to the code that we wanted to initiate from the DSP side.
"For example, we can submit some program:
" SUBMIT z_some_program AND RETURN.
<lfs_calculated_data>-flag = 'X'. "Formal flag of the calculation readiness
EXIT. "Additional guarantee that calculation will be performed only once
ENDLOOP.
MOVE-CORRESPONDING lit_calculated_data TO ct_calculated_data.
ENDMETHOD.</code></pre><P>Our CDS View is intentionally designed to return exactly one record. This means that in the CALCULATE method, the IT_ORIGINAL_DATA table also contains exactly one record. Consequently, the LOOP/ENDLOOP loop will only be executed once, and the target code will also be called only once.</P><P>After the ZCL_CDS_VIRT_ELEM_CALC class is activated, the warning on line 21 of our CDS View code will disappear.</P><H2 id="toc-hId-1199093706" id="toc-hId-1199093706">OData Service activation</H2><P>Our CDS View annotation specifies that it can be accessed via an OData Service. When we activate the CDS code, the corresponding OData Service will be created but not activated. Activation must be performed manually in the /IWFND/MAINT_SERVICE transaction. The OData Service name consists of the name of our CDS View followed by the _CDS suffix. In our example, the resulting name is Z_CDS_VIRT_ELEM_CDS.</P><P>Click the 'Add Service' button:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_4-1769288322417.jpeg" style="width: 540px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364872i49DF386BD4A8539A/image-dimensions/540x264?v=v2" width="540" height="264" role="button" title="vitvas_4-1769288322417.jpeg" alt="vitvas_4-1769288322417.jpeg" /></span></P><P> <SPAN>An ‘Add Selected Services’ window will appear. Enter Z_CDS_VIRT_ELEM_CDS as the technical name, and select LOCAL from the list in the 'System Alias' field. Then click 'Get Services'. A line with our service should appear:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Screenshot 2026-01-22 144602.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364903iCE9192EFEDAD4077/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Screenshot 2026-01-22 144602.jpg" alt="Screenshot 2026-01-22 144602.jpg" /></span></SPAN></P><P> <SPAN>Select the row with service description and press the ‘Add Selected Services’ button:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Screenshot 2026-01-22 144916.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364904iBA1DDB808023FFBA/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Screenshot 2026-01-22 144916.jpg" alt="Screenshot 2026-01-22 144916.jpg" /></span></SPAN><SPAN>In the 'Package Assignment' field, specify the appropriate package for subsequent import of developed objects to the production system. In the 'ICF Node' section, select the 'SAP Gateway OData V2' setting. An important setting is the 'Enable OAuth for Service' checkbox. Check with your BASIS Team to determine whether OAuth will be used when connecting the DSP to the ABAP system. If so, enable this checkbox. After confirming the settings, service activation will begin. Upon completion, the following message will appear: </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_7-1769288322396.jpeg" style="width: 590px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364875i0DC57C6BA63E46F0/image-dimensions/590x204?v=v2" width="590" height="204" role="button" title="vitvas_7-1769288322396.jpeg" alt="vitvas_7-1769288322396.jpeg" /></span></P><P>Now you need to return to the main transaction window and filter the newly activated service in the list:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Screenshot 2026-01-22 151814.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364905iE2AFDB6B44E248EB/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Screenshot 2026-01-22 151814.jpg" alt="Screenshot 2026-01-22 151814.jpg" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Picture7.jpg" style="width: 903px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364906i35FBCA17C138A5F7/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Picture7.jpg" alt="Picture7.jpg" /></span></P><P> <SPAN>Check the traffic light in the 'Status' field (the leftmost field in the ICF Nodes section). If it's yellow, select the drop-down list from the 'ICF Node' button directly above it and execute the 'Activate' command:</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_10-1769288322438.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364878iCD05D2C4FD8B5D16/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="vitvas_10-1769288322438.jpeg" alt="vitvas_10-1769288322438.jpeg" /></span></P><P>The window for selecting a package for importing activated objects into production will appear again:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_11-1769288322453.jpeg" style="width: 513px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364879i5E60655091582319/image-dimensions/513x108?v=v2" width="513" height="108" role="button" title="vitvas_11-1769288322453.jpeg" alt="vitvas_11-1769288322453.jpeg" /></span></P><P>After this, the traffic light will change to green. You can now test the service by clicking the 'SAP Gateway Client' button:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_12-1769288322381.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364881i8D27B1695748C09C/image-size/large?v=v2&px=999" role="button" title="vitvas_12-1769288322381.jpeg" alt="vitvas_12-1769288322381.jpeg" /></span></P><P> By default, the test transaction specifies a URI that reads the service description:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_13-1769288322424.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364884iFD190E19E523142F/image-size/large?v=v2&px=999" role="button" title="vitvas_13-1769288322424.jpeg" alt="vitvas_13-1769288322424.jpeg" /></span></P><P>But we're interested in what entities this service can provide. This can be found by clicking the 'Entity Set' button:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_14-1769288322350.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364883iA33215136874B20E/image-size/large?v=v2&px=999" role="button" title="vitvas_14-1769288322350.jpeg" alt="vitvas_14-1769288322350.jpeg" /></span></P><P>Here we see that the service can read data for CDS View Z_CDS_VIRT_ELEM. By double-clicking this name, we can change the URI:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_15-1769288322373.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364886i2B5637354B48010F/image-size/large?v=v2&px=999" role="button" title="vitvas_15-1769288322373.jpeg" alt="vitvas_15-1769288322373.jpeg" /></span></P><P>If you now click the 'Execute' button, data will be read from the CDS View Z_CDS_VIRT_ELEM via the OData Service. This is precisely the operation that should trigger the calculation of the 'flag' field via the ABAP class described above. If you need to study or debug this class, you can set breakpoints there before clicking 'Execute'. The calculation results will be presented here in XML format:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Screenshot 2026-01-22 161759.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364914i69CCC583187E5E8E/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="Screenshot 2026-01-22 161759.jpg" alt="Screenshot 2026-01-22 161759.jpg" /></span><SPAN>Here you can see that the 'flag' field is filled with the value 'X'. This means that the CALCULATE method was executed along with the target code call.</SPAN></P><H2 id="toc-hId-1002580201" id="toc-hId-1002580201">System user authorizations</H2><P>When creating a DSP connection to an ABAP system, a special user of the System type is used. Typically, the BASIS Team assigns this user a standard set of permissions sufficient for connections of SAP S/4HANA On-Premise, SAP ABAP, or SAP BW types. Experience has shown that a Generic OData connection requires a number of additional permissions.</P><P>First, you need to add authorizations to launch the created OData Service. This is the S_SERVICE authorization object. Editing it differs from most standard authorization objects. After adding the object to the role profile, click the edit button for the SRV_NAME field. In the window that appears, select 'TADIR Service' from the Type drop-down list, which will display a list of services allowed to launch:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_17-1769288322449.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364885i8650D66A295FF367/image-size/large?v=v2&px=999" role="button" title="vitvas_17-1769288322449.jpeg" alt="vitvas_17-1769288322449.jpeg" /></span></P><P>We need to add two types of objects for our service: IWSG and IWSV. First, select the appropriate type from the list in the 'Object Type' column. Then, use the search (for example, by pressing F4) in the 'Object Name' column. In both cases, you can use the name of our service (i.e. - Z_CDS_VIRT_ELEM_CDS) to search for the desired value:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_18-1769288322501.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364889i87FCE802AE976638/image-size/large?v=v2&px=999" role="button" title="vitvas_18-1769288322501.jpeg" alt="vitvas_18-1769288322501.jpeg" /></span>The name of the object being searched consists of the service name and the ordinal number of the active version of this service (four digits). For the first version of service in our example, the IWSG type value is 'Z_CDS_VIRT_ELEM_CDS_0001', and the IWSV type value is 'Z_CDS_VIRT_ELEM_CDS 0001'.</P><P>After saving the selected values, you will see that the value 'HT' is automatically added to the SRV_TYPE field. Thus, the S_SERVICE authorizations have been added and the values specified.</P><P>It is also important to note that specific authorizations may be required, which depend on the specific target code you want to run in the CALCULATE method described above. For example, if you plan to use a command like 'SUBMIT z_some_program AND RETURN' to launch a program, you will need the corresponding authorizations in the S_PROGRAM object. If the target code performs, for example, calculations related to the FI module, the system user may require authorizations specific to that module. If the target code accesses the contents of database tables, authorizations in the S_TABU_DIS and S_TABU_NAM objects may be required. And so on. Most likely, during the OData service testing phase (as described in the previous section), you will need to collaborate with the BASIS Team to monitor missing authorizations and add them to the system user's roles. The simplest solution is to temporarily (and only in the development system!) change the user type from system to dialog. In this case, you can log in to the SAP GUI with this user's login and manually run OData service testing while simultaneously monitoring the authorizations being checked.</P><P>The security issue of assigning such non-typical authorizations to a system user is mitigated by the fact that this user is created only to connect DSP to the ABAP system and cannot be used for manual data access, like a dialog user.</P><H2 id="toc-hId-806066696" id="toc-hId-806066696">DSP OData Connection</H2><P>Creating a Generic OData connection should be performed by BASIS Team specialists, as the required settings depend on the system landscape used in your specific project, as well as the security settings applied (for example, whether OAuth is enabled). Here's an example of the settings used in our project:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_19-1769288322316.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364890iBB98D5C13345890F/image-size/large?v=v2&px=999" role="button" title="vitvas_19-1769288322316.jpeg" alt="vitvas_19-1769288322316.jpeg" /></span><SPAN>The address entered in the ‘URL’ field ends with the technical name of our OData service. Therefore, a separate connection will need to be created for each OData service.</SPAN></P><P>In the ‘Version’ field, V2 is selected. This is because we specified OData V2 when activating the service above and didn't have the option to select V4:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_20-1769288322477.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364888i9FAF1FC8B0406D18/image-size/medium?v=v2&px=400" role="button" title="vitvas_20-1769288322477.jpeg" alt="vitvas_20-1769288322477.jpeg" /></span></P><P>The example also shows that OAuth is used. That's why the corresponding checkbox was enabled earlier when activating the service:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_21-1769288322383.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364891i4E5A1CDB1A371F13/image-size/medium?v=v2&px=400" role="button" title="vitvas_21-1769288322383.jpeg" alt="vitvas_21-1769288322383.jpeg" /></span></P><P>At the bottom of the connection settings window, click the ‘Show Advanced Properties’ button:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_22-1769288322357.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364892iAB63E00A23049750/image-size/large?v=v2&px=999" role="button" title="vitvas_22-1769288322357.jpeg" alt="vitvas_22-1769288322357.jpeg" /></span></P><P>Here you need to specify for the OData service the client number of the ABAP system we are connecting to:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_23-1769288322479.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364893iB6A73B22B92EA7E5/image-size/large?v=v2&px=999" role="button" title="vitvas_23-1769288322479.jpeg" alt="vitvas_23-1769288322479.jpeg" /></span></P><P> </P><H2 id="toc-hId-609553191" id="toc-hId-609553191">DSP Data Flow to use OData connection and trigger the target code</H2><P>We need to create a Data Flow with our OData service as a source. To do this, open Data Builder (in the same Space where you created OData service connection). In the left pane, select the 'Sources' tab and open the 'Connections' folder:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_24-1769288322401.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364896i7546FBCE3CF163EA/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="vitvas_24-1769288322401.jpeg" alt="vitvas_24-1769288322401.jpeg" /></span></P><P>If all connection settings and system user authorizations were configured correctly, you will see the connection, and when you expand it, you will see the name of our CDS View:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_25-1769288322407.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364894i654970E6A0C0836C/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="vitvas_25-1769288322407.jpeg" alt="vitvas_25-1769288322407.jpeg" /></span></P><P><SPAN>The icon next to the CDS View is the same as for a regular table. You can drag and drop this View into the Data Flow being created. It will be automatically recognized as a Source. At this step, you can select the 'Preview Data' command for this table:</SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_26-1769288322381.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364895i5C83380C45F8F4B1/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="vitvas_26-1769288322381.jpeg" alt="vitvas_26-1769288322381.jpeg" /></span></P><P>The result should look like this:<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_27-1769288322289.jpeg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364897iEEB3120BABBCC2AC/image-size/medium?v=v2&px=400" role="button" title="vitvas_27-1769288322289.jpeg" alt="vitvas_27-1769288322289.jpeg" /></span></P><P>The presence of an 'X' in the 'flag' field means that the target code has been executed on the ABAP system side. If the target code execution takes a long time, you will also notice the delay when viewing it in the DSP.</P><P>As a final step, you need to complete the Data Flow by adding the target table to it, which in this case only serves as a dummy. Save the Data Flow and deploy it.</P><H2 id="toc-hId-413039686">Result</H2><P>Now you can initiate execution of the target code in the ABAP system in any way convenient for you: by manually starting the Data Flow, scheduling it, or including the Data Flow in a Task Chain.</P><P> </P><P> </P><H3 id="toc-hId-542122405" id="toc-hId-345608900"><EM>Note: Data Preview is not working for Virtual Element</EM></H3><P>As a result of our implementation, OData Service is activated and implementing ABAP class exist for our Virtual Element. And we can repeat attempt of CDS data preview in ABAP Development Tools (CDS editor context menu command 'Open With -> Data Preview'). Unfortunately, result will be the same: field ‘flag’ is empty: </P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vitvas_28-1769288322404.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/364898i13E8077EA80F571A/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="vitvas_28-1769288322404.jpeg" alt="vitvas_28-1769288322404.jpeg" /></span>This is explained by the fact that the Virtual Element calculation is performed only when accessing the CDS through OData Service.</P><P> </P>2026-01-27T08:11:27.946000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/bw-like-authorizations-multiple-dim-dac-vs-single-dim-dacs/ba-p/14320103BW-Like Authorizations Multiple-Dim DAC vs Single-Dim DACs2026-02-02T22:45:26.816000+01:00Martin_Kumahttps://community.sap.com/t5/user/viewprofilepage/user-id/275733<P><STRONG>One Multiple-Dimensions DAC or Multiple Single-Dimension DACs</STRONG><BR />Basically<BR />If the Restriction is equal, then the DAC considers the Criterion entries as an "AND"<BR />If the Restriction is not-equal, then the DAC considers the Criterion entries as "OR"</P><P><STRONG>Use Case:</STRONG><BR />We have multiple RSIOBJNM-like f.e.: 0PLANT, 0COMP_CODE and we would like to implement RSINFOCUBE-like access as well.</P><P><BR /><STRONG>One Multiple Dimensions DAC: </STRONG><BR />In DSP it is not mandatory to have all Criterions used in DAC populated. Enough is one Restriction/Criterion row.</P><P><U>Example:</U><BR />User has maintained access to 0PLANT, but 0COMP_CODE is missing.<BR />DSP: User will see maintained PLANTS and ALL COMP_CODEs<BR />BW: User will get "No authorizations" message</P><P>DSP does not check all Restriction columns used in DAC, like BW does. If a User has access to one Dimension, user has access to * for all other Dimensions used in the DAC.</P><P>If the source for the authorization view/table is BW (loaded regularly from BW system into a secure DSP space), then we always should have both dimensions correctly populated. But in case not, we bypass security.</P><P><BR /><STRONG>Multiple Single-Dimension DACs:</STRONG><BR />Better approach is to use multiple DACs --> implicit "AND", with the mandatory requirement to have at least one Restriction/Criterion row, all DACs must have the user populated, else nothing will be shown.<BR />Example:<BR />User has maintained access to 0PLANT, but 0COMP_CODE is missing.<BR />DSP: User will get "No authorizations" message<BR />BW: User will get "No authorizations" message</P><P>We implicitly ensure, that the user has to be authorizes for all in DAC used Dimension. BW-Like behavior.</P><P> </P><P><STRONG>How-to simulate Multiple RSIOBJNM (compounded IOs as well):</STRONG><BR />Create Multiple (Single Dimension) DACs for each RSIOBJNM<BR />Implicit "AND" ensures, that security will always fail in case the user has no values assigned to DAC relevant Dimension.</P><P><STRONG>How-to simulate RSINFOCUBE with RSIOBJNM:</STRONG><BR />Create Multiple (Single Dimension) DACs for each RSIOBJNM and RSINFOCUBE<BR />For the RSINFOCUBE: create a new dimension in each DAC relevant View, simulating 0INFOPROV. Assign PROVIDER_DAC to the new dimension.<BR /><STRONG>How-to simulate RSINFOAREA:</STRONG><BR />Visibility cannot be adjusted withing a Space, but if RSINFOCUBE-like access is implemented, the users will not be able to access the data anyway.</P><P><BR /><STRONG>Possible use-cases for Multiple-Dimensions DAC :</STRONG><BR />MasterData based DACs: Based on single Master Data objects, for example: all of it's attributes can be moved into single DAC. Easier maintenance/support. User is not expected to be for each MD-Attribute populated. </P><P>Touples Dimensions based DACs: two different Dimensions (same master data/different view of the data) from the same source field</P><P> </P><P>BTW: <STRONG>Nice new feature in DSP(RSECADMIN </STRONG>Execute-As): Now possible, however only in Views (not in Models / AMs yet).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Martin_Kuma_0-1770067826568.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368347iAA7A94F07F054F47/image-size/medium?v=v2&px=400" role="button" title="Martin_Kuma_0-1770067826568.png" alt="Martin_Kuma_0-1770067826568.png" /></span></P><P> </P><P> </P><P><STRONG>Initial BW-Sec to DSP-Sec Blog: </STRONG><A href="https://community.sap.com/t5/technology-blog-posts-by-members/bw-like-authorizations-in-datasphere-dsp/ba-p/14153918" target="_blank">https://community.sap.com/t5/technology-blog-posts-by-members/bw-like-authorizations-in-datasphere-dsp/ba-p/14153918</A></P><P><STRONG>Main Blog:</STRONG><SPAN> </SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/bw-vs-datasphere-dsp-amp-sac/ba-p/14289847" target="_blank">https://community.sap.com/t5/technology-blog-posts-by-members/bw-vs-datasphere-dsp-amp-sac/ba-p/1428...</A></P><P> </P>2026-02-02T22:45:26.816000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/a-i-assisted-transition-at-your-own-pace-bw7-5-to-bdc-pilot-experiences/ba-p/14320734A.I Assisted Transition at your own Pace: BW7.5 to BDC Pilot Experiences2026-02-05T12:03:31.021000+01:00Rob_Lightfoothttps://community.sap.com/t5/user/viewprofilepage/user-id/130784<P>This blog is part of a series related to a complex BW7.5 customer case where we explored together the opportunity to transition to BDC. You can find the intro blog below, plus the links to the subsequent blogs will be listed here -</P><UL><LI><STRONG> Link 1</STRONG></LI><LI><STRONG>Link 2 </STRONG></LI><LI><STRONG>Link 3</STRONG></LI></UL><P class="lia-indent-padding-left-30px" style="padding-left : 30px;"><STRONG>The Time is Now </STRONG></P><P>At the time of typing this it is still early 2026. The year’s priorities and goals are being set. For many of our BW customers, the end of maintenance deadline set at the end of 2027 for NW 7.5 is looming large. Questions such as the below are being asked –</P><UL><LI>How can we maintain what we have built over many years without rebuilding from the ground up?</LI><LI>How can we make use of the latest and greatest from A.I?</LI><LI>Can we continue to use our existing skills?</LI><LI>Can we remain compliant to our regulatory requirements?</LI></UL><P>The goal of this series of blogs is to outline what customers on BW 7.5 can do to modernise their mission critical system of engagement at their own pace whilst embracing A.I, ensuring they avoid expensive regret cost migrations to third party platforms that can often spiral, seldom delivering on the initial promise. These blogs are based on first hand experiences in a pilot project for BW7.5 to BDC.</P><P>Interested to learn more? Read on</P><P class="lia-indent-padding-left-30px" style="padding-left : 30px;"><STRONG>The Problem Statement</STRONG></P><P>BW 7.5 is the last version of the ABAP stack SAP Business Warehouse that was initially released in the late 1990’s, it delivers out of the box capabilities (business content, extractor logic, authorisation concepts etc) to manage complex third normal form data structures from SAP ECC (legacy SAP application around finance, HR, supply chain etc) across a layered scalable architecture (LSA). In 2016, the ‘new’ BW was released called BW/4HANA with a roadmap to 2040+, triggering a timeline that will culminate in 2027 with the end of mainstream maintenace of the ABAP version BW7.5.</P><P>However, many customers for a variety of reasons chose not to move to BW4. Either they have too much logic built in objects not easily ported (for example APDs, BEx) or they were enticed by promises made by best of breed data vendors in cloud native data lake architectures to migrate their BW workload. Often these project spiral in costs and effort without delivering sufficient value.</P><P class="lia-indent-padding-left-30px" style="padding-left : 30px;"><STRONG>The Solution </STRONG></P><P>SAP Business Data Cloud (BDC), released in February 2025 is SAP’s fully managed SaaS solution built for modern workloads and to natively handle data-as-a-service from across the SAP Business Suite applications (S/4HANA Public and Private, SuccessFactors, Concur etc). This is achieved at the click of a button whereby ‘Data Products’ are loaded and kept in sync with backend transactional systems, removing the leg work for customers looking to get rapid value from their data. Business Content and Intelligent applications are installed via the BDC Cockpit, surfacing KPIs for decision making across a host of business areas, with a vast amount more to come in 2026. With BDC, SAP Databricks, SAP Snowflake and other best of breed solutions work in tandem with SAP Datasphere on the semantically rich data products providing customers a unified data architecture to deliver next generation A.I use cases.</P><P>What adds to the excitement of BW customers setting their new year goals? BDC is the go-to solution for SAP Business Warehouse. BDC provides a strong roadmap for customers to modernise their BW7.5 environment at their own pace, retaining their investments, their skills, keeping their data available and opening their rich sets of historic business data for A.I. This is achieved through the ‘Lift, Shift & Innovate’ approach that allows for large complex BWs to move to a managed cloud service (Private Cloud Edition) of BW within the formation of SAP Business Data Cloud.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Picture1 replace.jpg" style="width: 743px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369170i8C9BCE896DC26D62/image-dimensions/743x388?v=v2" width="743" height="388" role="button" title="Picture1 replace.jpg" alt="Picture1 replace.jpg" /></span></P><P>The <STRONG>Lift</STRONG> phase moves existing SAP BW systems as-is to SAP-managed cloud infrastructure, extending maintenance timelines (2030 instead of 2027) and reducing operational overhead. The <STRONG>Shift</STRONG> phase exposes SAP BW data as Data Products using the Data Product Generator tool, enabling integration with SAP Datasphere and external data sources while preserving existing use cases. The <STRONG>Innovate</STRONG> phase explores SAP-managed Data Products and Intelligent Applications to replace legacy SAP BW scenarios with modern analytics, AI capabilities, and access to an open partner ecosystem. This phased approach mitigates risk by maintaining existing investments while providing a clear path to cloud-native analytics and reduced Total Cost of Ownership.</P><P>There are some prerequisites for your BW system to fulfil prior to embarking on the journey with BDC, the minimum version supporting the data product generator is SPS 24 and it needs to be running on HANA.</P><P>Many large global organisations reliant on BW for their strategic reporting, planning/business steering have already begun the transition to BDC to realise the benefits of modernisation to cloud native at their own pace.</P><P class="lia-indent-padding-left-30px" style="padding-left : 30px;"><STRONG>‘</STRONG><STRONG>Vibe Migration</STRONG><STRONG>’</STRONG></P><P>In Autumn/Winter 2025 the SAP EMEA and Global Solution Advisory teams worked cohesively on a pilot to demonstrate the BW to BDC move for a specific customer case.</P><P>This case assessed all aspects of a complex BW7.5 system after the Lift to PCE, demonstrating how the Shift and Innovate phases can look using the modern tools available in SAP BDC with a focus on SAP Datasphere, SAP Analytics Cloud and Artificial Intelligence. Throughout the subsequent blog posts we will share the findings with specifics on how end of life (EoL) objects such as APDs, BEx and other scenarios can be covered either through automated migration to SAP BDC Datasphere, A.I assisted conversion or rebuild.</P><P>Please stay tuned for the next in this series of blog posts related to the Data Product Generator. </P><P> </P>2026-02-05T12:03:31.021000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/sap-business-data-cloud-and-datasphere-news-in-january/ba-p/14320094SAP Business Data Cloud and Datasphere News in January2026-02-05T23:27:35.276000+01:00kpsauerhttps://community.sap.com/t5/user/viewprofilepage/user-id/14110<P><STRONG>SAP Business Data Cloud and Datasphere News in January</STRONG></P><P>This year we really started with a bang!<BR />You will see many great features and enhancements being delivered with our three releases already in the first month of the year.</P><P>Learn more in my short top features video in January 2026 on YouTube <span class="lia-unicode-emoji" title=":television:">📺</span></P><P>In addition, explore the latest updates in our community news blogs and more.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2026-01 Datasphere News Blog.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368864i813A0D00CCB06529/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="2026-01 Datasphere News Blog.png" alt="2026-01 Datasphere News Blog.png" /></span></P><P> </P><H2 id="toc-hId-1789460925">My top features in January</H2><P>2026 really starts with a firework of excellent new features. Therefore, this list is a lot longer than usual.</P><P> </P><H3 id="toc-hId-1722030139">Task Chains & Scheduling</H3><H4 id="toc-hId-1654599353">Task ports of tasks in Task Chains</H4><P>You can now use task ports to provide enhanced flexibility and error handling in task chains. Task ports allow you to control the flow of task based on success or failure outcomes. Additionally, the new <EM>Ignore Error</EM> feature allows you to disregard the status of a specific task from the overall task chain status evaluation. This feature is beneficial for maintaining workflow continuity, managing errors effectively, and customizing task execution paths.<BR /><SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span> </SPAN><SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/04dcfa7fcf374e8798c9807cfe612c0c.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">More information</A></SPAN></P><P> </P><H4 id="toc-hId-1458085848">Enable Auto-Retry for a Task in a Task Chain</H4><P>The auto-retry feature allows up to three automatic retries for a task in a task chain. This feature will automatically retry failed tasks to help complete a task in a task chain and improve resilience. <BR /><SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span> </SPAN><SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/d1afbc2b9ee84d44a00b0b777ac243e1.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">More information</A></SPAN></P><P> </P><H4 id="toc-hId-1261572343">Technical user for scheduling </H4><P>You can now assign a technical user to schedule data integration tasks. This allows you to minimize the number of users that manage schedules. In addition, this avoids failed schedules and tasks related to expired consent and authorization.<BR /><span class="lia-unicode-emoji" title=":link:">🔗</span><SPAN> More information: </SPAN><SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/4b660c0395454bd0923f732eef4ee4b2.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Modify the Owner of a Schedule</A></SPAN><SPAN> and </SPAN><SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/7fa07621d9c0452a978cb2cc8e4cd2b1.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Scheduling Data Integration Tasks</A></SPAN><SPAN>.</SPAN></P><P> </P><H4 id="toc-hId-1065058838">Manage Tasks with REST APIs </H4><P><SPAN>You can use REST APIs to run task chains and retrieve task logs and existing task log history without using the Command Line Interface (CLI). In addition, you can now also use a new CLI command to get flexible and controlled log information.<BR /><span class="lia-unicode-emoji" title=":link:">🔗</span></SPAN><SPAN> More information </SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/274f2736465c4c48a091c675880502a2.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Manage Tasks Using REST APIs</A><SPAN> and </SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/2b26a31f197444dea314495bc0008eae.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Manage Tasks and Task Chains via the Command Line</A><SPAN>.</SPAN></P><P> </P><H3 id="toc-hId-739462614">Analytic Model: New functions in the Expression Editor (YTD, QTD, MTD)</H3><P><SPAN>For calculated and restricted measures, as well as for calculated and restricted structures, the new standard functions YTD, QTD, MTD and %GrandTotal are now supported in the expression editor for these measures in the Analytic Model.</SPAN></P><P> </P><H3 id="toc-hId-542949109">Replication Flow</H3><H4 id="toc-hId-475518323">Reuse a target table of one replication as a source of another replication flow </H4><P>You can reuse a target local table (with or without delta capture enabled) of an existing replication flow as a source table in another replication flow. It will automatically create a relationship between the two replication flows and allows now multi-step replication scenarios. This simplifies the distribution of data from SAP Datasphere to multiple downstream systems.<BR /><SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span> </SPAN><SPAN><A href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/d570eee0045a4b9ab5d47ac70140d60a.html?ai=true&locale=en-US&version=LATEST" target="_blank" rel="noopener noreferrer">More information</A></SPAN><SPAN>.</SPAN></P><P> </P><H4 id="toc-hId-279004818">Support parquet format <SPAN>for cloud storage sources</SPAN></H4><P>You can now replicate data stored in parquet files when using the following cloud storage providers as the source of your replication flow:</P><UL><LI>Amazon Simple Storage Service</LI><LI>Google Cloud Storage</LI><LI>Microsoft Azure Data Lake Gen2</LI></UL><P><span class="lia-unicode-emoji" title=":link:">🔗</span> <SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/4d481a2c620f4b52ba65b360299d7719.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">More information</A></SPAN>.</P><P> </P><H3 id="toc-hId--121822775">Partitioning of local tables with data</H3><P>You can create partitions on existing local tables that contains existing data stored in a standard space (HANA Cloud storage and compute). That way you can break down your data into smaller tables and better manage tables with a large volume of data.</P><P>At deployment you can decide if the partitioning task happens</P><UL><LI><U>Automatically After Deployment</U>: Once the deployment is complete, the partition creation task will start automatically.</LI><LI><U>Automatically at a Scheduled Time</U>: You define a timestamp when the partitions must be created. Once deployment is complete, the partition creation task will start automatically at the scheduled timestamp.</LI><LI><U>Manually at Later Time</U>: Once the deployment is complete, you will be able to create the partitions when you want.</LI></UL><P><span class="lia-unicode-emoji" title=":link:">🔗</span> <SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/03191f36e9144b2aaa47b8c9eea039c1.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">More information</A></SPAN>.</P><P> </P><H3 id="toc-hId--318336280">Longer technical names of data builder objects</H3><P>We increased maximum length for technical names for data builder objects and elements that now allow for more descriptive and detailed naming conventions.<BR />The maximum length has changed for the following objects and elements:</P><UL><LI>Views, tables, E/R models: 100</LI><LI>Views with semantic usage Analytical Dataset (Deprecated): 60</LI><LI>Analytic Models: 60</LI><LI>Hierarchies created inside a dimension view: 20</LI><LI>Columns: 100</LI><LI>Input parameters: 100</LI></UL><P><span class="lia-unicode-emoji" title=":link:">🔗</span> <SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/982f9a30d4ab49c8b019cfaf3dc08391.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">More information</A></SPAN>.</P><P> </P><H3 id="toc-hId--514849785">BW Modernization</H3><P>We also delivered major milestones for the BW modernization scenarios. Metadata extraction of SAP BW 7.5 is now enabled for the catalog, plus also a data lineage from SAC story down to the BW 7.5 objects.</P><P>Extension of the data product generator with</P><UL><LI>Introduction of Scenario Generator: semantical import of BW 7.5 and BW/4HANA InfoProvider and its dependencies including attributes and texts</LI><LI>Multiple target support: select any Datasphere tenant and space</LI><LI>Transport of BW subscriptions</LI><LI>Mass handling of BW InfoProviders</LI></UL><P>Requires BWMT 1.27 and SAP Note <SPAN><A href="https://me.sap.com/notes/3692414" target="_blank" rel="noopener noreferrer">3692414</A></SPAN></P><P><span class="lia-unicode-emoji" title=":link:">🔗</span> More information <SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/b350d7c6e88c4f089d381dfe6feee792.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Connecting to BW Systems</A></SPAN> and <SPAN><A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/afccc581146542c485a52563167e23cc.html?version=cloud&locale=en-US" target="_blank" rel="noopener noreferrer">Catalog Asset Details</A>.</SPAN></P><P> </P><P><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2Ft6KoDcATK-M%3Ffeature%3Doembed&display_name=YouTube&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3Dt6KoDcATK-M&image=https%3A%2F%2Fi.ytimg.com%2Fvi%2Ft6KoDcATK-M%2Fhqdefault.jpg&type=text%2Fhtml&schema=youtube" width="200" height="112" scrolling="no" title="SAP Datasphere: Top New Features | January 2026" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></P><P> </P><P><span class="lia-unicode-emoji" title=":television:">📺</span> Check out the Top Features Playlist: <SPAN><A href="https://www.youtube.com/playlist?list=PL3ZRUb1AKkpS8bwmAREXo78O2xccXzoiy" target="_blank" rel="noopener nofollow noreferrer">https://www.youtube.com/playlist?list=PL3ZRUb1AKkpS8bwmAREXo78O2xccXzoiy</A></SPAN></P><P> </P><H2 id="toc-hId--417960283">Webinar Series: The Next Era of Business Data</H2><P>SAP Business Data Cloud is part of a comprehensive strategy for enterprise data designed to address complex enterprise data management challenges. We provide an overview of SAP Business Data Cloud and guide you through its integral pillars and strategic vision.<BR /><SPAN><span class="lia-unicode-emoji" title=":backhand_index_pointing_right:">👉</span> Sign up now: </SPAN><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-bdc-the-next-era-of-business-data/ba-p/14305870" target="_blank">SAP BDC: The Next Era of Business Data</A></SPAN></P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Jan 29 | Discover SAP Business Data Cloud in 2026 - <A href="https://www.youtube.com/live/iRdyj4g80gk" target="_blank" rel="noopener nofollow noreferrer">recording</A></P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Feb 05 | The Smart Path to SAP BW Modernization - <A href="https://www.youtube.com/live/S6H8fJBSuF8" target="_blank" rel="noopener nofollow noreferrer">recording</A></P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Feb 19 | Empowering SAP Datasphere Users with SAP Business Data Cloud Innovations</P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Feb 26 | From Data to Decisions: Analytics with SAP Analytics Cloud and SAP BDC</P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Mar 12 | From Data Silos to Business Insight: The Power of BDC Connect</P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Mar 19 | An Architect's Blueprint for SAP Business Data Cloud as the Foundation for Enterprise AI</P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Mar 26 | Design. Govern. Deliver. Data Products - Data Product Studio</P><P><span class="lia-unicode-emoji" title=":small_blue_diamond:">🔹</span>Apr 16 | Breaking Down the SAP BDC Pricing Model</P><P><SPAN> </SPAN></P><P><SPAN> </SPAN></P><H2 id="toc-hId--614473788">My favorite blogs in January</H2><P><SPAN>What a great blog about seamless planning by <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/14553">@Max_Gander</a> and <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/187920">@marc_daniau</a>.<BR />The blog shows how to run SAP HANA APL time‑series forecasts in SAP Datasphere and feed the results into a seamless SAP Analytics Cloud (SAC) planning model as a live forecast version:<BR /></SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span><SPAN> </SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/generating-and-integrating-automated-predictive-library-apl-forecasts-in-a/ba-p/14309857" target="_blank">Generating and Integrating Automated Predictive Library (APL) Forecasts in a Seamless Planning Model</A></P><P> </P><P><a href="https://community.sap.com/t5/user/viewprofilepage/user-id/1886">@Phil_from_Madrid</a> wrote a nice piece about implementing a "Shift-Left" architecture within the SAP business data fabric shifts responsibility for data quality to the source, aiming for data to originate as a governed, high-quality asset. While this approach accelerates time-to-insight through the creation of reusable data products, it necessitates development teams taking on new data-modeling tasks and managing increased technical debt upstream.<BR /><span class="lia-unicode-emoji" title=":link:">🔗</span><SPAN> </SPAN><A href="https://community.sap.com/t5/data-and-analytics-blog-posts/the-consequences-of-implementing-strictly-shift-left-architectures/ba-p/14313699" target="_blank">The consequences of implementing strictly Shift-left architectures</A></P><P> </P><P><SPAN>This blog from <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/136079">@vitvas</a> details a method to execute ABAP code directly from an SAP Datasphere Data Flow without relying on the BW Bridge. This approach uses virtual elements in CDS views and a generic OData connection to trigger complex logic or external processes in the source system. <BR /></SPAN><span class="lia-unicode-emoji" title=":link:">🔗</span><SPAN> </SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/how-to-call-as-abap-from-sap-datasphere/ba-p/14313818" target="_blank">How to call AS ABAP from SAP Datasphere</A></P><P> </P><H2 id="toc-hId--810987293">More blogs about SAP BDC and Datasphere to check out <span class="lia-unicode-emoji" title=":backhand_index_pointing_down:">👇</span></H2><UL><LI><SPAN><A href="https://community.sap.com/t5/artificial-intelligence-learning-group-blog-posts/sap-business-ai-amp-sustainability-what/ba-p/14318237" target="_blank">SAP Business AI & Sustainability: WHAT</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/bdc-sac-ip-whitelist-and-access-control-in-sap-analytics-cloud-sac/ba-p/14318656" target="_blank">[BDC-SAC] IP Whitelist and Access Control in SAP Analytics Cloud (SAC)</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/bdc-sac-akamai-cdn-metadata-understanding-data-download-when-access-sac/ba-p/14318643" target="_blank">[BDC-SAC] Akamai CDN , Metadata Understanding Data Download When Access SAC</A></SPAN></LI><LI><A href="https://community.sap.com/t5/technology-blog-posts-by-members/how-to-call-as-abap-from-sap-datasphere/ba-p/14313818" target="_blank">How to call AS ABAP from SAP Datasphere</A></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sneak-peek-in-to-sap-analytics-cloud-release-for-q1-2026/ba-p/14317657" target="_blank">Sneak Peek in to SAP Analytics Cloud release for Q1 2026</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/what-s-new-on-my-metrics-2026-q1-qrc-schedule-a-metrics-report/ba-p/14314522" target="_blank">What’s New on My Metrics 2026 Q1 QRC: Schedule a Metrics Report</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/crm-and-cx-blog-posts-by-sap/the-integration-of-sap-business-data-cloud-and-customer-data-platform/ba-p/14315287" target="_blank">The Integration of SAP Business Data Cloud and Customer Data Platform Multiplies the Value</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-business-technology-platform-blog-posts/sap-datasphere-and-sac-modernization-a-commercial-move-with-rewiring/ba-p/14280547" target="_blank">SAP Datasphere and SAC Modernization: A Commercial Move with Rewiring</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-business-technology-platform-blog-posts/the-future-of-sap-analytics-smart-modernization-journey-powered-by-sap-bdc/ba-p/14279002" target="_blank">The Future of SAP Analytics - Smart Modernization Journey Powered by SAP BDC</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/guiding-customers-from-sap-business-data-cloud-discovery-workshop-to-a/ba-p/14314260" target="_blank">Guiding Customers from SAP Business Data Cloud Discovery Workshop to a Clear Adoption Roadmap</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/essential-preparation-and-expert-tips-data-analyst-sap-analytics-cloud/ba-p/14312916" target="_blank">Essential preparation and expert tips: Data analyst - SAP Analytics Cloud certification</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-rpt-1-why-is-it-essential-for-predicting-business-outcomes-in-today-s/ba-p/14314375" target="_blank">SAP RPT‑1: Why is it Essential for Predicting Business Outcomes in Today’s and Future Generative AI?</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/integrating-sap-business-data-cloud-data-products-with-sap-hana-central/ba-p/14312943" target="_blank">Integrating SAP Business Data Cloud data products with SAP HANA central</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/building-a-graph-native-ai-chatbot-on-sap-btp/ba-p/14313174" target="_blank">Building a Graph Native AI Chatbot on SAP BTP</A></SPAN></LI><LI><A href="https://community.sap.com/t5/data-and-analytics-blog-posts/the-consequences-of-implementing-strictly-shift-left-architectures/ba-p/14313699" target="_blank">The consequences of implementing strictly Shift-left architectures</A></LI><LI><SPAN><A href="https://community.sap.com/t5/sap-learning-blog-posts/unlock-faster-insights-modernizing-sap-bw-with-sap-business-data-cloud/ba-p/14309721" target="_blank">Unlock Faster Insights: Modernizing SAP BW with SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/filling-the-gab-between-bw-and-datasphere-a-bit-implementing-a-few-standard/ba-p/14312749" target="_blank">Filling the gab between BW and Datasphere (a bit): Implementing a few standard variables</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/what-s-new-in-sap-btp-q4-2025-innobytes/ba-p/14311120" target="_blank">What's New in SAP BTP - Q4 2025 Innobytes</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/crm-and-cx-blog-posts-by-members/connecting-sap-sales-cloud-v2-to-a-standalone-sap-analytics-cloud-a/ba-p/14310439" target="_blank">Connecting SAP Sales Cloud V2 to a Standalone SAP Analytics Cloud - A Practical Alternative</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-business-data-cloud-and-ai-telling-the-story-to-drive-adoption/ba-p/14312022" target="_blank">SAP Business Data Cloud and AI: Telling the story to drive adoption</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/introducing-sap-business-data-cloud-test-demo-and-development-service/ba-p/14310464" target="_blank">Introducing SAP Business Data Cloud test, demo and development service</A></SPAN></LI><LI><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/generating-and-integrating-automated-predictive-library-apl-forecasts-in-a/ba-p/14309857" target="_blank">Generating and Integrating Automated Predictive Library (APL) Forecasts in a Seamless Planning Model</A></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-members/working-with-bw-authorizations-in-datasphere-on-enterprise-level/ba-p/14302361" target="_blank">Working with BW Authorizations in Datasphere on Enterprise Level</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/error-message-when-trying-to-access-quot-my-products-packages-quot-for-sap/ba-p/14305999" target="_blank">Error message when trying to access "My Products Packages" for SAP BDC</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-bdc-the-next-era-of-business-data/ba-p/14305870" target="_blank">SAP BDC: The Next Era of Business Data</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/beyond-data-mesh-entering-the-data-product-economy-with-sap-business-data/ba-p/14304334" target="_blank">Beyond Data Mesh: Entering the Data Product Economy with SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/learner-stories/sap-business-data-cloud-certification-c-bcbdc-the-ultimate-guide-to-passing/ba-p/14305009" target="_blank">SAP Business Data Cloud Certification (C_BCBDC): The Ultimate Guide to Passing SAP’s New AI Scenario</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/adopting-ux-best-practices-for-intelligent-applications-in-sap-business/ba-p/14305271" target="_blank">Adopting UX best practices for intelligent applications in SAP Business Data Cloud</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/leverage-system-job-live-model-to-display-acquired-model-last-refreshed/ba-p/14304570" target="_blank">Leverage System Job Live Model to display Acquired Model Last Refreshed Time in Story</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/developer-news/sap-developer-news-january-8th-2026/ba-p/14303979" target="_blank">SAP Developer News January 8th,, 2026</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-business-data-cloud-customer-adoption-teaming-up-for-success/ba-p/14303260" target="_blank">SAP Business Data Cloud customer adoption: Teaming up for success</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/selling-sap-business-data-cloud-sales-plays-bill-of-material-and-sizing/ba-p/14303218" target="_blank">Selling SAP Business Data Cloud: sales plays, bill of material and sizing</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/enterprise-resource-planning-blog-posts-by-sap/beyond-the-buzz-making-sap-business-ai-work-for-the-real-world/ba-p/14303013" target="_blank">Beyond the Buzz: Making SAP Business AI Work for the Real World</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/sap-business-data-cloud-dev-inspiration-blueprints-for-next-gen-intelligent/ba-p/14301769" target="_blank">SAP Business Data Cloud dev inspiration: blueprints for next-gen intelligent apps</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/communicating-the-value-of-sap-business-data-cloud-using-personas/ba-p/14301730" target="_blank">Communicating the value of SAP Business Data Cloud using personas</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/positioning-sap-business-data-cloud-with-sap-snowflake/ba-p/14300805" target="_blank">Positioning SAP Business Data Cloud with SAP Snowflake</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/data-and-analytics-learning-group-blog-posts/how-to-access-your-sap-business-data-cloud-free-trial/ba-p/14296178" target="_blank">How to Access your SAP Business Data Cloud Free Trial</A></SPAN></LI><LI><SPAN><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/enhancing-sap-hana-database-development-with-generative-ai-support-for-hdi/ba-p/14294318" target="_blank">Enhancing SAP HANA Database Development with Generative AI Support for HDI Artifacts</A></SPAN></LI></UL><P> </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kpsauer_0-1770052206768.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368227i9C61D3925DDFBA9C/image-size/large/is-moderation-mode/true?v=v2&px=999" role="button" title="kpsauer_0-1770052206768.png" alt="kpsauer_0-1770052206768.png" /></span></P><P>Find more information and related blog posts on the topic pages for <A href="https://pages.community.sap.com/topics/business-data-cloud" target="_blank" rel="noopener noreferrer">SAP Business Data Cloud</A> and <SPAN><A href="https://pages.community.sap.com/topics/datasphere" target="_blank" rel="noopener noreferrer">SAP Datasphere</A></SPAN>. Also check out the on SAP Help for <A href="https://help.sap.com/docs/business-data-cloud" target="_blank" rel="noopener noreferrer">SAP Business Data Cloud</A> and <A href="https://help.sap.com/docs/SAP_DATASPHERE" target="_blank" rel="noopener noreferrer">SAP Datasphere</A>, including for <A href="https://help.sap.com/docs/SUPPORT_CONTENT/datasphere/4181116697.html" target="_blank" rel="noopener noreferrer">troubleshooting and analysis guides</A>, <A href="https://help.sap.com/docs/SAP_BUSINESS_DATA_CLOUD/9b36d0ac59f24cbeb45617e36a7680fc?locale=en-US&state=PRODUCTION&version=SHIP" target="_blank" rel="noopener noreferrer">onboarding guides</A>, and more.</P>2026-02-05T23:27:35.276000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/currency-conversion-in-sap-datasphere-using-graphical-views/ba-p/14322689Currency Conversion in SAP Datasphere Using Graphical Views2026-02-09T07:20:44.359000+01:00Lokesh_Kumar_Pothapolahttps://community.sap.com/t5/user/viewprofilepage/user-id/880828<P><STRONG>Introduction: Currency Conversion in SAP</STRONG></P><P>In SAP landscapes such as S/4HANA, SAP Datasphere, and SAP Analytics Cloud (SAC), handling multiple currencies is a fundamental requirement due to global operations and diverse organizational structures. Business transactions are usually posted in the local currency of the country where the transaction takes place, but reporting and analysis often require additional currency perspectives</P><P>SAP systems can support multiple organizational and reporting currencies, depending on system configuration and business requirements, such as:</P><P>• <SPAN>Company Code Currency - </SPAN>The legal reporting currency used for statutory and financial reporting.</P><P>• <SPAN>Group (Global) Currency - </SPAN>Used for consolidation and group-level reporting across multiple company codes.</P><P>• <SPAN>Profit Center Currency - </SPAN>Enables profitability analysis when a dedicated profit center currency is configured.</P><P>• <SPAN>Cost Center Currency - </SPAN>Supports internal cost tracking and management accounting scenarios.</P><P>• <SPAN>Transaction Currency - </SPAN>The currency in which the original business transaction is posted.</P><P>• <SPAN>Controlling Area Currency - </SPAN>Used for management accounting and controlling across company codes.</P><P>• <SPAN>Document Currency - </SPAN>The currency stored at the accounting document level for audit and traceability.</P><P>• <SPAN>Functional Currency - </SPAN>Represents the primary currency of business operations, commonly used in management reporting.</P><P>To ensure accurate and comparable reporting across entities and regions, these currencies must be converted using well-defined exchange rates.</P><P>--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------</P><P><SPAN><STRONG>Currency conversion in SAP Datasphere</STRONG> plays a key role by:</SPAN></P><UL><LI><SPAN>Centralizing currency conversion logic</SPAN></LI><LI><SPAN>Enabling consistent currency handling across models</SPAN></LI><LI><SPAN>Providing harmonized financial data for analytics and planning in SAC</SPAN></LI></UL><P>This makes SAP Datasphere the foundation for reliable multi-currency reporting in modern SAP analytics scenarios.</P><P>In SAP Datasphere, currency conversion can be implemented in <SPAN>Graphical Views, Analytical Models, and SQL Views</SPAN>, and can also be applied dynamically at runtime in SAP Analytics Cloud.</P><P>In the following post, I will be covering the currency conversion in graphical view with BW as a source system.</P><P><STRONG>Prerequisite:</STRONG></P><P>Based on the source system (BW, S4HANA etc) the ABAP connection should be configured from source to Datasphere system.</P><P><STRONG>Step 1: Import TCUR* tables, views, data flows and remote tables into Datasphere Tenant</STRONG></P><P>Go to Data Builder</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_0-1770356286322.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369385i01E6335315C2524B/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_0-1770356286322.png" alt="Lokesh_Kumar_Pothapola_0-1770356286322.png" /></span></P><P>Select '+' symbol</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_1-1770356286326.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369386i04C1E2A46F19431A/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_1-1770356286326.png" alt="Lokesh_Kumar_Pothapola_1-1770356286326.png" /></span></P><P>Select 'Currency Conversion Views'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_2-1770356286326.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369387i3232BC4E296F8250/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_2-1770356286326.png" alt="Lokesh_Kumar_Pothapola_2-1770356286326.png" /></span></P><P><SPAN>Note: In the Source drop down list, based on the type of source different artifacts will be created in Datasphere.</SPAN></P><P><EM>Manual - Local tables and views</EM></P><P><EM>SAP S/4 HANA Cloud / On Premise - Local tables, views and data flows will be imported<BR />SAP ABAP connection - Local tables, views, data flows and remote tables will be imported.</EM></P><P>Select Connection as ABAP</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_3-1770356286327.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369388i26C3A7F0051E57AE/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_3-1770356286327.png" alt="Lokesh_Kumar_Pothapola_3-1770356286327.png" /></span></P><P>By selecting ABAP as a source - Local tables, views, data flows and remote tables will be imported.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_4-1770356286328.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369389iD36B3CDECE37F442/image-size/medium?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_4-1770356286328.png" alt="Lokesh_Kumar_Pothapola_4-1770356286328.png" /></span></P><P>Click on 'Create' - All the objects will be imported in to Datasphere tenant, localize the objects in a folder to use globally for different spaces.</P><P><EM>Note: Naming convention as follows by default, option available to change Business Name</EM></P><P><EM>Remote tables - SAP.CURRENCY.RTABLE.TCUR*</EM></P><P><EM>Data Flows - SAP.CURRENCY.DATAFLOW.TCUR*</EM></P><P><EM>Tables - SAP.CURRENCY.TABLE.TCUR*</EM></P><P><EM>Views - SAP.CURRENCY.VIEW.TCUR*</EM></P><P><EM>Once you’ve created the currency conversion views and their supporting objects, you can share the TCUR* tables with other spaces. The spaces to which you share these tables will then see your space as an alternative source in the Source field above.</EM></P><P><STRONG>Step 2: Make TCUR* data available from BW to SAP Datasphere</STRONG></P><P>Since the connection type is ABAP with BW, the existing source tables in the BW system must be mapped to their corresponding remote tables in SAP Datasphere.</P><P>Select the Remote table which are imported automatically in the system.</P><P>Example: For TCURR configuration, the same steps must be repeated for all imported TCUR* tables.</P><P>Go to 'Data Builder'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_5-1770356630740.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369390iA04ABB1992D5ED8B/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_5-1770356630740.png" alt="Lokesh_Kumar_Pothapola_5-1770356630740.png" /></span></P><P><SPAN>Search '</SPAN><SPAN>SAP.CURRENCY.RTABLE.</SPAN><SPAN>TCURR' table</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_6-1770356630741.png" style="width: 900px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369392i445432DF77AB621D/image-dimensions/900x209?v=v2" width="900" height="209" role="button" title="Lokesh_Kumar_Pothapola_6-1770356630741.png" alt="Lokesh_Kumar_Pothapola_6-1770356630741.png" /></span></P><P>Open the table and set up the 'Remote' connection details (BW_ABAP : configured ABAP connection from BW to Datasphere)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_7-1770356630743.png" style="width: 554px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369391iD63BF9D3F467B4D1/image-dimensions/554x241?v=v2" width="554" height="241" role="button" title="Lokesh_Kumar_Pothapola_7-1770356630743.png" alt="Lokesh_Kumar_Pothapola_7-1770356630743.png" /></span></P><P><EM>Note: There are different methods to replicate data into Remote table:</EM></P><P><EM>Start Data Replication: Fetches data from the source BW ABAP table (TCURR) into the corresponding remote table in SAP Datasphere and stages it. This step can be included in a task chain. (STAGED)</EM></P><P><EM>Enable Real-Time Replication: Reads data virtually from BW into SAP Datasphere on demand, without staging. (VIRTUAL)</EM></P><P>Once this step is completed, go to 'Data Builder' -> search for data flow 'SAP.CURRENCY.DATAFLOW.TCURR' -> Select 'Run'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_8-1770356630744.png" style="width: 576px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369394iA1ABC2CF0C96CE41/image-dimensions/576x192?v=v2" width="576" height="192" role="button" title="Lokesh_Kumar_Pothapola_8-1770356630744.png" alt="Lokesh_Kumar_Pothapola_8-1770356630744.png" /></span></P><P><EM>Note: The data flow can be scheduled to run at regular intervals or included in a task chain for daily execution.</EM></P><P><SPAN>Once the data flow is completed (which can be monitored using the icon below), the data can be viewed in the currency view </SPAN><SPAN>SAP.CURRENCY.VIEW.TCURR</SPAN><SPAN>, a default SQL-based view provided by SAP</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_9-1770356630744.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369393i96CB575A23E4AE2A/image-size/medium?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_9-1770356630744.png" alt="Lokesh_Kumar_Pothapola_9-1770356630744.png" /></span></P><P><SPAN>SAP.CURRENCY.VIEW.TCURR (provided by SAP)</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_10-1770356630745.png" style="width: 772px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369395iEE4594710996CCD8/image-dimensions/772x248?v=v2" width="772" height="248" role="button" title="Lokesh_Kumar_Pothapola_10-1770356630745.png" alt="Lokesh_Kumar_Pothapola_10-1770356630745.png" /></span></P><P> </P><P><STRONG>Step3: Currency conversion in Datasphere views.</STRONG></P><P>Go to 'Data Builder' (select space if applicable)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_12-1770357055415.png" style="width: 332px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369399i0ADE3D5B3F47450B/image-dimensions/332x68/is-moderation-mode/true?v=v2" width="332" height="68" role="button" title="Lokesh_Kumar_Pothapola_12-1770357055415.png" alt="Lokesh_Kumar_Pothapola_12-1770357055415.png" /></span></P><P>Select 'New Graphical View' or select if any existing view available which requires currency conversion.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_13-1770357055416.png" style="width: 749px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369401iD635775C8CF1C248/image-dimensions/749x359/is-moderation-mode/true?v=v2" width="749" height="359" role="button" title="Lokesh_Kumar_Pothapola_13-1770357055416.png" alt="Lokesh_Kumar_Pothapola_13-1770357055416.png" /></span></P><P>There are three columns:</P><P>'Sale_Amount_CompCode' the measure contains amount to be converted</P><P><SPAN> 'Company_Code_Currency' holds the source currency</SPAN></P><P> 'Profit_Center_Currency' holds the target currency</P><P>Go to View -> Select the node where currency conversion need to apply -> Add 'fx' step</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_14-1770357055418.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369400iD7B759EAF3D26443/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_14-1770357055418.png" alt="Lokesh_Kumar_Pothapola_14-1770357055418.png" /></span></P><P>Select 'fx' step -> Details -> select '+' -> select 'Currency Conversion Column'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_15-1770357055418.png" style="width: 799px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369404i163C8342A5EE42A0/image-dimensions/799x262?v=v2" width="799" height="262" role="button" title="Lokesh_Kumar_Pothapola_15-1770357055418.png" alt="Lokesh_Kumar_Pothapola_15-1770357055418.png" /></span></P><P><STRONG>Set the Element Properties</STRONG></P><P>Provide Business Name, Technical Name, Data type and Decimal changes</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_16-1770357055420.png" style="width: 493px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369403i0E28C72C407038D8/image-dimensions/493x537?v=v2" width="493" height="537" role="button" title="Lokesh_Kumar_Pothapola_16-1770357055420.png" alt="Lokesh_Kumar_Pothapola_16-1770357055420.png" /></span></P><P><STRONG>Currency Properties</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_17-1770357055421.png" style="width: 536px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369402i5421D094C55AAF71/image-dimensions/536x438?v=v2" width="536" height="438" role="button" title="Lokesh_Kumar_Pothapola_17-1770357055421.png" alt="Lokesh_Kumar_Pothapola_17-1770357055421.png" /></span></P><P>Source Amount Column - Represents the column which requires conversion in this case 'Sale_Amount_CompCode''</P><P>Source Currency - Currency works as a source value in this case 'Company_Code_Currency'</P><P>Target currency - Currency works as a target value in this case 'Profit_Center_Currency'</P><P>Reference Date - This is Year/Date to be used as a reference during conversion.</P><P>Steps: Different options available and one of the key element</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_18-1770357055421.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369405i7745ED12170F2F49/image-size/medium?v=v2&px=400" role="button" title="Lokesh_Kumar_Pothapola_18-1770357055421.png" alt="Lokesh_Kumar_Pothapola_18-1770357055421.png" /></span></P><P> </P><TABLE><TBODY><TR><TD width="80.8333px" height="50px"><P>Shift</P></TD><TD width="637.904px" height="50px"><P>The amount is multiplied to remove decimals. Example 123.45 USD → 12345</P></TD></TR><TR><TD width="80.8333px" height="50px"><P>Convert</P></TD><TD width="637.904px" height="50px"><P>The shifted amount is converted using the exchange rate. Example 12345 × exchange rate</P></TD></TR><TR><TD width="80.8333px" height="50px"><P>Round</P></TD><TD width="637.904px" height="50px"><P>The converted value is rounded based on the <STRONG>target currency’s decimal rules</STRONG>.</P></TD></TR><TR><TD width="80.8333px" height="50px"><P>Shift Back</P></TD><TD width="637.904px" height="50px"><P>The value is divided to restore the correct decimal places.</P></TD></TR></TBODY></TABLE><P><EM>Note: For any simple currency conversion, recommended to select only 'Convert' and 'Round'</EM></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_19-1770357055422.png" style="width: 486px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369407iB4B8FE478B25B61D/image-dimensions/486x429/is-moderation-mode/true?v=v2" width="486" height="429" role="button" title="Lokesh_Kumar_Pothapola_19-1770357055422.png" alt="Lokesh_Kumar_Pothapola_19-1770357055422.png" /></span></P><P><STRONG>Conversion properties</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_20-1770357055422.png" style="width: 446px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369406iD0439002ABA9F39D/image-dimensions/446x279?v=v2" width="446" height="279" role="button" title="Lokesh_Kumar_Pothapola_20-1770357055422.png" alt="Lokesh_Kumar_Pothapola_20-1770357055422.png" /></span></P><P>Client - Client 000 is typically used for currency tables in SAP systems</P><P>Conversion Type - Exchange rate type maintained in TCURR (for example, M – Standard rate, P – Budget rate)</P><P>Error Handling - Defines the value to be generated if no valid exchange rate is found.</P><P><STRONG>Advanced Properties</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_21-1770357055423.png" style="width: 448px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369409i24CE61A822C0C156/image-dimensions/448x634/is-moderation-mode/true?v=v2" width="448" height="634" role="button" title="Lokesh_Kumar_Pothapola_21-1770357055423.png" alt="Lokesh_Kumar_Pothapola_21-1770357055423.png" /></span></P><P>Lookup: Regular and Reverse (to be selected if reverse exchange rate is maintained in the system)</P><P>Accuracy: Compatibility (Matches classic SAP (ECC/BW) currency results) and Highest (Provides maximum precision for accurate financial reporting)</P><P>Date Format: Auto Detect (Automatically picks the correct date format from source data and applies the right exchange rate)</P><P>All remaining views relevant to TCUR* will get auto populated by default, as its configured at beginning.</P><P><STRONG>Activate and Deploy the view</STRONG></P><P><STRONG>Result Output:</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Lokesh_Kumar_Pothapola_22-1770357055424.png" style="width: 972px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/369408i2163FB282D4485DD/image-dimensions/972x173/is-moderation-mode/true?v=v2" width="972" height="173" role="button" title="Lokesh_Kumar_Pothapola_22-1770357055424.png" alt="Lokesh_Kumar_Pothapola_22-1770357055424.png" /></span></P><P><SPAN>Exchange rate from AUD to GBP is maintained as </SPAN><STRONG>1.75 in BW system</STRONG></P><P>Data From BW System:</P><DIV><TABLE border="1" cellspacing="0" cellpadding="0"><TBODY><TR><TD><P>Client</P></TD><TD><P>Exchange Type</P></TD><TD><P>From</P></TD><TD><P>To</P></TD><TD><P>Valid From</P></TD><TD><P>Exchange Rate</P></TD></TR><TR><TD><P>600</P></TD><TD><P>P</P></TD><TD><P>AUD</P></TD><TD><P>GBP</P></TD><TD><P>01.01.2025</P></TD><TD><P>/1.75</P></TD></TR></TBODY></TABLE></DIV><P><STRONG>Values are converted using the applicable exchange rate and published in a new column called </STRONG><STRONG>“Profit Center Amount.”</STRONG></P><P><SPAN>Conclusion: </SPAN><SPAN>By centralizing currency conversion logic in SAP Datasphere, organizations can ensure consistent, auditable, and scalable multi-currency reporting across analytical and planning scenarios. This approach simplifies downstream consumption in SAP Analytics Cloud while maintaining alignment with SAP’s standard currency configuration.</SPAN></P><P> </P><P>Thank you for taking the time to read this article. I hope this overview helps you understand and implement currency conversion in SAP Datasphere more effectively. Your feedback, questions, and suggestions are welcome in the comments section.</P><P> </P><P>Regards</P><P>Lokesh Kumar Pothapola</P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P><P> </P>2026-02-09T07:20:44.359000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/triggering-sap-datasphere-bdc-task-chains-from-sap-btp-abap-cloud-using-the/ba-p/14325528Triggering SAP Datasphere/BDC Task Chains from SAP BTP ABAP Cloud using the Task Chain REST API2026-02-10T16:44:24.061000+01:00stefan_geiselhart2https://community.sap.com/t5/user/viewprofilepage/user-id/200897<P>Our modern data landscapes thrive on automation. In this post, we’ll walk end-to-end through a real-life integration scenario that tackles one of the building blocks in this context:</P><P class="lia-align-center" style="text-align: center;"><STRONG>Starting a SAP Datasphere Task Chain remotely from SAP BTP ABAP Cloud using OAuth 2.0 Client Credentials and the Datasphere Task Chain REST API</STRONG></P><P>The goal is to orchestrate Datasphere data processing flows directly from ABAP Cloud — securely, programmatically, and fully automated.</P><P>We will cover the below points:</P><UL><LI>Creating a Task Chain in Datasphere</LI><LI>Configuring OAuth clients in Datasphere</LI><LI>Setting up Communication System & Arrangement in ABAP Cloud</LI><LI>Executing the Task Chain remotely: Consume REST API from ABAP Cloud</LI></UL><P><FONT size="5"><STRONG>1 Architecture Overview</STRONG></FONT></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Solution Architecture" style="width: 550px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371166iB8C9DA6C7F6550B3/image-dimensions/550x406?v=v2" width="550" height="406" role="button" title="image.png" alt="Solution Architecture" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Solution Architecture</span></span></P><P>ABAP Cloud Task Chain Execution: ABAP Cloud acts as the caller. Datasphere exposes the Task Chain API protected by OAuth.</P><P><FONT size="5"><STRONG>2 Prerequisites</STRONG></FONT></P><P>You must have:</P><UL><LI>SAP Datasphere tenant with developer + admin access</LI><LI>A Task Chain to trigger</LI><LI>SAP BTP ABAP Environment (ABAP Cloud) with developer access</LI><LI>BTP ABAP: Authorization to configure OAuth clients and Communication Arrangements</LI></UL><P><FONT size="5"><STRONG>3 Creating a Task Chain in SAP Datasphere</STRONG></FONT></P><P>The first step is defining a Task Chain that encapsulates the processing logic you want to execute remotely.</P><P>In the given example:</P><UL><LI>The Task Chain starts with a <STRONG>Begin</STRONG> node</LI><LI>Executes a <STRONG>Replication Flow</STRONG> for loading dimension data</LI><LI>(Can be extended with transformations, views, or further flows)</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Datasphere Task Chain with Replication Flow inside" style="width: 592px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371084i85FF39979A8A1F3B/image-dimensions/592x320?v=v2" width="592" height="320" role="button" title="image.png" alt="SAP Datasphere Task Chain with Replication Flow inside" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Datasphere Task Chain with Replication Flow inside</span></span></P><P><EM>Main takeaways:</EM></P><UL><LI>Task Chain must be <EM>Deployed</EM></LI><LI>Note the <EM>Technical Name</EM> (used later by API)</LI><LI>Verify it runs manually before you automate via the API<STRONG> </STRONG></LI></UL><P><FONT size="5"><STRONG>4 Configuring OAuth Client in SAP Datasphere</STRONG></FONT></P><P>To allow ABAP Cloud to call Datasphere APIs securely, we create an OAuth Client in Datasphere Administration.</P><P><EM>Navigation Path</EM></P><P>Datasphere → Administration → App Integration → OAuth Clients</P><P><EM>OAuth Endpoints Provided</EM></P><P>Amongst others, SAP Datasphere automatically exposes:</P><UL><LI>Authorization URL</LI><LI>Token URL</LI></UL><P>These are required for OAuth2 token retrieval. Moreover you must create a technical user that has access to the Datasphere space and is able to run the task chains that are in scope. Typically this is done by assigning a scoped role.</P><P><EM>Configured OAuth Client (based on Technical User)</EM></P><P><STRONG><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Menu Path Datasphere" style="width: 172px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371014i26654901577A6E67/image-size/large?v=v2&px=999" role="button" title="image.png" alt="Menu Path Datasphere" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Menu Path Datasphere</span></span></STRONG></P><P>This is where you see the OAuth Client specific URLs - take note of them for later configuration in ABAP BTP:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="OAuth Client Generation" style="width: 764px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371054i4BB7492B30B71B30/image-dimensions/764x458?v=v2" width="764" height="458" role="button" title="image.png" alt="OAuth Client Generation" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">OAuth Client Generation</span></span></P><P>Your specific OAuth client for consumption from ABAP BTP. The secret key is only shown once upon creation - keep this in mind!</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="OAuth Client Details" style="width: 404px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371055i19C15B5494614179/image-dimensions/404x580?v=v2" width="404" height="580" role="button" title="image.png" alt="OAuth Client Details" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">OAuth Client Details</span></span></P><P><EM>Important settings:</EM></P><TABLE><TBODY><TR><TD><P><STRONG>Parameter</STRONG></P></TD><TD><P><STRONG>Value</STRONG></P></TD></TR><TR><TD><P>Grant Type</P></TD><TD><P>Client Credentials</P></TD></TR><TR><TD><P>Purpose</P></TD><TD><P>Technical User</P></TD></TR><TR><TD><P>Roles</P></TD><TD><P>space access must be given (e.g. by scope role in DSP)</P></TD></TR><TR><TD><P>Token Lifetime</P></TD><TD><P>e.g. 60 minutes</P></TD></TR></TBODY></TABLE><P>Save the <EM>Client ID</EM> and <EM>Client Secret </EM>— this is used in ABAP Cloud later on.</P><P><FONT size="5"><STRONG> 5 </STRONG><STRONG>Creating Communication Artifacts in ABAP Cloud</STRONG></FONT></P><P>Now we configure the outbound technical connection from ABAP Cloud to Datasphere.</P><P>First create an Outbound Service of type HTTP:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Create Outbound Service" style="width: 498px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371664i6B7CF4506456A6F6/image-dimensions/498x184?v=v2" width="498" height="184" role="button" title="image.png" alt="Create Outbound Service" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Create Outbound Service</span></span></P><P>Create a Communication Scenario with Authentication Methods = OAuth 2.0 only, bind it to your outbound service and publish it:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Communication Scenario Creation" style="width: 618px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371665iF1366B36E6FF0402/image-dimensions/618x251?v=v2" width="618" height="251" role="button" title="image.png" alt="Communication Scenario Creation" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Communication Scenario Creation</span></span></P><P><EM>Navigation: </EM>ABAP Environment Web Access → Communication Management → Communication Systems</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Web Access" style="width: 671px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371057i5B672EC77D24D22E/image-dimensions/671x269?v=v2" width="671" height="269" role="button" title="image.png" alt="ABAP Cloud Web Access" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Web Access</span></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Communication Systems" style="width: 672px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371058i1807DF17EA76120F/image-dimensions/672x113?v=v2" width="672" height="113" role="button" title="image.png" alt="ABAP Cloud Communication Systems" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Communication Systems</span></span></P><P>The hostname is your Datasphere tenant URL. Outbound OAuth URLs (like Token/Authentication) can be found on the Datasphere side under Administration -> App Integration (previous step).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Communication Systems Details" style="width: 673px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371059iF14B49A880B877F1/image-dimensions/673x497?v=v2" width="673" height="497" role="button" title="image.png" alt="ABAP Cloud Communication Systems Details" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Communication Systems Details</span></span></P><P>You have to add a user "OAuth 2.0 (Basic)":</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Communication Systems Users" style="width: 672px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371061i0EC89B23ACC3A7F5/image-dimensions/672x269?v=v2" width="672" height="269" role="button" title="image.png" alt="ABAP Cloud Communication Systems Users" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Communication Systems Users</span></span></P><P><EM>Key fields:</EM></P><TABLE><TBODY><TR><TD><P><EM>Field</EM></P></TD><TD><P><EM>Example</EM></P></TD></TR><TR><TD><P>System ID</P></TD><TD><P>DSP_REST</P></TD></TR><TR><TD><P>Host Name</P></TD><TD><P><your-datasphere-host> (tenant URL)</P></TD></TR><TR><TD><P>Port</P></TD><TD><P>443</P></TD></TR><TR><TD><P>OAuth 2.0 Endpoints</P></TD><TD><P>From Datasphere for Token & Authorization URL (Audience URL not required)</P></TD></TR><TR><TD><P>Client ID</P></TD><TD><P>Datasphere OAuth Client ID</P></TD></TR><TR><TD><P>Client Secret</P></TD><TD><P>Datasphere Secret (you must take note of this when creating the OAuth Client)</P></TD></TR></TBODY></TABLE><P><FONT size="5"><STRONG>6 Creating a Communication Arrangement in SAP ABAP Cloud</STRONG></FONT></P><P>Now we bind the Communication System to a Communication Scenario that allows REST calls.</P><P><EM>Navigation: </EM>Communication Management → Communication Arrangements</P><P><EM>Arrangement Example: ZDSP_CS_REST</EM></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Communication Arrangements" style="width: 675px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371062i5D6E57F823E5F6EB/image-dimensions/675x138?v=v2" width="675" height="138" role="button" title="image.png" alt="ABAP Cloud Communication Arrangements" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Communication Arrangements</span></span></P><P>Note the service path: The dynamic parts of the path are set from ABAP (this is to keep the space + task chain flexible and controllable from the actual caller: ABAP).</P><P><EM><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Communication Arrangements Detailed Configuration" style="width: 673px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371063i82A3F766212E7AA8/image-dimensions/673x334?v=v2" width="673" height="334" role="button" title="image.png" alt="ABAP Cloud Communication Arrangements Detailed Configuration" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Communication Arrangements Detailed Configuration</span></span></EM></P><P><EM>What this provides:</EM></P><UL><LI>Preconfigured service URL</LI><LI>OAuth token handling by ABAP runtime</LI><LI>Secure outbound API consumption</LI></UL><P><FONT size="5"><STRONG>7 Calling the Datasphere Task Chain API from ABAP Cloud</STRONG></FONT></P><P>Now comes the fun part: consuming the REST API. Datasphere exposes endpoints such as:</P><UL><LI>POST /api/v1/datasphere/tasks/chains/<space_id>/run/<task_chain_technical_name></LI><LI>GET /api/v1/datasphere/tasks/logs/<space_id>/<log_id></LI></UL><P>Which triggers a new execution or reads the status of a task chain run.</P><P><EM>ABAP Cloud Implementation Pattern</EM></P><P>In ABAP Cloud when working with REST API to integrate external services, you typically:</P><OL><LI>Create HTTP destination automatically from the created Communication Arrangement</LI><LI>Use CL_WEB_HTTP_CLIENT_MANAGER</LI><LI>Perform POST/GET requests</LI><LI>Parse response (run ID, status)</LI></OL><P>The code snippet performs the following parts in more detail:</P><OL><LI><EM>Run as a class-run program</EM> (IF_OO_ADT_CLASSRUN) -> you can execute directly from ADT and print to the console.</LI><LI><EM>Locate the configured Communication Arrangement</EM> (bound to our Communication System DSP_REST) using CL_COM_ARRANGEMENT_FACTORY.</LI><LI><EM>Create an HTTP destination</EM> from the Communication Arrangement using CL_HTTP_DESTINATION_PROVIDER=>CREATE_BY_COMM_ARRANGEMENT.</LI><LI><EM>Create an HTTP client</EM> from the destination (CL_WEB_HTTP_CLIENT_MANAGER).</LI><LI><EM>POST</EM> request to the Task Chain “run” endpoint:</LI><UL><LI>triggers a new run</LI><LI>receives a JSON response containing a <EM>LogId</EM></LI></UL><LI><EM>WAIT</EM> a few seconds.</LI><LI><EM>GET</EM> the log endpoint using the returned <EM>LogId</EM>:</LI><UL><LI>retrieves the execution status (RUNNING, later SUCCESS / FAILED, etc.)</LI></UL><LI>Print everything in the <EM>ABAP Console</EM>.</LI></OL><P>ABAP-Code Snippet:</P><pre class="lia-code-sample language-abap"><code>CLASS zcl_dsp_rest_api DEFINITION PUBLIC FINAL CREATE PUBLIC.
PUBLIC SECTION.
INTERFACES if_oo_adt_classrun.
CLASS-DATA:
out TYPE REF TO if_oo_adt_classrun_out.
CLASS-METHODS:
call_dsp
RAISING
cx_http_dest_provider_error
cx_web_http_client_error.
ENDCLASS.
CLASS zcl_dsp_rest_api IMPLEMENTATION.
METHOD if_oo_adt_classrun~main.
zcl_dsp_rest_api=>out = out.
TRY.
call_dsp( ).
CATCH cx_web_http_client_error
cx_http_dest_provider_error INTO DATA(exception).
out->write( exception->get_text( ) ).
ENDTRY.
ENDMETHOD.
METHOD call_dsp.
TYPES: BEGIN OF ty_log,
logId TYPE i,
END OF ty_log.
DATA: ls_log TYPE ty_log.
DATA(communication_system) = 'DSP_REST'.
DATA(arrangement_factory) = cl_com_arrangement_factory=>create_instance( ).
DATA(comm_arrangement_range) = VALUE if_com_arrangement_factory=>ty_query-cs_id_range(
( sign = 'I' option = 'EQ' low = communication_system ) ).
arrangement_factory->query_ca(
EXPORTING
is_query = VALUE #( cs_id_range = comm_arrangement_range )
IMPORTING
et_com_arrangement = DATA(arrangements) ).
DATA(arrangement) = arrangements[ 1 ].
DATA(destination) = cl_http_destination_provider=>create_by_comm_arrangement(
comm_scenario = 'ZDSP_CS_REST'
service_id = 'ZDSP_REST_SRV_REST'
comm_system_id = arrangement->get_comm_system_id( ) ).
DATA(http_client) = cl_web_http_client_manager=>create_by_http_destination( destination ).
DATA(request) = http_client->get_http_request( ).
request->set_uri_path( '/chains/BDCPOC/run/TC_DIM2_A001' ).
DATA(response) = http_client->execute( if_web_http_client=>post ).
CALL METHOD /ui2/cl_json=>deserialize
EXPORTING
json = response->get_text( )
CHANGING
data = ls_log.
data(log_str) = CONV string( ls_log-logId ).
out->write( 'Task Chain started. LogId created:' && log_str ).
WAIT UP TO 10 SECONDS.
CONCATENATE '/logs/BDCPOC/' log_str INTO DATA(uri).
clear: request, response, http_client.
http_client = cl_web_http_client_manager=>create_by_http_destination( destination ).
request = http_client->get_http_request( ).
request->set_uri_path( uri ).
response = http_client->execute( if_web_http_client=>get ).
out->write( 'Task Chain Log REST API - via GET - Task Chain Status:' && response->get_text( ) ).
ENDMETHOD.
ENDCLASS.</code></pre><P>Once triggered:</P><UL><LI>Datasphere creates a new Task Chain run</LI><LI>Execution can be monitored in Datasphere UI</LI><LI>API returns run ID and status</LI></UL><P>Follow-up calls can fetch:</P><UL><LI>Execution status</LI><LI>Logs</LI><LI>Completion result</LI></UL><P><FONT size="5"><STRONG>8 Execution Results</STRONG></FONT></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ABAP Cloud Console Log: Results" style="width: 964px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371044iA43931A0457AB770/image-size/large?v=v2&px=999" role="button" title="image.png" alt="ABAP Cloud Console Log: Results" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">ABAP Cloud Console Log: Results</span></span></P><P>The Task Chain in Datasphere is in Running status:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Datasphere Task Chain Running Status" style="width: 871px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371046i89C5349410537884/image-size/large?v=v2&px=999" role="button" title="image.png" alt="SAP Datasphere Task Chain Running Status" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">SAP Datasphere Task Chain Running Status</span></span></P><P><FONT size="5"><STRONG>9 Conclusion/Outlook & Further aspects:</STRONG></FONT></P><P><STRONG>Overview of the major building blocks:</STRONG></P><TABLE><TBODY><TR><TD><P><STRONG>Capability</STRONG></P></TD><TD><P><STRONG>Result</STRONG></P></TD></TR><TR><TD><P>Secure Auth</P></TD><TD><P>OAuth 2.0 Client Credentials</P></TD></TR><TR><TD><P>No hardcoded secrets</P></TD><TD><P>via Communication Arrangements/System</P></TD></TR><TR><TD><P>Remote orchestration</P></TD><TD><P>Creating an end-to-end orchestrated chain of tasks</P></TD></TR><TR><TD><P>Cloud-native ABAP</P></TD><TD><P>Using appropriate ABAP Class APIs</P></TD></TR><TR><TD><P>Datasphere automation</P></TD><TD><P>Using Task Chain API</P></TD></TR></TBODY></TABLE><P><STRONG>Use Cases that the implementation enables:</STRONG></P><UL><LI>Nightly batch orchestration</LI><LI>Event-driven data loads</LI><LI>Cross-system workflows (this was our ultimate focus)</LI><LI>CI/CD data pipelines</LI><LI>API-driven analytical data refresh</LI></UL><P><STRONG>References</STRONG></P><UL><LI><SPAN>SAP Datasphere Task Chain API Documentation: </SPAN><A href="https://api.sap.com/api/DatasphereTasks/overview" target="_blank" rel="noopener noreferrer"><SPAN>https://api.sap.com/api/DatasphereTasks/overview</SPAN></A></LI><LI><SPAN>SAP Datasphere Task Chain Help Page: </SPAN><A href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/274f2736465c4c48a091c675880502a2.html?locale=en-US" target="_blank" rel="noopener noreferrer"><SPAN>https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/274f2736465c4c48a091c675880502a2.html?locale=en-US</SPAN></A></LI><LI><SPAN>ABAP Cloud - how to call external APIs: </SPAN><A href="https://jacekw.dev/blog/2022/oauth-client-credentials-from-abap-cloud/" target="_blank" rel="noopener nofollow noreferrer"><SPAN>https://jacekw.dev/blog/2022/oauth-client-credentials-from-abap-cloud/</SPAN></A></LI></UL><P><STRONG>These are your main takeaways to further <U>sharpen</U> the solution:</STRONG></P><UL><LI>Use short token lifetimes</LI><LI>Separate technical OAuth users</LI><LI>Monitor task runs via API from ABAP</LI><LI>Handle retries & failures in ABAP</LI><LI>Log run IDs for observability</LI></UL><P>If you have any aspects, comments and/or concerns, please raise them and let's discuss. I'm happy and proud that this article was written by at least 90% human and only 10% AI <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span></P>2026-02-10T16:44:24.061000+01:00https://community.sap.com/t5/integration-blog-posts/automating-invoice-predictions-from-sap-databricks-with-sap-datasphere-task/ba-p/14325688Automating Invoice Predictions from SAP Databricks with SAP Datasphere Task Chain2026-02-11T13:01:18.220000+01:00BabacarThttps://community.sap.com/t5/user/viewprofilepage/user-id/125204<P><SPAN>In our previous </SPAN><A href="https://community.sap.com/t5/integration-blog-posts/integrating-non-sap-semi-structured-invoices-data-with-sap-bdc-using-sap/ba-p/14272179" target="_blank">blog</A><SPAN>, we demonstrated how to process semi-structured PDF invoices, parse them and store the results in a </SPAN><CODE>Delta table</CODE><SPAN> before applying machine learning algorithms to </SPAN><CODE>predict payment delays</CODE><SPAN>. These predictions are hosted in </SPAN><CODE>SAP Databricks</CODE><SPAN> as part of a </SPAN><CODE>customer managed data product</CODE><SPAN>, with results written to a target Delta table in Unity Catalog and </SPAN><CODE>shared</CODE><SPAN> via Delta Sharing with SAP Business Data Cloud (BDC).</SPAN></P><P class="">This is how our customer managed data product holding the <STRONG>first</STRONG> processed 1.000 invoices looks like before any updates:<SPAN> </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_1-1770739992447.png" style="width: 864px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371101iC06ACD042AAFF373/image-dimensions/864x244?v=v2" width="864" height="244" role="button" title="BabacarT_1-1770739992447.png" alt="BabacarT_1-1770739992447.png" /></span></P><P class=""><SPAN>The next challenge on our journey was </SPAN><CODE>automation</CODE><SPAN>. Once </SPAN><CODE>new invoices</CODE><SPAN> land in the </SPAN><CODE>S3 bucket</CODE><SPAN>, we need a </SPAN><CODE>fully automated process</CODE><SPAN> that parses them, updates predictions, and refreshes the Delta share, without any manual intervention. Since SAP consumes the data federatively, business users expect up‑to‑date predictions in near real time.</SPAN></P><DIV><SPAN>To meet this requirement, my colleague </SPAN><a href="https://community.sap.com/t5/user/viewprofilepage/user-id/1650687">@Mostafa_Shaarawy1</a><SPAN> </SPAN><SPAN><SPAN>and Idesigned and implemented an <CODE>end‑to‑end orchestration flow</CODE> using <STRONG>SAP Datasphere (DSP) Task Chains</STRONG> and <STRONG>Databricks API</STRONG> to:</SPAN></SPAN></DIV><UL class=""><LI>trigger the Databricks notebook on schedule or on demand,</LI><LI>monitor its execution,</LI><LI>and ensure updated predictions flow instantly to SAP BDC.</LI></UL><P class="">To achieve this, we are going to utilize the<SPAN> </SPAN><A href="https://docs.databricks.com/api/workspace/jobs/runnow" target="_blank" rel="noopener nofollow noreferrer">run-now</A><SPAN> </SPAN>and the<SPAN> </SPAN><A href="https://docs.databricks.com/api/workspace/jobs/getrun" target="_blank" rel="noopener nofollow noreferrer">get run</A><SPAN> </SPAN>APIs described in<SPAN> </SPAN><A href="https://docs.databricks.com/api/workspace/introduction" target="_blank" rel="noopener nofollow noreferrer">Databricks API reference</A>, with authentication handled via a Databricks<SPAN> </SPAN><CODE>Service Principle</CODE>.</P><P class=""><STRONG><CODE>The result</CODE></STRONG>: a fully automated pipeline that delivers fresh, reliable insights to business users, improving decision‑making, operational efficiency, and payment‑risk visibility.</P><HR /><H2 id="business-use-case-finance-managers-perspective" id="toc-hId-1789615619">Business Use Case: Finance Manager’s Perspective</H2><P class="">Imagine you’re a finance manager responsible for optimizing cash flow. You rely on accurate predictions of payment delays to decide which customers to follow up with and when. With Datasphere orchestration, the moment new invoices arrive, the system automatically processes them, updates predictions, and makes the latest insights available in SAP BDC, without any manual steps. This means you can act faster, reduce overdue payments, and improve working capital, all while saving time and effort.</P><P class="">We’ll achieve this using Datasphere Task Chains, leveraging their API Task (REST) capability and a Generic HTTP connection to seamlessly integrate with SAP Databricks.</P><BLOCKQUOTE dir="auto"><P class=""><FONT color="#FF0000">Disclaimer</FONT>: The scenario we are covering in this blog is technically working fine but is not officially supported in the SAP Databricks documentation.</P></BLOCKQUOTE><HR /><H2 id="solution-overview" id="toc-hId-1593102114">Solution Overview</H2><P class="">The architecture looks like this:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_2-1770741673564.png" style="width: 867px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371131i8B7FED589119F1F6/image-dimensions/867x403?v=v2" width="867" height="403" role="button" title="BabacarT_2-1770741673564.png" alt="BabacarT_2-1770741673564.png" /></span></P><BLOCKQUOTE dir="auto"><P class=""><STRONG>What makes this work?</STRONG></P></BLOCKQUOTE><UL class=""><LI><P class="">The architecture consists of three components:</P><UL class=""><LI>DSP Task Chains orchestrate Databricks jobs using REST API calls.</LI><LI>Databricks Jobs API triggers and monitors the notebook execution.</LI><LI>Delta Sharing + Unity Catalog exposes the updated prediction table live to SAP BDC/DSP.</LI></UL></LI></UL><P class="">This ensures new data flows from invoice ingestion → prediction → Delta share → SAP consumption end‑to‑end, without manual steps.</P><HR /><H2 id="prerequisites" id="toc-hId-1396588609">Prerequisites</H2><P class="">Before building the orchestration, we need to first complete these steps:</P><OL class=""><LI><P class="">Create a Service Principal in SAP Databricks Workspace at the workspace level to generate:</P><UL class=""><LI>Client ID</LI><LI>Client Secret</LI><LI>OAuth Token Endpoint.</LI></UL></LI><LI><P class="">Assign the service principal permissions to run the job and access the target Delta table.</P><UL class=""><LI>This avoids using personal tokens and ensures secure automated API calls.</LI></UL></LI><LI><P class="">Prepare the Databricks Job</P><UL class=""><LI>Identify the Job ID of the notebook that:<UL class=""><LI>Processes new invoices,</LI><LI>Runs ML predictions,</LI><LI>Writes results into the target Delta table in Unity Catalog.</LI></UL></LI></UL></LI><LI><P class="">Update the Delta Share and the Share CSN document</P></LI></OL><HR /><H1 id="step-by-step-implementation" id="toc-hId-1070992385">Step-by-Step Implementation</H1><H2 id="1-set-up-databricks-and-create-service-principal" id="toc-hId-1003561599">1) Set-up Databricks and create service Principal</H2><BLOCKQUOTE dir="auto"><P class=""><SPAN>A </SPAN><CODE>service principal</CODE><SPAN> in Databricks is a dedicated, non-human identity built for automation. It enables secure, API‑only access to Databricks resources for scripts, automated workflows, and CI/CD pipelines, all without depending on personal user credentials.</SPAN></P></BLOCKQUOTE><P class="">a- Go to<SPAN> </SPAN><CODE>Settings</CODE><SPAN> </SPAN>><SPAN> </SPAN><CODE>Ìdentity and Access</CODE><SPAN> </SPAN>><SPAN> </SPAN><CODE>Service Principals</CODE><SPAN> </SPAN>><SPAN> </SPAN><CODE>Add service principal</CODE>:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_4-1770740521350.png" style="width: 866px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371108i0D85193D8B0D1A3F/image-dimensions/866x368?v=v2" width="866" height="368" role="button" title="BabacarT_4-1770740521350.png" alt="BabacarT_4-1770740521350.png" /></span></P><P><SPAN>b- </SPAN><CODE>Generate a secret</CODE><SPAN> for this </SPAN><CODE>Service Principal</CODE><SPAN> and copy it as well as the </SPAN><CODE>Client ID</CODE><SPAN> (you'll need them for the definition of the </SPAN><CODE>Http</CODE><SPAN>connection in</SPAN></P><DIV><DIV><SPAN>SAP Datasphere</SPAN><SPAN><span class="lia-unicode-emoji" title=":disappointed_face:">😞</span></SPAN></DIV></DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_5-1770740624632.png" style="width: 854px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371110i06D3E270A501EC6B/image-dimensions/854x427?v=v2" width="854" height="427" role="button" title="BabacarT_5-1770740624632.png" alt="BabacarT_5-1770740624632.png" /></span></P><P><SPAN>c- Go to the notebook under </SPAN><CODE>Schedule</CODE><SPAN>and create a new one:</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_6-1770740684519.png" style="width: 854px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371113i35E445AADE1CE348/image-dimensions/854x427?v=v2" width="854" height="427" role="button" title="BabacarT_6-1770740684519.png" alt="BabacarT_6-1770740684519.png" /></span></P><P><SPAN>d- Switch to the </SPAN><STRONG><CODE>Jobs & Pipelines</CODE></STRONG><SPAN> section and select your scheduled job and copy the job ID (you'll need it in the API Invocation</SPAN><CODE>request body</CODE><SPAN> of the </SPAN><CODE>task chain</CODE><SPAN> in SAP Datasphere to call the right job): </SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_10-1770741243271.png" style="width: 854px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371126i4A3DCA39B20C9EE7/image-dimensions/854x427?v=v2" width="854" height="427" role="button" title="BabacarT_10-1770741243271.png" alt="BabacarT_10-1770741243271.png" /></span></P><H2 id="2-set-up-your-http-connection-in-your-datasphere-space" id="toc-hId-807048094">2) Set-up your Http Connection in your Datasphere space</H2><UL class=""><LI><SPAN>In your space, navigate to </SPAN><CODE>Connections</CODE><SPAN> in Datasphere and choose </SPAN><CODE>Http</CODE><SPAN> as connection type.</SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_3-1770742009737.png" style="width: 856px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371133i25AAB02A5E55897E/image-dimensions/856x461?v=v2" width="856" height="461" role="button" title="BabacarT_3-1770742009737.png" alt="BabacarT_3-1770742009737.png" /></span></P><UL class=""><LI>Enter the connection details.</LI></UL><PRE><CODE>Host = <your_tenant_url without 'https://' >.
Port = 443
Protocol = HTTPS
Use Cloud Connector
= False
Authentication Type
= OAuth2.0
OAuth Grant Type
= Client Credentials
OAuth Token Endpoint = <Your_token_Url_endpoint>
OAuth Scope = all-apis
OAuth Response Type: Token
OAuth Token Request Content Type: URL Encoded
Client ID: <Client_Id_of_DBX_Service_Principle>
Secret: <Secret_of_DBX_Service_Principle></CODE></PRE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_0-1770742508224.png" style="width: 856px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371137i287F8209CC423DAF/image-dimensions/856x601?v=v2" width="856" height="601" role="button" title="BabacarT_0-1770742508224.png" alt="BabacarT_0-1770742508224.png" /></span></P><UL class=""><LI>Save the connection.</LI></UL><H2 id="3-create-task-chain-with-api-task" id="toc-hId-610534589">3) Create Task Chain with API Task</H2><UL class=""><LI><P class="">Navigate to<SPAN> </SPAN><STRONG><CODE>Data builder</CODE></STRONG><SPAN> </SPAN>and create a new<SPAN> </SPAN><CODE>Task Chain</CODE>.</P></LI><LI><P class="">From the top of the canvas, drag and drop<SPAN> </SPAN><CODE>API Task</CODE><SPAN> </SPAN>to the editor.</P></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_1-1770742766093.png" style="width: 826px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371138iC4D3137176FECC02/image-dimensions/826x593?v=v2" width="826" height="593" role="button" title="BabacarT_1-1770742766093.png" alt="BabacarT_1-1770742766093.png" /></span></P><P><SPAN>From the properties of the </SPAN><CODE>Task Chain</CODE><SPAN>, select under </SPAN><CODE>Connection</CODE><SPAN> the </SPAN><CODE>Http</CODE><SPAN> connection created in step 2.</SPAN></P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_0-1770743726280.png" style="width: 830px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371144i403F6F3048E2283D/image-dimensions/830x415?v=v2" width="830" height="415" role="button" title="BabacarT_0-1770743726280.png" alt="BabacarT_0-1770743726280.png" /></span><P class="">In the<SPAN> </SPAN><STRONG><CODE>API Invocation</CODE></STRONG><SPAN> </SPAN>section, choose<SPAN> </SPAN><CODE>POST</CODE><SPAN> </SPAN>method and the mode to be<SPAN> </SPAN><CODE>Asynchronous</CODE>.</P><UL class=""><LI>The base url will be automatically generated from the connection details</LI><LI>In the API Path, write<SPAN> </SPAN><CODE>run-now</CODE><SPAN> </SPAN>(more information can be found in the Databricks API documentation<SPAN> </SPAN><A href="https://docs.databricks.com/api/workspace/jobs/runnow" target="_blank" rel="noopener nofollow noreferrer">https://docs.databricks.com/api/workspace/jobs/runnow</A>).</LI><LI>The body should be a json object with the<SPAN> </SPAN><CODE>job-id</CODE>of the Databricks job.</LI></UL><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_3-1770743846230.png" style="width: 479px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371145i4622654923388711/image-dimensions/479x620?v=v2" width="479" height="620" role="button" title="BabacarT_3-1770743846230.png" alt="BabacarT_3-1770743846230.png" /></span><UL class=""><LI>Choose<SPAN> </SPAN><CODE>Get result from HTTP status code and response body</CODE><SPAN> </SPAN>and set the path to<SPAN> </SPAN><CODE>run_id</CODE>. This will map the property<SPAN> </SPAN><CODE>run_id</CODE><SPAN> </SPAN>in the API response to be used for status retrieval to be able to monitor the Databricks job from the task chain.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Status_retrieval_Path.png" style="width: 527px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371149iF6F16BED8E7C0E91/image-dimensions/527x223?v=v2" width="527" height="223" role="button" title="Status_retrieval_Path.png" alt="Status_retrieval_Path.png" /></span></P><P class="">In the<SPAN> </SPAN><STRONG><CODE>Status</CODE></STRONG><SPAN> </SPAN>API section, we will use the<SPAN> </SPAN><CODE>get single run</CODE><SPAN> </SPAN>API for status retrieval as documented in Databricks API documentation<SPAN> </SPAN><A href="https://docs.databricks.com/api/workspace/jobs/getrun" target="_blank" rel="noopener nofollow noreferrer">https://docs.databricks.com/api/workspace/jobs/getrun</A>.</P><UL class=""><LI>Set the API path to<SPAN> </SPAN><CODE>runs/get?run_id={id}</CODE>. Since we have, in the previous step, configured the path to<SPAN> </SPAN><CODE>run-id</CODE>, the API task will dynamically use its value in the query parameter during run time.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="StatusAPI_API_Task.png" style="width: 527px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371159iFF072334BBD881AC/image-dimensions/527x705?v=v2" width="527" height="705" role="button" title="StatusAPI_API_Task.png" alt="StatusAPI_API_Task.png" /></span></P><UL class=""><LI>In the response section choose<SPAN> </SPAN><CODE>Get result from HTTP status code and response body</CODE><SPAN> </SPAN>to determine whether the databricks job was successful or not, based on the result from the response body.</LI><LI>As success indicator use:<SPAN> </SPAN><CODE>status.termination_details.code</CODE><SPAN> </SPAN>equals to<SPAN> </SPAN><CODE>SUCCESS</CODE></LI><LI>As error indicator use:<SPAN> </SPAN><CODE>status.state</CODE>equals to<SPAN> </SPAN><CODE>TERMINATED</CODE></LI></UL><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BabacarT_0-1770810886930.png" style="width: 543px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371551i6E2FD01D5A450337/image-dimensions/543x352?v=v2" width="543" height="352" role="button" title="BabacarT_0-1770810886930.png" alt="BabacarT_0-1770810886930.png" /></span><P> </P><H2 id="4-run-the-api-task-to-trigger-the-databricks-job" id="toc-hId-414021084">4) Run the API task to trigger the Databricks job</H2><P class="">Now the API task is ready to be triggered. Once triggered the<SPAN> </SPAN><CODE>Invocation API</CODE><SPAN> </SPAN>will be called with the<SPAN> </SPAN><CODE>job-id</CODE><SPAN> </SPAN>in the request body and the<SPAN> </SPAN><CODE>status</CODE><SPAN> </SPAN>API will be called periodically to evaluate the completion of the Databricks job.</P><P class="">For testing purpose, you can conveniently just run the API task without having to run the task chain. Click on<SPAN> </SPAN><CODE>Test Run</CODE><SPAN> </SPAN>in the API task to run it.</P><P class="">You can now click on the<SPAN> </SPAN><CODE>Data Integration Monitor</CODE><SPAN> </SPAN>icon to see the run details.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Test_Run_API_Task.png" style="width: 549px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371151i8292B6F12E5A2E10/image-dimensions/549x261?v=v2" width="549" height="261" role="button" title="Test_Run_API_Task.png" alt="Test_Run_API_Task.png" /></span></P><P><SPAN>When you switch to the monitoring, you can then see the run details of the job:</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DSP_Monitoring.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371154iE6CC80034766CD29/image-size/large?v=v2&px=999" role="button" title="DSP_Monitoring.png" alt="DSP_Monitoring.png" /></span></P><P><SPAN>This is also reflected in SAP Databricks:</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DBX_Monitoring.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371152iB0168105D2421E61/image-size/large?v=v2&px=999" role="button" title="DBX_Monitoring.png" alt="DBX_Monitoring.png" /></span></P><P><SPAN>Once the job has completed successfully, you can see the 5 invoices </SPAN><CODE>newly</CODE><SPAN> added to the S3 bucket now </SPAN><CODE>updated in the custom managed data product</CODE><SPAN> shared from SAP Databricks to SAP Business Data Cloud:</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DBX_DeltaTable_DSP.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371153i433527275406C88E/image-size/large?v=v2&px=999" role="button" title="DBX_DeltaTable_DSP.png" alt="DBX_DeltaTable_DSP.png" /></span></P><HR /><H2 id="key-benefits-for-business-users" id="toc-hId-217507579">Key Benefits for Business Users</H2><UL class=""><LI>Real-time insights: Data is refreshed immediately as new invoices are processed.</LI><LI>Zero manual effort: A fully automated end‑to‑end pipeline.</LI><LI>Improved cash flow management: Faster decisions supported by fresh insights.</LI><LI>Operational efficiency: No delays between data processing and business consumption.</LI><LI>Federated access: Data is consumed directly in SAP BDC—no physical copies or complex integrations required.</LI></UL></DIV>2026-02-11T13:01:18.220000+01:00https://community.sap.com/t5/technology-blog-posts-by-sap/a-sap-btp-kyma-encryption-decryption-microservice-for-all-contexts-e-g-sap/ba-p/14326277A SAP BTP Kyma Encryption/Decryption Microservice for ALL Contexts (e.g. SAP Datasphere/BDC/ABAP)2026-02-11T16:12:01.099000+01:00stefan_geiselhart2https://community.sap.com/t5/user/viewprofilepage/user-id/200897<P>G’day!</P><P><STRONG>Warning</STRONG>: This is only for readers who are really interested in this subject <span class="lia-unicode-emoji" title=":smiling_face_with_heart_eyes:">😍</span> The article is quite comprehensive and has got some cross-article links inside, that also need to be considered. Therefore: A lot of reading and a lot of hands-on + thinking if you want to rebuild the thing.</P><P><U>Below is the structure of this article:</U></P><UL><LI><FONT size="3">1 Status Quo on En-/Decryption in BTP</FONT></LI><LI><FONT size="3">2 Motivation</FONT></LI><LI><FONT size="3">3 Fundamentals </FONT></LI><LI><FONT size="3">4 Architecture</FONT></LI><LI><FONT size="3">5 HDLFS Configuration</FONT></LI><LI><FONT size="3">6 Insights into Python</FONT></LI><LI><FONT size="3">7 Containerization & Deployment in Kyma</FONT></LI><LI><FONT size="3">8 Achievements</FONT></LI><LI><FONT size="3">9 More Scope</FONT></LI><LI><FONT size="3">10 References</FONT></LI></UL><P>Let's begin:</P><P><STRONG><FONT size="5">1 Status Quo on En-/Decryption in BTP</FONT></STRONG></P><P><FONT size="3">When it comes to En-/Decryption on SAP BTP, there are a couple of pitfalls to be considered. </FONT>The following roughly outlines the situation, platform and application-wise:</P><UL><LI>Individual applications (e.g. SAP Successfactors) got proprietary encryption/decryption mechanisms available. They are typically <STRONG>not reusable</STRONG>.</LI><LI><STRONG>SAP Integration Suite has got PGP En-/Decryptor nodes</STRONG> available. However, it is <STRONG><U>not</U> advisable to run</STRONG> large volume scenarios using this approach. It has been built for data sizes on the levels represented within message handling scenarios (I‘m not an expert therein, but I guess thats typically not in the area of GBs).</LI><LI>A policy/rule for data to remain encrypted until its landing into BTP: You face challenges to <STRONG>securely onboard large volume files</STRONG> containing sensitive data <STRONG>into BTP</STRONG>, because there is no out-of-the-box solution to keep them encrypted in motion and decrypt them at rest once available in BTP (and vice versa for an outbound flow).</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="The Challenge" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371589i1ED7B676DB32AA4F/image-size/large?v=v2&px=999" role="button" title="image.png" alt="The Challenge" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">The Challenge</span></span></P><P><STRONG><FONT size="5">2 Motivation</FONT></STRONG></P><P>Due to the above limitations, we were eager to find an appropriate solution approach that first of all satifies our specific scenario, but can also be augmented to other contexts. This is why the following motivation arose:</P><UL><LI>Especially En-/Decryption for larger files on BTP side seems to be a missing piece in the puzzle.</LI><LI>Our ambition is to encapsulate such kind of mechanism centrally as a reusable service. <STRONG>Multiple kinds of contexts can be served</STRONG>, exposing this service: Applications, data flows & transformations etc. – in our context the consumer will be SAP Datasphere.</LI><LI>There is another consequence of finally being able to load decrypted large data sets into BTP. To be more precise: To onboard this data into SAP Datasphere to further process and transform, the fact that Base64 encoded data must be handled in our context too, why not also expose or <STRONG>incorporate columnar Base64 En-/Decoding</STRONG> into the microservice?</LI><LI>The intended solution is <STRONG>custom-built</STRONG> and must be managed individually. However, the baseline that will be set <STRONG>can be reused and adapted</STRONG> to various customer needs and application contexts.</LI></UL><P><STRONG><FONT size="5">3 Fundamentals</FONT></STRONG> </P><P>The following represent some fundamentals and success criteria we‘ve established for our solution:</P><UL><LI>Use HANA Data Lake Files (<STRONG>HDLFS</STRONG>) as a file store for inbound/outbound persistency</LI><LI>Implement the service as <STRONG>containerized Python</STRONG> code logic</LI><LI>Service capabilities must be <STRONG>controllable and schedulable within Kyma</STRONG></LI><LI>Service can handle <STRONG>PGP encrypted</STRONG> files, but can also decrypt for outbound delivery</LI><LI>Service can decode <STRONG>Base64</STRONG> columns for inbound source files</LI><LI>Service <STRONG>manages and cleans up</STRONG> directories and file in-/output on HDLFS</LI><LI>Service can handle <STRONG>files with sizes in GB range</STRONG></LI><LI>Deploy service into a <STRONG>Kyma Cluster</STRONG> (Cloud Foundry? -> no, Memory/Sizing limitations)</LI><LI><STRONG>Security</STRONG>: Facilitate simple and secure handling of all required secrets and certificates</LI><LI>Service <STRONG>not exposed via HTTP endpoints</STRONG> (this is kind of a contradiction to a true microservice, however this requirement applies in our context. Under "More Scope" you will find guidelines/directions how to enable HTTP based microservices)</LI></UL><P><STRONG><FONT size="5">4 Architecture</FONT></STRONG></P><P>The following architectural components represent what we‘ve picked out for our solution:</P><UL><LI>Usage of HANA Data Lake Files (<STRONG>HDLFS</STRONG>) as a file store for <STRONG>inbound/outbound persistency</STRONG></LI><LI>Use <STRONG>BTP Kyma for service runtime</STRONG> (Kyma deployed in the 2nd smallest T-Shirt size)</LI><LI><STRONG>SAP Datasphere</STRONG> represents the <STRONG>persistency layer</STRONG> (DSP as a consumer is exchangeable but in our case relevant for the business scenario)</LI><LI>The other more general blue box "other BTP Solutions" indicates, that the service implementation on Kyma isn't necessary limited to SAP Datasphere only, but should be rather considered as something agnostic. This of course heavily depends how it is built and finally implemented.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="BTP Solution Architecture Kyma Microservice" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371516i29EF2B1C74333CEB/image-size/large?v=v2&px=999" role="button" title="image.png" alt="BTP Solution Architecture Kyma Microservice" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">BTP Solution Architecture Kyma Microservice</span></span></P><P><STRONG><FONT size="5">5 HDLFS Configuration</FONT></STRONG></P><P><FONT size="4">I strongly recommend to read the <A href="https://developers.sap.com/tutorials/data-lake-file-containers-hdlfscli.html" target="_self" rel="noopener noreferrer">tutorial by Jason Hinsperger</A> to familiarize yourself with the HDLFS REST API dependencies and the essential steps to spin up the instance. You must follow all the steps described therein.</FONT></P><P><FONT size="4">Its important to note that the <STRONG>generated client key and client certificate</STRONG> is what you are about to use from python level. Moreover <STRONG>you should take note of the REST API endpoint</STRONG>. This is the endpoint you run all your requests against (you find it in HANA Cloud Central -> Instances -> HDLFS Instance -> "Files REST API Endpoint".).</FONT></P><P><FONT size="4">To familiarize yourself in general with the HDLFS REST API documentation, you can use this <A href="https://help.sap.com/doc/9d084a41830f46d6904fd4c23cd4bbfa/2025_3_QRC/en-US/index.html#tag/WebHDFS" target="_self" rel="noopener noreferrer">link</A>.</FONT></P><P><STRONG><FONT size="5"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="HDLFS REST API Endpoint" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371618i735BEFA7C02FCA36/image-size/large?v=v2&px=999" role="button" title="image.png" alt="HDLFS REST API Endpoint" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">HDLFS REST API Endpoint</span></span></FONT></STRONG></P><P><FONT size="4">One essential detail I don't want to hide, is the IP whitelisting of the Kyma environment. In order to do that, navigate to the start page of your cluster dashboard. There is a section called "Cluster Overview --> Metadata". Copy the (typically three) IP listed there: "NAT Gateway IP Addresses".</FONT></P><P><FONT size="4">Go to your HANA Cloud Central and navigate to your HDLFS instance. Click "Manage Configuration" and go to the section "Connections": If you have enabled the setting to only allow specific IP addresses, likewise maintain the ones of the Kyma Cluster that you previously copied:</FONT></P><P><FONT size="4"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="HDLFS Configuration Connections" style="width: 706px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371638i636055B083947539/image-dimensions/706x435?v=v2" width="706" height="435" role="button" title="image.png" alt="HDLFS Configuration Connections" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">HDLFS Configuration Connections</span></span></FONT></P><P><STRONG><FONT size="5">6 Insights into Python</FONT></STRONG></P><P><FONT size="3">The python coding part is split into several modules, which are all described in a detailed way further below:</FONT></P><OL><LI><FONT size="3">Inbound processing</FONT></LI><LI><FONT size="3">Outbound processing</FONT></LI><LI><FONT size="3">Common functions</FONT></LI><LI><FONT size="3">Crypto function for encryption/decryption</FONT></LI></OL><P><FONT size="3"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Python Modules Sketch" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371586i15DBFF5CC94401F9/image-size/large?v=v2&px=999" role="button" title="image.png" alt="Python Modules Sketch" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Python Modules Sketch</span></span></FONT></P><P><FONT size="4"><STRONG>Inbound Module (1)</STRONG></FONT></P><P>The inbound part scans a given datalake path for <CODE>.gpg</CODE> files, then decrypts each file locally in parallel using a configurable worker count. After decryption, it samples the CSV (semicolon-delimited), detects which columns contain base64-encoded data, and runs a transformation step to produce a cleaned output file. The transformed result is uploaded to a decrypted output path, and the original encrypted input is moved to an archive folder. It logs success or errors with timing details on file level and exits with a failure code if any file fails.</P><P>This code snippet describes the parallel processing of files:</P><pre class="lia-code-sample language-python"><code> with ThreadPoolExecutor(max_workers=MAX_PARALLEL_JOBS) as exe:
futs = {exe.submit(processFile, f): f for f in files}
for fut in as_completed(futs):
f = futs[fut]
try:
res = fut.result()
ok += 1
print(f"[OK] {res['input']} -> {res['output']} sec={res['sec']:.2f} base64_cols={res['base64_cols']}")
except Exception as ex:
err.append((f, str(ex)))
print(f"[ERR] {f}: {ex}")</code></pre><P>Sampel code of processing an individual file:</P><pre class="lia-code-sample language-python"><code>def processFile(file_name: str):
t0 = timer()
out_name = normalize_output_name(file_name)
with tempfile.TemporaryDirectory() as td:
enc_path = os.path.join(td, file_name)
dec_path = os.path.join(td, out_name + ".decrypted.csv")
out_path = os.path.join(td, out_name)
download_to_file(f"{PATH_IN_DATALAKE}/{file_name}", enc_path)
decrypt_symmetric(enc_path, dec_path)
header, sample = sample_csv(dec_path, delimiter=";")
base64_cols = detect_base64_columns(out_name, header, sample)
transform_csv(dec_path, out_path, base64_cols, delimiter=";")
mkdirs(PATH_OUT_DATALAKE)
upload_file_chunked(f"{PATH_OUT_DATALAKE}/{out_name}", out_path)
mkdirs(PATH_ARCHIVE)
rename(f"{PATH_IN_DATALAKE}/{file_name}", f"{PATH_ARCHIVE}/{file_name}")
return {"input": file_name, "output": out_name, "base64_cols": base64_cols, "sec": timer() - t0}</code></pre><P><FONT size="4"><STRONG>Outbound module (2)</STRONG></FONT></P><P><FONT size="3">It scans the HDLFS outbound folder for .csv files and subdirectories containing Parquet parts, then processes them in parallel using a configurable worker count. CSV files are downloaded locally, then encrypted and uploaded back again to an encrypted outbound path. The originals are moved to an archive folder. Parquet file based directories on HDLFS (when enabled) are downloaded, consolidated into a single CSV using PyArrow, encrypted and uploaded similarly. As a last step, they are marked with an _SUCCESS file to prevent reprocessing and moved to the archive. </FONT></P><P><FONT size="4"><STRONG>Crypto Module (3)</STRONG></FONT></P><P>It provides helper functions for symmetric file encryption and decryption using the <CODE>gpg</CODE> command-line tool with an AES-256 cipher. It requires a passphrase supplied via the <CODE>PASSPHRASE</CODE> environment variable (from a Kyma Secret). Both <CODE>encrypt_symmetric</CODE> and <CODE>decrypt_symmetric</CODE> execute GPG in batch mode through <CODE>subprocess</CODE>, capturing stdout/stderr and validating the return code.</P><P>The <CODE>decrypt_symmetric</CODE> function is implemented as follows:</P><P> </P><pre class="lia-code-sample language-python"><code>def decrypt_symmetric(enc_path: str, dec_path: str):
_require_passphrase()
cmd = [
"gpg", "--batch", "--yes",
"--pinentry-mode", "loopback",
"--passphrase", PASSPHRASE,
"--output", dec_path,
"--decrypt", enc_path
]
r = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
if r.returncode != 0:
raise RuntimeError(f"GPG decrypt failed: {r.stderr.decode('utf-8', 'replace')}")</code></pre><P><FONT size="4"><STRONG>Common Functions Module (4)</STRONG></FONT></P><DIV class=""><DIV class=""><DIV class=""><DIV class=""><DIV class=""><DIV class=""><DIV class=""><P>All major control variables are built using environment variables (defined as secrets in Kyma) for endpoint/container, TLS certificates, verification behavior, timeouts, retries, and chunk size.</P><P>In one function it builds a reusable SSL context. API requests are wrapped with exponential-backoff retries for transient HTTP statuses (e.g., 429/5xx) and network errors. It provides HDLFS related helpers to list files/directories, check existence, create directories, rename paths, download remote content to disk in chunks, and upload files.</P><P>The below part builds the SSL context to connect to HDLFS via its REST API:</P></DIV></DIV></DIV></DIV></DIV></DIV></DIV><pre class="lia-code-sample language-python"><code>def buildSSLContext() -> ssl.SSLContext:
ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
if os.path.exists(CRT_PATH) and os.path.exists(KEY_PATH):
ctx.load_cert_chain(certfile=CRT_PATH, keyfile=KEY_PATH)
if TLS_VERIFY:
if CA_CERT_PATH:
ctx.load_verify_locations(CA_CERT_PATH)
else:
ctx.load_default_certs()
ctx.verify_mode = ssl.CERT_REQUIRED
ctx.check_hostname = True
else:
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
return ctx</code></pre><P><FONT size="4">This part establishes a connection to the HDLFS REST API endpoint as defined in the environment variable FILES_REST_API:</FONT></P><pre class="lia-code-sample language-abap"><code>def _conn():
if not FILES_REST_API:
raise RuntimeError("FILES_REST_API is empty (set env FILES_REST_API).")
return http.client.HTTPSConnection(FILES_REST_API, timeout=HTTP_TIMEOUT, context=_SSL_CTX)</code></pre><P><FONT size="4">The below function represents the upload (HDLFS REST API PUT) of a file in streaming mode to an HDLFS directory:</FONT></P><pre class="lia-code-sample language-python"><code>def _put_to_location(location: str):
url = urlparse(location)
if u.scheme and u.netloc:
host = url.netloc
path = (url.path or "/") + (("?" + url.query) if url.query else "")
conn = http.client.HTTPSConnection(host, timeout=HTTP_TIMEOUT, context=_SSL_CTX)
else:
path = location
conn = _conn()
try:
conn.putrequest("PUT", path)
conn.putheader("x-sap-filecontainer", CONTAINER)
conn.putheader("Content-Type", "application/octet-stream")
conn.putheader("Content-Length", str(size))
conn.putheader("Connection", "close")
conn.endheaders()
with open(file_path, "rb") as f:
while True:
chunk = f.read(CHUNK_SIZE)
if not chunk:
break
conn.send(chunk)
r = conn.getresponse()
data = r.read()
status = r.status
r.close()
if status in _RETRYABLE_STATUSES:
raise RetryableHttpError(f"PUT retryable: {status} {data[:200]!r}")
if status not in (200, 201):
raise RuntimeError(f"PUT failed {remote_path_no_leading_slash}: {status} {data[:200]!r}")
finally:
conn.close()</code></pre><P><STRONG><FONT size="5">7 Containerization & Deployment in Kyma</FONT></STRONG></P><P>I won't describe in detail, how the creation of a docker container is made step by step. I just provide some essentials + hints to guarantee that you succeed and don't stumble upon the same issues I had faced. The following prerequisites must be met in order to create a runnable docker container based on your Dockerfile + python code:</P><UL><LI>Local installation of docker (Docker Desktop)</LI><LI>Docker registry (e.g. GitHub Docker Registry) – this is from where the Pod pulls the docker image</LI><LI>kubectl cli</LI><LI>IDE such as Visual Studio code</LI></UL><P>For an overall procedure, including the part in Kyma, you can refer to this <A href="http://https://community.sap.com/t5/technology-blog-posts-by-members/develop-and-deploy-python-rest-api-with-kubernetes-docker-in-sap-btp-kyma/ba-p/13533279" target="_blank" rel="noopener nofollow noreferrer">blog entry on SCN</A>. I strongly recommend that you try to rebuild the example the author walks through first, BEFORE you tackle your actual project.</P><P>Recommendations out of my personal learnings:</P><UL><LI>Try to not create any Kyma artifacts via the UI, but rather define those within descriptor yaml files of the corresponding kinds (e.g. kind: Deployment or Service or APIRule).</LI><LI><STRONG>!!!</STRONG> A minor but very important thing to consider when running the docker build command, run it as follows:</LI></UL><pre class="lia-code-sample language-bash"><code>docker build . --tag ghcr.io/<github_user>/<git_repo_name>:latest --platform linux/amd64</code></pre><UL><LI>The deployment.yaml in the blog specified above is based on three Kyma/K8S artifacts that are created: <STRONG>Deployment, Service + APIRule</STRONG>. Which kind of artifacts you actually require strongly depends on your python implementation and how your service/logic runs. In the event of a purely job schedule based execution of your processing logic, <STRONG>you don't need a service or APIRule</STRONG> artifact, of course. Instead, you'd only need a <STRONG>deployment of kind: CronJob</STRONG>. This will spin up all dependent artifacts implicitely. In a CronJob based deployment you will have to specify the following details in the yaml file: scheduling details (time, frequency, time zone), required volumes, container spec including secrets to pull image, container level: mounts & env variables, resource allocation.</LI><LI>A template for a CronJob based deployment can look like this:</LI></UL><pre class="lia-code-sample language-json"><code>apiVersion: batch/v1
kind: CronJob
metadata:
name: <job_name>
namespace: <kyma_namespace>
spec:
suspend: false
schedule: "* * * * *"
timeZone: "Europe/Berlin"
jobTemplate:
spec:
backoffLimit: 3
template:
spec:
restartPolicy: Never
imagePullSecrets:
- name: ghcr-pull-secret
volumes:
- name: gnupg-home
emptyDir: {}
initContainers:
- name: init-gnupg
image: busybox:1.36
command: ["sh","-c","mkdir -p /tmp/gnupg && chmod 700 /tmp/gnupg"]
volumeMounts:
- name: gnupg-home
mountPath: /tmp/gnupg
resources:
requests:
cpu: "10m"
memory: "32Mi"
limits:
cpu: "50m"
memory: "64Mi"
containers:
- name: inbound
image: <container_registry_path>
imagePullPolicy: Always
command: [<custom_os_level_command_if_required>]
volumeMounts:
- name: gnupg-home
mountPath: /tmp/gnupg
env:
- name: KEY_PATH
value: "/keys/keyfile.key"
- name: PASSPHRASE
valueFrom:
secretKeyRef:
name: kyma-secret
key: PASSPHRASE
resources:
requests:
cpu: "100m"
memory: "1Gi"
limits:
cpu: "200m"
memory: "2Gi"</code></pre><P><STRONG><FONT size="5">8 Achievements</FONT></STRONG></P><P>The following functionality and service details were delivered:</P><UL><LI>The Python code was improved multiple times: Handling approx. 30 files in one go with a total of 10 GB in encrypted state could be accelerated to < 30 minutes of processing time</LI><LI>Cron Jobs in the Kyma cluster can be used to schedule individual services, e.g. inbound processing of files that reside in a specific folder on HDLFS</LI><LI>SAP Datasphere writes local tables via Replication Flows back into HDLFS, on which the outbound service performs encryption and marks the files to „collectible“</LI></UL><P><STRONG><FONT size="5">9 More Scope</FONT></STRONG></P><P><FONT size="4">...up to come soon. I will cover aspects how to further mature the microservice/python logic and moreover exemplify on HTTP based encryption/decryption service endpoints, exposed on the Kyma Cluster.</FONT></P><P><FONT size="4"><STRONG><FONT size="5">10 References</FONT></STRONG></FONT></P><P><A href="https://community.sap.com/t5/technology-blog-posts-by-members/develop-and-deploy-python-rest-api-with-kubernetes-docker-in-sap-btp-kyma/ba-p/13533279" target="_self">K8S/Kyma + Python SCN Blog</A></P><P><A href="http://https://developers.sap.com/tutorials/data-lake-file-containers-hdlfscli.html" target="_self" rel="nofollow noopener noreferrer">Getting Started with Data Lake Files HDLFSCLI</A></P><P><A href="http:// https://help.sap.com/doc/9d084a41830f46d6904fd4c23cd4bbfa/2025_3_QRC/en-US/index.html#tag/WebHDFS" target="_self" rel="nofollow noopener noreferrer">HDLFS REST API Guide</A></P><P><FONT size="4"><STRONG><FONT size="5">Thx...</FONT></STRONG></FONT></P><P><FONT size="4">...to my fellow colleagues at SAP who provides meaningful input and discussed about solution options. Furthermore I am especially grateful to our implementation partner who also supported, implemented and showed strong endurance in building up this scenario.</FONT></P><P><FONT size="4">I hope you enjoyed reading this article and could have gained some deeper insights into what we did. If you have comments or suggestions of any kind, don't hesitate to comment and start-off a discussions with me and hopefully other SMEs.</FONT></P>2026-02-11T16:12:01.099000+01:00https://community.sap.com/t5/technology-blog-posts-by-members/deep-dive-into-sap-datasphere-object-store-of-bdc-benefits-architecture-and/ba-p/14328318Deep Dive into SAP Datasphere Object Store of BDC: Benefits, Architecture and implementation2026-02-15T17:50:03.644000+01:00SreekanthSurampallyhttps://community.sap.com/t5/user/viewprofilepage/user-id/8543<P>In this document, I will go through the architecture and benefits of HANA DataLake Files service in the BDC Datasphere application. In the SAP Business business Data cloud there are 2 instances of HDLF object stores,</P><OL><LI>Foundation services HDLF object store - managed by SAP</LI><LI>Datasphere HDLF object store - managed by Customers</LI></OL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ObjectStores_BDC.png" style="width: 740px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372730i0E83FA0C35DBC3C5/image-dimensions/740x417?v=v2" width="740" height="417" role="button" title="ObjectStores_BDC.png" alt="ObjectStores_BDC.png" /></span></P><P>Object store is a file storage from Hyperscalers like Amazon s3 to store large amounts of data files(csv, parquet and delta tables)</P><P>FOS HDLF object store is used for SAP standard data products, when an intelligent application is activated in BDC Cockpit, required standard data products are installed and data is loaded to FOS HDLF object store. these Data products are available as Delta tables in the FOS HDLF, can be shared with Datasphere HANA Space or Databricks for data analytics and data science models. FOS HDLF is managed by SAP, Customers don't have access to it. </P><P>Datasphere HDLF object store is used for storing large data files from SAP and non SAP systems. these files are available as Delta tables. Customers can configure the ETL to replicate and transform data into Datasphere HDLF space. Also, custom data products can be generated in the Datasphere HDLF from BW PCE in BDC. </P><P><FONT color="#0000FF"><STRONG>In this article, we focus on the Datasphere HDLF object store.</STRONG></FONT></P><P>HDLF object store was made available in Standalone Datasphere(BTP) as part of data fabric solution from SAP, later it is introduced as part of BDC.</P><P><STRONG><FONT color="#800000">Benefits:</FONT></STRONG></P><UL><LI>HDLF object store is cost effective solution, 1TB is 0.1 CU per hour.</LI><LI>Spark compute for delta table creation, scalable runtime for transformation flows in the HDLF space.</LI><LI>"SQL on files" feature, read data from HDLF space into HANA cloud models for analytics.</LI></UL><P>You can check the price quote for the object store and compute in the BDC price estimator <A href="https://bdc-pricing-estimator-sac-sacus10.cfapps.us10.hana.ondemand.com/" target="_blank" rel="noopener nofollow noreferrer">https://bdc-pricing-estimator-sac-sacus10.cfapps.us10.hana.ondemand.com/</A></P><P><STRONG><FONT color="#800000">Architecture:</FONT></STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ObjectStore_Spark_SQLonFiles.png" style="width: 738px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372732i54FE2B7C5AD2D9C7/image-dimensions/738x532?v=v2" width="738" height="532" role="button" title="ObjectStore_Spark_SQLonFiles.png" alt="ObjectStore_Spark_SQLonFiles.png" /></span></P><UL><LI>Embedded HDLF object store in BDC Datasphere is optional.</LI><LI>It requires <FONT color="#993300">additional capacity units</FONT>, BDC pricing estimator can provide the number.</LI><LI>Data is replicated from Source systems and stored as Delta tables using <FONT color="#993300">Replication flows.</FONT></LI><LI><FONT color="#993300">Delta tables</FONT> are parquet files with transaction logs and maintained in the HDLF object store.</LI><LI><FONT color="#993300">SQL on Files engine</FONT> can read data from Delta tables of HDLF object store in HANA cloud without loading them to HANA Database. It is a Native HANA feature and explained in detail in this SAP blog <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/unlocking-the-true-potential-of-data-in-files-with-sap-hana-database-sql-on/ba-p/13861585" target="_blank">SQL on Files</A> </LI><LI><FONT color="#993300">Apache SPARK compute</FONT> provides scalable runtime for the transformation flows to process delta table(Parquet files) from Bronze to Silver to Gold layer Delta tables(Medallion architecture).</LI></UL><P><STRONG><FONT color="#800080">Configuration of Datasphere object store</FONT></STRONG></P><P><SPAN>The minimum capacity units required to activate Object Store in BDC is</SPAN><SPAN> </SPAN><U><STRONG>4150.05 CU</STRONG></U><SPAN> </SPAN><SPAN>(128 GB of SAP HANA memory). You can Simulate price the BDC estimator, steps are provided in the SAP blog <A title="Objectstore CUs" href="https://community.sap.com/t5/technology-blog-posts-by-sap/what-is-the-minimum-capacity-units-cu-to-activate-object-store-in-sap/ba-p/14181475#:~:text=Resolution%3A%20The%20minimum%20capacity%20units,Data%20Cloud%20Capacity%20Unit%20Estimator." target="_blank">Objectstore CUs</A></SPAN></P><P>After that you can configure the Tenant in Datasphere --> System --> Configuration --> Tenant configuration and add the Object store units. You can find the detail steps in the SAP blog <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/how-to-activate-the-object-store-in-sap-datasphere/ba-p/14279053" target="_self">Configuring the Tenant</A></P><P>after that, when we create a new space in Datasphere, option to choose HANA cloud or HDLF space will be enabled.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SreekanthSurampally_0-1771160115109.png" style="width: 290px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372794iEC5DED0C9C956CCC/image-dimensions/290x282?v=v2" width="290" height="282" role="button" title="SreekanthSurampally_0-1771160115109.png" alt="SreekanthSurampally_0-1771160115109.png" /></span></P><P><STRONG><FONT color="#800080">HANA Space Vs HDLF Space in Datasphere</FONT>:</STRONG></P><P>After configuring HDLF object store, we have 2 spaces in Datasphere.</P><UL><LI>HANA space</LI><LI>HANA DataLake file space. </LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Datasphere HDLF object Store.png" style="width: 733px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372734i7A70DEA2306B3C69/image-size/large?v=v2&px=999" role="button" title="Datasphere HDLF object Store.png" alt="Datasphere HDLF object Store.png" /></span></P><P>HANA cloud space is the traditional space available in the Standalone Datasphere tenant(BTP), it requires HANA memory and HANA compute engine to store and process the data. On the other side, HDLF space is available in the BDC Datasphere and uses SPARK compute and object storage.</P><P>we must choose the space that we intend to work and build the objects accordingly. monitoring of jobs are specific to the space. Overall the look and feel experience for object development and monitoring jobs is same in both spaces.</P><P>In the HDLF space, replication flows and transformation flows are used to replicate and transform the large data files. Modeling objects like Graphical or SQL views and analytic models can't be developed in this space.</P><P>We can share the HDLF local delta tables(files) in HANA space and use them in transformation flows and Vice versa. that is Share the HANA space Local tables in a HDLF space and use them transformation flows.</P><P><FONT color="#800080"><STRONG>End to end process flows Replication flow + Transformation flows</STRONG></FONT></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Workflow_ObjectStore.png" style="width: 812px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372797iC069A3F7681D1550/image-dimensions/812x374?v=v2" width="812" height="374" role="button" title="Workflow_ObjectStore.png" alt="Workflow_ObjectStore.png" /></span></P><P>Replication flows are used to integrate/replicate data from Source systems into Inbound buffer first, then a merge task needs to be executed to write data to Delta table file. Transformation flows are used to transform the data to silver and gold layers of Delta table files in HDLF object store. These Delta tables can be shared in HANA space of Datasphere in the view builder.</P><P><STRONG><FONT color="#993366">Importance of merge task:</FONT></STRONG></P><P>Since the Local tables (files) contains large data volumes, merge task updates the data by allocating required amount of SPARK compute resources. You can enable the merge task directly in the Replication flow configuration or manually run it from the monitor. You can schedule it as a task in the Task chains. Now you can cancel a running merge task in the monitor tab. Merge task is to be required for BW Data product generator delta table files also.</P><P><FONT color="#0000FF">Note:</FONT> In case of merge task is not run, then data is not available for reporting.</P><P>Python script can be written in the transformation flows and run with SPARK compute to process large data files.</P><P><STRONG><FONT color="#993366">Data product generator for Object store in Datasphere:</FONT></STRONG></P><P class="">Data products are created as Local Delta tables(files) in the object store of Datasphere from BW InfoProviders. For the Subscriptions that are created in BW PCE, even delta data can be pushed from BW ADSO into Object store. these delta local tables can be shared in other applications.</P><P><STRONG><FONT color="#993366"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Source from SAP" style="width: 715px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372796iE79D0379A6D6D612/image-dimensions/715x314?v=v2" width="715" height="314" role="button" title="BWinBDC.png" alt="Source from SAP" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Source from SAP</span></span></FONT></STRONG></P><P>More detail explanation on the BW data products can be found this blog <A href="https://community.sap.com/t5/technology-blog-posts-by-sap/the-sap-bw-data-product-generator-for-sap-business-data-cloud/ba-p/14072413" target="_blank">Data Product Generator</A></P><P><STRONG><FONT color="#993366">Limitations of Object store in Datasphere:</FONT></STRONG></P><UL><LI>Replication Flows currently do not support local table (files) as source. So only Inbound.</LI><LI>Only 5 HDLF spaces can be created in Datasphere tenant.</LI><LI>Objects without Primary key cannot be loaded into embedded object store</LI><LI>Support of local type “initial load” only in Replication Flows is not supported.</LI><LI>Some datatypes don't support in Replication flows of HDLF space.</LI><LI>You will need to check the connection documentation to make sure if it is supported for Data lake files or not.</LI></UL><P><STRONG><FONT color="#993300">Summary:</FONT> </STRONG>SAP Datasphere object store is a cost effective solution to store large data files as delta tables and transform the data with SPARK runtime and share them as data products in SAP Databricks.</P>2026-02-15T17:50:03.644000+01:00