https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Datasphere-blog-posts.xml SAP Community - SAP Datasphere 2024-05-20T11:12:45.229747+00:00 python-feedgen SAP Datasphere blog posts in SAP Community https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-news-in-april/ba-p/13690421 SAP Datasphere News in April 2024-05-03T10:45:11.774000+02:00 kpsauer https://community.sap.com/t5/user/viewprofilepage/user-id/14110 <P>April was for real <span class="lia-unicode-emoji" title=":hundred_points:">💯</span>&nbsp;with regards to great new features for SAP Datasphere. There are also lots of excellent new blogs in the community and in case you missed it, our What’s New in SAP Datasphere session covering Q1 just ran in early April.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2024-04 News.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105373i3756111B2D2B511C/image-size/large?v=v2&amp;px=999" role="button" title="2024-04 News.png" alt="2024-04 News.png" /></span></P><H2 id="toc-hId-994312968">What’s New in SAP Datasphere Q1 2024 Webinar</H2><P>We just ran our Q1&nbsp;<A href="https://events.sap.com/2024-1231-sap-btp-whatsnew-datashere-webinar-global/en/home" target="_blank" rel="noopener noreferrer">What’s New in SAP Datasphere session</A>&nbsp;on April 10th 2024. No worries, in case you missed it. The session covering the features and functions delivered in Q1 2024 was recorded and is available on YouTube.<BR />Watch the recording <span class="lia-unicode-emoji" title=":backhand_index_pointing_down:">👇</span><BR /><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FDnDRVTEkqEA%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DDnDRVTEkqEA&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FDnDRVTEkqEA%2Fhqdefault.jpg&amp;key=b0d40caa4f094c68be7c29880b16f56e&amp;type=text%2Fhtml&amp;schema=youtube" width="400" height="225" scrolling="no" title="What's New in SAP Datasphere in Q1-2024" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></P><P>&nbsp;</P><H2 id="toc-hId-797799463">My top features in April</H2><H3 id="toc-hId-730368677"><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;Confluent Kafka as Target for Replication Flows</H3><P>After delivering the native Apache Kafka connection already back in November 2023, you can now use Confluent Kafka and the corresponding connection type to add targets to replication flows.&nbsp;<BR /><SPAN>For more information: </SPAN><A href="https://help.sap.com/docs/PRODUCTS/42ab3fb5d0d5462b8d917bef60e5364d/74b3c95464f246aa8c3fd510661daa6d.html?locale=en-US&amp;state=DRAFT&amp;version=STABI" target="_blank" rel="noopener noreferrer">Using Confluent Kafka As the Target</A><SPAN> and </SPAN><A href="https://help.sap.com/docs/PRODUCTS/42ab3fb5d0d5462b8d917bef60e5364d/eb85e157ab654152bd68a8714036e463.html?locale=en-US&amp;state=DRAFT&amp;version=STABI" target="_blank" rel="noopener noreferrer">Integrating Data via Connections</A></P><H3 id="toc-hId-533855172"><STRONG><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span></STRONG>&nbsp;Cataloging offers ability to add an&nbsp;SAP Datasphere, SAP BW bridge&nbsp;as a Source System</H3><P>You can add an&nbsp;SAP Datasphere, SAP BW bridge&nbsp;system to the catalog and have the metadata objects monitored and extracted to the catalog. Once extracted, you can publish these objects as assets to the catalog and use them in the impact and lineage diagram.<BR />For more information: <A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/287ad2d8640449d5a62c6bb5b4a37d4d.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener noreferrer">Connecting to Data Sources</A> and <A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/dc061a23484241b1b791f5540b1f38e3.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener noreferrer">Evaluating Catalog Assets</A></P><H3 id="toc-hId-337341667"><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;Measure Type&nbsp;Non-Cumulative Measure&nbsp;for Analytic Models</H3><P>You can now create non-cumulative measures for analytic models. This type of measure is used for cases where measure values describe a state that is not changing every day. For example, you can use non-cumulative measures to calculate the stock in a warehouse, the bank balance of an account or certain values in HR use cases.<BR />For more information: <A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/58fcee02df8044119777cf060000aca8.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener noreferrer">Create a Non-Cumulative Measure</A>&nbsp;.</P><H3 id="toc-hId-140828162"><STRONG><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span></STRONG>&nbsp;Add a Visual Tenant Type Indicator</H3><P>When you are working with multiple tenants you can now add a visual tenant type indicator to distinguish which system you are using. For example, between a test or production system. When enabled, a colored information bar is visible to all users of the tenant, and the browser favicon is updated with the matching color.&nbsp;<BR />For more information: <A href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/6bdd79878afa4ec5bcd9d3502158a06e.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener noreferrer">Display Your System Information</A></P><P>&nbsp;</P><P><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2F9_vIaKbqqng%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D9_vIaKbqqng&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2F9_vIaKbqqng%2Fhqdefault.jpg&amp;key=b0d40caa4f094c68be7c29880b16f56e&amp;type=text%2Fhtml&amp;schema=youtube" width="200" height="112" scrolling="no" title="SAP Datasphere Top Features in April" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></P><P>&nbsp;</P><H2 id="toc-hId--184768062">Blogs for our Japanese Community</H2><UL><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-%E3%82%A4%E3%83%B3%E3%83%86%E3%83%AA%E3%82%B8%E3%82%A7%E3%83%B3%E3%83%88%E3%83%AB%E3%83%83%E3%82%AF%E3%82%A2%E3%83%83%E3%83%97-%E5%90%8D%E5%AF%84%E3%81%9B%E6%A9%9F%E8%83%BD%E3%81%AE%E3%81%94%E7%B4%B9%E4%BB%8B/ba-p/13679229" target="_blank">SAP Datasphere インテリジェントルックアップ : 名寄せ機能のご紹介</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-%E3%83%87%E3%83%BC%E3%82%BF%E3%82%A2%E3%82%AF%E3%82%BB%E3%82%B9%E5%88%B6%E5%BE%A1-%E8%A1%8C%E3%83%AC%E3%83%99%E3%83%AB%E3%82%BB%E3%82%AD%E3%83%A5%E3%83%AA%E3%83%86%E3%82%A3/ba-p/13668264" target="_blank">SAP Datasphere : データアクセス制御 (行レベルセキュリティ)</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datapshere%E3%81%A7%E3%81%AE-hana-cloud-data-lake-%E3%81%AE%E5%88%A9%E7%94%A8%E6%96%B9%E6%B3%95/ba-p/13637646" target="_blank">SAP Datapshereでの HANA Cloud, Data Lake の利用方法</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-%E3%83%87%E3%83%BC%E3%82%BF%E3%83%95%E3%83%AD%E3%83%BC%E3%81%AE%E4%BD%9C%E6%88%90/ba-p/13656186" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : データフローの作成</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-%E3%83%AA%E3%83%A2%E3%83%BC%E3%83%88%E3%83%86%E3%83%BC%E3%83%96%E3%83%AB-%E4%BB%AE%E6%83%B3%E3%83%86%E3%83%BC%E3%83%96%E3%83%AB-%E3%81%AE%E4%BD%9C%E6%88%90/ba-p/13655824" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : リモートテーブル(仮想テーブル)の作成</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-%E6%8E%A5%E7%B6%9A-%E3%83%AA%E3%83%A2%E3%83%BC%E3%83%88%E3%82%BD%E3%83%BC%E3%82%B9-%E3%81%AE%E8%A8%AD%E5%AE%9A/ba-p/13655609" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : 接続(リモートソース)の設定</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-sap-s-4hana-%E3%82%B7%E3%82%B9%E3%83%86%E3%83%A0%E3%81%AE%E8%A8%AD%E5%AE%9A/ba-p/13655508" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : SAP S/4HANA システムの設定</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-cloud-connector%E3%81%AE%E8%A8%AD%E5%AE%9A/ba-p/13655271" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : Cloud Connectorの設定</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA-dp-agent%E3%81%AE%E8%A8%AD%E5%AE%9A/ba-p/13655244" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携 : DP Agentの設定</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-s-4hana%E3%81%A8sap-datasphere%E3%81%AE%E3%83%87%E3%83%BC%E3%82%BF%E9%80%A3%E6%90%BA/ba-p/13655121" target="_blank">SAP S/4HANAとSAP Datasphereのデータ連携</A></LI></UL><P>&nbsp;</P><H2 id="toc-hId--381281567">My top blogs in April</H2><P>Excellent blog from <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/1426829">@AlexLePape</a>&nbsp;outlining the integration of SAP Datasphere with the Customer Data Platform (CDP) through Direct Data Access<BR /><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;<A href="https://community.sap.com/t5/data-and-analytics-blog-posts/unlocking-the-power-of-customer-data-sap-customer-data-platform-and-sap/ba-p/13651446" target="_blank">Unlocking the Power of Customer Data –&nbsp;SAP Customer Data Platform and SAP Analytics Cloud</A>.</P><P><a href="https://community.sap.com/t5/user/viewprofilepage/user-id/1506">@ClaudiaFiess</a>&nbsp;started a blog series about SAP extractors and SAP Datasphere with many great details on the different scenario options:<BR /><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/exploring-integration-options-in-sap-datasphere-with-the-focus-on-using-sap/ba-p/13658329" target="_blank">Exploring Integration Options in SAP Datasphere with the focus on using SAP extractors</A><BR /><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/exploring-integration-options-in-sap-datasphere-with-the-focus-on-using-sap/ba-p/13680387" target="_blank">Exploring Integration Options in SAP Datasphere with the focus on using SAP extractors - Part II</A></P><P>Another favorite in April is <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/39263">@jaigupta</a>&nbsp;excellent deep dive into currency conversion and the derivation of variables<BR /><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-using-variable-derivation-for-currency-conversion-measures/ba-p/13660714" target="_blank">Using Variable derivation for currency conversion measures within Analytic Model</A></P><P>The last blogs I’d like to highlight is about the command line interface. <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/1445338">@Yves_Kipfer</a>&nbsp;checks on the new feature of CRUD operations&nbsp;<BR /><span class="lia-unicode-emoji" title=":right_arrow:">➡️</span>&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/use-crud-opreations-with-sap-datasphere-command-line-interface/ba-p/13682690" target="_blank">Use CRUD Operations with SAP Datasphere Command-Line-Interface</A><BR /><BR /></P><H2 id="toc-hId--577795072">More blogs to check out</H2><UL><LI><A href="https://community.sap.com/t5/data-and-analytics-discussions/hybrid-scenario-usage-of-sap-hana-calculation-view-on-sap-datasphere/td-p/13687212/jump-to/first-unread-message" target="_blank">Hybrid Scenario: Usage of SAP HANA Calculation View on SAP Datasphere</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-sap-s-4hana-your-guide-to-seamless-data-integration/ba-p/13662817" target="_blank">SAP Datasphere + SAP S/4HANA: Your Guide to Seamless Data Integration</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/hybrid-architectures-a-modern-approach-for-sap-data-integration/ba-p/13685389" target="_blank">Hybrid Architectures: A Modern Approach for SAP Data Integration</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-members/sap-datasphere-s-updated-pricing-amp-packaging-lower-costs-amp-more/ba-p/13682679" target="_blank">SAP Datasphere's updated Pricing &amp; Packaging: Lower Costs &amp; More Flexibility</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-members/quick-amp-easy-datasphere-when-to-use-data-flow-transformation-flow-sql/ba-p/13678235" target="_blank">Quick &amp; Easy Datasphere - When to use Data Flow, Transformation Flow, SQL View?</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/enhance-your-sap-datasphere-experience-with-api-access/ba-p/13585094" target="_blank">Enhance your SAP Datasphere Experience with API Access</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/use-crud-opreations-with-sap-datasphere-command-line-interface/ba-p/13682690" target="_blank">Use CRUD Operations&nbsp;with SAP Datasphere Command-Line-Interface</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/exploring-integration-options-in-sap-datasphere-with-the-focus-on-using-sap/ba-p/13680387" target="_blank">Exploring Integration Options in SAP Datasphere with the focus on using SAP extractors - Part II</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-analytics-cloud-add-in-for-microsoft-excel-and-sap-datasphere/ba-p/13676027" target="_blank">SAP Analytics Cloud, add-in for Microsoft Excel and SAP Datasphere connection</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/10-ways-to-reshape-your-sap-landscape-with-sap-business-technology-platform/ba-p/13614282" target="_blank">10+ ways to reshape your SAP landscape with SAP Business Technology Platform – Blog 4</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-members/data-flows-the-python-script-operator-and-why-you-should-avoid-it/ba-p/13664408" target="_blank">Data Flows - The Python Script Operator and why you should avoid it</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/enhanced-data-analysis-of-fitness-data-using-hana-vector-engine-datasphere/ba-p/13666367" target="_blank">Enhanced Data Analysis of Fitness Data using HANA Vector Engine, Datasphere and SAP Analytics Cloud</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/join-and-innovate-with-the-sap-enterprise-support-advisory-council-esac/ba-p/13666646" target="_blank">Join and innovate with the SAP Enterprise Support Advisory Council (ESAC) Program</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/pilot-sap-datasphere-fundamentals/ba-p/13663830" target="_blank">Pilot: SAP Datasphere Fundamentals</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/customers-and-partners-are-using-sap-btp-to-innovate-and-extend-their-sap/ba-p/13656355" target="_blank">Customers and Partners are using SAP BTP to Innovate and Extend their SAP Applications</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-is-ready-to-take-over-the-role-of-sap-bw/ba-p/13661635" target="_blank">SAP Datasphere is ready to take over the role of SAP BW</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-using-variable-derivation-for-currency-conversion-measures/ba-p/13660714" target="_blank">SAP Datasphere: Using Variable derivation for currency conversion measures within Analytic Model</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-members/replication-flows-sap-datasphere-to-google-bigquery/ba-p/13654502" target="_blank">Replication flows: SAP Datasphere to Google BigQuery</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/exploring-integration-options-in-sap-datasphere-with-the-focus-on-using-sap/ba-p/13658329" target="_blank">Exploring Integration Options in SAP Datasphere with the focus on using SAP extractors</A></LI><LI><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-space-data-integration-and-data-modeling-best-practices/ba-p/13651889" target="_blank">SAP Datasphere - Space, Data Integration, and Data Modeling Best Practices</A></LI><LI><A href="https://community.sap.com/t5/artificial-intelligence-and-machine-learning-blogs/knowledge-graphs-on-datasphere-and-hana-cloud-the-differences/ba-p/13654058" target="_blank">Knowledge Graphs on Datasphere and HANA Cloud. The differences</A></LI><LI><A href="https://community.sap.com/t5/data-and-analytics-blog-posts/unlocking-the-power-of-customer-data-sap-customer-data-platform-and-sap/ba-p/13651446" target="_blank">Unlocking the Power of Customer Data –&nbsp;SAP Customer Data Platform and SAP Analytics Cloud</A></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gold-line-2.png" style="width: 902px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105377iC086D7409EC4235B/image-size/large?v=v2&amp;px=999" role="button" title="gold-line-2.png" alt="gold-line-2.png" /></span></P><P>Find more information and related blog posts on the&nbsp;<A href="https://pages.community.sap.com/topics/datasphere" target="_blank" rel="noopener noreferrer">topic page for SAP Datasphere</A>. You will find further product information on our Community with various subpages about&nbsp;<A href="https://pages.community.sap.com/topics/datasphere/getting-started" target="_blank" rel="noopener noreferrer">Getting Started</A>,&nbsp;<A href="https://pages.community.sap.com/topics/datasphere/business-content" target="_blank" rel="noopener noreferrer">Business Content</A>, the&nbsp;<A href="https://pages.community.sap.com/topics/datasphere/bw-bridge" target="_blank" rel="noopener noreferrer">SAP BW Bridge</A> as well as content for&nbsp;<A href="https://pages.community.sap.com/topics/datasphere/best-practices-troubleshooting" target="_blank" rel="noopener noreferrer">Best Practices &amp; Troubleshooting</A>&nbsp;and the <A href="https://pages.community.sap.com/topics/datasphere/faq" target="_blank" rel="noopener noreferrer">FAQ for SAP Datasphere</A>.</P><P>Find out how to unleash the power of your business data with SAP’s free learning content on <A href="https://learning.sap.com/learning-journey/explore-sap-datasphere?source=social-meta-prdteng-ExploreSAPDatasphere" target="_blank" rel="noopener noreferrer">SAP Datasphere</A>. It’s designed to help you enrich your data projects, simplify the data landscape, and make the most out of your investment. Check out even more role-based learning resources and opportunities to get certified in one place on <A href="https://learning.sap.com/?url_id=text-sapcommunity-prdteng" target="_blank" rel="noopener noreferrer">&nbsp;SAP Learning site.</A></P> 2024-05-03T10:45:11.774000+02:00 https://community.sap.com/t5/technology-blogs-by-members/create-sap-s-4hana-external-gl-account-hierarchy-within-sap-datasphere/ba-p/13692445 Create SAP S/4HANA External GL Account Hierarchy within SAP Datasphere using standard CDS Views 2024-05-07T09:51:03.951000+02:00 JoelleS https://community.sap.com/t5/user/viewprofilepage/user-id/1431336 <H2 id="toc-hId-994372616">1.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Introduction</H2><P>This blogpost is inspired by this Blogpost (<A href="https://community.sap.com/t5/technology-blogs-by-sap/guide-create-sap-s-4hana-external-gl-account-hierarchy-within-sap/ba-p/13574648" target="_blank">https://community.sap.com/t5/technology-blogs-by-sap/guide-create-sap-s-4hana-external-gl-account-hierarchy-within-sap/ba-p/13574648</A>). It is described how you can create an SAP S/4HANA External GL Account Hierarchy within SAP Datasphere.</P><P>However, sometimes it might be not possible to use the community content package, therefore we want to hand you a guide which shows you how the hierarchy is being set up. If you don’t have any deviations, we still recommend using the community package, since it will save you a lot of time and effort. Please be aware that this logic can be used for differing use cases! Even with the data changed, the logic will remain the same.</P><P>First, you need to set up a connection between your S/4HANA Cloud System and your space in SAP Datasphere. There will soon follow a guide series provided by my colleague Simone and me. I will keep this blog post up to date and add this link as soon as our series started</P><P>You should know some basics of SAP Datasphere (build views in general and create an replication flow).</P><H2 id="toc-hId-797859111">2.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Basic Logic and Setup</H2><P>&nbsp;</P><P>We will build the hierarchy and the final view of a balance sheet with the G/L hierarchy applied to it. Please find this logic applied to it. It is the same from the community content. The purple arrows represent the associations.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="GL Account Model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106170i235D3063E013AECD/image-size/large?v=v2&amp;px=999" role="button" title="GL Account Model.png" alt="GL Account Model.png" /></span></P><P>Therefore, we will need the following CDS Views which will be pulled via Replication Flow. In this step we will create new target tables. These Local Tables remain with the same name as the original CDS views. You can choose existing target tables as well if you have them, but I assume you are starting from scratch.</P><P>For the views, we will use different business and technical names. We sticked with the naming convention of the original blogpost with the community content to not create confusion.</P><P>&nbsp;</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHY</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHY</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Directory (DIM)</P></TD><TD width="151"><P>Local Table for GL Account Hierarchy View</P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYTEXT</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYTEXT</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Directory &nbsp;- Text (IL)</P></TD><TD width="151"><P>Local Table for GL Account Hierarchy Text</P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYNODE</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Node (IL)</P></TD><TD width="151"><P>Local Table for GL Account Hierarchy Node</P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYNODET</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODET</P></TD><TD width="151"><P>HRF: G/L Account Group (Hierarchy Node) - Text (IL)</P></TD><TD width="151"><P>Local Table for GL Account Hierarchy Node Text</P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTINCHARTOFACCOUNTS</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTINCHARTOFACCOUNTS</P></TD><TD width="151"><P>HRF: G/L Account (DIM)</P></TD><TD width="151"><P>Local Table for GL Account Master data Attributes</P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTTEXTRAWDATA</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTTEXTRAWDATA</P></TD><TD width="151"><P>HRF: G/L Account – Text (IL)</P></TD><TD width="151"><P>Local Table for GL Account Master data Text</P></TD></TR></TBODY></TABLE><P>&nbsp;</P><P>In addition to these, you should have a view containing your transaction data. In our example we have a custom CDS View with Journal Entries. We are aware that there are Standard CDS Views that contain the same data, but in our case, we had to add some measures.</P><P>After deploying and running your replication flow, you should now have several Local Tables. Within the Data Builder you can filter after them using the top row.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="image.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106171iBB33829FB3E2EF1B/image-size/large?v=v2&amp;px=999" role="button" title="image.png" alt="image.png" /></span></P><P>Starting from the bottom, we will first create the hierarchy. Afterwards we will build the dimension to which the hierarchy is applied to. This dimension is our master data, which will be associated with our transaction data (Fact View).&nbsp; Based on the Fact View, we will create our Analytic Model to make our data available for consumption in SAP Analytics Cloud.</P><P>&nbsp;</P><H2 id="toc-hId-601345606">3.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Building the Hierarchy with Directory</H2><H3 id="toc-hId-533914820">3.1 View (Text) for GL Account Hierarchy Text</H3><P>We will first build this Text View:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYTEXT</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYTEXT</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Directory &nbsp;- Text (IL)</P></TD><TD width="151"><P>GL Account Hierarchy Text</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost2.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106173i74C9B062EB5F6307/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost2.png" alt="Blogpost2.png" /></span></P><P>Create the view and adjust the following Colums:</P><UL><LI>ValidityEndDate with the expression TO_DATE(ValidityEndDate)</LI><LI>ValidityStartDate with the expression TO_DATE(ValidityStartDate)<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_29-1714986697637.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106174i30F6850B414B8B29/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_29-1714986697637.png" alt="JoelleS_29-1714986697637.png" /></span></LI></UL><P>&nbsp;</P><P>In the newly created View (Text) change the semantic type from “relational dataset” to “Text”.</P><P>Work on the Semantic Types for the Attributes accordingly.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_31-1714987030392.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106177i76A77B50ABEBA390/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_31-1714987030392.png" alt="JoelleS_31-1714987030392.png" /></span></P><P>There is no need to set a representative key in this step.</P><H3 id="toc-hId-337401315">3.2 View (Dimension) for GL Account Hierarchy View</H3><P>Next up is the related Dimension:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHY</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHY</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Directory (DIM)</P></TD><TD width="151"><P>GL Account Hierarchy View</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost3.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106178i44AD44A3E1E24FD3/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost3.png" alt="Blogpost3.png" /></span></P><P>Create the view and adjust the following Colums:</P><UL><LI>ValidityEndDate with the expression TO_DATE(ValidityEndDate)</LI><LI>ValidityStartDate with the expression TO_DATE(ValidityStartDate)</LI></UL><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_32-1714987208817.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106179iC62577B076085429/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_32-1714987208817.png" alt="JoelleS_32-1714987208817.png" /></span></P><P>In the newly created View (Dimension) change the semantic type from “relational dataset” to “Dimension”.</P><P>Create an Association between HRF: G/L Account Hierarchy Directory&nbsp; - Text (IL) (SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYTEXT) and the newly created Dimension.</P><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_33-1714987314331.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106181iAAB53C68C91D3B06/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_33-1714987314331.png" alt="JoelleS_33-1714987314331.png" /></span>Work on the Semantic Types for the Attributes accordingly. Please ignore the error warning.&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_34-1714987387752.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106182i57A54A1B844F60C5/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_34-1714987387752.png" alt="JoelleS_34-1714987387752.png" /></span></P><P>There is no need to set a representative key in this step.</P><H3 id="toc-hId-140887810">3.3&nbsp;&nbsp; View (Text) for GL Account Hierarchy Node Text</H3><P>We will first build this Text View:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYNODET</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODET</P></TD><TD width="151"><P>HRF: G/L Account Group (Hierarchy Node) - Text (IL)</P></TD><TD width="151"><P>GL Account Hierarchy Node Text</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost4.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106183i8C54AD523E8D90C5/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost4.png" alt="Blogpost4.png" /></span></P><P> Create the view and adjust the following Colums:</P><UL><LI>ValidityEndDate with the expression TO_DATE(ValidityEndDate)</LI><LI>ValidityStartDate with the expression TO_DATE(ValidityStartDate)</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_0-1714987642614.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106184iCB26D7606577C49F/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_0-1714987642614.png" alt="JoelleS_0-1714987642614.png" /></span></P><P>In the newly created View (Text) change the semantic type from “relational dataset” to “Text”.</P><P>Work on the Semantic Types for the Attributes accordingly.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_1-1714987704630.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106185i8A4FF9505E035128/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_1-1714987704630.png" alt="JoelleS_1-1714987704630.png" /></span></P><P>Set “Hierarchy Node” (HierarchyNode) representative key.</P><H3 id="toc-hId--55625695">3.4&nbsp;&nbsp; View (Relational Dataset) GL Account Hierarchy Node</H3><P>We will now create the related Dimension:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTHIERARCHYNODE</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Node (IL)</P></TD><TD width="151"><P>GL Account Hierarchy Node</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost2.4.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106188i18E4868FB406BC16/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost2.4.png" alt="Blogpost2.4.png" /></span></P><P>Create the view and adjust the following Colums:</P><UL><LI>ValidityEndDate with the expression TO_DATE(ValidityEndDate)</LI><LI>ValidityStartDate with the expression TO_DATE(ValidityStartDate)</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_0-1714988031991.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106192i1143700499BC74F5/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_0-1714988031991.png" alt="JoelleS_0-1714988031991.png" /></span></P><P>Create two more calculated Columns:</P><UL><LI>Node ID (NODEID) with the expression:</LI></UL><P>CASE NodeType</P><P>WHEN 'L' then</P><P>CONCAT(ChartOfAccounts,CONCAT('/',GLAccount))</P><P>else HierarchyNode</P><P>end</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_1-1714988093251.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106195iCFFC7F1CCB6841F8/image-size/medium?v=v2&amp;px=400" role="button" title="JoelleS_1-1714988093251.png" alt="JoelleS_1-1714988093251.png" /></span></DIV><UL><LI>Hierarchy Name (HIENAME) with the expression</LI></UL><P>GLAccountHierarchy</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_2-1714988139033.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106197i85EE9DB9C2646125/image-size/medium?v=v2&amp;px=400" role="button" title="JoelleS_2-1714988139033.png" alt="JoelleS_2-1714988139033.png" /></span><P>&nbsp;</P></DIV><P>In the newly created View (relational Dataset) leave everything else as created by default.</P><H3 id="toc-hId--252139200">3.5 View (Dimension) for GL Account Hierarchy Node</H3><P>Next up is the Hierarchy with Directory View:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Relational Dataset</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>HRF: G/L Account Hierarchy Node (IL)</P><P>&nbsp;</P><P>(SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE)</P></TD><TD width="151"><P>SAP_CC_FI_HRF_GLACCOUNTHIERARCHYNODE</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy Node (IL)</P></TD><TD width="151"><P>GL Account Hierarchy Node</P></TD></TR></TBODY></TABLE><H3 id="toc-hId--448652705">&nbsp;</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost5.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106206i8F8016B866669044/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost5.png" alt="Blogpost5.png" /></span></P><P>Create this view and project only the following four columns</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_0-1714993684849.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106239i287A9CE468CDC0D9/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_0-1714993684849.png" alt="JoelleS_0-1714993684849.png" /></span></P><P>Associate the Text View HRF: G/L Account Group (Hierarchy Node) -Text (IL) (SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODET) with the newly created view.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_1-1714994074790.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106242i4652D00CA57C3928/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_1-1714994074790.png" alt="JoelleS_1-1714994074790.png" /></span></P><P>Adjust the semantics for the attributes</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_0-1714995308222.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106244i7E13603EAE2DF90B/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_0-1714995308222.png" alt="JoelleS_0-1714995308222.png" /></span></P><P>Choose "HierarchyNode" (HierarchyNode) as representative key.</P><P> </P><H3 id="toc-hId--645166210">3.6 View (Hierarchy with Directory) for GL Account Hierarchy View</H3><P>Next up is the Hierarchy with Directory View:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Relational Dataset</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>HRF: G/L Account Hierarchy Node (IL)</P><P>&nbsp;</P><P>(SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE)</P></TD><TD width="151"><P>SAP_CC_FI_HRF_GLAccountHierarchyView</P></TD><TD width="151"><P>HRF: G/L Account Hierarchy (Hierarchy with Directory)</P></TD><TD width="151"><P>GL Account Hierarchy</P></TD></TR></TBODY></TABLE><P>&nbsp;</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost6.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106245i1E8641F154859A93/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost6.png" alt="Blogpost6.png" /></span><P>&nbsp;</P></DIV><P>Like before, create the view based on the relational dataset HRF: G/L Account Hierarchy Node (IL) (SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE).</P><P>In the newly created View (Hierarchy with Directory) change the semantic type from “relational dataset” to “Hierarchy with Directory”.</P><P>Create an Association between HRF: G/L Account Hierarchy Directory (DIM)(SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHY) and the newly created Hierarchy with Directory.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_4-1714995982111.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106250iAE1546A244AAD3C5/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_4-1714995982111.png" alt="JoelleS_4-1714995982111.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_5-1714996036000.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106251iBC70C902DBA8C46B/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_5-1714996036000.png" alt="JoelleS_5-1714996036000.png" /></span></P><P>Create a second association between HRF: G/L Account Hierarchy Node (IL) (SAP_CC_FI_HRF_IL_I_GLACCOUNTHIERARCHYNODE ) and the newly created Hierarchy with Directory.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_3-1714995923164.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106248iFF73D29BA4778537/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_3-1714995923164.png" alt="JoelleS_3-1714995923164.png" /></span></P><P>Work on the Semantic Types for the Attributes accordingly.</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_6-1714996110791.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106252i7CBA090DB7A368BB/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_6-1714996110791.png" alt="JoelleS_6-1714996110791.png" /></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_7-1714996166362.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106253i8C75941F84BD5927/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_7-1714996166362.png" alt="JoelleS_7-1714996166362.png" /></span><P>&nbsp;</P></DIV><P>Change the keys to ValidityEndDate (ValidityEndDate), Node ID (NODEID) and Hierarchy Name (HIENAME). There is no need to set a representative key in this step.</P><P>Open the Hierarchy with Directory Settings and fill in the following information.</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_8-1714996221714.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106254i3090B476D9A4AE02/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_8-1714996221714.png" alt="JoelleS_8-1714996221714.png" /></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_9-1714996264739.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106255i504F910428633BC1/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_9-1714996264739.png" alt="JoelleS_9-1714996264739.png" /></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_10-1714996299349.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106256iE03FA8E6B36988B0/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_10-1714996299349.png" alt="JoelleS_10-1714996299349.png" /></span><P>&nbsp;</P></DIV><H2 id="toc-hId--623508077">4.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Apply the Hierarchy</H2><P>You are almost there <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span>&nbsp;</P><H2 id="toc-hId--820021582">4.1 View (Text) for GL Account Master Data Text</H2><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTTEXTRAWDATA</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTTEXTRAWDATA</P></TD><TD width="151"><P>HRF: G/L Account – Text (IL)</P></TD><TD width="151"><P>GL Account Master data Text</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost7.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106257iBC5D4784A6A59D79/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost7.png" alt="Blogpost7.png" /></span></P><P> </P><P>In the newly created View (Text) change the semantic type from “relational dataset” to “Text”.</P><P>Work on the Semantic Types for the Attributes accordingly.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_11-1714996486203.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106258i0E735A997DFE7B6E/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_11-1714996486203.png" alt="JoelleS_11-1714996486203.png" /></span></P><P>Set GLAccount (GLAccount) as representative key.</P><P>&nbsp;</P><H2 id="toc-hId--1016535087">4.2 View (Dimension) for GL Account Master Data Attributes</H2><P>Next up is the related Dimension:</P><TABLE><TBODY><TR><TD width="156"><P><STRONG>Name CDS View and DSP Local Table</STRONG></P></TD><TD width="151"><P><STRONG>Technical Name View</STRONG></P></TD><TD width="151"><P><STRONG>Business Name View</STRONG></P></TD><TD width="151"><P><STRONG>Usage</STRONG></P></TD></TR><TR><TD width="156"><P>I_GLACCOUNTINCHARTOFACCOUNTS</P></TD><TD width="151"><P>SAP_CC_FI_HRF_IL_I_GLACCOUNTINCHARTOFACCOUNTS</P></TD><TD width="151"><P>HRF: G/L Account (DIM)</P></TD><TD width="151"><P>GL Account Master data Attributes</P></TD></TR></TBODY></TABLE><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Blogpost8.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106260i0E5C5CA757B7AAD6/image-size/large?v=v2&amp;px=999" role="button" title="Blogpost8.png" alt="Blogpost8.png" /></span></P><P>In the newly created View (Dimension) change the semantic type from “relational dataset” to “Dimension”.</P><P>Create an Association between HRF: G/L Account&nbsp; - Text (IL) (SAP_CC_FI_HRF_IL_I_GLACCOUNTTEXTRAWDATA) &nbsp;and the newly created Dimension.</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_12-1714996698179.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106261i390391B34373A77B/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_12-1714996698179.png" alt="JoelleS_12-1714996698179.png" /></span><P>&nbsp;</P></DIV><P>Create a second association between HRF: G/L Account Hierarchy (Hierarchy with Directory) (SAP_CC_FI_HRF_GLAccountHierarchyView)</P><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="JoelleS_13-1714996747405.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106262i6EDC42E5CF912E7E/image-size/large?v=v2&amp;px=999" role="button" title="JoelleS_13-1714996747405.png" alt="JoelleS_13-1714996747405.png" /></span><P>&nbsp;</P></DIV><P>There is no need to work on the semantic types of the attributes.</P><P>Set “GLAccount” (GLAccount) as representative key.</P><H2 id="toc-hId--1213048592">5.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Associate the Dimension</H2><P>In our case we can now associate the Dimension with our Journal Entries, that are deployed as Fact View. We define the measures and create an Analytic Model. In the Data Preview we choose GL Account and can now apply several Hierarchies.</P><P>&nbsp;</P><P>We hope that this guide helps. Feel free to reach out to us <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span></P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-07T09:51:03.951000+02:00 https://community.sap.com/t5/technology-blogs-by-members/currency-translation-in-sap-datasphere/ba-p/13688008 Currency Translation in SAP Datasphere 2024-05-07T09:57:57.155000+02:00 suraj_yadav13 https://community.sap.com/t5/user/viewprofilepage/user-id/191142 <P>Currency translation refers to the process of converting financial data from one currency to another.</P><P>This is particularly important for multinational companies that operate in multiple countries and use different currencies for their financial transactions.</P><P>Currency translation in SAP typically involves the following steps:</P><P>1. <STRONG>Setting up currency exchange rates</STRONG>: Before performing any currency translation, exchange rates need to be maintained in SAP. These rates can be manually entered or automatically updated from external sources.</P><P>2. <STRONG>Defining translation ratios</STRONG>: Companies may have specific rules for currency translation, such as using average rates, closing rates, or historical rates. These rules need to be defined within SAP.</P><P>3. <STRONG>Executing currency translation</STRONG>: Once the exchange rates and translation rules are set up, currency translation can be executed. SAP usually provides standard functionality and reports to perform this task.</P><P>4. <STRONG>Reviewing translated financial data</STRONG>: After translation, it's essential to review the translated financial data to ensure accuracy and compliance with accounting standards.</P><P>Currency translation in SAP is crucial for financial reporting, consolidation, and analysis purposes, allowing companies to view their financial data in a common currency, facilitating comparisons and decision-making across different regions.</P><P>To perform <STRONG>currency translation</STRONG> in <STRONG>SAP Datapshere</STRONG>, the following tables must be available in your space:</P><UL><LI><STRONG>TCURV</STRONG>&nbsp;- Exchange rate types</LI><LI><STRONG>TCURW</STRONG>&nbsp;- Exchange rate type text</LI><LI><STRONG>TCURX</STRONG>&nbsp;- Decimal places in currencies</LI><LI><STRONG>TCURN</STRONG>&nbsp;- Quotations</LI><LI><STRONG>TCURR</STRONG>&nbsp;- Exchange rates</LI><LI><STRONG>TCURF</STRONG>&nbsp;- Conversion factors</LI><LI><STRONG>TCURC</STRONG>&nbsp;- Currency codes</LI><LI><STRONG>TCURT</STRONG>&nbsp;- Currency text</LI></UL><P><STRONG>Creating currency translation related tables in SAP Dataspher:</STRONG></P><OL><LI>Login to SAP Datapshere, go to Data Builder menu, and click on create icon.&nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="1.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104267iCAB9FE16CBF01E0C/image-size/large?v=v2&amp;px=999" role="button" title="1.png" alt="1.png" /></span><P>&nbsp;</P></LI><LI>Select 'Currency Conversion View' from the drop-down.&nbsp;<P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104268i3D97DF30AE28697C/image-size/large?v=v2&amp;px=999" role="button" title="2.png" alt="2.png" /></span></P></LI><LI>Select source which can be manual or any SAP connection in your SAP Datasphere space which is compatible for replicating currency translation tables.&nbsp;<P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="3.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104274iC0DB3C51F2D9BAC0/image-size/large?v=v2&amp;px=999" role="button" title="3.png" alt="3.png" /></span></P>&nbsp;</LI><LI>If source is manual, upload the data to all the required tables either manually or via data flows.&nbsp;</LI><LI>If SAP source is selected, respective remote tables and data flows will also get created.&nbsp;<P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="3.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104266i2E9D9786853C1BAD/image-size/large?v=v2&amp;px=999" role="button" title="3.JPG" alt="3.JPG" /></span></P></LI><LI>After clinking on create option, 32 objects will get deployed including below 8 data flows&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="5.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104279i65069AF7912EF989/image-size/large?v=v2&amp;px=999" role="button" title="5.JPG" alt="5.JPG" /></span></LI><LI>Create a task chain and add all these data flows for uploading the data to required TCUR* tables. You can also schedule the task chain to keep your data updated.&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="6.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104282iCB8EBA695B8F5756/image-size/large?v=v2&amp;px=999" role="button" title="6.JPG" alt="6.JPG" /></span></LI><LI>Execute the task chain to update the data.&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="10.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105836i448E2B3CC553F647/image-size/large?v=v2&amp;px=999" role="button" title="10.JPG" alt="10.JPG" /></span></LI><LI>Create '<STRONG>currency conversion column</STRONG>' in required object (for instance in view)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="11.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105834iAE45EBD43031C50B/image-size/large?v=v2&amp;px=999" role="button" title="11.png" alt="11.png" /></span></LI><LI>Enter business name, technical name, data type, precision and scale.&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="8.png" style="width: 615px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104305i96079EFD07B7B39D/image-size/large?v=v2&amp;px=999" role="button" title="8.png" alt="8.png" /></span></LI><LI>In currency properties, select source amount column, source currency (fixed value or columns), target currency (fixed value, columns, or input parameter) and reference date (input parameters, columns, expressions, fixed date) for conversion.&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="12.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105833i19845A3DEECC6FE9/image-size/large?v=v2&amp;px=999" role="button" title="12.png" alt="12.png" /></span></LI><LI>Enter client, conversion type and error handling&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="9.png" style="width: 592px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/104304i08D09614B4D5258F/image-size/large?v=v2&amp;px=999" role="button" title="9.png" alt="9.png" /></span></LI><LI>Preview the data and verify the rate maintained:&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="13.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105835i1D125F7C79F21E76/image-size/large?v=v2&amp;px=999" role="button" title="13.JPG" alt="13.JPG" /></span>&nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="15.JPG" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105837i98DF6D918A8ECD69/image-size/large?v=v2&amp;px=999" role="button" title="15.JPG" alt="15.JPG" /></span></LI></OL><P>Reference:&nbsp;<A href="https://youtu.be/m4m1TNJ5Ekw" target="_blank" rel="nofollow noopener noreferrer">https://youtu.be/m4m1TNJ5Ekw</A></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-07T09:57:57.155000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/replication-flow-blog-part-6-confluent-as-replication-target/ba-p/13693888 Replication Flow Blog Part 6 – Confluent as Replication Target 2024-05-07T15:03:22.063000+02:00 DanielKraemer https://community.sap.com/t5/user/viewprofilepage/user-id/40769 <P><SPAN>This blog is part of a blog series from SAP Datasphere product management with the focus on the Replication Flow capabilities in SAP Datasphere: </SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN><A href="https://blogs.sap.com/2023/11/16/replication-flow-blog-series-part-1-overview/" target="_blank" rel="noopener noreferrer">Replication Flow Blog Series Part 1 – Overview | SAP Blogs</A></SPAN><SPAN> </SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN><A href="https://blogs.sap.com/2023/11/16/replication-flow-blog-series-part-2-premium-outbound-integration/" target="_blank" rel="noopener noreferrer">Replication Flow Blog Series Part 2 – Premium Outbound Integration | SAP Blogs</A></SPAN><SPAN> </SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN><A href="https://blogs.sap.com/2023/12/04/sap-datasphere-replication-flows-blog-series-part-3-integration-with-kafka/" target="_blank" rel="noopener noreferrer">Replication Flows Blog Series Part 3 – Integration with Kafka</A></SPAN><SPAN> </SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN><A href="https://blogs.sap.com/2023/12/15/replication-flow-blog-series-part-4-sizing/" target="_blank" rel="noopener noreferrer">Replication Flows Blog Series Part 4 – Sizing</A></SPAN><SPAN>   </SPAN></P><P><SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/replication-flow-blog-series-part-5-integration-of-sap-datasphere-and/ba-p/13604976" target="_blank">Replication&nbsp;Flows Blog Series Part 5 – Integration between SAP Datasphere and Databricks</A> </SPAN><SPAN>&nbsp;</SPAN></P><P><U>Replication Flows Blog Series Part 6 – Confluent as a Replication Target</U></P><P><SPAN>Data Integration is an essential topic in a Business Data Fabric like SAP Datasphere. Replication Flow is the cornerstone to fuel SAP Datasphere with data,</SPAN><SPAN>&nbsp;especially from SAP ABAP sources. There is also a big need to move enriched data from SAP Datasphere into external environments to succeed certain use cases.</SPAN><SPAN>&nbsp;</SPAN></P><P>In this part of the Replication Flow Blog series we focus on the usage of Confluent as a target in a Replication Flow. We will explain in detail the new capabilities that have been introduced with SAP Datasphere release 2024.08. The content of this blog is structured as follows.</P><OL><LI>Introduction</LI><LI>Confluent as a new Connection Type in SAP Datasphere</LI><LI>Configuration options for Replication Flows</LI><LI>Details on Kafka Message and Schema creation<OL><LI>Schema &amp; Message creation</LI><LI>Data Type mappings</LI></OL></LI><LI>Scenarios or what happens if...</LI></OL><P>&nbsp;</P><H1 id="toc-hId-865323659">1. Introduction</H1><P>The purpose of the additional features that are described in this Blog, next to the generic Kafka integration that has been available since end of last year, is to provide tailor-made integration with the managed Kafka offerings Confluent Cloud and Confluent Platform of our dedicated partner Confluent.</P><P>The Confluent integration described in this blog is only usable in SAP Datasphere Replication Flows.</P><P>For the examples and step-by-step instructions in the blog, we assume that a properly configured SAP Datasphere tenant and a Confluent Cloud cluster are available. In addition, we assume that the reader is familiar with the basic concepts around Replication Flows and Connection Management in SAP Datasphere as well as with the Kafka capabilities of Confluent Cloud.</P><P>&nbsp;</P><H1 id="toc-hId-668810154">2. Confluent as a new Connectio<SPAN>n Type in SAP Datasphere</SPAN></H1><P>Beginning of this year a new connection type was introduced for Apache Kafka to offer basic integration with Apache Kafka based sinks. With the 2024.08 release of SAP Datasphere there is a new dedicated connection type introduced that is tailored to Confluent in order to support Confluent specific capabilities like Confluents built-in schema-registry.</P><H1 id="toc-hId-472296649"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 1 - New connection type for Confluent" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106289i2C16E5431C0BDF37/image-size/large?v=v2&amp;px=999" role="button" title="Figure 1.jpg" alt="Figure 1 - New connection type for Confluent" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 1 - New connection type for Confluent</span></span></H1><P>Here are the different configuration options for Confluent.</P><P><STRONG>Connection Details</STRONG></P><UL><LI><STRONG>System Type </STRONG><BR />Possibility to choose between <EM>Confluent Cloud</EM> and <EM>Confluent Platform</EM></LI><LI><STRONG>Kafka Brokers<BR /></STRONG>A comma-separated list of Kafka brokers in the format &lt;host&gt;:&lt;port&gt;.</LI></UL><P><STRONG>Cloud Connector (only available if System Type Confluent Platform is chosen)</STRONG></P><UL><LI><STRONG>Use Cloud Connector<BR /></STRONG>This setting configures whether a&nbsp;<SPAN><A href="https://help.sap.com/docs/connectivity/sap-btp-connectivity-cf/cloud-connector" target="_blank" rel="noopener noreferrer">Cloud Connector</A></SPAN>&nbsp;is used</LI></UL><P><STRONG>Authentication</STRONG></P><P>The following SASL based authentication methods are supported.</P><P>For <EM>Confluent Cloud:</EM></P><TABLE width="100%"><TBODY><TR><TD width="33%" height="50px"><P><STRONG>Authentication Type</STRONG></P></TD><TD width="33%" height="50px"><P><STRONG>SASL Authentication Type</STRONG></P></TD><TD width="34%" height="50px"><P><STRONG>Properties</STRONG></P></TD></TR><TR><TD width="33%" height="50px"><P>No Authentication</P></TD><TD width="33%" height="50px"><P>n/a</P></TD><TD width="34%" height="50px">&nbsp;</TD></TR><TR><TD width="33%" height="77px"><P>API Key and Secret</P></TD><TD width="33%" height="77px"><P>PLAIN</P></TD><TD width="34%" height="77px"><P>API Key*<BR />API Secret*</P></TD></TR></TBODY></TABLE><P>*mandatory</P><P>For <EM>Confluent Platform</EM>:</P><TABLE width="100%"><TBODY><TR><TD width="33%"><P><STRONG>Authentication Type</STRONG></P></TD><TD width="33%"><P><STRONG>SASL Authentication Type</STRONG></P></TD><TD width="47%"><P><STRONG>Properties</STRONG></P></TD></TR><TR><TD width="33%"><P>No Authentication</P></TD><TD width="33%"><P>n/a</P></TD><TD width="47%">&nbsp;</TD></TR><TR><TD width="33%"><P>User Name and Password</P></TD><TD width="33%"><P>PLAIN</P></TD><TD width="47%"><P>Kafka SASL User Name*<BR />Kafka SASL Password*</P></TD></TR><TR><TD width="33%"><P>Salted Challenge Response Authentication Mechanism (256)</P></TD><TD width="33%"><P>SCRAM256</P></TD><TD width="47%"><P>Kafka SASL User Name*<BR />Kafka SASL Password*</P></TD></TR><TR><TD width="33%"><P>Salted Challenge Response Authentication Mechanism (512)</P></TD><TD width="33%"><P>SCRAM 512</P></TD><TD width="47%"><P>Kafka SASL User Name*<BR />Kafka SASL Password*</P></TD></TR><TR><TD width="33%"><P>Kerberos with User Name and Password</P></TD><TD width="33%"><P>GSSAPI</P></TD><TD width="47%"><P>Kafka Kerberos Service Name*<BR />Kafka Kerberos Realm*<BR />Kafka Kerberos Config*<BR />User Name*<BR />Password*</P></TD></TR><TR><TD width="33%"><P>Kerberos with Keytab File</P></TD><TD width="33%"><P>GSSAPI</P></TD><TD width="47%"><P>Kafka Kerberos Service Name*<BR />Kafka Kerberos Realm*<BR />Kafka Kerberos Config*<BR />User Name*<BR />Keytab File*</P></TD></TR></TBODY></TABLE><P>*mandatory<BR /><BR /></P><P><STRONG>Security</STRONG></P><P>Transport Layer Security (TLS) settings for encryption as well as server certification validation are supported for system type Confluent Cloud. In addition, Confluent Platform offers client certificate validation via mTLS.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 2 - TLS configuration for Confluent Cloud and Confluent Platform" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106308i3CD06B5836F85BF7/image-size/large?v=v2&amp;px=999" role="button" title="Figure 2.jpg" alt="Figure 2 - TLS configuration for Confluent Cloud and Confluent Platform" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 2 - TLS configuration for Confluent Cloud and Confluent Platform</span></span></P><P><STRONG>Schema Registry</STRONG></P><P>Additional configuration options to specify schema registry endpoint and credentials.</P><UL><LI><STRONG>URL<BR /></STRONG>Endpoint of the schema registry in the format &lt;host&gt;:&lt;port&gt;</LI><LI><STRONG>Authentication Type<BR /></STRONG>Authentication mechanism that is supposed to be used for schema registry. <EM>User Name and Password</EM> and <EM>No Authentication</EM> are supported.</LI><LI><STRONG>User Name </STRONG>+ <STRONG>Password<BR /></STRONG>User Name and Password in case the corresponding authentication type is chosen.</LI></UL><P>Remark: A schema registry configuration is mandatory when creating connections of type <EM>Confluent</EM>.</P><P>Connections of connection type Confluent can currently only be used as targets in Replication Flows.<BR /><BR /></P><P><STRONG>Example: Creating a connection to Confluent Cloud in SAP Datasphere</STRONG></P><P>We are now showcasing how to create a connection to Confluent Cloud including schema registry configuration. The case of Confluent Platform is similar assuming that a Cloud Connector has been configured properly.</P><P>Use the Bootstrap Server URL(s) of your Confluent cluster as an entry in the <EM>Kafka Brokers </EM>property<EM>.</EM></P><P><EM><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 3 - Specify Kafka Brokers in the SAP Datasphere connection creation wizard." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106326i1A98CA7BAB61160D/image-size/large?v=v2&amp;px=999" role="button" title="Figure 4.jpg" alt="Figure 3 - Specify Kafka Brokers in the SAP Datasphere connection creation wizard." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 3 - Specify Kafka Brokers in the SAP Datasphere connection creation wizard.</span></span></EM></P><P>Choose <EM>User Name And Password </EM>as authentication type and use an API key and secret pair as <EM>Kafka SASL User Name</EM> and <EM>Kafka SASL Password.</EM></P><P><EM><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 4 - API Key and Secret as authentication credentials" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106332iE7C81E5247E5643D/image-size/large?v=v2&amp;px=999" role="button" title="Figure 5.jpg" alt="Figure 4 - API Key and Secret as authentication credentials" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 4 - API Key and Secret as authentication credentials</span></span></EM></P><P>Remark: It is assumed that the owner of the API key has sufficient rights to create/delete and access topics.</P><P>Use the Stream Governance API Endpoint as the Schema Registry URL.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 5 - Specify schema registry URL in the SAP Datasphere connection creation wizard." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106336iCAEDEB29B59BB412/image-size/large?v=v2&amp;px=999" role="button" title="Figure 6.png" alt="Figure 5 - Specify schema registry URL in the SAP Datasphere connection creation wizard." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 5 - Specify schema registry URL in the SAP Datasphere connection creation wizard.</span></span></P><P>Choose <EM>User Name And Password </EM>as the authentication type for the Schema Registry and use an API key and secret pair as <EM>User Name</EM> and <EM>Password</EM>, respectively.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 6 - Specify API Key Secret pair for Schema registry access." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106340iB85BA44D6DAF5F1F/image-size/large?v=v2&amp;px=999" role="button" title="Figure 7.png" alt="Figure 6 - Specify API Key Secret pair for Schema registry access." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 6 - Specify API Key Secret pair for Schema registry access.</span></span></P><P>Remark: It is assumed that the owner of the API credentials has sufficient rights to create and update schemas.</P><P>&nbsp;</P><H1 id="toc-hId-275783144">3. Configuration Options for Replication Flows</H1><P>For the remainder of this blog post, we will describe the capabilities of the Confluent integration with Replication Flows alongside the following source data set that is assumed to be stored in a local table (Business Name: <EM>Demo Table</EM>) in SAP Datasphere.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 7 - Example source dataset Demo Table" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106341iCFE53AD48412B0CE/image-size/large?v=v2&amp;px=999" role="button" title="Figure 8.png" alt="Figure 7 - Example source dataset Demo Table" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 7 - Example source dataset Demo Table</span></span></P><P>Let’s assume we want to replicate this table into a Confluent Cloud instance. We create a corresponding Replication Flow design time artifact, select SAP Datasphere and the local table <EM>Demo table</EM>, respectively, as source and chose the connection CONFLUENT_DEMO (see section 1) as the sink for the Replication Flow.</P><P>The following screenshots highlights the configuration options for the Confluent sink.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 8 - Replication Flow configuration options for Confluent" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106342i82BBB847FF30DF1E/image-size/large?v=v2&amp;px=999" role="button" title="Figure 9.png" alt="Figure 8 - Replication Flow configuration options for Confluent" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 8 - Replication Flow configuration options for Confluent</span></span></P><P>Compared to the usage of the Apache Kafka connection type there are the following additional configuration options <EM>Use Schema Registry, Record Name, Subject Name Strategy, Compatibility Type </EM>and<EM> Clamp Decimal Floating Point Data Types. </EM>They are described in the following table together with the other properties.</P><TABLE><TBODY><TR><TD width="200px" height="50px"><P><STRONG>Setting</STRONG></P></TD><TD width="149px" height="50px"><P><STRONG>Value</STRONG></P></TD><TD width="251px" height="50px"><P><STRONG>Explanation</STRONG></P></TD></TR><TR><TD width="200px" height="159px"><P>Replication Thread Limit</P></TD><TD width="149px" height="159px"><P>Number</P></TD><TD width="251px" height="159px"><P>The number of parallel replication threads that can be executed during the replication process.&nbsp;<STRONG>Only available in Global configuration.</STRONG></P></TD></TR><TR><TD width="200px" height="187px"><P>Number of Partitions</P></TD><TD width="149px" height="187px"><P>Number</P></TD><TD width="251px" height="187px"><P>The number of Kafka Partitions for the target Kafka topic.<BR /><STRONG>Only used for new topics that don’t yet exist in the Kafka Cluster, otherwise the setting is ignored.</STRONG></P></TD></TR><TR><TD width="200px" height="187px"><P>Replication Factor</P></TD><TD width="149px" height="187px"><P>Number</P></TD><TD width="251px" height="187px"><P>The Kafka replication factor for the Kafka topic.<BR /><STRONG>Only used for new topics that don’t yet exist in the Kafka Cluster, otherwise the setting is ignored.</STRONG></P></TD></TR><TR><TD width="200px" height="77px"><P>Message Encoder</P></TD><TD width="149px" height="77px"><P>AVRO or JSON</P></TD><TD width="251px" height="77px"><P>The message format for the Kafka topic.</P></TD></TR><TR><TD width="200px" height="159px"><P>Message Compression</P></TD><TD width="149px" height="159px"><P>No Compression<BR />Gzip<BR />Snappy<BR />LZ4<BR />Zstandard</P></TD><TD width="251px" height="159px"><P>The compression method for the Kafka messages that are sent to a Kafka topic.</P></TD></TR><TR><TD width="200px" height="262px"><P>Use Schema Registry</P></TD><TD width="149px" height="262px"><P>True or False</P></TD><TD width="251px" height="262px"><P><BR />True: Schema Registry is used</P><P>False: Schema Registry is not used<BR /><BR /><STRONG>Schema registry is mandatory in case Message Encoder AVRO is chosen.</STRONG></P></TD></TR><TR><TD width="200px" height="207px"><P>Topic Name</P></TD><TD width="149px" height="207px"><P>string</P></TD><TD width="251px" height="207px"><P>The name of the Kafka topic to be used as the target.<BR />The topic name is always based on the target object name.</P><P><STRONG>It can only be changed by renaming the target object.</STRONG></P></TD></TR><TR><TD width="200px" height="159px"><P>Record Name</P></TD><TD width="149px" height="159px"><P>string</P></TD><TD width="251px" height="159px"><P>The record name that is used for the schema registry entry when the subject name strategy is applied. It is also referenced in the schema definition itself.</P></TD></TR><TR><TD width="200px" height="132px"><P>Subject Name Strategy</P></TD><TD width="149px" height="132px"><P>Topic<BR />Record<BR />Topic-Record</P></TD><TD width="251px" height="132px"><P>Choose the subject name strategy for the schema.<STRONG><BR />Only available if <EM>Use Schema Registry</EM> is used/true.</STRONG></P></TD></TR><TR><TD width="200px" height="269px"><P>Compatibility Type</P></TD><TD width="149px" height="269px"><P>Default<BR />Backward<BR />Backward Transitive<BR />Forward<BR />Forward Transitive<BR />Full<BR />Full Transitive<BR />None</P></TD><TD width="251px" height="269px"><P>Choose the compatibility type for the schema registry subject.<STRONG><BR />Only available if <EM>Use Schema Registry</EM> is used/true.</STRONG></P></TD></TR><TR><TD width="200px" height="159px"><P>Clamp Decimal Floating Point Data Types</P></TD><TD width="149px" height="159px"><P>True</P></TD><TD width="251px" height="159px"><P><STRONG>Immutable setting (always True)<BR /></STRONG>Decimal values that do not fit into the target type (see section 4.2) are automatically clamped</P></TD></TR><TR><TD width="200px" height="242px"><P>Overwrite Target Settings at Object Level</P></TD><TD width="149px" height="242px"><P>True or False</P></TD><TD width="251px" height="242px"><P><BR /><BR />True: The global configuration overwrites the configurations made at task level.<BR /><BR /><STRONG>Only available in global configuration</STRONG></P></TD></TR></TBODY></TABLE><P>The settings on Replication Task level take precedence over the settings on Replication Flow level unless the <EM>Overwrite Target Settings at Object Level </EM>checkbox is checked in the settings on Replication Flow level.</P><P>After the Replication Flow has been deployed and run the Kafka topic was created and a corresponding schema registry entry was created (see section 4 for details on schema and message creation).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 9 - Example of Target Kafka Topic" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106347i8124BEAECC015954/image-size/large?v=v2&amp;px=999" role="button" title="Figure 10.png" alt="Figure 9 - Example of Target Kafka Topic" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 9 - Example of Target Kafka Topic</span></span></P><P><STRONG>Remark:</STRONG><BR />As it was the case with the generic Kafka integration, for Confluent we support a <EM>Truncate</EM> flag. If set and in case the target Kafka topic for the replication is already existing in the Confluent cluster, it is deleted an recreated. This also deletes all messages that are assigned to the topic.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 10 - Truncate flag configuration setting in Replication Flows" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106350iAA362951B8E1E236/image-size/large?v=v2&amp;px=999" role="button" title="Figure 11.png" alt="Figure 10 - Truncate flag configuration setting in Replication Flows" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 10 - Truncate flag configuration setting in Replication Flows</span></span></P><P>In the following section we will have a closer look on schema and message creation.</P><P>&nbsp;</P><H1 id="toc-hId-79269639">4. Details on Kafka Message and Schema creation</H1><P>Before describing the details regarding schema and message creation, we start with a listing of the producer configurations that are used in SAP Datasphere Replication Flows. The parameter values are fixed and cannot be changed in SAP Datasphere.</P><TABLE width="527px"><TBODY><TR><TD width="269.99px"><P><STRONG>Kafka Producer Configuration Parameter</STRONG></P></TD><TD width="129.719px"><P><STRONG>Value used by Replication Flows</STRONG></P></TD><TD width="126.625px"><P><STRONG>Remark</STRONG></P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#max-request-size" target="_blank" rel="noopener nofollow noreferrer">max.request.size</A></SPAN></P></TD><TD width="129.719px"><P>1048576 (1MB)</P></TD><TD width="126.625px"><P>Confluent Default</P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#request-timeout-ms" target="_blank" rel="noopener nofollow noreferrer">request.timeout.ms</A></SPAN></P></TD><TD width="129.719px"><P>30 seconds</P></TD><TD width="126.625px"><P>Confluent Default</P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#max-in-flight-requests-per-connection" target="_blank" rel="noopener nofollow noreferrer">max.in.flight.requests.per.connection</A></SPAN></P></TD><TD width="129.719px"><P>5</P></TD><TD width="126.625px"><P>Confluent Default</P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#retry-backoff-ms" target="_blank" rel="noopener nofollow noreferrer">retry.backoff.ms</A></SPAN></P></TD><TD width="129.719px"><P>100</P></TD><TD width="126.625px"><P>Confluent Default</P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#acks" target="_blank" rel="noopener nofollow noreferrer">acks</A></SPAN></P></TD><TD width="129.719px"><P>all (-1)</P></TD><TD width="126.625px"><P>Confluent Default</P></TD></TR><TR><TD width="269.99px"><P><SPAN><A href="https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#retries" target="_blank" rel="noopener nofollow noreferrer">retries</A></SPAN></P></TD><TD width="129.719px"><P>3</P></TD><TD width="126.625px"><P>&nbsp;</P></TD></TR></TBODY></TABLE><H2 id="toc-hId-11838853">&nbsp;</H2><H2 id="toc-hId--184674652">4.1 Message &amp; Schema Creation</H2><P>The following section is only applicable in case the <EM>Use Schema registry</EM> toggle is activated (true) in the Replication Flow/Task properties panel. In this case schemas a written to the schema registry as described in the following paragraphs.</P><P>No schema definition for the Kafka message key (e.g. […]-key) is created in Confluent. Instead, the message key is a string consisting of the primary key values of the source dataset separated by underscores.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 11 - Kafka message key generation" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106360iA349DC432291181D/image-size/large?v=v2&amp;px=999" role="button" title="Figure 12.png" alt="Figure 11 - Kafka message key generation" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 11 - Kafka message key generation</span></span></P><P>If, in case of our example above, we would have the two columns <EM>ID</EM> and <EM>First_Name </EM>together as a primary key, then the generated message keys would look as follows.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 12 - Second example for Kafka message key generation" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106361iA797626F7103523D/image-size/large?v=v2&amp;px=999" role="button" title="Figure 13.png" alt="Figure 12 - Second example for Kafka message key generation" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 12 - Second example for Kafka message key generation</span></span></P><P>The schema definition for the Kafka message body is constructed and created based on the structure/schema of the source dataset, the configuration settings of the executed Replication Flow and the change data capture mechanisms of SAP Datasphere. The following screenshot shows how the different settings are fed into the Kafka schema.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 13 - Schema creation for the Kafka message body" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106376iB66FCD382949FE6E/image-size/large?v=v2&amp;px=999" role="button" title="Figure 14.png" alt="Figure 13 - Schema creation for the Kafka message body" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 13 - Schema creation for the Kafka message body</span></span></P><P>The corresponding <EM>Confluent schema</EM> and <EM>subject</EM> names, the <EM>serialization format</EM> and the <EM>compatibility mode</EM> are derived from the configuration values of the Replication Flow and Replication Task definition, respectively.</P><P>The schema definition for the message body itself is derived from the source dataset definition and the metadata for the delta information of the Replication Flow. In our example the schema of the <EM>Demo Table</EM> is translated into an AVRO schema that contains the four fields <EM>ID</EM>, <EM>First_Name</EM>, <EM>Second_Name</EM> and <EM>Age </EM>as well as the three delta columns that are always added and which contain the Replication Flow specific delta information. The AVRO type of the message body is always <EM>record.</EM></P><P>In case the <EM>serialization format</EM> is set to <EM>JSON </EM>a corresponding JSON schema is registered in the Confluent schema registry that follows the <A href="https://json-schema.org/" target="_blank" rel="noopener nofollow noreferrer">JSON Schema standard</A>. The following schema definition would have been written to the Confluent schema registry in case JSON serialization format would have been used in our example above:</P><P>&nbsp;</P><pre class="lia-code-sample language-json"><code>{ "$schema": "https://json-schema.org/draft/2020-12/schema", "properties": { "Age": { "anyOf": [ { "maximum": 9223372036854776000, "minimum": -9223372036854776000, "type": "integer" }, { "type": "null" } ] }, "First_Name": { "anyOf": [ { "type": "string" }, { "type": "null" } ] }, "ID": { "anyOf": [ { "maximum": 9223372036854776000, "minimum": -9223372036854776000, "type": "integer" }, { "type": "null" } ] }, "Second_Name": { "anyOf": [ { "type": "string" }, { "type": "null" } ] }, "__operation_type": { "anyOf": [ { "type": "string" }, { "type": "null" } ] }, "__sequence_number": { "anyOf": [ { "maximum": 18446744073709552000, "minimum": 0, "type": "integer" }, { "type": "null" } ] }, "__timestamp": { "anyOf": [ { "format": "date-time", "type": "string" }, { "type": "null" } ] } }, "title": "Demo_Table", "type": "object" }</code></pre><P>&nbsp;</P><P>The topic itself is created and configured based on the Replication Flow/Task settings <EM>Number of Partitions, Replication Factor </EM>and <EM>Topic Name. </EM>All other topic specific configuration parameters are inherited from the Confluent Cluster settings.</P><P>For initial of delta loads during a Replication Flow run for each row in the source data set exactly one Kafka message is added to the target Kafka topic.</P><P>In case of our example (initial load of a SAP Datasphere table) the final Kafka messages look as follows (an overview of the Kafka topic is shown in figure 10 above).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 14 - Example Kafka messages for Demo Table replication" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106377iF4E478D74250AAA3/image-size/large?v=v2&amp;px=999" role="button" title="Figure 15.png" alt="Figure 14 - Example Kafka messages for Demo Table replication" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 14 - Example Kafka messages for Demo Table replication</span></span></P><P>The next section contains an overview of type mappings between selected source systems and JSON/AVRO schema types.</P><P>&nbsp;</P><H2 id="toc-hId--381188157">4.2 Data Type Mappings</H2><P>The following two tables contain the data type mappings for the scenarios where SAP HANA/SAP Datasphere or ABAP artifacts are chosen as a source.</P><P><STRONG>Scenario: SAP HANA/SAP Datasphere to Confluent</STRONG></P><TABLE width="628px"><TBODY><TR><TD width="130.073px"><P><STRONG>SAP HANA Type</STRONG></P></TD><TD width="241.417px"><P><STRONG>JSON Type</STRONG></P></TD><TD width="255.844px"><P><STRONG>AVRO Type</STRONG></P></TD></TR><TR><TD width="130.073px"><P><SPAN>TINYINT</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>int</P></TD></TR><TR><TD width="130.073px"><P><SPAN>SMALLINT</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>int</P></TD></TR><TR><TD width="130.073px"><P><SPAN>INTEGER</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>int</P></TD></TR><TR><TD width="130.073px"><P><SPAN>BIGINT</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>long</P></TD></TR><TR><TD width="130.073px"><P><SPAN>REAL</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>float</P></TD></TR><TR><TD width="130.073px"><P><SPAN>DOUBLE</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>double</P></TD></TR><TR><TD width="130.073px"><P><SPAN>SMALLDECIMAL</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>type:"bytes",logical type:"decimal",scale:6,precision:28</P></TD></TR><TR><TD width="130.073px"><P><SPAN>DECIMAL</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>type:"bytes",logical type:"decimal",scale:6,precision:38</P></TD></TR><TR><TD width="130.073px"><P><SPAN>DECIMAL(p,s)</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>type:"bytes", logical type:"decimal",scale:s,precision:p</P></TD></TR><TR><TD width="130.073px"><P><SPAN>FLOAT</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>double</P></TD></TR><TR><TD width="130.073px"><P><SPAN>FLOAT(n), 1&lt;=n&lt;=24</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>float</P></TD></TR><TR><TD width="130.073px"><P><SPAN>FLOAT(n), 25&lt;=n&lt;=53</SPAN></P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.844px"><P>double</P></TD></TR><TR><TD width="130.073px"><P><SPAN>BOOLEAN</SPAN></P></TD><TD width="241.417px"><P>boolean</P></TD><TD width="255.844px"><P>boolean</P></TD></TR><TR><TD width="130.073px"><P><SPAN>DATE</SPAN></P></TD><TD width="241.417px"><P>string ('YYYY-MM-DD')</P></TD><TD width="255.844px"><P>type:"int",logical type:"date" (days from UNIX 0)</P></TD></TR><TR><TD width="130.073px"><P><SPAN>TIME</SPAN></P></TD><TD width="241.417px"><P>string ('HH:MM:SS.NNNNNNNNN')</P></TD><TD width="255.844px"><P>type:"long",logical type:"time-micros"</P></TD></TR><TR><TD width="130.073px"><P><SPAN>CLOB/NCLOB</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>string</P></TD></TR><TR><TD width="130.073px"><P><SPAN>SECONDDATE</SPAN></P></TD><TD width="241.417px"><P>string ('YYYY-MM-DDTHH:MM:SS.NNNNNNNNNZ')</P></TD><TD width="255.844px"><P>type:"long",logical type:"timestamp-micros" (microseconds after UNIX 0)</P></TD></TR><TR><TD width="130.073px"><P><SPAN>TIMESTAMP</SPAN></P></TD><TD width="241.417px"><P>string ('YYYY-MM-DDTHH:MM:SS.NNNNNNNNNZ')</P></TD><TD width="255.844px"><P>type:"long",logical type:"timestamp-micros" (microseconds after UNIX 0)</P></TD></TR><TR><TD width="130.073px"><P><SPAN>VARCHAR(n)</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>string</P></TD></TR><TR><TD width="130.073px"><P><SPAN>NVARCHAR(n)</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>string</P></TD></TR><TR><TD width="130.073px"><P><SPAN>ALPHANUM(n)</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>string</P></TD></TR><TR><TD width="130.073px"><P><SPAN>SHORTTEXT(n)</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>string</P></TD></TR><TR><TD width="130.073px"><P><SPAN>VARBINARY(n)</SPAN></P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.844px"><P>bytes</P></TD></TR></TBODY></TABLE><P><STRONG>Scenario: ABAP to Confluent</STRONG></P><TABLE width="632px"><TBODY><TR><TD width="135.167px"><P><STRONG>ABAP Data Dictionary Type (DDIC)</STRONG></P></TD><TD width="241.417px"><P><STRONG>JSON Type</STRONG></P></TD><TD width="255.521px"><P><STRONG>AVRO Type</STRONG></P></TD></TR><TR><TD width="135.167px"><P>INT1</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>int</P></TD></TR><TR><TD width="135.167px"><P>INT2</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>int</P></TD></TR><TR><TD width="135.167px"><P>INT4</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>int</P></TD></TR><TR><TD width="135.167px"><P>INT8</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>long</P></TD></TR><TR><TD width="135.167px"><P>DEC</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:s,precision:d</P></TD></TR><TR><TD width="135.167px"><P>DF16_DEC / D16D</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:6,precision:28</P></TD></TR><TR><TD width="135.167px"><P>DF34_DEC / D34D</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes",logical type:"decimal",scale:6,precision:38</P></TD></TR><TR><TD width="135.167px"><P>FLTP</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>double</P></TD></TR><TR><TD width="135.167px"><P>CURR</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:s,precision:d</P></TD></TR><TR><TD width="135.167px"><P>QUAN</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:s,precision:d</P></TD></TR><TR><TD width="135.167px"><P>DECFLOAT16 / D16N</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:6,precision:28</P></TD></TR><TR><TD width="135.167px"><P>DECFLOAT34 / D34N</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes",logical type:"decimal",scale:6,precision:38</P></TD></TR><TR><TD width="135.167px"><P>DF16_RAW / D16R</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes", logical type:"decimal",scale:6,precision:28</P></TD></TR><TR><TD width="135.167px"><P>DF34_RAW/D34R</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>type:"bytes",logical type:"decimal",scale:6,precision:38</P></TD></TR><TR><TD width="135.167px"><P>RAW</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>bytes</P></TD></TR><TR><TD width="135.167px"><P>LRAW</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>bytes</P></TD></TR><TR><TD width="135.167px"><P>RAWSTRING / RSTR</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>bytes</P></TD></TR><TR><TD width="135.167px"><P>SRAWSTRING / SRST</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>bytes</P></TD></TR><TR><TD width="135.167px"><P>CHAR</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>LCHR</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>SSTRING / SSTR</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>DATS</P></TD><TD width="241.417px"><P>string ('YYYY-MM-DD')</P></TD><TD width="255.521px"><P>type:"int",logical type:"date" (days from UNIX 0)</P></TD></TR><TR><TD width="135.167px"><P>DATN</P></TD><TD width="241.417px"><P>string ('YYYY-MM-DD')</P></TD><TD width="255.521px"><P>type:"int",logical type:"date" (days from UNIX 0)</P></TD></TR><TR><TD width="135.167px"><P>TIMS</P></TD><TD width="241.417px"><P>string ('HH:MM:SS.NNNNNNNNN')</P></TD><TD width="255.521px"><P>type:"long",logical type:"time-micros"</P></TD></TR><TR><TD width="135.167px"><P>ACCP</P></TD><TD width="241.417px"><P>number</P></TD><TD width="255.521px"><P>???</P></TD></TR><TR><TD width="135.167px"><P>NUMC</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>CLNT</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>LANG</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>UTCLONG / UTCL</P></TD><TD width="241.417px"><P>string ('YYYY-MM-DDTHH:MM:SS.NNNNNNNNNZ')</P></TD><TD width="255.521px"><P>type:"long",logical type:"timestamp-micros" (microseconds after UNIX 0)</P></TD></TR><TR><TD width="135.167px"><P>CUKY</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>UNIT</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>STRING / STRG</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>GEOM_EWKB / GGM1</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>TIMN</P></TD><TD width="241.417px"><P>string ('HH:MM:SS.NNNNNNNNN')</P></TD><TD width="255.521px"><P>type:"long",logical type:"time-micros"</P></TD></TR><TR><TD width="135.167px"><P>Domain TZNTSTMPL</P></TD><TD width="241.417px"><P>string ('YYYY-MM-DDTHH:MM:SS.NNNNNNNNNZ')</P></TD><TD width="255.521px"><P>type:"long",logical type:"timestamp-micros" (microseconds after UNIX 0)</P></TD></TR><TR><TD width="135.167px"><P>Domain TZNTSTMPS</P></TD><TD width="241.417px"><P>string ('YYYY-MM-DDTHH:MM:SS.NNNNNNNNNZ')</P></TD><TD width="255.521px"><P>type:"long",logical type:"timestamp-micros" (microseconds after UNIX 0)</P></TD></TR><TR><TD width="135.167px"><P>Domain SYSUUID_X16 and SYSUUID</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>Domain SYSUUID_C22 SYSUUID_22</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>Domain SYSUUID_C26</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>Domain SYSUUID_C32 and SYSUUID_C</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR><TR><TD width="135.167px"><P>Domain SYSUUID_C36</P></TD><TD width="241.417px"><P>string</P></TD><TD width="255.521px"><P>string</P></TD></TR></TBODY></TABLE><P>In the final section we will have a look on the behavior of SAP Datasphere Replication Flows when they are scheduled and a Confluent cluster is selected as the target system.</P><P>&nbsp;</P><H1 id="toc-hId--706784381">5. Scenarios or what happens if...</H1><P>In this section we assume that the Kafka topic a Replication Flow/Task is supposed to write to already exists in the target Confluent Cluster. By making different assumptions, we explain the behaviour of a Replication Flow/Task that is configured to use the already existing topic. We will again leverage our small <EM>Demo Table</EM> setup and we assume that the target Kafka topic <EM>Demo Table</EM> already exists. &nbsp;&nbsp;</P><P>In general, the concept of the Kafka schema registry are always applied when a new schema is registered during a Replication Flow run.<BR /><BR /></P><P><STRONG>Scenario 1: A schema entry in the schema registry does not exist.</STRONG></P><P>A schema is registered in the Schema registry using the subject name strategy that is specified in the Replication Flow. The messages are written by the Replication Flow into the already existing topic.<BR /><BR /></P><P><STRONG>Scenario 2: A schema entry in the schema registry exists but with a different subject Naming strategy that is specified in the Replication Flow</STRONG></P><P>A new schema is registered in the Schema registry using the subject name strategy that is specified in the Replication Flow. The messages are written by the Replication Flow into the already existing topic.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 15 - Demo Table Example for Scenario 2" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106379i563BE2CD42DEE8A1/image-size/large?v=v2&amp;px=999" role="button" title="Figure 16.png" alt="Figure 15 - Demo Table Example for Scenario 2" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 15 - Demo Table Example for Scenario 2</span></span></P><P><STRONG>Scenario 3:</STRONG> <STRONG>A schema entry in the schema registry exists with the same subject name strategy that is specified in the Replication Flow</STRONG></P><P>Assumption A: The compatibility type that is specified in the Replication Flow coincides with the compatibility type of the already existing schema/subject definition and the new schema definition is different from the already existing one but complies with the compatibility type.</P><UL><LI>A new schema version is registered based on the compatibility type that was specified for the existing schema/subject. The messages are written by the Replication Flow into the already existing topic.<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 16 - Demo Table Example for Scenario 3 Assumption A" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106393i3B54E1E285F636A3/image-size/large?v=v2&amp;px=999" role="button" title="Figure 17.png" alt="Figure 16 - Demo Table Example for Scenario 3 Assumption A" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 16 - Demo Table Example for Scenario 3 Assumption A</span></span></LI></UL><P> </P><P>Assumption B: The compatibility type that is specified in the Replication Flow is different from the compatibility type of the already existing Schema/Subject definition, but the schema definition is the same.</P><UL><LI>The compatibility type is overwritten, and no new schema is registered in the schema registry. The messages are written by the Replication Flow into the already existing topic.<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 17 - Demo Table Example for Scenario 3 Assumption B Part I" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106384i6C1A964F4630C35E/image-size/large?v=v2&amp;px=999" role="button" title="Figure 18.png" alt="Figure 17 - Demo Table Example for Scenario 3 Assumption B Part I" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 17 - Demo Table Example for Scenario 3 Assumption B Part I</span></span></LI><LI>Watch out that such a change may produce schema version chains with changes that do not comply with the compatibility type that is specified in the schema definition.<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 18 - Demo Table Example for Scenario 3 Assumption B Part II" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106387iEF788E7F2AF2888E/image-size/large?v=v2&amp;px=999" role="button" title="Figure 19.png" alt="Figure 18 - Demo Table Example for Scenario 3 Assumption B Part II" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 18 - Demo Table Example for Scenario 3 Assumption B Part II</span></span></LI></UL><P>Assumption C: The compatibility type that is specified in the Replication Flow is different from the compatibility type of the already existing schema/subject definition and the schema definition is different but complies with the new compatibility type.</P><UL><LI>The old compatibility type is overwritten and a new schema version for the already existing subject is registered in the schema registry applying the new compatibility type<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 19 - Example: Compatibility type Backward is overwritten by Forward and a new schema version is registered that introduces a forward compatible change." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106392i3B73A8251D76594E/image-size/large?v=v2&amp;px=999" role="button" title="Figure 20.png" alt="Figure 19 - Example: Compatibility type Backward is overwritten by Forward and a new schema version is registered that introduces a forward compatible change." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 19 - Example: Compatibility type Backward is overwritten by Forward and a new schema version is registered that introduces a forward compatible change.</span></span></LI></UL><P>Assumption D : The compatibility type that is specified in the Replication Flow/Task is the same as the one that is specified in the already existing Schema/Subject definition but the schema in the Replication Flow is different or introduces a change that does not comply with the compatibility type.</P><UL><LI>The Replication Flow fails with an error message that indicated that the new message schema that is supposed to be registered in the schema registry is incompatible with the already existing one.<BR /><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure 20 - Demo Table Example for Scenario C Assumption D" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106395i8217E853BF9C9C80/image-size/large?v=v2&amp;px=999" role="button" title="Figure 21.png" alt="Figure 20 - Demo Table Example for Scenario C Assumption D" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure 20 - Demo Table Example for Scenario C Assumption D</span></span></LI></UL><H1 id="toc-hId--903297886">Summary</H1><P>In this blog post we introduced new integration capabilities of SAP Datasphere with Confluent Cloud and Confluent Platform.&nbsp;The intention was to provide as many details as possible and provide step-by-step guides.</P> 2024-05-07T15:03:22.063000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-technical-deep/ba-p/13692481 Tracking HANA Machine Learning experiments with MLflow: A technical Deep Dive 2024-05-08T17:00:00.007000+02:00 martinboeckling https://community.sap.com/t5/user/viewprofilepage/user-id/43098 <H1 id="toc-hId-865290017">Introduction</H1><P><SPAN><SPAN class="">This blog post is part of a series describing the usage of MLflow with HANA Machine Learning co-authored by &nbsp;<a href="https://community.sap.com/t5/user/viewprofilepage/user-id/39047">@stojanm</a>&nbsp;and <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/43098">@martinboeckling</a>.</SPAN> In this blog post we provide a more technical deep dive on the setup of a MLFlow instance and provide a general introduction how Machine Learning models trained with HANA ML can be logged with MLflow. The first blog post of the blog series is called&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-conceptual-guide/ba-p/13688478" target="_self">Tracking HANA Machine Learning experiments with MLflow: A Conceptual Guide for MLOps</A>&nbsp;and gives an introduction to the topic of MLOps with MLflow.</SPAN></P><P>Starting with the python HANA ML package version 2.13, HANA Machine Learning added support for tracking of experiments with the MLflow package, which makes the incorporation of models developed using HANA Machine Learning into a comprehensive MLOps pipeline easy to achieve.&nbsp;</P><P>In this blog post we will provide an overview how MLflow can be used together with HANA ML. MLflow, which manages the experiment tracking and artefact management can run as a managed service at a hyperscaler platform, deployed locally or on a remote infrastructure. In the following we describe how to deploy MLflow on SAP Business Technology Platform &nbsp;and how to track your HANA machine learning experiments with MLFlow. In addition, we present which methods and algorithms in the hana ml package currently support the experiment tracking feature. Finally, we touch on the possibility to use logged models in MLflow for prediction.</P><H1 id="toc-hId-668776512">Prerequisites</H1><P>In this blog post we solely focus on the technical integration of HANA ML and MLflow as a logging platform. Generally, we assume that Python is already installed together with an already established development environment. Furthermore, we will not completely explain all details of docker and Cloud Foundry, but simply focus on the essential parts for HANA ML and MLFlow within this blog post.</P><H1 id="toc-hId-472263007">Set up MLFlow on BTP</H1><P><SPAN>MLFlow is leveraged and integrated in different solutions. For example Databricks as well Machine Learning in Microsoft Fabric integrate or provide a managed MLflow instance. In case MLFlow is not yet provided, we outline in this section a possibility to deploy MLflow in SAP BTP. We focus for simplicity reasons on the SQLite based deployment of MLflow. However, for productive environments it is recommended to separate the storage from the runtime of the MLFlow instance. A detailed explanation for setting up your own MLFlow server with alternatives to SQLite can be found under the following&nbsp;</SPAN><SPAN>link:&nbsp;<A href="https://mlflow.org/docs/latest/tracking/server.html" target="_blank" rel="noopener nofollow noreferrer">https://mlflow.org/docs/latest/tracking/server.html</A>. In the following paragraphs, we focus on a step by step overview to set up your own MLFlow instance using SQLite in BTP.</SPAN></P><P>As a first step we want to create a local docker file which can be used to upload it to our BTP environment. In the following code snippet, we provide the coding used to construct your own docker container locally. For that, paste the following code into a file called Dockerfile within your desired local folder.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-bash"><code># Retrieve a python version as a base runtime for our docker container FROM python:3.10-slim # Run the pip install command for the package of mlflow RUN pip install mlflow # Create a temporary folder within our docker container to store our artifact RUN mkdir -p /mlflow # Expose the port 7000 to make our application which runs within docker accessable over the defined port EXPOSE 7000 # Define environment variables BACKEND_URI and ARTIFACT_ROOT to define the backend uri as also the # artifact root ENV BACKEND_URI sqlite:///mlflow//mlflow.db ENV ARTIFACT_ROOT /mlflow/artifacts # run the shell comand to setup our mlflow server within our docker container CMD mlflow server --backend-store-uri ${BACKEND_URI} --host 0.0.0.0 --port 7000</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>After creating successfully the Docker object, you can run&nbsp;<STRONG>docker build -t {tagname}.</STRONG>&nbsp;to construct your docker container. Afterwards, the local docker image is locally built. To expose the docker image, we push the image to a docker registry. In our example, we assume that you already have a docker registry set up where you can push your image to. For that step, you can run the following commands: <STRONG>docker tag {tagname} {dockerhub repository tag}</STRONG>,&nbsp;<STRONG>docker push {</STRONG>dockerhub repository tag<STRONG>&nbsp;}.&nbsp;</STRONG>After the successful run of the command, you see within your private docker hub the newly published docker container, which contains MLflow and all its dependencies inside of it.</P><P>After the successful publishing of your docker image to your registry, we can run the following command to create a BTP app based on the published docker image:</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-bash"><code>cf push APP-NAME --docker-image REPO/IMAGE:TAG</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>After successfully publishing the docker image on our BTP Cloud Foundry environment, we can find our published app within our BTP account and are able to access&nbsp;it under the published URL.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MLFlow Initial UI.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106232iB3F8015926221314/image-size/large?v=v2&amp;px=999" role="button" title="MLFlow Initial UI.png" alt="MLFlow Initial UI.png" /></span></P><H2 id="toc-hId-404832221"><SPAN>Set up tracking for MLflow</SPAN></H2><P>With MLflow users have the possibility to track their trained HANA ML models. In the following paragraph, we introduce the aspects that are needed to be able to log HANA ML models into MLflow itself.&nbsp;</P><P>To be able to use MLFlow together with HANA ML, we need to first install besides the HANA ML package also the MLFlow package. Therefore, you need to run the following command in your virtual environment, to be able to run the following scripts.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-bash"><code>pip install mlflow hana-ml</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>As a general setup, we first need to run the following command to set up our tracking with MLFlow to our available MLFlow instance. Therefore, place into the following two lines first your personal MLFlow tracking URI and your own custom experiment. In case you do not want to create a separate experiment, the different runs together with the MLFlow model are stored under the default experiment.</P><P>The method that allows us to track HANA ML models is implemented in the HANA ML package and is called enable_mlflow_autologging(schema=None, meta=None, is_exported=False, registered_model_name=None). This method can be used for initialised HANA ML models that are under the following methods:</P><UL><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.auto_ml.AutomaticClassification.html#hana_ml.algorithms.pal.auto_ml.AutomaticClassification.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">AutomaticClassification</A></LI><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.auto_ml.AutomaticRegression.html#hana_ml.algorithms.pal.auto_ml.AutomaticRegression.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">AutomaticRegression</A></LI><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.auto_ml.AutomaticTimeSeries.html#hana_ml.algorithms.pal.auto_ml.AutomaticTimeSeries.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">AutomaticTimeSeries</A></LI><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.pipeline.Pipeline.html#hana_ml.algorithms.pal.pipeline.Pipeline.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">Pipeline</A></LI><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.unified_classification.UnifiedClassification.html#hana_ml.algorithms.pal.unified_classification.UnifiedClassification.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">Unified_Classification</A></LI><LI><A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/pal/algorithms/hana_ml.algorithms.pal.unified_regression.UnifiedRegression.html#hana_ml.algorithms.pal.unified_regression.UnifiedRegression.enable_mlflow_autologging" target="_blank" rel="noopener noreferrer">Unified_Regression</A></LI></UL><P>Within the method enable_mlflow_autologging the user has different keywords that can be filled that allows us to influence the behaviour of our MLFlow autologging in HANA ML.</P><UL><LI>schema: Defines the HANA database schema for MLFlow autologging where the MLflow logging table is stored</LI><LI>meta: Defines the name of the model storage table in HANA database</LI><LI>is_exported: Determines if the hana model binaries should be exported to MLflow</LI><LI>registered_model_name: Name of the model stored in MLflow</LI></UL><P>In the following section we provide an overview for the Unified Interface how the logging of MLflow can be used.</P><H1 id="toc-hId-79235997">Run HANA ML Algorithms with MLflow</H1><P>As we have explained and outlined in the sections above, we have created a MLFlow instance and have introduced the syntax that is needed for the logging of HANA ML models in MLflow. In the following sections we will provide based on an example how the logging of HANA ML models on MLflow is done.</P><H2 id="toc-hId-11805211">Model training of HANA ML with MLFlow</H2><P>For the training of HANA ML in combination with MLflow, we focus in this blog post on the Unified Method. We apply for the respective elements a Classification on the sample bank dataset which can be found under the&nbsp;<A href="https://github.com/SAP-samples/hana-ml-samples/blob/main/Python-API/pal/datasets/bank-additional-full-with-header.csv" target="_blank" rel="noopener nofollow noreferrer">HANA ML sample dataset folder on GitHub</A>.</P><P>The dataset can either be uploaded directly to the SAP HANA database or you could also use SAP Datasphere as your starting point. Generally, to use HANA ML directly you would need to store the dataset in a HANA database. However, HANA ML also provides methods to integrate third party files/ data structures. This involves Pandas, Spark as also shapefiles. In addition also HANA Data Lake file tables can be integrated with HANA ML functionalities. An overview of the different methods can be found under the&nbsp;<A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/hana_ml.dataframe.html" target="_blank" rel="noopener noreferrer">following page</A>. In the following paragraphs, we will go through the sample code that we have created to combine HANA ML and MLFlow.</P><H2 id="toc-hId--184708294">Connect to HANA database (Deployed under SAP Datasphere)</H2><P>To be able to connect to the HANA database instance, we first need to build up a connection to the HANA database. In our example, we load the data from the data samples provided by HANA ML. During the time of this blog post, the OpenSQL schema of Datasphere only supports Basic Authentication. Therefore, in this blog post we only elaborate how the connection is done over basic authentication. SAP HANA standalone supports however non-basic authentication, which are also supported in the HANA ML package to connect certificate based to the SAP HANA instance.</P><P>To establish the connection to the HANA database, we make use of the implemented HANA ML dataframe class and call the method ConnectionContext. We store the instance of the connection in the variable&nbsp;<STRONG>conn</STRONG>. To now be able to establish the connection to the HANA database view or table, we will need to specify over the method table the connection. The beautiful aspect is that overall, the dataset is not going to be loaded to the Python runtime, but will only be represented with a proxy to the actual table in the HANA database. All transformations, if done over the methods of HANA ML, are then pushed down to the database itself and executed there if the training is executed. In our case, we load the sample dataset into our database by making use of the provided methods of the HANA ML package.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>""" This script provides a short example how HANA ML and MLFlow can be integrated together. The credentials to the database are following the currently supported authentification (Basic) of the SAP Datasphere OpenSQL schema. Overall, HANA Cloud standalone is also able to support multiple other authentification methods. We have used an abstraction python file (constants) where we retrieve the securely stored authentification properties. To get more details about the exact method structure needed, please have a look at the documentation: https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/hana_ml.dataframe.html#hana_ml.dataframe.ConnectionContext """ from hana_ml import dataframe from hana_ml.algorithms.pal.unified_classification import UnifiedClassification from hana_ml.algorithms.pal.auto_ml import AutomaticClassification import mlflow from hana_ml.algorithms.pal.auto_ml import Preprocessing from hana_ml.algorithms.pal.partition import train_test_val_split from constants import db_url, db_user, db_password # dataset retrieval conn = dataframe.ConnectionContext(address=db_url, port=443, user=db_user, password=db_password) dataset_data, training_data, _, test_data = DataSets.load_bank_data(connection=conn, schema=schema_name, train_percentage=0.7, valid_percentage=0, test_percentage=0.3, seed=43)</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>After connecting to the database, the user is able to use the preprocessing methods implemented in HANA ML. Generally, the different changes are pushed down to the HANA database and are not executed within the Python runtime. In our use case, we do not need to use the data preprocessing as we directly retrieve a sample dataset which we can directly use for our ML training.</P><P>After finishing the potentially needed transformations, we are now able to implement the tracking of our HANA ML runs with the possibility of MLFlow. Similar to the normal usage of MLFlow, we set up first our tracking uri under which we want to store our HANA ML runs and models. In your case you would need to change the keyword <STRONG>mlflow_tracking_uri</STRONG> with your respective MLflow tracking URL. Furthermore, we then are able to specify the experiment name under which the runs are tracked. If we do not specify a specific experiment, the runs are tracked under the Default experiment.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code># set up MLFlow mlflow.set_tracking_uri(mlflow_tracking_uri) mlflow.set_experiment("HANA ML Experiment")</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>In the following chapters, we will provide an outline how the exact training is performed and what components are logged to MLflow.</P><H3 id="toc-hId--252139080">Unified Method</H3><P>For the example we use the implemented Hybrid Gradient Boosting Tree as a classification algorithm for our Classification. In order to perform the classification, we use the Unified Classification in order to be able to run our algorithm. On the defined variable, we then use the implemented enable_mlflow_autologging method. This allows us to directly log the model using implemented auto logging behaviour.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>uc = UnifiedClassification(func="HybridGradientBoostingTree") uc.enable_mlflow_autologging() uc.fit(training_data, key="ID", label="responded")</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>We call the fit method once we have initiated the HANA ML model variable and the associated autologging for MLflow. For the fit method, we have in total two different options. Firstly, the non-partitioned training dataset where we only use the training dataset. If we decide to partition our training dataset, we allow to create a validation dataset for which we can log metrics automatically during training.</P><P>If we do not define for our fit function the partitioning, we will not log metrics within MLFlow. In the following image, you can see how a potential HANA ML tracked run looks like in MLFlow together with the stored HANA ML model in MLflow.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MLflow Initial Model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107009i7D8C6FCD09145880/image-size/large?v=v2&amp;px=999" role="button" title="MLflow Initial Model.png" alt="MLflow Initial Model.png" /></span></P><P><SPAN>If we decide to partition our dataset, here for instance to partition the dataset along the defined primary key, we are able to directly log evaluation metrics relevant for the Classification we have used. This includes the following metrics: AUC, Recall, Precision, F1 Score, Accuracy, Kappa coefficient and the Mathews Correlation Coefficient (MCC). This would directly allow us to compare multiple runs within our MLFlow project to one another and measure the different performances.</SPAN></P><P><FONT face="inherit"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MLFlow Metric Model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107015i7FB44324969988A4/image-size/large?v=v2&amp;px=999" role="button" title="MLFlow Metric Model.png" alt="MLFlow Metric Model.png" /></span></FONT></P><P><FONT face="inherit">&nbsp;In addition to the general run, HANA ML also logs the model to MLflow. What is logged to MLflow depends on the parameters set&nbsp;for the method </FONT><STRONG>enable_mlflow_autologging</STRONG><FONT face="inherit">. If for instance everything is set to the default settings, we will see the following yaml file to be logged </FONT>to MLflow.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MLflow model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107027i73AC9FF7204BAD74/image-size/large?v=v2&amp;px=999" role="button" title="MLflow model.png" alt="MLflow model.png" /></span></P><P>&nbsp;<SPAN>If within the method&nbsp;</SPAN><STRONG>enable_mlflow_autologging </STRONG><FONT face="inherit">the parameter is set to is_exported, the model binaries stored in the model storage on HANA are exported to MLflow. This setting would allow us to </FONT>retrieve<FONT face="inherit">&nbsp;the trained model from MLflow and use it in a different HANA database for prediction purposes. In addition to the yaml file containing the metadata we now can see a created subfolder called models which contains the necessary model </FONT>artefacts<FONT face="inherit">&nbsp;normally stored in the HANA database now in MLflow.</FONT></P><P><FONT face="inherit"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MLflow exported model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107028i5B476B0754A68F6C/image-size/large?v=v2&amp;px=999" role="button" title="MLflow exported model.png" alt="MLflow exported model.png" /></span></FONT></P><P><FONT face="inherit">After the training is finished, we have </FONT><SPAN>besides the auto logging capabilities of HANA ML for MLflow the possibility to track further artefacts in MLflow. In the following section we will outline a few possibilities that exist with the additional tracking.</SPAN></P><H2 id="toc-hId--577735304">Additional logging possibilities</H2><P>Besides the outlined auto logging capabilities, we can track with MLFlow additional artefacts to the respective run. In the following chapters, we outline selected possibilities to further enrich the auto logging for HANA ML runs tracked in MLFlow.</P><H3 id="toc-hId--645166090">Adding run and experiment description</H3><P><SPAN>The description in the experiment section can be handy once the number of your experiments grows in the repository. In addition, mlflow allows to also add individual description to each run of an experiment. Using the following methods you can set up both:</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>from mlflow.tracking import MlflowClient current_experiment=dict(mlflow.get_experiment_by_name("HANA ML Experiment")) experiment_id=current_experiment['experiment_id'] run = mlflow.active_run() MlflowClient().set_experiment_tag(experiment_id,"mlflow.note.content", "This experiment shows the automated methods of HANA machine learning and how to track them with MLFLOW") MlflowClient().set_tag(run.info.run_id, "mlflow.note.content", "This is a run tracked with Unified Classification from HANA Machine Learning")</code></pre><P>&nbsp;</P><P>&nbsp;</P><H3 id="toc-hId--916910964">Logging input datasets</H3><P><SPAN>Sometimes it is important to keep the input dataset also as part of the tracking with MLflow. Since HANA machine learning datasets are located in HANA, they need to be converted to pandas DataFrames to be tracked as shown in the following code:</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code># Store training dataset in MLFlow itself pandas_training_dataset = training_data.collect() mlflow_dataset = mlflow.data.from_pandas(pandas_training_dataset, name="Customer data", targets="LABEL") mlflow.log_input(mlflow_dataset, context='training')</code></pre><P>&nbsp;</P><P>&nbsp;</P><P><SPAN>This results in the change, that the respective state of the training data is logged to the current run. The logged dataset can be found in the associated MLflow run, where the schema of the dataset is provided together with some metadata information about the number of rows and number of elements. In addition, also the provided context is marked in the UI of MLflow.</SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="2024-05-07_16-02-25.png" style="width: 799px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/106929iF8F82FDDFDF435A7/image-size/large?v=v2&amp;px=999" role="button" title="2024-05-07_16-02-25.png" alt="2024-05-07_16-02-25.png" /></span></SPAN></P><H3 id="toc-hId--1113424469"><SPAN>Logging a model report</SPAN></H3><P><SPAN>In addition to the logging of the dataset, it might also be important to add a model report to MLFlow. HANA ML generally provides different interactive visualisations for the trained model artefact, which can be stored as an HTML file. After the storing of the model report to your local repository, we can log the input of the model report to our current run. This allows us to interactively explore the model report automatically generated by HANA ML and make it accessible in MLFlow. To log the HANA ML model report, you can use the following code snippet.</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code># create additional model report in MLFlow UnifiedReport(uc).display(save_html="UnifiedReport") mlflow.log_artifact("UnifiedReport_unified_classification_model_report.html")</code></pre><P>&nbsp;</P><P>&nbsp;</P><P><SPAN>After the Model report is stored successfully under the current run, we can see in the artefact&nbsp;tab in MLFlow the interactive model report:</SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="artifact_mlflow.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107064i48983C52AA76D768/image-size/large?v=v2&amp;px=999" role="button" title="artifact_mlflow.gif" alt="artifact_mlflow.gif" /></span></SPAN></P><P>&nbsp;<SPAN>The complete script used for this section can be found in the following code snippet:</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>from hana_ml import dataframe from hana_ml.algorithms.pal.unified_classification import UnifiedClassification from hana_ml.visualizers.unified_report import UnifiedReport import mlflow from hana_ml.algorithms.pal.utility import DataSets from constants import db_url, db_user, db_password # dataset retrieval conn = dataframe.ConnectionContext(address=db_url, port=443, user=db_user, password=db_password) dataset_data, training_data, _, test_data = DataSets.load_bank_data(connection=conn, schema=schema_name, train_percentage=0.7, valid_percentage=0, test_percentage=0.3, seed=43) # set up MLflow mlflow.set_tracking_uri(tracking_uri) mlflow.set_experiment("HANA ML Experiment") # set up classification uc = UnifiedClassification(func="HybridGradientBoostingTree") uc.enable_mlflow_autologging(is_exported=True) # train model uc.fit(training_data, key="ID", label="LABEL", partition_method="stratified", stratified_column="ID", partition_random_state=43, build_report=True) # create additional model report in MLFlow UnifiedReport(uc).display(save_html="UnifiedReport") mlflow.log_artifact("UnifiedReport_unified_classification_model_report.html") # Store training dataset in MLFlow itself pandas_training_dataset = training_data.collect() mlflow_dataset = mlflow.data.from_pandas(pandas_training_dataset, name="Customer data", targets="LABEL") mlflow.log_input(mlflow_dataset, context='training')</code></pre><P>&nbsp;</P><P>&nbsp;</P><H2 id="toc-hId--1016534967">Apply of trained model</H2><P>After we have finished our training, we are able with HANA ML to retrieve the model from MLFLow and use it for our prediction purposes. For this purpose, we will create a separate Python script where we will provide an overview to retrieve the trained MLflow model.</P><P>Similar to our training script, we first set up our connection to the HANA database and establish the connection to our table. In our case, we simply use the sample dataset provided by HANA ML.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>from hana_ml import dataframe from hana_ml.algorithms.pal.unified_classification import UnifiedClassification from hana_ml.visualizers.unified_report import UnifiedReport import mlflow from hana_ml.algorithms.pal.utility import DataSets from constants import db_url, db_user, db_password # dataset retrieval conn = dataframe.ConnectionContext(address=db_url, port=443, user=db_user, password=db_password) dataset_data, training_data, _, test_data = DataSets.load_bank_data(connection=conn, schema=schema_name, train_percentage=0.7, valid_percentage=0, test_percentage=0.3, seed=43)</code></pre><P>&nbsp;</P><P>&nbsp;</P><P><SPAN>Similar to our training script, we need to set the tracking url for MLflow and need to initiate the model storage of HANA. If we have decided to not export the HANA ML model to MLflow, we need to specify the same schema for the model storage where our HANA ML model is stored after the successful run. In case we have exported our model, we are able to specify a different schema. In the following, you can see the necessary script in order to retrieve the logged HANA ML model from MLflow.</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code># set up MLFlow and model storage mlflow.set_tracking_uri(tracking_url) model_storage = ModelStorage(connection_context=conn, schema=schema_name)</code></pre><P>&nbsp;</P><P>&nbsp;</P><P>After the model storage has been initiated, we are able to retrieve the stored HANA ML model from MLflow. In order to select the correct model, you need to extract the correct run id associated to the model you would like to apply for your prediction dataset. In our case, this is the test dataset we have received from the sample dataset method. The model_uri needed for the model retrieval is consisting of the following pattern 'runs:/{run id}/model', in which you would need to exchange the run id with your respective run. For the actual retrieval of the model, we use the initiated model storage, in our case called <STRONG>model_storage</STRONG>&nbsp;and call the method <STRONG>load_mlflow_model</STRONG> to load the MLflow model to our HANA database and assign the respective proxy to our variable <STRONG>mymodel</STRONG>. The variable <STRONG>mymodel</STRONG> is then used to call the predict method in order to apply our model to our dataset. In the end we transform our prediction dataset into a Pandas DataFrame to look at the content of the created DataFrame. Normally, we could directly persist the created temporary table with the <A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/hana_ml.dataframe.html#hana_ml.dataframe.DataFrame.save" target="_self" rel="noopener noreferrer">save method</A> and therefore make the dataset available for further processing.</P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code># load logged run from MLflow to HANA ML logged_model = 'runs:/d8a763b7b81940598633605e447cd880/model' mymodel = model_storage.load_mlflow_model(connection_context=conn, model_uri=logged_model) dataset_data_predict = mymodel.predict(data=test_data, key="ID") # collect the predicted dataset to see content in dataframe print(dataset_data_predict.collect())</code></pre><P>&nbsp;</P><P>&nbsp;</P><P><SPAN>After running the script, you should be able to see the following terminal output, for which we can see the download of the artefact stored in MLflow and the created prediction dataset, which consists in our case of 4 columns: ID (primary key), SCORE (predicted label), CONFIDENCE (prediction confidence for applied row) and REASON_CODE (influence of individual variables to prediction output). </SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Terminal output MLflow HANA model.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107043i774317219AD52360/image-size/large?v=v2&amp;px=999" role="button" title="Terminal output MLflow HANA model.png" alt="Terminal output MLflow HANA model.png" /></span></SPAN></P><P>&nbsp;<SPAN>In case we have exported our model, the output of our terminal look slightly different indicating that we also download the respective model artefacts stored additionally to the yaml file. In the following you see the complete script used for applying the model to a new dataset.</SPAN></P><P>&nbsp;</P><P>&nbsp;</P><pre class="lia-code-sample language-python"><code>from hana_ml import dataframe from hana_ml.model_storage import ModelStorage from hana_ml.algorithms.pal.utility import DataSets import mlflow from constants import db_url, db_user, db_password conn = dataframe.ConnectionContext(address=db_url, port=443, user=db_user, password=db_password) # full_set, diabetes_train, diabetes_test, _ = DataSets.load_diabetes_data(conn) dataset_data, training_data, _, test_data = DataSets.load_bank_data(connection=conn, schema=schema_name, train_percentage=0.7, valid_percentage=0, test_percentage=0.3, seed=43) # set up MLFlow and model storage mlflow.set_tracking_uri(tracking_uri) model_storage = ModelStorage(connection_context=conn, schema=schema_name) # load logged run from MLflow to HANA ML logged_model = 'runs:/ed7b8d4734cb42ca90c417f932957b40/model' mymodel = model_storage.load_mlflow_model(connection_context=conn, model_uri=logged_model) dataset_data_predict = mymodel.predict(data=test_data, key="ID") # collect the predicted dataset to see content in dataframe print(dataset_data_predict.collect())</code></pre><P>&nbsp;</P><P>&nbsp;</P><H1 id="toc-hId--919645465"><SPAN>Key take aways</SPAN></H1><P><SPAN>In this blog post we have showcased an end to end example how MLflow can be integrated in your HANA ML workload by providing the possibility to share and compare multiple tracked runs in MLflow. If the data is already stored in HANA, this allows you to directly interact with MLflow while being able to run your Machine Learning algorithms on data stored in the HANA database without the need to transfer your data between multiple systems. This blog covered an essential part of the automated logging capabilities of HANA ML models into MLflow.&nbsp;</SPAN></P><P>We highly appreciate your thoughts, comments and questions under this blog post. In case you want to reach out for general questions around HANA, or specifically HANA ML, don't hesitate to use the <A href="https://community.sap.com/t5/technology-q-a/qa-p/technology-questions" target="_self">Q&amp;A tool</A>&nbsp;with the respective tags that describe your question.</P> 2024-05-08T17:00:00.007000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-catalog-harvesting-from-sap-datasphere-sap-bw-bridge/ba-p/13694333 SAP Datasphere catalog - Harvesting from SAP Datasphere, SAP BW bridge 2024-05-08T17:48:25.809000+02:00 GaetanSaulnier https://community.sap.com/t5/user/viewprofilepage/user-id/255390 <P style=" text-align : justify; ">In March 2023, we enhanced SAP Datasphere with catalog capabilities to democratize data access and strengthen data-driven decision-making <SPAN>(</SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/unlock-the-full-potential-of-your-enterprise-data-with-sap-datasphere/ba-p/13562220" target="_blank"><SPAN>launch blog</SPAN></A><SPAN>)</SPAN>. This feature facilitates the curation and organization of metadata harvested from various SAP data sources, assists in discovering curated data assets, and enables search and classification of these assets by users and data stewards. The catalog further enriches assets with business metadata, aiding organizations in their data governance journey. It connects to sources such as SAP Datasphere and SAP Analytics Cloud from its inception, retrieving metadata from a variety of assets within these systems, including but not limited to stories, models, insights, predictive scenarios, local tables, remote tables, views, and data actions.</P><P>Today we are happy to announce that we recently delivered a new <A href="https://roadmaps.sap.com/board?PRODUCT=73555000100800002141&amp;range=CURRENT-LAST#;INNO=B73262CB7CD41EDDB3E6586A0A692931" target="_blank" rel="noopener noreferrer">enhancement</A> that extends the coverage of our metadata harvesting with the support to connect and retrieve technical metadata from SAP Datasphere, SAP BW bridge.</P><P style=" text-align : justify; ">SAP Datasphere, SAP BW bridge provides a path that eases the transition from existing SAP BW systems to an innovative infrastructure supported by SAP Datasphere. This helps organizations to reuse existing investments in SAP BW by allowing you to transfer and enable the rich feature set of extractors and ABAP code for access to legacy SAP on-premises systems, and to seamlessly transfer existing ETL processes within a dedicated space.</P><H3 id="toc-hId-1123513923">Feature Description</H3><P>You can add an SAP Datasphere, SAP BW bridge system to the catalog to <SPAN>as a source to harvest metadata of the following objects:</SPAN></P><UL><LI>Advanced Data Store Objects</LI><LI>Info Areas</LI><LI>Composite Providers</LI><LI>Info Objects</LI><LI>Source Systems</LI><LI>Data Sources</LI><LI>Info Sources</LI><LI>Transformations</LI></UL><P>Once extracted, these new objects can be enriched with business metadata and published to the catalog like any other assets.</P><H3 id="toc-hId-927000418">Create a Connection to an SAP Datasphere, SAP BW bridge system</H3><P>In this section, we will briefly show how to connect to an SAP Datasphere, SAP BW bridge system to harvest the metadata for the catalog.</P><P>Connect to your SAP Datasphere tenant and access the Monitoring page under the Catalog section in the left menu.</P><P>On the monitoring page, click “+” to create a new entry for a system.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_0-1715118020013.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107067i11B890D2D4950F2A/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_0-1715118020013.jpeg" alt="gaetan_saulnier_0-1715118020013.jpeg" /></span></P><P>Select SAP Datasphere, SAP BW bridge.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_1-1715118020019.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107066i0981B647EFBE3736/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_1-1715118020019.jpeg" alt="gaetan_saulnier_1-1715118020019.jpeg" /></span></P><P>Click Create.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_2-1715118020023.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107065iFF12FF5418B01CCA/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_2-1715118020023.jpeg" alt="gaetan_saulnier_2-1715118020023.jpeg" /></span></P><P>You can now configure which Info Areas you want to select and then save the configuration for this connected system.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_3-1715118020027.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107068iE95BC44E634A64B6/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_3-1715118020027.jpeg" alt="gaetan_saulnier_3-1715118020027.jpeg" /></span></P><P>Now you can sync to request the harvesting of the metadata related to the selected Info Areas that define the connection to the SAP Datasphere, SAP BW bridge system.</P><H3 id="toc-hId-730486913">Interact with the catalog</H3><P>In this section, we will briefly show how to search and interact with assets in the catalog that are from the SAP Datasphere, SAP BW bridge connected system.</P><P>As usual, navigate to the Catalog, and search for any content, but also now you can leverage the Info Areas as well as the Source Systems associated to the SAP Datasphere, SAP BW bridge connected system as structure to narrow down your search result.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_4-1715118020034.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107069i85975B220D4340D1/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_4-1715118020034.jpeg" alt="gaetan_saulnier_4-1715118020034.jpeg" /></span></P><P>The support of metadata harvesting also introduces new asset types which can be used to filter and narrow down your discovery of the available content.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_5-1715118020041.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107070i21574C6E96BD34A6/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_5-1715118020041.jpeg" alt="gaetan_saulnier_5-1715118020041.jpeg" /></span></P><P>Once you find (or locate) the asset you were looking for, you can review its properties and the column details when . Column details, for example for attributes also allow to easily navigate to other referenced info objects in the catalog to further discover the data.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_6-1715118020048.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107071i376FB00F7459EFC0/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_6-1715118020048.jpeg" alt="gaetan_saulnier_6-1715118020048.jpeg" /></span></P><P>Like any other asset in the catalog, you can also look at the lineage and impact analysis graph to see the other assets associated to it.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gaetan_saulnier_7-1715118020055.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107072i3F709B20F12EBA8A/image-size/large?v=v2&amp;px=999" role="button" title="gaetan_saulnier_7-1715118020055.jpeg" alt="gaetan_saulnier_7-1715118020055.jpeg" /></span></P><P>The lineage and impact analysis graph allows to navigate to the different assets to further discover the data available and their relationships.</P><P>Any of the existing feature to enrich the assets with business metadata such as descriptions, business glossary terms, KPIs, and Classification Tags can be used to further enhance the metadata for the SAP Datasphere, SAP BW bridge harvested metadata as described in this <A href="https://community.sap.com/t5/technology-blogs-by-sap/maximizing-communication-efficiency-with-sap-datasphere-catalog-s-business/ba-p/13632678" target="_blank">blog</A>.</P><H3 id="toc-hId-533973408">What’s Next?</H3><P>Soon, for the SAP Datasphere, SAP BW bridge metadata harvesting, we will enable scheduling of the metadata harvesting to remove the need to manually sync that process. Additionally, we will add column groupings for the column details associated to the assets.</P><P>We will continue to extend the coverage of the SAP metadata we can harvest to continuously support organizations on their data governance journey.</P><P>I also want to share some additional resources for you around SAP Datasphere:&nbsp;&nbsp;</P><UL><LI>Access to your free 30 day&nbsp;<A href="https://www.sap.com/products/technology-platform/datasphere/guided-experience.html" target="_blank" rel="noopener noreferrer">trial</A>&nbsp;of SAP Datasphere&nbsp;</LI><LI>Join&nbsp;<A href="https://pages.community.sap.com/topics/datasphere" target="_blank" rel="noopener noreferrer">SAP Datasphere community</A>&nbsp;to see what’s new, ask questions, share your feedback, and more.&nbsp;</LI><LI>SAP Datasphere product&nbsp;<A href="https://www.sap.com/products/technology-platform/datasphere/features.html" target="_blank" rel="noopener noreferrer">overview</A>&nbsp;&nbsp;</LI><LI><A href="https://help.sap.com/docs/SAP_DATASPHERE/aca3ccb4b2f84eb8b6154e8fd2812c0e/de29b96a9438439682715a93212ae4f4.html" target="_blank" rel="noopener noreferrer">Documentation</A>&nbsp;for SAP Datasphere catalog</LI></UL> 2024-05-08T17:48:25.809000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-conceptual-guide/ba-p/13688478 Tracking HANA Machine Learning experiments with MLflow: A conceptual guide for MLOps 2024-05-08T19:17:01.849000+02:00 stojanm https://community.sap.com/t5/user/viewprofilepage/user-id/39047 <H2 id="toc-hId-993627937">Introduction</H2><P>MLflow is an open-source platform, which is the de facto standard, when it comes to managing and streamlining machine learning lifecycles, including experimentation, reproducibility, and deployment. It offers a centralized repository to track experiments, share projects, and collaborate effectively, making it a common choice among data scientists and can be used with most open-source machine learning frameworks (e.g. scikit-learn, Tensorflow, etc.).</P><P>Some providers offer MLflow as a managed service (e.g. Databricks) or integrate it (e.g. Azure ML). In addition, it is possible for the user to deploy the service manually on their platform of choice (e.g. on SAP Business Technology Platform).</P><P>Starting with version 2.13 HANA Machine Learning added support for tracking of experiments with the mlflow package. This makes models, which were developed using hanaml, easily incorporated into an extensive MLOps pipeline.</P><P><SPAN>This blog post is part of a series describing the usage of MLflow with HANA Machine Learning co-authored by <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/43098">@martinboeckling</a>&nbsp;and <a href="https://community.sap.com/t5/user/viewprofilepage/user-id/39047">@stojanm</a>. In the first part</SPAN><SPAN> we present an conceptual guide on how to use MLflow with SAP Datasphere and HANA Machine Learning (through the hanaml package). The objective is to provide to the reader a high level template for machine learning operations (MLOps) for HANA ML specifically with MLflow. In the second part of the series, called <A href="https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-technical-deep/ba-p/13692481" target="_blank">Tracking HANA Machine Learning experiments with MLflow: A technical Deep Dive</A>, we provide a more technical deep dive on how to setup an MLflow instance and a general introduction on how Machine Learning models trained with HANA ML can be logged with MLflow.</SPAN></P><P><SPAN>It is important to mention that SAP offers an extensive MLOps platform for managing ML experiments, AI Core / AI Launchpad, which is out of the scope of this post. For more information on AI Core please refer to the <A href="https://community.sap.com/t5/technology-blogs-by-sap/ai-foundation-sap-s-all-in-one-ai-toolkit-for-developers/ba-p/13581014" target="_blank">blog post here</A>. </SPAN></P><P>Ok, let's start reviewing our example. We will work our way along a simplified Machine Learning pipeline as shown below and will comment on the architectural patterns for each task.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="PipelineV3.png" style="width: 707px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/105718i25B379A47143C54A/image-dimensions/707x67?v=v2" width="707" height="67" role="button" title="PipelineV3.png" alt="PipelineV3.png" /></span></P><P>To simplify the use case, we will assume that the<SPAN>&nbsp;gravity of the data required for our model lies within SAP. This means that majority of the data used for model training is in an SAP application either on-premise or in the cloud. </SPAN></P><H3 id="toc-hId-926197151"><SPAN>Data Modeling</SPAN></H3><P><SPAN>Typically&nbsp;data landscapes in enterprises are quite complex and data is distributed across numerous systems. So, even though the majority of the data for our example comes from an SAP source, it is realistic to assume that for the modeling a portion of that data could come from another system. It is the task of a Data Engineer to connect to the data and to prepare it for the algorithm training (e.g. feature engineering). As shown in the picture below SAP Datasphere can help unify data sources in a central repository either via federation or, where not supported, via replication.&nbsp;</SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="DataModeling.jpg" style="width: 614px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107828i0DD1F43F1EC03698/image-dimensions/614x268?v=v2" width="614" height="268" role="button" title="DataModeling.jpg" alt="DataModeling.jpg" /></span></SPAN></P><P><SPAN>In addition to the data modeling features, SAP Datasphere also</SPAN>&nbsp;offers<SPAN>&nbsp;a runtime for Machine Learning tasks thanks to the embedded HANA Cloud instance. This runtime can be utilized by Data Scientists and since it is embedded it allows to perform ML without the need for data movement and replication. This brings several benefits related to security, execution speed, business context preservation and compliance aspects. For more information about those benefits&nbsp;check out&nbsp;</SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-seamless-extraction-of-business-insights-in-multi-cloud/ba-p/13563518" target="_blank">this blog post.</A></P><P><SPAN>Ok, let's move to the data science tasks and model training.</SPAN></P><H3 id="toc-hId-729683646">Model Training</H3><P>During this phase Data Scientists experiment and iteratively develop the ML model. Most Data Science experts have their preferred platform for ML prototyping. The HANA Machine Learning Python package, called hana-ml, can be used with any Python IDE available. The development environment can be either deployed manually by the Data Scientist or hosted centrally on a dedicated platform. The following blog posts show examples how HANA Machine Learning code can be developed using different platforms: <A href="https://community.sap.com/t5/technology-blogs-by-sap/azure-machine-learning-triggering-calculations-ml-in-sap-data-warehouse/ba-p/13523316" target="_blank">Azure ML</A> and <A href="https://community.sap.com/t5/technology-blogs-by-sap/databricks-triggering-calculations-ml-in-sap-data-warehouse-cloud/ba-p/13540177" target="_blank">Databricks</A>.&nbsp;</P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="ModelTraining.jpg" style="width: 624px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107829i4F918A3F5D5FF3B1/image-dimensions/624x254?v=v2" width="624" height="254" role="button" title="ModelTraining.jpg" alt="ModelTraining.jpg" /></span></SPAN></P><P>&nbsp;<SPAN>Already during training and experimentation, MLflow plays an important role. It helps evaluate the progress and log the details of each experiment run for later reference. Several algorithms from the hana-ml package (e.g. Automated* or Unified* methods) support the automatic logging of model key performance indicators during training.&nbsp;This is seamless for the user and uses the same interface as open source frameworks. It allows to track hyper parameters, model performance KPIs and also log training activities with usernames and timestamps for auditibility. For more technical details about these features please refer to the </SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-technical-deep/ba-p/13692481" target="_self">second part</A><SPAN> of our blog post.</SPAN></P><H3 id="toc-hId-533170141">Model Deployment</H3><P>Once a suitable model has been selected and trained, it needs to be deployed. For our example with HANA Machine Learning, we will do this in two steps. In the first step, hana-ml is used to store the artifacts into the built-in model repository of SAP HANA Cloud. In the second step the model is exposed via an API to be consumed by other applications. This can be achieved in several ways, but a lean approach is to use a deployed Flask application (e.g. on SAP Business Technology Platform). To see the details on this process please refer to <A href="https://community.sap.com/t5/technology-blogs-by-sap/scheduling-python-code-on-cloud-foundry/ba-p/13503697" target="_blank">this blog post</A>.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="ModelDeployment.jpg" style="width: 622px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107830iFD1AC1F838D1E95C/image-dimensions/622x276?v=v2" width="622" height="276" role="button" title="ModelDeployment.jpg" alt="ModelDeployment.jpg" /></span></P><P>&nbsp;<SPAN>Model Performance Tracking</SPAN></P><P>In addition to being able to track experiments while training the model, also tracking of the model performance after deployment is important - e.g. in order to track prediction quality and also detect effects like data drift, etc. Some information about those concepts can be found in <A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-data-intelligence-as-an-mlops-platform/ba-p/13441314" target="_blank">this blog post</A>.</P><P>In our example we will achive the monitoring of the model during operations as follows: Since our model is deployed and exposed via a Flask application, as proposed above, in the application code we use the mlflow package to log incoming data as well as predictions. This allows us to run validation tests (compare actual vs. predicted) once validation data becomes available.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="ModelPerformanceTracking.jpg" style="width: 629px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107831iD540D3392AA56977/image-dimensions/629x256?v=v2" width="629" height="256" role="button" title="ModelPerformanceTracking.jpg" alt="ModelPerformanceTracking.jpg" /></span></P><P>&nbsp;<SPAN>Let's now review how to detect deterioration in model performance and how to perform retraining.&nbsp;</SPAN></P><H3 id="toc-hId-336656636">Model Re-Training</H3><P>Model retraining could be either scheduled or triggered based on a condition or an event. There are several ways how this can be achieved, including automation flows, like SAP Build Process Automation, AirFlow (<A href="https://airflow.apache.org/" target="_blank" rel="noopener nofollow noreferrer">https://airflow.apache.org/</A>) or Kubeflow (<A href="https://www.kubeflow.org/" target="_blank" rel="noopener nofollow noreferrer">https://www.kubeflow.org/</A>), or by simple helper applications deployed by the user.</P><P>For the sake of simplicity in our case we use a simple application deployed on SAP Business Technology Platform (e.g. on Cloud Foundry), which can schedule model retraining runs (e.g. if new training data becomes available regularly). This same application can periodically run a check on the model performance via the APIs from mlflow as described in the previous section. And if there is any performance deterioration (e.g. there is a high deviation in predicted vs actuals) a new retraining run can be performed. And as discussed already, during the new training run the model parameters will be logged via hana-ml and mlflow. In addition, the model will be updated in the model repository in SAP HANA Cloud.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="ModelRetraining.jpg" style="width: 630px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107832i01B06E252BF6ED67/image-dimensions/630x314?v=v2" width="630" height="314" role="button" title="ModelRetraining.jpg" alt="ModelRetraining.jpg" /></span></P><P>&nbsp;<SPAN>This closes the cycle for the example ML pipeline in the first section. Let's put all pieces together to see the big picture.</SPAN></P><H3 id="toc-hId-140143131">Key takeaways</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Overview.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107833iE812F4DCDFFACB49/image-size/large?v=v2&amp;px=999" role="button" title="Overview.jpg" alt="Overview.jpg" /></span></P><P>&nbsp;<SPAN>In this blog post we showcased an example conceptual and architectural blueprint on how to realize MLOps pipelines using SAP HANA Machine Learning and the open-source framework MLflow. We discussed the end-to-end process and the advantages of integrating these tools to streamline the machine learning lifecycle, especially in the part of model lifecycle management. To see the technical details and example code to achieve the described steps please refer to the second part of the blog series </SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/tracking-hana-machine-learning-experiments-with-mlflow-a-technical-deep/ba-p/13692481" target="_blank">here</A><SPAN>. Happy reading!</SPAN></P> 2024-05-08T19:17:01.849000+02:00 https://community.sap.com/t5/technology-blogs-by-members/what-s-new-in-sap-datasphere-version-2024-10-may-7-2024/ba-p/13696439 What’s New in SAP Datasphere Version 2024.10 — May 7, 2024 2024-05-09T15:55:33.407000+02:00 TuncayKaraca https://community.sap.com/t5/user/viewprofilepage/user-id/137163 <P>SAP has released version 2024.10 of SAP Datasphere. Enhancements in administration, data integration, data marketplace and data modeling.</P><P class="">There are 3 data integration, 2 data modeling, 1 administration, and 1 data marketplace enhancements in version 2024.10 of SAP Datasphere.</P><P class="">&nbsp;</P><H2 id="toc-hId-994491753">Administration</H2><DIV class=""><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_0-1715262414325.jpeg" style="width: 748px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108045i3ED3573686D81DA4/image-dimensions/748x561?v=v2" width="748" height="561" role="button" title="TuncayKaraca_0-1715262414325.jpeg" alt="TuncayKaraca_0-1715262414325.jpeg" /></span><FONT size="2"><SPAN>Photo by</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/@impatrickt?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Patrick Tomasso</A><SPAN>&nbsp;</SPAN><SPAN>on</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Unsplash</A></FONT></DIV></DIV><P class=""><SPAN class=""><FONT size="6">1</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>For resilience purposes, we are now limiting the maximum CPU resources for workload generated by spaces, user group users and agent users to 80%. The remaining 20% of CPU resources are reserved to ensure that the system can respond under heavy load.</EM></FONT></P><P class=""><FONT color="#808080"><EM>You can now configure the total amount of threads that each space can consume up to a maximum of 100% of this 80% of the threads available in your tenant.</EM></FONT></P><P class=""><FONT color="#808080"><EM>Also, statements are now queued when the CPU usage reaches 90% (instead of 80%) of the database capacity. This value, which you cannot change, applies to all spaces, including spaces created before this version.<SPAN>&nbsp;</SPAN></EM><A class="" href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/d66ac1efb5054068a104c4559b72d272.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener ugc noreferrer"><EM>Set a Priority and Statement Limits for a Space</EM></A></FONT></P><P class="">It’s all about how you manage resources in<SPAN>&nbsp;</SPAN><STRONG>Space Management</STRONG><SPAN>&nbsp;</SPAN>&gt;<SPAN>&nbsp;</SPAN><STRONG>Workload Management</STRONG>. Good luck!</P><P class="">The ADMISSION CONTROL QUEUE CPU THRESHOLD parameter is set to 90% and cannot be changed.</P><P class="">The TOTAL STATEMENT MEMORY LIMIT parameter is set to 80% by default. You can change it by entering the maximum number (or percentage) of GBs of memory that concurrently executing statements can consume in the space.</P><P class="">&nbsp;</P><H2 id="toc-hId-797978248">Data Integration</H2><DIV class=""><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_1-1715262414227.jpeg" style="width: 764px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108046iD80968163DD29C7B/image-dimensions/764x510?v=v2" width="764" height="510" role="button" title="TuncayKaraca_1-1715262414227.jpeg" alt="TuncayKaraca_1-1715262414227.jpeg" /></span><FONT size="2"><SPAN>Photo by</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/@manu_mnvx?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Manuel Mnvx</A><SPAN>&nbsp;</SPAN><SPAN>on</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Unsplash</A></FONT></DIV></DIV><P class=""><SPAN class=""><FONT size="6">1</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>For transformation flows, if the source table is not a delta capture table, you can now switch delta capture on for the target table. The system adds the delta capture columns<SPAN>&nbsp;</SPAN></EM><STRONG><EM>Change Date</EM></STRONG><EM><SPAN>&nbsp;</SPAN>and<SPAN>&nbsp;</SPAN></EM><STRONG><EM>Change Type</EM></STRONG><EM><SPAN>&nbsp;</SPAN>to the target table.<SPAN>&nbsp;</SPAN></EM><A class="" href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/0950746ab4444e5ca6a665ee1b0380a1.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener ugc noreferrer"><EM>Add or Create a Target Table</EM></A></FONT></P><P class="">It’s a handy addition to the transformation flows. Yes, if the source table is not a delta, then the default value of the Delta Capture<STRONG><SPAN>&nbsp;</SPAN></STRONG>property for the target table is Off. Now you can enable delta capture and add delta capture columns<SPAN>&nbsp;</SPAN><STRONG>Change Date</STRONG><SPAN>&nbsp;</SPAN>and<SPAN>&nbsp;</SPAN><STRONG>Change Type</STRONG><SPAN>&nbsp;</SPAN>to the target table.</P><P class=""><SPAN class=""><FONT size="6">2</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>If email notification is set up for a task chain, you’ll now receive email notifications if an error occurs during initialization or preparation to run the task chain, before the task chain run actually starts.<SPAN>&nbsp;</SPAN></EM><A class="" href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/d1afbc2b9ee84d44a00b0b777ac243e1.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener ugc noreferrer"><EM>Creating a Task Chain</EM></A></FONT></P><P class="">You can now set up email notifications when task chains are completed. These options are available:</P><UL class=""><LI>Send email notification only if the run failed.</LI><LI>Send email notification only when the run completes successfully.</LI><LI>Send email notification when the run completes.</LI></UL><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_2-1715262414018.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108047iFE5B29B9759EF611/image-size/medium?v=v2&amp;px=400" role="button" title="TuncayKaraca_2-1715262414018.png" alt="TuncayKaraca_2-1715262414018.png" /></span><BR /><FONT size="2">SAP Datasphere — Task Chain Email Notification</FONT></DIV><P class=""><SPAN class=""><FONT size="6">3</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>Make your SAP Analytics Cloud stories available to the world by translating metadata from SAP Datasphere’s Translation tool. With the Translation privilege, you can access the dashboard and translate metadata such as business names and column names for dimensions and analytic models, and hierarchy dimension labels for stories to a wide range of languages. You can translate manually or via an XLIFF file, but also manage already existing translations, and update them.<SPAN>&nbsp;</SPAN></EM><A class="" href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/fe829debe389450394cf7a15860e2caa.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener ugc noreferrer"><EM>Translating Metadata for SAP Analytics Cloud</EM></A></FONT></P><P class="">It’s another handy feature for multilingual environments when using SAP Datasphere with SAP Analytics Cloud stories and reports. Use the translation tool in Datasphere and translate object metadata, and your SAP Analytics Cloud story can be viewed in the language of your choice.</P><P class="">&nbsp;</P><H2 id="toc-hId-601464743">Data Marketplace</H2><DIV class=""><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_3-1715262414291.jpeg" style="width: 760px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108048iA8EA82D737BEDAE8/image-dimensions/760x570?v=v2" width="760" height="570" role="button" title="TuncayKaraca_3-1715262414291.jpeg" alt="TuncayKaraca_3-1715262414291.jpeg" /></span><FONT size="2"><SPAN>Photo by</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/@ragonesco?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Raul Gonzalez Escobar</A><SPAN>&nbsp;</SPAN><SPAN>on</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Unsplash</A></FONT></DIV></DIV><P class=""><SPAN class=""><FONT size="6">1</FONT>&nbsp;</SPAN><EM>Y<FONT color="#808080">ou can now duplicate existing data products.<SPAN>&nbsp;</SPAN></FONT></EM><FONT color="#808080"><A class="" href="https://help.sap.com/docs/PRODUCTS/d4185d7d9a634d06a5459c214792c67e/14232081d7444195b1f66c16e56f6d09.html?locale=en-US&amp;version=cloud" target="_blank" rel="noopener ugc noreferrer"><EM>Duplicating a Data Product</EM></A></FONT></P><P class="">It’s simple. My Data Products &gt; Duplicate Product and make necessary changes and Save.</P><P class="">&nbsp;</P><H2 id="toc-hId-404951238">Data Modeling</H2><DIV class=""><DIV class=""><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_4-1715262414064.jpeg" style="width: 757px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108049i7133095D95177A39/image-dimensions/757x505?v=v2" width="757" height="505" role="button" title="TuncayKaraca_4-1715262414064.jpeg" alt="TuncayKaraca_4-1715262414064.jpeg" /></span><FONT size="2"><SPAN>Photo by</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/@josholalde?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Josh Olalde</A><SPAN>&nbsp;</SPAN><SPAN>on</SPAN><SPAN>&nbsp;</SPAN><A class="" href="https://unsplash.com/?utm_source=medium&amp;utm_medium=referral" target="_blank" rel="noopener ugc nofollow noreferrer">Unsplash</A></FONT></DIV></DIV><P class=""><SPAN class=""><FONT size="6">1</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>We have improved the user experience:</EM></FONT></P><UL class=""><LI><FONT color="#808080"><EM>You can now copy existing measures to create a new measure.</EM></FONT></LI><LI><FONT color="#808080"><EM>For measures, attributes and associated dimensions there is now a<SPAN>&nbsp;</SPAN></EM><STRONG><EM>Select All/Unselect All</EM></STRONG><EM><SPAN>&nbsp;</SPAN>checkbox.</EM></FONT></LI></UL><P class="">Thanks again for handy features!</P><P class=""><SPAN class=""><FONT size="6">2</FONT>&nbsp;</SPAN><FONT color="#808080"><EM>The Data Preview shows only the first 1,000 rows to increase performance. You can use filters to display the data relevant to you.</EM></FONT></P><P class="">Okay, that works.</P><P class="">&nbsp;</P><H1 id="toc-hId-79355014">References</H1><OL class=""><LI><A class="" href="https://help.sap.com/whats-new/48017b2cc4834fc6b6cae87097bd9e4d?locale=en-US&amp;Version=2024.10" target="_blank" rel="noopener ugc noreferrer"><EM>What’s New in SAP Datasphere</EM></A><EM><SPAN>&nbsp;</SPAN>May 7, 2024. Version<SPAN>&nbsp;</SPAN></EM><STRONG><EM>2024.10</EM></STRONG></LI><LI><EM><A href="https://medium.com/@tncykarc/whats-new-in-sap-datasphere-version-2024-10-may-7-2024-98d9a589e63d" target="_self" rel="nofollow noreferrer noopener">What’s New in SAP Datasphere Version 2024.10— May 7, 2024</A><SPAN>&nbsp;</SPAN>at&nbsp;<A href="https://medium.com/@tncykarc/" target="_self" rel="nofollow noopener noreferrer">medium.com/@tncykarc</A></EM></LI></OL> 2024-05-09T15:55:33.407000+02:00 https://community.sap.com/t5/technology-blogs-by-members/datasphere-delta-extraction/ba-p/13699391 Datasphere – Delta Extraction 2024-05-13T13:54:30.330000+02:00 Kuma https://community.sap.com/t5/user/viewprofilepage/user-id/275733 <P>Optimizing Dataloads requires delta functionality.</P><P>Here a How-To for a Generic Delta-like extractor within DSP.</P><P style=" text-align : justify; ">Prerequisite is a ‘last change date/timestamp’ or a ‘unique document counter’ in the source view, based on which the delta will be determined.</P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><STRONG>In DSP two components will be used: </STRONG></P><UL><LI>The Delta table, which contains the ‘Extractor’ and ‘Last Extraction’ Date. This date provides the info to the Extractor itself on what to load – the Delta. It always loads data from the Last Extraction Date till Today. The Delta Table is general for all Extractors.</LI><LI>Actual Extractor, from where we load the Delta data. The Extractor has several steps which include the Extraction View, Datapshere transparent table and Extractor Delta updater.<SPAN> Each Extractor has its own Task Chain and own 3 Task Chain steps.&nbsp;</SPAN></LI></UL><P style=" text-align : justify; "><STRONG>Objects Used: </STRONG></P><TABLE><TBODY><TR><TD width="162"><P><STRONG><SPAN>Type </SPAN></STRONG></P></TD><TD width="291"><P><STRONG><SPAN>Usage&nbsp; </SPAN></STRONG></P></TD></TR><TR><TD width="162"><P><SPAN>Table </SPAN></P></TD><TD width="291"><P><SPAN>Extractors and Last Extraction Dates</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Extractor </SPAN></P></TD><TD width="291"><P><SPAN>Task Chain controls the actual Extractor </SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>SDA Table</SPAN></P></TD><TD width="291"><P><SPAN>Remote Table </SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>View</SPAN></P></TD><TD width="291"><P><SPAN>Data Load View used for the Delta Extraction</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>DSP Table</SPAN></P></TD><TD width="291"><P><SPAN>Local Table used to persist all data</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Data Flow</SPAN></P></TD><TD width="291"><P><SPAN>Flow pushes the Delta data into Local table</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Transformation Flow</SPAN></P></TD><TD width="291"><P><SPAN>Flow updates the Last Extraction Date</SPAN></P></TD></TR></TBODY></TABLE><P style=" text-align : justify; "><SPAN>&nbsp;</SPAN></P><P style=" text-align : justify; "><STRONG>How To: </STRONG></P><P style=" text-align : justify; ">Add new entry into the Delta Table</P><P style=" text-align : justify; ">Create Delta View with selection using the entry from Delta Table</P><P style=" text-align : justify; ">Create Local Table (type FACT so it can be used in Queries, Analytical Models) with all Dimensions as Key</P><P style=" text-align : justify; ">Create Task Chain with 3 Steps:</P><OL style=" text-align : justify; "><LI>First Step is the Data Load View persisting the Delta Data (using objects from #2 and #3)</LI><LI>Second Step is Data Flow loading data from Delta View to Local Table</LI><LI>Third Step is Transformation Flow updating the Delta Table</LI></OL><P style=" text-align : justify; "><STRONG><SPAN>&nbsp;</SPAN></STRONG></P><P style=" text-align : justify; "><STRONG><SPAN>&nbsp;</SPAN></STRONG><STRONG><SPAN>#1 Delta Table</SPAN></STRONG></P><P style=" text-align : justify; "><SPAN>Use: Extractors and Last Extraction Dates</SPAN></P><P style=" text-align : justify; "><SPAN>Type: Local Table</SPAN></P><P style=" text-align : justify; "><SPAN>Semantic Usage: Dimension</SPAN></P><P style=" text-align : justify; "><SPAN>Key: Extractor </SPAN></P><P style=" text-align : justify; "><SPAN>Fields: Date, Extractor Description, Extractor Type, Upper and Lower Safety Interval and a Last Changed timestamp</SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_0-1715600867285.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109537i911D335A5FDDBE42/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_0-1715600867285.png" alt="martin_kuma_0-1715600867285.png" /></span></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_1-1715600867289.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109538i5F659F86681883E2/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_1-1715600867289.png" alt="martin_kuma_1-1715600867289.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><SPAN>&nbsp;</SPAN><STRONG>#2 Delta View with selection using the entry from Delta Table</STRONG></P><P style=" text-align : justify; "><SPAN>Use: Loading the Delta Data</SPAN></P><P style=" text-align : justify; "><SPAN>Type: View</SPAN></P><P style=" text-align : justify; "><SPAN>Semantic Usage: Relational Dataset</SPAN></P><P style=" text-align : justify; "><SPAN>Key: Not relevant / none as used for the Delta Data only</SPAN></P><P style=" text-align : justify; "><SPAN>View uses an SDA/Remote source</SPAN></P><P style=" text-align : justify; ">Example: SELECT&nbsp;&nbsp;&nbsp; … FROM "[SDA/<STRONG>Remote table</STRONG> with change date/timestamp]"</P><P style=" text-align : justify; ">AND "[delta_field]" &gt;= ( SELECT "DATE" FROM "[DELTA Table]" WHERE "EXTRACTOR" = '[extractor_name]' )</P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><SPAN>The <STRONG>Remote Table</STRONG> is used as the interface to Source System. The table has to have a Delta pointer. Either a Date, TimeStamp or Document Counter, what can be used to calculate the Delta. </SPAN></P><P style=" text-align : justify; "><SPAN>Type: SDA Table </SPAN></P><P style=" text-align : justify; "><SPAN>Semantic Usage: Relational Dataset</SPAN></P><P style=" text-align : justify; "><SPAN>Key: Not relevant / none as used for the Delta Data only</SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_2-1715600867290.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109539i03D1C7C1A03C78BB/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_2-1715600867290.png" alt="martin_kuma_2-1715600867290.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><STRONG>#3 Local Table (type FACT so it can be used in Queries, Analytical Models) with all Dimensions as Key</STRONG></P><P style=" text-align : justify; ">Local table is holding all extracted data. To be used for Query-like Views or Analytical Models. Important is to set the Key to all inbound fields / dimensions including texts. &nbsp;</P><P style=" text-align : justify; "><SPAN>Type: Local Table</SPAN></P><P style=" text-align : justify; "><SPAN>Semantic Usage: Fact</SPAN></P><P style=" text-align : justify; "><SPAN>Key: <STRONG>All Fields/Dimensions (including Texts, …) </STRONG>except Key Figures </SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_3-1715600867293.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109542iDE15244140D774DF/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_3-1715600867293.png" alt="martin_kuma_3-1715600867293.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><STRONG>#4 Task Chain: </STRONG></P><P style=" text-align : justify; ">The Extractor is controlled via a Task Chain:</P><P style=" text-align : justify; ">View providing the Delta Data from the Extractor is based on the Delta table (View is persisted)</P><P style=" text-align : justify; ">The Delta Data (Persisted data) are then moved to a local (FACT) table with UPSERT (Data Flow)</P><P style=" text-align : justify; ">Delta Date is updated after success (Transformation Flow)</P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_4-1715600867295.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109541iE72CA2D1A3756FB9/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_4-1715600867295.png" alt="martin_kuma_4-1715600867295.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><STRONG>Data Flow</STRONG></P><P style=" text-align : justify; "><SPAN>Pushing the Delta Data from Delta View into the Local Table holding all the data</SPAN></P><P style=" text-align : justify; "><SPAN>Type: Data Flow</SPAN></P><P style=" text-align : justify; "><SPAN>Uses: </SPAN></P><P style=" text-align : justify; ">Delta view <SPAN>persisting the Delta data</SPAN></P><P style=" text-align : justify; ">Local Table <SPAN>holding all data</SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_5-1715600867297.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109543i1E26A734ADE55197/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_5-1715600867297.png" alt="martin_kuma_5-1715600867297.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><SPAN>&nbsp;</SPAN><STRONG>Transformation flow</STRONG></P><P style=" text-align : justify; "><SPAN>Updates the Last Extraction Date of the Extractor</SPAN></P><P style=" text-align : justify; "><SPAN>Type: Transformation Flow</SPAN></P><P style=" text-align : justify; "><SPAN>Uses (same object as source and as a target): Delta Table for the Delta Extraction</SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_6-1715600867300.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109544iEABDB46D6F249B0B/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_6-1715600867300.png" alt="martin_kuma_6-1715600867300.png" /></span></P><P style=" text-align : justify; "><SPAN>&nbsp;</SPAN></P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_7-1715600867303.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109545iF2EF4F3FB135FA57/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_7-1715600867303.png" alt="martin_kuma_7-1715600867303.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="martin_kuma_8-1715600867305.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109546i06829A96C2442C7C/image-size/medium?v=v2&amp;px=400" role="button" title="martin_kuma_8-1715600867305.png" alt="martin_kuma_8-1715600867305.png" /></span></P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; ">&nbsp;</P><P style=" text-align : justify; ">&nbsp;</P> 2024-05-13T13:54:30.330000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/technovation-summit-2024-expand-your-ai-and-btp-horizons/ba-p/13697831 Technovation Summit 2024: Expand your AI and BTP Horizons 2024-05-13T15:21:27.786000+02:00 ChrisGrundy https://community.sap.com/t5/user/viewprofilepage/user-id/171629 <P>This week the SAPinsider <A href="https://reg.eventmobi.com/technovation-summit-barcelona-2024-ai-sap-btp" target="_blank" rel="noopener nofollow noreferrer">Technovation Summit</A> will commence its inaugural event, with the topics of <A href="https://www.sap.com/products/artificial-intelligence.html" target="_blank" rel="noopener noreferrer">Artificial Intelligence</A> (AI) and <A href="https://www.sap.com/products/technology-platform.html" target="_blank" rel="noopener noreferrer">SAP Business Technology Platform</A> (BTP) the stars of the show. And following on just a few short months from the last SAPinsider EMEA conference, it will be great to see if the buzz and excitement around these two topics is just as palpable in Barcelona as it was in <A href="https://community.sap.com/t5/technology-blogs-by-sap/wonderful-wonderful-sap-btp-your-conference-guide-to-sapinsider-copenhagen/ba-p/13576363" target="_blank">Copenhagen</A>.</P><P>Technovation Summit 2024 is a 1.5-day event designed for anyone interested in AI or SAP BTP and offers an agenda rich in informational sessions and learning opportunities for any technology enthusiast seeking to expand their knowledge and understanding of the development, application and best practices in these two hot topics. And in addition to hearing from a wide range of speakers at the event, there will also be ample opportunity to network and dive into deeper technology discussions with technology experts and industry peers.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-left" image-alt="280278_GettyImages-500050785_medium.jpg" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108736i77B471D5BA85A9E6/image-size/medium?v=v2&amp;px=400" role="button" title="280278_GettyImages-500050785_medium.jpg" alt="280278_GettyImages-500050785_medium.jpg" /></span>Barcelona of course is a wonderful city location for an event like this. After spending a long day inside, absorbing all the sessions that one can fit into a day, building your knowledge and expanding your horizons of the potential that AI and BTP has to offer your business, what better place could there be to then relax and reflect on the day by taking the time to visit the beach, enjoy a city walk, or unwind and eat Tapas in one of the many tavernas on Las Ramblas, while marveling at the diverse and unique architecture.</P><H3 id="toc-hId-1123608099">6 of the Best for Barcelona</H3><P>The trouble with attending any event, is selecting which sessions to see that will prove to be of most value to you, especially when there are so many opportunities to learn new things. My advice is simple, try and plan as much in advance, and create your personal agenda as early as possible, so that when you get to the event you can spend less time planning and more time networking and seeking answers to the burning questions that are most important to you and your business. &nbsp;</P><P>So why don’t I help a little and start you off with some suggestions for your agenda. Let’s simply assume that everyone will plan to attend the day-1 and day-2 keynotes, to be delivered by <STRONG><EM>Iver van de Zand (14 May, 9:30am)</EM></STRONG> and <STRONG><EM>Prof. Marek Kowalkiewicz</EM> <EM>(15 May, 9:30am)</EM> </STRONG>who will share their own insights and experiences of AI, and in the case of Iver BTP too, setting the scene and the tone for each day of the summit. But after the keynotes, what next? Here are my personal 6 picks for Barcelona:</P><OL><LI><EM>Generating the Future: SAP Business AI and the Generative AI Roadmap, 14 May, 10:45am to 11:30am</EM></LI><LI><EM>AI Ethics – A Primer to Guide your Organization on the Opportunities and Risks, 14 May, 11:45am to 12:30pm</EM></LI><LI><EM>How to Automate Document Streams using SAP BTP and Machine Learning, 14 May, 1:30pm to 2:30pm</EM></LI><LI><EM>Case Study: Moving From On-Premises to The Cloud with SAP Integration Suite, 14 May, 3:00pm to 4:00pm</EM></LI><LI><EM>What's on your Mind? Maximize the Potential of SAP BTP &amp; AI, 14 May, 4:15pm to 5:00pm</EM></LI><LI><EM>What's on your Mind? Unveiling the Magic of Digital Transformation, 15 May, 11:45am to 12:30pm</EM></LI></OL><P>And, if you don’t mind, I’ll take a liberty and add one more to see. This isn’t strictly a session, but it will cover many topics that are BTP and AI related, and this is delivered in the form of a <STRONG><EM>BTP: Technology News Update</EM></STRONG><EM>, </EM>which will take place at the SAP booth during the<EM> <STRONG>networking break at 10:15 on 14 May</STRONG>. </EM>Please head to the booth to hear from our presenter Carles, and then network with the SAP onsite team to ask them any questions that spring to mind. Don’t be afraid to challenge them – they’ll appreciate you spending the time to talk with them!</P><P>Of course, these are just a few suggestions, and there are many more sessions available for you to select from the agenda. Why not take a look at the SAPinsider Technovation Summit website and look at the event <A href="https://reg.eventmobi.com/technovation-summit-barcelona-2024-ai-sap-btp/pages/Sessions" target="_blank" rel="noopener nofollow noreferrer">sessions list</A>, or even better download the SAPinsider2024 event app to your smart device (available on the App Store and Google Play) and to see a full list of available sessions, and to create your own personal agenda.</P><H3 id="toc-hId-927094594">Continue Your Learning After the Event</H3><P>While I’m sure you’ll enjoy your experience and learning opportunities in Barcelona, I expect that you’ll also want to continue your own personal learning journey after the event too. In this case why not explore other learning opportunities from SAP. For example, find out how to build software applications, side-by-side extensions, and integrations to and from cloud applications by exploring <A href="https://learning.sap.com/learning-journeys/discover-sap-business-technology-platform?url_id=text-sapcommunity-prdteng-BTP" target="_blank" rel="noopener noreferrer">SAP’s free learning content on SAP BTP</A>. It is made for both integration designers and extension developers from all levels of expertise and will help you stay up to date with the latest <A href="https://learning.sap.com/learning-journey/discover-sap-business-technology-platform?url_id=text-sapcommunity-prdteng-BTP" target="_blank" rel="noopener noreferrer">SAP BTP</A> innovations. And check out even more role-based learning resources and opportunities to get certified in one place on <A href="https://learning.sap.com/?url_id=text-sapcommunity-prdteng" target="_blank" rel="noopener noreferrer">SAP Learning site</A>.</P><P>For those of you travelling to Barcelona this week, I do hope that you find the SAPinsider Technovation Summit to be an informative and stimulating experience. My best wishes for safe travels, and I hope that you enjoy the event!</P><P>Chris Grundy</P><P>Product Marketing, SAP BTP</P> 2024-05-13T15:21:27.786000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/watch-the-sap-bw-modernization-webinar-series/ba-p/13699559 Watch the SAP BW Modernization Webinar Series 2024-05-13T15:49:24.460000+02:00 SavannahVoll https://community.sap.com/t5/user/viewprofilepage/user-id/13466 <P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="296603_GettyImages-1338373325_medium_jpg.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109686i487761C94D205674/image-size/large?v=v2&amp;px=999" role="button" title="296603_GettyImages-1338373325_medium_jpg.jpg" alt="296603_GettyImages-1338373325_medium_jpg.jpg" /></span></SPAN></P><P><SPAN>Step into the future with our SAP Business Warehouse Modernization Webinar Series. Designed with the goal of revolutionizing the way you approach your SAP Business Warehouse (SAP BW) with SAP Datasphere, this series provides you with an unparalleled opportunity to learn, grow, and take your SAP BW environment to the next level.</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Each session comes packed with valuable information, transformative guidance, and real-life insights from seasoned SAP BW and SAP Datasphere experts. The series includes: </SPAN><SPAN>&nbsp;<BR /></SPAN><SPAN>&nbsp;<BR /></SPAN></P><UL><LI><STRONG><SPAN>Industry-Expert Guidance</SPAN></STRONG><SPAN>: Our sessions are hosted by technical experts from SAP, offering you the unique opportunity to learn from those shaping the future of business data technology.</SPAN><SPAN>&nbsp;<BR /></SPAN><SPAN>&nbsp;<BR /></SPAN></LI><LI><STRONG><SPAN>Practical Tools and Applications</SPAN></STRONG><SPAN>: We'll demonstrate the SAP Readiness Check, the SAP Expert Evaluation Tool, and SAP BW Bridge&nbsp;to ensure you have a robust toolbox.</SPAN><SPAN>&nbsp;<BR /></SPAN><SPAN>&nbsp;<BR /></SPAN></LI><LI><STRONG><SPAN>In-Depth Case Studies and Demonstrations</SPAN></STRONG><SPAN>: We’ll share how other customers have successfully modernized their SAP BW environment. Each session concludes with thorough demonstrations that you can apply in your practice.</SPAN><SPAN><BR /></SPAN></LI></UL><H2 id="toc-hId-994582149"><STRONG><SPAN><BR />Access the webinar series</SPAN></STRONG><SPAN>&nbsp;</SPAN><SPAN><BR /></SPAN></H2><P><SPAN>The overall length of the seven units is close to 7 hours. We have divided the recordings into several parts of 2 to 20 minutes for ease of use. Find the materials below: </SPAN><SPAN>&nbsp;</SPAN></P><P><STRONG><SPAN>Unit 1: Modernizing SAP Business Warehouse, Explore Your Options</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: Exploring Transition Option to SAP Datasphere –&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_d7104w4j" target="_blank" rel="noopener nofollow noreferrer"><SPAN>part 1</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_n4924i5w" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;|</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_1zrazwkq" target="_blank" rel="noopener nofollow noreferrer"><SPAN>part 2</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_djdovu16" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 2: Embedded Analytics in SAP S/4HANA Cloud -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_o6licnwt" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_ipy9lt15" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 3: Modernization Customer Stories -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_spo3pz61" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_140ysy6u" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 2: Preparation for Modernization – Your Source System</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: Preparation for Modernization -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_24t8g70j" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_apcm1kt9" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 2: Exploring the Transition from BW7.x and BW4 HANA Systems to BW Bridge Conversion -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_f3a1qfex" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_1t6mxf2v" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 3: System Analysis Tooling -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_3qolkzsg" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_q6vk4hpy" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 4: Additional Resources -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_qwqywlxd" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_2bfg4qqi" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 3: Using SAP Tooling for a Smooth Transition</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: Exploring Tools for Evaluating SAP BW System Readiness for</SPAN><SPAN>&nbsp;</SPAN><SPAN>Transition to SAP Datasphere -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_nkf9rygr" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_k4jlensf" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 2: Understanding the Conversion Paths to SAP Datasphere and SAP Datasphere, BW Bridge -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_vwm4uq3i" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_obtubygh" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 3: Leveraging SAP Services for Successful Deployment of SAP Datasphere -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_1d0c6r04" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_6a9lyf3c" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 4: Expert Evaluation Report -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_8x7usuol" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 5: Assessment Tools by SAP Partners -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_al1yqvoa" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_xhbo8pos" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 4: Sizing Matters for the Cloud</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: tools and methods for sizing of SAP Datasphere -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_xthexms0" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_pbvwnol1" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 5: Preparation for Modernization: Your target system</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: SAP Datasphere, BW Bridge as your target system -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_v2ornfoq" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_we79i9g3" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 2: Strategic Directions and Options for Data Warehousing Solutions -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_zpauer3z" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_u740hwsc" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 3: &nbsp;Exploring the Value and Features of SAP Datasphere -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_9y7e3oql" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_ikl3xhpf" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 4: Additional Resources -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_w7sxe4vd" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_uom1ax5f" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 6: A Deep Dive on Transition Approaches</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: Remote conversion and shell conversion options -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_15kwvztb" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_gxawzpvr" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 2: Metadata and Business Data Transfers in SAP Datasphere BW Bridge -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_ocgveqdo" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_w04wo38a" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 3: Transition Approaches - Demo -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_yfk165t9" target="_blank" rel="noopener nofollow noreferrer"><SPAN>part 1</SPAN></A><SPAN>&nbsp;|&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_poaer3hd" target="_blank" rel="noopener nofollow noreferrer"><SPAN>part 2</SPAN></A><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Lesson 4: Additional resources -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_xl128163" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_85qhrcbt" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><P><STRONG><SPAN>Unit 7: BW Modernization Series - Working with SAP Business Warehouse Elements in SAP Datasphere</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><SPAN>Lesson 1: Best practices for working with SAP Business Warehouse elements in SAP Datasphere -&nbsp;</SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_2sh706ds" target="_blank" rel="noopener nofollow noreferrer"><SPAN>video</SPAN></A><SPAN>&nbsp;,&nbsp;</SPAN><A href="https://www.kaltura.com/api_v3/service/attachment_attachmentasset/action/serve/attachmentAssetId/1_myd314bq" target="_blank" rel="noopener nofollow noreferrer"><SPAN>PDF</SPAN></A><SPAN>&nbsp;</SPAN></LI></UL><H2 id="toc-hId-798068644"><STRONG><SPAN><BR />Get Started </SPAN></STRONG><SPAN>&nbsp;</SPAN></H2><P><SPAN>We’ve designed the series to offer a comprehensive overview, from the basics of starting your <A href="https://www.sap.com/products/technology-platform/datasphere/migration.html#na" target="_self" rel="noopener noreferrer">SAP BW modernization</A> journey to successfully implementing the transition.</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Don’t miss the opportunity to prepare your operations, strategies, and mindset for the future with the SAP BW Modernization Webinar series. Take the first step towards empowering your business' digital transformation journey with the </SPAN><A href="https://sapvideo.cfapps.eu10-004.hana.ondemand.com/?entry_id=1_d7104w4j" target="_blank" rel="noopener nofollow noreferrer"><SPAN>first lesson</SPAN></A><SPAN>. </SPAN><SPAN>&nbsp;</SPAN></P><P>&nbsp;</P> 2024-05-13T15:49:24.460000+02:00 https://community.sap.com/t5/enterprise-architecture-blog-posts/s-4hana-public-cloud-integration-with-sap-datasphere/ba-p/13699159 S/4HANA Public Cloud Integration with SAP Datasphere 2024-05-14T05:54:48.426000+02:00 xsnat https://community.sap.com/t5/user/viewprofilepage/user-id/159410 <H1 id="toc-hId-865495586">Create a Connection between SAP S/4HANA Public Cloud and SAP Datasphere</H1><H2 id="toc-hId-798064800">1.&nbsp;&nbsp;&nbsp; Introduction</H2><P>In the fast-paced digital landscape, businesses seek synergy between technology solutions to thrive. SAP S/4HANA Public Cloud offers robust ERP capabilities, while Datasphere empowers data-driven insights. Together, they form a dynamic partnership, driving innovation and efficiency.</P><P>In this blog, we unravel the connection between SAP S/4HANA Public Cloud and Datasphere. From integration strategies to real-world applications, discover how this convergence revolutionizes operational excellence and strategic decision-making.</P><P>Join us on a journey towards unlocking the full potential of this transformative partnership, where data-driven insights fuel unprecedented success in the digital era.</P><P>To make the same steps you must have enough permissions on the S4 side and on the Datasphere side.</P><H2 id="toc-hId-601551295">2.&nbsp;&nbsp;&nbsp; Basic Logic and Setup</H2><P>We will setup the Connection to work with the S4 data in the Datasphere. To make sure that you have enough permissions to do that connection check the following steps:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_18-1715591994057.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109395i5EACBB742393DD15/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_18-1715591994057.png" alt="xsnat_18-1715591994057.png" /></span>On the Datasphere side you must have the permissions to setup a Connection with a System.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_19-1715592030625.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109396i1CF328828F6350E4/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_19-1715592030625.png" alt="xsnat_19-1715592030625.png" /></span></P><P>On the S4 side make sure that you have access to these apps.&nbsp;</P><P>If you have all these requirements, then you are to go with the blog. <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span></P><H2 id="toc-hId-405037790">3.&nbsp;&nbsp;&nbsp; Setup the Communication Settings in the S4</H2><P>To build up the Connection between the S4 and Datasphere we must enable on the S4 side the communication between the S4 and other Systems. The first app that we we need to open is the App “Communication Arrangements.”</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_20-1715592106972.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109397iA6D620625B7EFA39/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_20-1715592106972.png" alt="xsnat_20-1715592106972.png" /></span>After that you can click on “New” to create a new Communication Arrangement. Then a pop up will appear. There you can search for the scenario “SAP Datasphere - ABAP CDS Extraction - WebSocket Integration”</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_21-1715592146664.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109398i6B374ED488CE0480/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_21-1715592146664.png" alt="xsnat_21-1715592146664.png" /></span>After that you will jump in the Communication Arrangement. Then you can create a new communication System. This will be needed to declare which system will be take the data from the S4. For that click on new and enter a Name for your system and then click create.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_22-1715592170028.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109399iD46443ADCFCCCCAF/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_22-1715592170028.png" alt="xsnat_22-1715592170028.png" /></span></P><P>Now you are in the Communication System. There you must enter the URL of your Datasphere tenant but make sure to remove the https:// of the URL. Your URL should be something like this: “<STRONG><EM>your-datasphere-tenant.location.hcs.cloud.sap</EM></STRONG>”</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_23-1715592189945.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109400i598C3C643942F995/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_23-1715592189945.png" alt="xsnat_23-1715592189945.png" /></span></P><P>Then you can scroll down until you reach the options for the Inbound Communication User. There you can click on the + and press the button to create a new User.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_24-1715592209047.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109401i56584B3EF2D61AF5/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_24-1715592209047.png" alt="xsnat_24-1715592209047.png" /></span>You now can create your own user. Enter there a Name and Description for the User and then you can enter your own password, or you can propose own. After you have done it you can click on Create.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_25-1715592245041.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109402iF3E22D01C5EA34C0/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_25-1715592245041.png" alt="xsnat_25-1715592245041.png" /></span>After you have created your Inbound Communication User you will be send back to the Communication System App. You will see that you User has been selected and you can confirm it by clicking “OK”.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_26-1715592280355.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109403iAC9505590732BF06/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_26-1715592280355.png" alt="xsnat_26-1715592280355.png" /></span>Now you can scroll down to the Outbound Communication User. There you can click on the + to create a new User. In the Pop up you can select the Authentication Method of the User. There you can select “None” and then you can create the User.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_27-1715592312002.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109404i97C89E91018650F2/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_27-1715592312002.png" alt="xsnat_27-1715592312002.png" /></span>Now you can save your communication System.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_28-1715592333316.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109405iFDBF8150890CDCE0/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_28-1715592333316.png" alt="xsnat_28-1715592333316.png" /></span></P><P>Now you are there where you have started. Select on the Authorization Group ID the Value “SAP_DSP_ALL” to have access to all Views from the S4.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_29-1715592357997.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109406iD9DD7D29172DB8D4/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_29-1715592357997.png" alt="xsnat_29-1715592357997.png" /></span>Click on save and then you have setup everything on the S4 side.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_30-1715592376784.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109407i5686AD9F0A7279D2/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_30-1715592376784.png" alt="xsnat_30-1715592376784.png" /></span></P><H2 id="toc-hId-208524285">4.&nbsp;&nbsp;&nbsp; Make the Connection to S4 in Datasphere</H2><P>Now that we are back in the Datasphere we can create the connection to the S4. Go to the Connections and click on “Create” there you select the Connection “SAP ABAP.”</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_31-1715592409660.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109408i1AABBD00ACBB1CB1/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_31-1715592409660.png" alt="xsnat_31-1715592409660.png" /></span></P><P>Then you can enter your credentials:</P><P>&nbsp;</P><P>Protocol: Web Socket RFC</P><P>SAP Logon Connection Type: Application Server</P><P>Application Server: your-s4-tenant-api.s4hana.cloud.sap</P><P>Port: 443</P><P>Client: 100</P><P>&nbsp;</P><P>Username: YourUsernamefromtheInboundConnection</P><P>Password: YourPasswordfromtheInboundConnection</P><P>&nbsp;</P><P>Then you can click on “Next Step”</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_32-1715592445550.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109409i8BBA46607E89269F/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_32-1715592445550.png" alt="xsnat_32-1715592445550.png" /></span></P><P>Enter a Name for the Connection and click on “Create Connection”.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_33-1715592490368.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109411iF163822C46F248E9/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_33-1715592490368.png" alt="xsnat_33-1715592490368.png" /></span></P><P>The connection is created. If you want to check, if the connection has worked. You can select the Connection and click on the Validate button. There your connection will be validated.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_34-1715592509543.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109412iFBD868605DBD02BD/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_34-1715592509543.png" alt="xsnat_34-1715592509543.png" /></span></P><P>If it has worked, you will see this message:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xsnat_35-1715592529558.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/109413iF6EB73ED988620B5/image-size/large?v=v2&amp;px=999" role="button" title="xsnat_35-1715592529558.png" alt="xsnat_35-1715592529558.png" /></span></P><P>Now you can work with the S4 Views in the Datasphere. <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span></P> 2024-05-14T05:54:48.426000+02:00 https://community.sap.com/t5/technology-blogs-by-members/sap-datasphere-space-management/ba-p/13694454 SAP Datasphere - Space Management 2024-05-14T10:37:35.395000+02:00 vagarwal1 https://community.sap.com/t5/user/viewprofilepage/user-id/491751 <P><SPAN>This blog helps you getting started with SAP Datasphere. After getting Datasphere tenant, we need&nbsp; defining space for team to define connection, Data access mode, Model and views.&nbsp;</SPAN></P><P>Login to Datasphere Tenant:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_0-1715127854726.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107151i88DDA34A8E1C6C37/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_0-1715127854726.png" alt="vagarwal1_0-1715127854726.png" /></span></P><P>Navigate to given Icon / Space management:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_1-1715127876689.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107152iF564FAC5F73DF148/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_1-1715127876689.png" alt="vagarwal1_1-1715127876689.png" /></span></P><P>Choose the Menu option ‘Create’:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_2-1715127903027.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107153i4243BADEE47D378E/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_2-1715127903027.png" alt="vagarwal1_2-1715127903027.png" /></span></P><P>Provide the Name of the space:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_3-1715127925604.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107154iCE20A45CC5E77521/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_3-1715127925604.png" alt="vagarwal1_3-1715127925604.png" /></span></P><P>Allocate Disk and Memory Storage:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_4-1715127949021.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107155iE4A96D37700C1DB4/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_4-1715127949021.png" alt="vagarwal1_4-1715127949021.png" /></span></P><P>Workload Management:</P><P>Set the appropriate priority based on space usage.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_5-1715127969202.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107156i1EA57DB0F666F915/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_5-1715127969202.png" alt="vagarwal1_5-1715127969202.png" /></span></P><P>Users:</P><P>Add new users to space by clicking on the Menu option ‘ Add’:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_6-1715128001783.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107157i6836F601015950F5/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_6-1715128001783.png" alt="vagarwal1_6-1715128001783.png" /></span></P><P>*Scope need to defined for the users before they can be added to a space.</P><P>Database Access:</P><P>Check the tick box for ‘Expose for Consumption by Default’ , if you want to expose data stored on this space for reporting.</P><P>Database Users:</P><P>Create space database user to connect to space at DB layer or for external tool like PowerBI:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_7-1715128040733.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107158iC12364DEDA21F3C8/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_7-1715128040733.png" alt="vagarwal1_7-1715128040733.png" /></span></P><P>provide username and select the option as required.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_9-1715128175606.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107160i04EB29655AF8A6CF/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_9-1715128175606.png" alt="vagarwal1_9-1715128175606.png" /></span></P><P>Auditing:</P><P>We can keep Logs for Read and Change operation by default 7 days as required by the Audit team.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_10-1715128200546.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107161i8F28D9C0C353737B/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_10-1715128200546.png" alt="vagarwal1_10-1715128200546.png" /></span></P><P>Multiple space can be created based on the requirement.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_11-1715128225418.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107162i8014220D980D042D/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_11-1715128225418.png" alt="vagarwal1_11-1715128225418.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-14T10:37:35.395000+02:00 https://community.sap.com/t5/technology-blogs-by-members/sap-bw-bridge-sap-cloud-connector-configuration/ba-p/13695621 SAP BW bridge – SAP Cloud connector configuration 2024-05-14T10:40:01.329000+02:00 vagarwal1 https://community.sap.com/t5/user/viewprofilepage/user-id/491751 <P>Login to cloud connector with Administrator account</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_0-1715192738368.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107667i64C2E08EE5D46851/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_0-1715192738368.png" alt="vagarwal1_0-1715192738368.png" /></span></P><P>Get the sub account and location ID created on SAP DataSphere. Sub-account can be created by following the documentation at below link: <A href="https://help.sap.com/docs/SAP_DATASPHERE/9f804b8efa8043539289f42f372c4862/6de74f7731c54ce88f2883df8f8671a8.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/SAP_DATASPHERE/9f804b8efa8043539289f42f372c4862/6de74f7731c54ce88f2883df8f8671a8.html</A></P><P>(SAP DataSphere –&gt; System –&gt; Administration-&gt; Data Source Configuration)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_2-1715192828774.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107669i85E05BFD8414EF71/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_2-1715192828774.png" alt="vagarwal1_2-1715192828774.png" /></span></P><P>On SAP Cloud connector, add the sub account and location ID created in SAP DataSphere:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_1-1715192779107.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107668iDBDEFEB5B0ABCCA5/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_1-1715192779107.png" alt="vagarwal1_1-1715192779107.png" /></span></P><P>fill the required information and save.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_4-1715192989437.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107672iAF821EA150F5C47F/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_4-1715192989437.png" alt="vagarwal1_4-1715192989437.png" /></span></P><P>Next, get the information of BW Bridge server from SAP DataSphere (Connections -&gt; BW Bridge)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_7-1715193233895.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107676i1036DEF5F7942DA7/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_7-1715193233895.png" alt="vagarwal1_7-1715193233895.png" /></span></P><P>On SAP Cloud connector, add SAP BW Bridge as ‘On-Premise To Cloud’&nbsp; by select option as below.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_5-1715193049806.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107673i90EC399ADC396125/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_5-1715193049806.png" alt="vagarwal1_5-1715193049806.png" /></span></P><P>Click on '+' Symbol to add new server:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_6-1715193178600.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107674i3DE01BFBD03334DB/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_6-1715193178600.png" alt="vagarwal1_6-1715193178600.png" /></span></P><P>Provide the input as shown below: (Type - ABAP Cloud System and Description about server)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_8-1715193377057.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107677i64F10AB520ED99BD/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_8-1715193377057.png" alt="vagarwal1_8-1715193377057.png" /></span></P><P>In next screen, input Host address, Instance No., and No# of connections and Click on Finish.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_9-1715193489715.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107678i8B443313F7233E83/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_9-1715193489715.png" alt="vagarwal1_9-1715193489715.png" /></span></P><P>BW Bridge is added to SAP Cloud connector and we can see status as 'GREEN' as shown below:</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_11-1715193566187.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107680i5930CE85845070AC/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_11-1715193566187.png" alt="vagarwal1_11-1715193566187.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-14T10:40:01.329000+02:00 https://community.sap.com/t5/technology-blogs-by-members/sap-bw-bridge-integrating-with-on-premises-ecc-bw-servers/ba-p/13695784 SAP BW Bridge – Integrating with On-Premises ECC / BW servers 2024-05-14T10:47:37.351000+02:00 vagarwal1 https://community.sap.com/t5/user/viewprofilepage/user-id/491751 <P>To begin with, first we need to add on-Premises server to SAP Cloud connector.</P><P>Login to SAP Cloud connector as Administrator.</P><P>Go to the path: Sub-account -&gt; Cloud To On-Premise</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_9-1715209118581.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107749iE09B747569518D7F/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_9-1715209118581.png" alt="vagarwal1_9-1715209118581.png" /></span></P><P>&nbsp;</P><P>Click on ‘+’ symbol to add new server.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_10-1715209145973.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107750iC9B0DF02D2573196/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_10-1715209145973.png" alt="vagarwal1_10-1715209145973.png" /></span></P><P>&nbsp;</P><P>In Add System Mapping, select Back-end Type as ‘ABAP System’</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_2-1715208927423.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107742i9CDCC11A76C6E482/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_2-1715208927423.png" alt="vagarwal1_2-1715208927423.png" /></span></P><P>Select protocol as ‘RFC’.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_3-1715208944850.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107743i2D4E21E20C901384/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_3-1715208944850.png" alt="vagarwal1_3-1715208944850.png" /></span></P><P>Choose the option with / without load balancing, since mine is single instance server, I have chosen second option.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_4-1715208964242.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107744i93B395C15878BFC9/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_4-1715208964242.png" alt="vagarwal1_4-1715208964242.png" /></span></P><P>Enter the application name as FQDN “&lt;host&gt;.&lt;domain&gt;.com” and server instance no.,</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_5-1715208988355.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107745i7BC4498B93F116B7/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_5-1715208988355.png" alt="vagarwal1_5-1715208988355.png" /></span></P><P>Give a virtual name of the server, that will be used as reference in SAP BW Bridge.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_6-1715209009815.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107746i78D23EA1C8FC350D/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_6-1715209009815.png" alt="vagarwal1_6-1715209009815.png" /></span></P><P>Provide meaningful description.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_7-1715209027876.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107747iA02BFC669A21C2A6/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_7-1715209027876.png" alt="vagarwal1_7-1715209027876.png" /></span></P><P>Check the box ‘Check internal Host’ and click on finish.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_8-1715209044546.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107748i4CAD9C6C48D0FF60/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_8-1715209044546.png" alt="vagarwal1_8-1715209044546.png" /></span></P><P>Check the server status, it will show as reachable.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_11-1715209202461.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107752i5A8664ECB6121A41/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_11-1715209202461.png" alt="vagarwal1_11-1715209202461.png" /></span></P><P>Download the resources for R/3 On-premises server from below snote: &nbsp;<A href="https://help.sap.com/docs/link-disclaimer?site=https://me.sap.com/notes/3112568" target="_blank" rel="noopener noreferrer">3112568</A></P><P>Upload the resources for newly added server by clicking on up arrow.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_12-1715209236886.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107753iB6DFA6EF8BE22879/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_12-1715209236886.png" alt="vagarwal1_12-1715209236886.png" /></span></P><P>In the Import Scenario, browse and select the zip file for ODP resources.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_13-1715209260260.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107754i8B8410C80CA23E7C/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_13-1715209260260.png" alt="vagarwal1_13-1715209260260.png" /></span></P><P>Once imported status will be ‘Green’ and function name will be like below.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_14-1715209276647.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107756iCFF89D562CDD5DF1/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_14-1715209276647.png" alt="vagarwal1_14-1715209276647.png" /></span></P><P>Login to SAP BW Bridge with administrator privileges and select Administration - &gt; ‘Communication Management’</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_15-1715209297180.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107757i1A975818C62BD003/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_15-1715209297180.png" alt="vagarwal1_15-1715209297180.png" /></span></P><P>In Communication Management select ‘Maintain communication Users’</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_16-1715209313917.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107758i581EEFBDD04CF026/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_16-1715209313917.png" alt="vagarwal1_16-1715209313917.png" /></span></P><P>Select ‘New’ communication user to be added.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_17-1715209333830.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107759iC2ABB4C98DA12F21/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_17-1715209333830.png" alt="vagarwal1_17-1715209333830.png" /></span></P><P>Provide the Name and description of the user. For password either you can provide complex password or select random generated password with option of ‘Propose Password’. Click on create.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_18-1715209353258.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107760iD0737C69064A7559/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_18-1715209353258.png" alt="vagarwal1_18-1715209353258.png" /></span></P><P>A new user ID will be created for the given user.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_19-1715209371238.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107761i8E8C8731ABFAEFE8/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_19-1715209371238.png" alt="vagarwal1_19-1715209371238.png" /></span></P><P>Return the Communication Management page and click on Communication Systems.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_20-1715209386042.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107762i17CC237C9CF639F0/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_20-1715209386042.png" alt="vagarwal1_20-1715209386042.png" /></span></P><P>Select the ‘New’ option to create new communication server</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_21-1715209399918.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107764i7A4C60F5A9757B83/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_21-1715209399918.png" alt="vagarwal1_21-1715209399918.png" /></span></P><P>Provide the reference name and system ID and click on create.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_22-1715209415379.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107765iA0389ED696DA8273/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_22-1715209415379.png" alt="vagarwal1_22-1715209415379.png" /></span></P><P>For the new system, under Technical Data under general, provide the server’s name and port no., as created in SAP Cloud connector.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_23-1715209431345.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107766iFB93B5CDEC16B163/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_23-1715209431345.png" alt="vagarwal1_23-1715209431345.png" /></span></P><P>Switch ON the cloud connector and provide the location ID.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_24-1715209444275.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107767iE1B958A340B509CC/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_24-1715209444275.png" alt="vagarwal1_24-1715209444275.png" /></span></P><P>For RFC settings, provide target host name as virtual host name in SAP cloud connector, for Client – 100 and Instance no# 00.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_25-1715209458837.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107768iDC8B64A4ECD5E999/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_25-1715209458837.png" alt="vagarwal1_25-1715209458837.png" /></span></P><P>Add the user for Inbound communication</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_26-1715209473942.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107769iFC4862A0FCD3DBE1/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_26-1715209473942.png" alt="vagarwal1_26-1715209473942.png" /></span></P><P>Select the user created as communication user and click OK.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_27-1715209491044.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107770iCA33EA5A6959FA36/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_27-1715209491044.png" alt="vagarwal1_27-1715209491044.png" /></span></P><P>Similarly for the outbound communication provide the user ID created in R/3 server.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_28-1715209508020.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107771i7A829C1BC218BFAA/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_28-1715209508020.png" alt="vagarwal1_28-1715209508020.png" /></span></P><P>Save the configuration.</P><P>Return to Communication Management and select “Communication Arrangements”.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_29-1715209532955.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107772i9DA4AF068710CC70/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_29-1715209532955.png" alt="vagarwal1_29-1715209532955.png" /></span></P><P>Select new communication arrangement.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_30-1715209547552.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107773i88BA889F47BD27D1/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_30-1715209547552.png" alt="vagarwal1_30-1715209547552.png" /></span></P><P>Select the Scenario as ‘SAP_COM_0692’ (SAP BW Bridge – ODP RFC Source system Integration) and give a name and click on ‘Create’.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_31-1715209568467.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107774i79C61DE998074585/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_31-1715209568467.png" alt="vagarwal1_31-1715209568467.png" /></span></P><P>Select the newly created Communication system.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_32-1715209582754.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107775i637AE684D6B79CEB/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_32-1715209582754.png" alt="vagarwal1_32-1715209582754.png" /></span></P><P>Select the backend system ID as created in eclipse tool as source system.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_33-1715209596113.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107776i4556544A8B6E2F16/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_33-1715209596113.png" alt="vagarwal1_33-1715209596113.png" /></span></P><P>Validate the Inbound and Outbound user and click on save.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_34-1715209612552.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107777iF9BA302AA9D4CD98/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_34-1715209612552.png" alt="vagarwal1_34-1715209612552.png" /></span></P><P>Login to backend R/3 server and transaction code ‘SM59’ and create a new RFC connection.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_35-1715209641758.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107778i997D349C1517DC19/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_35-1715209641758.png" alt="vagarwal1_35-1715209641758.png" /></span></P><P>Create the RFC connection of type ‘3 – ABAP ‘ as shown below.</P><P>RFC Destination : ‘Own SAP Cloud system’ name of BW bridge.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_36-1715209661527.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107779i3BF43FE6BE75A185/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_36-1715209661527.png" alt="vagarwal1_36-1715209661527.png" /></span></P><P>Target host: SAP Cloud connector FQDN (without https and port no#).</P><P>Instance no# 00</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_37-1715209688308.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107780i7EB31F55B30D09B5/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_37-1715209688308.png" alt="vagarwal1_37-1715209688308.png" /></span></P><P>Under logon &amp; Security enter the Inbound User ID &amp; Password created for Communication user and save the settings.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_38-1715209708861.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107781i1CA56D5F7569030C/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_38-1715209708861.png" alt="vagarwal1_38-1715209708861.png" /></span></P><P>Click on Test RFC connection and it will show connected.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_39-1715209721452.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107782i3BE9AA84521471D4/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_39-1715209721452.png" alt="vagarwal1_39-1715209721452.png" /></span></P><P>We have successfully added backed R/3 server to SAP BW Bridge.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_40-1715209734635.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/107783iFD0E6B67F94BC75F/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_40-1715209734635.png" alt="vagarwal1_40-1715209734635.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-14T10:47:37.351000+02:00 https://community.sap.com/t5/technology-blogs-by-members/sap-datasphere-sap-hana-database-monitoring/ba-p/13696750 SAP DataSphere – SAP HANA Database Monitoring 2024-05-14T10:49:02.161000+02:00 vagarwal1 https://community.sap.com/t5/user/viewprofilepage/user-id/491751 <P>Login to SAP DataSphere and navigate to path System -&gt; Configuration. Select the Menu option 'Database Access' -&gt; Database Analysis Users.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_0-1715279836432.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108160i98D7E112FD94AF79/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_0-1715279836432.png" alt="vagarwal1_0-1715279836432.png" /></span></P><P>Select the option Create for new DB user.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_1-1715279908449.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108161iD17A3B48B282E28C/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_1-1715279908449.png" alt="vagarwal1_1-1715279908449.png" /></span></P><P>Provide the Name of the user along with No# of days the user will be active ( after which the user ID will get deleted) and click on 'Create'.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_2-1715280024179.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108162i8E536E46017EA847/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_2-1715280024179.png" alt="vagarwal1_2-1715280024179.png" /></span></P><P>HAND DB Analysis User will be created and access details will pop up in next screen. Please make not of User Name, Host, Port and Password.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_3-1715280193539.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108163iEAE06031152158D0/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_3-1715280193539.png" alt="vagarwal1_3-1715280193539.png" /></span></P><P>We can see the newly created user under 'Database Analysis Users' along with its status and Validity date.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_4-1715280316554.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108164i2C41B40EFDE89D44/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_4-1715280316554.png" alt="vagarwal1_4-1715280316554.png" /></span></P><P>To Access the SAP HANA Cockpit, check the User Name and click on 'Open SAP HANA Cockpit'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_5-1715280400968.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108165iF1152AEE7E80A50C/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_5-1715280400968.png" alt="vagarwal1_5-1715280400968.png" /></span></P><P>This will take us to new webpage/ tab for HANA DB Cockpit and prompt us to enter the Login details.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_6-1715280570365.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108166i17FC9CBFE4F9400E/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_6-1715280570365.png" alt="vagarwal1_6-1715280570365.png" /></span></P><P>After successful login we land on the Database Overview page of SAP DataSphere HANA DB.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_7-1715280703977.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108167i9E4C5C665D9552A5/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_7-1715280703977.png" alt="vagarwal1_7-1715280703977.png" /></span></P><P>Here we have option to select different view from the drop down menu.&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_8-1715280873819.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108168i8E66A1611982C5E1/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_8-1715280873819.png" alt="vagarwal1_8-1715280873819.png" /></span></P><P>Select Monitoring the see the overall status of the SAP HANA DB. Now, we will explore some key monitoring artifacts like Storage, CPU, Threads, Transactions, etc.,.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_9-1715281011392.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108169i39DA6217528F39B9/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_9-1715281011392.png" alt="vagarwal1_9-1715281011392.png" /></span></P><P>1. Storage Utilization will provide statistic of used disk space against total allocated space of the SAP DataSphere tenant.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_10-1715281110858.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108170iC8BBAE7D21E06B3C/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_10-1715281110858.png" alt="vagarwal1_10-1715281110858.png" /></span></P><P>Analyze Workloads will provide details view of 'Top SQL Statements', 'Background Jobs', 'Timeline' &amp; active&nbsp; 'Threads'.&nbsp;&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_12-1715281444992.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108174i4DCC0B4171163FAA/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_12-1715281444992.png" alt="vagarwal1_12-1715281444992.png" /></span></P><P>We also have option to select the time period and Filter the view.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_11-1715281389496.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108172iEC083F01E1D245D5/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_11-1715281389496.png" alt="vagarwal1_11-1715281389496.png" /></span></P><P>Background jobs will show the status of background job, which were schedule for the selected period.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_13-1715281615690.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108178i27F7C4013C53B5D7/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_13-1715281615690.png" alt="vagarwal1_13-1715281615690.png" /></span></P><P>Under 'Timeline', we can monitor different application threads along with duration and status.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_14-1715281857789.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108182iF3522618AAB0A334/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_14-1715281857789.png" alt="vagarwal1_14-1715281857789.png" /></span></P><P>In Menu option 'Threads', we can select the different monitoring option for the process threads and see the overall performance. Here I've selected option as 'Thread State', we can also select different option for more detailed view.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_15-1715282090863.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108186iAE8ED390F5CCD71D/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_15-1715282090863.png" alt="vagarwal1_15-1715282090863.png" /></span></P><P>It also has option to select 'Secondary Dimension' and monitor the statistics based on selected criteria.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_16-1715282223647.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108187iF6F0BC3698D85415/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_16-1715282223647.png" alt="vagarwal1_16-1715282223647.png" /></span></P><P>2. Manage Services, will show the overall status of core HANA DB services for the SAP DataSphere tenant.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_17-1715282353948.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108191i88A1BC686585D6A4/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_17-1715282353948.png" alt="vagarwal1_17-1715282353948.png" /></span></P><P>&nbsp;we can see the service status, Database SID and overall status. Below image shows HANA DB services are running with one high priority alert with in the 'indexserver'. We can also access 'indexserver' trace file by selecting option ' Go to Trace file'.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_18-1715282587214.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108192i1408ED82EA14753E/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_18-1715282587214.png" alt="vagarwal1_18-1715282587214.png" /></span></P><P>3. Memory Usage: This menu option provide us high level view of Memory utilization.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_19-1715282815409.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108194iFDD9C32DF0EDA42D/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_19-1715282815409.png" alt="vagarwal1_19-1715282815409.png" /></span></P><P>To access detailed information on memory utilization, click on option ' Monitoring Performance', This will show us detailed statistical view. we can see memory allocation to different DB components and services.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_21-1715283023039.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108196i993939696C6FC469/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_21-1715283023039.png" alt="vagarwal1_21-1715283023039.png" /></span></P><P>with in the performance Monitor, we also have option to view performance metrics CPU &amp; Memory as well.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_22-1715283323803.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108201i247FAC2FE648C52A/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_22-1715283323803.png" alt="vagarwal1_22-1715283323803.png" /></span></P><P>More detailed view of Memory usage with respect to Workloads, Buffer Cache and Paging can also be accessed by selecting the option for menu (...).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_23-1715283522037.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108203i7CEF334F66D84D4D/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_23-1715283522037.png" alt="vagarwal1_23-1715283522037.png" /></span></P><P>4. Monitoring Table usage:&nbsp; We can access table space usage statistics from 'Monitoring'</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_24-1715283651427.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108204i6CF1AECB2595CFA2/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_24-1715283651427.png" alt="vagarwal1_24-1715283651427.png" /></span></P><P>we can monitor the tables and its space utilization and statistics with different spaces.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_25-1715283910163.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108205iE222D9DCFFC32AC4/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_25-1715283910163.png" alt="vagarwal1_25-1715283910163.png" /></span></P><P>&nbsp;</P><P>5. Blocked Transactions: Performance of HANA DB get severely impacted if we have any blocked transaction at the DB level.&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_26-1715284072019.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108206i26D11C744720167F/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_26-1715284072019.png" alt="vagarwal1_26-1715284072019.png" /></span></P><P>We can see the active transaction in blocked status, if any.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_27-1715284152231.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108208i88F37C43C79AE587/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_27-1715284152231.png" alt="vagarwal1_27-1715284152231.png" /></span></P><P>6. CPU Usage: We can monitoring overview of CPU utilization and details view can be access with options like 'Monitoring Performance' and 'Analyze Workloads'.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vagarwal1_28-1715284317715.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/108209iB3E7FB1E769F79D4/image-size/medium?v=v2&amp;px=400" role="button" title="vagarwal1_28-1715284317715.png" alt="vagarwal1_28-1715284317715.png" /></span></P><P>&nbsp;</P><P>In case we observe any issue, which requires SAP attention than we need to open a support ticket with SAP to get the required help.</P><P>&nbsp;&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-05-14T10:49:02.161000+02:00 https://community.sap.com/t5/technology-blogs-by-members/sap-datasphere-architecture-and-security-concept/ba-p/13702030 SAP Datasphere – Architecture and Security Concept 2024-05-15T11:52:30.219000+02:00 Kuma https://community.sap.com/t5/user/viewprofilepage/user-id/275733 <P><SPAN>SAP Datasphere (DSP) design differs from classic BW and therefore, a new approach is needed.</SPAN></P><P><SPAN>Space is the point of entry. Consider Spaces as Info Areas. Creating spaces and assigning users to specific spaces allows easier security maintenance and more straight forwards consumption. Row-Level data security is still applied however. </SPAN></P><P><STRONG><SPAN>Roles Used: </SPAN></STRONG></P><TABLE><TBODY><TR><TD width="162"><P><STRONG><SPAN>Role</SPAN></STRONG></P></TD><TD width="330"><P><STRONG><SPAN>Description&nbsp;&nbsp; </SPAN></STRONG></P></TD><TD width="132"><P><STRONG><SPAN>Space&nbsp;&nbsp; </SPAN></STRONG></P></TD></TR><TR><TD width="162"><P><SPAN>System Architect&nbsp; </SPAN></P></TD><TD width="330"><P><SPAN>System and Cloud Administration</SPAN></P></TD><TD width="132"><P><SPAN>All</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Data Architect&nbsp; </SPAN></P></TD><TD width="330"><P><SPAN>Knows inbound data from all sources </SPAN></P></TD><TD width="132"><P><SPAN>Central </SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Solution Architect </SPAN></P></TD><TD width="330"><P><SPAN>Works on specific reporting requirements </SPAN></P></TD><TD width="132"><P><SPAN>Functional</SPAN></P></TD></TR></TBODY></TABLE><P><SPAN>&nbsp;</SPAN></P><P><STRONG><SPAN>Spaces Used: </SPAN></STRONG></P><TABLE><TBODY><TR><TD width="102"><P><STRONG><SPAN>Space</SPAN></STRONG></P></TD><TD width="522"><P><STRONG><SPAN>Description</SPAN></STRONG></P></TD></TR><TR><TD width="102"><P><SPAN>Central </SPAN></P></TD><TD width="522"><P><SPAN>Connections, Data Persistency, DACs, Time and Conversion Tables, Private-Like Views and Reusable-like Views for sharing with Functional spaces</SPAN></P></TD></TR><TR><TD width="102"><P><SPAN>Security </SPAN></P></TD><TD width="522"><P><SPAN>Security Tables</SPAN></P></TD></TR><TR><TD width="102"><P><SPAN>Functional</SPAN></P></TD><TD width="522"><P><SPAN>Reusable and Query-like views for consumption </SPAN></P></TD></TR><TR><TD width="102"><P><SPAN>CSV (optional)</SPAN></P></TD><TD width="522"><P><SPAN>For CSV Uploads (if necessary for key users / like BW-Workspace)</SPAN></P></TD></TR></TBODY></TABLE><P><SPAN>&nbsp;</SPAN></P><P><STRONG><SPAN>Access Used: </SPAN></STRONG></P><TABLE><TBODY><TR><TD width="162"><P><STRONG><SPAN>Role</SPAN></STRONG></P></TD><TD width="132"><P><STRONG><SPAN>Space&nbsp;&nbsp; </SPAN></STRONG></P></TD><TD width="132"><P><STRONG><SPAN>Access Type</SPAN></STRONG></P></TD></TR><TR><TD width="162"><P><SPAN>System Architect&nbsp; </SPAN></P></TD><TD width="132"><P><SPAN>All Spaces</SPAN></P></TD><TD width="132"><P><SPAN>Full</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Data Architect&nbsp; </SPAN></P></TD><TD width="132"><P><SPAN>Central </SPAN></P></TD><TD width="132"><P><SPAN>Full</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Data Architect&nbsp; </SPAN></P></TD><TD width="132"><P><SPAN>Functional</SPAN></P></TD><TD width="132"><P><SPAN>Read</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Solution Architect</SPAN></P></TD><TD width="132"><P><SPAN>Central</SPAN></P></TD><TD width="132"><P><SPAN>Read</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Solution Architect </SPAN></P></TD><TD width="132"><P><SPAN>Functional</SPAN></P></TD><TD width="132"><P><SPAN>Full</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Security Team</SPAN></P></TD><TD width="132"><P><SPAN>Security</SPAN></P></TD><TD width="132"><P><SPAN>Full</SPAN></P></TD></TR><TR><TD width="162"><P><SPAN>Support Team </SPAN></P></TD><TD width="132"><P><SPAN>Central</SPAN></P></TD><TD width="132"><P><SPAN>DIM only</SPAN></P></TD></TR></TBODY></TABLE><P>&nbsp;</P><P><SPAN>Security: </SPAN></P><UL><LI><SPAN>Security Tables will be placed in a separate Space to ensure, that only specific users have access to them, but the DACs are built and assigned in the Central Space.</SPAN></LI></UL><P><SPAN>Central: </SPAN></P><UL><LI><SPAN>All connections – full control of all connections to all source systems </SPAN></LI><LI><SPAN>Persist the data (with Task Chains) – full control of the quota </SPAN></LI><LI><SPAN>Assign DAC (based on shared tables from the Security Space) to the persisted data / views / remote tables</SPAN></LI><LI><SPAN>Conversion (TCUR*, T006*) tables will be shared as Views </SPAN></LI><LI><SPAN>Time Tables can be shared directly or as Views</SPAN></LI><LI>Central Space shares Reusable-Like views to Functional spaces.&nbsp;&nbsp;</LI></UL><P><SPAN>Functional: </SPAN></P><UL><LI><SPAN>Only functional relevant (virtual) object</SPAN></LI><LI><SPAN>No data persisted</SPAN></LI><LI><SPAN>Possible to expand to Branche/Country Spaces. P</SPAN>roper technical name and self-explaining business name is necessary</LI><LI>Functional Spaces use shared reusable views from the Central space.</LI></UL><P><SPAN>CSV:</SPAN></P><UL><LI><SPAN>Purely optional and only used if it is necessary to upload any CSV data (BW-Workspace like)</SPAN></LI><LI><SPAN>Separate space as the functional spaces should not persist any data</SPAN></LI></UL> 2024-05-15T11:52:30.219000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/transformation-flow-partition-generation-with-sap-datasphere-cli/ba-p/13702553 Transformation Flow partition generation with SAP Datasphere CLI 2024-05-15T17:16:21.756000+02:00 Sefan_Linders https://community.sap.com/t5/user/viewprofilepage/user-id/140128 <P>You want to partition a Transformation Flow in SAP Datasphere, to reduce the memory consumption? That’s what we discuss in this blog post and provide the Python code to realize this. It makes use of the SAP Datasphere Command Line Interface (CLI) to read and write Transformation Flows, and to write a Task Chain that holds them together as one.</P><P>Recently I tried loading a large amount of data from the SAP Datasphere BW Bridge into SAP Datasphere Core, using a Transformation Flow in Delta Mode. As you might know, the Bridge runs its own SAP HANA database, separate from the SAP HANA database that SAP Datasphere “Core” runs. When using a Transformation Flow to load from Bridge to Core, the data transfer goes over SDA. Both Core and Bridge were minimally sized, so I needed to be careful with the memory consumption of the operation. And that is what you can usually achieve with partitioning, so that the data is transferred in smaller packets using a pre-defined set of criteria. However, Transformation Flows do not support partitioning yet.</P><P>The alternative? Creating multiple Transformation Flows, each with a WHERE clause in the SQL definition, effectively creating your own partitioned Flows. And since I didn’t feel like creating 20 of this manually, I thought this would be nice to give the CLI another swing. And some python code of course.</P><H2 id="toc-hId-1014691068">The original Transformation Flow</H2><P>The original Transformation Flow is rather straightforward: we read from one Delta (remote) Table and write it into another Delta (local) Table. In this case we are writing from and to a Local Delta Table, which means that the Transformation Flow is in Delta Mode. Below you see the screenshots of the main work area and the source SQL definition.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_0-1715784753392.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111084iD0A14E7A0E904609/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_0-1715784753392.png" alt="Sefan_Linders_0-1715784753392.png" /></span></P><P>Figure 1: The original un-partitioned Transformation Flow</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_1-1715784753399.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111083i53E1B466A8C28B91/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_1-1715784753399.png" alt="Sefan_Linders_1-1715784753399.png" /></span></P><P>Figure 2: The SQL definition of the source node</P><P>&nbsp;</P><pre class="lia-code-sample language-sql"><code>SELECT "Change_Type", "Change_Date", "ID", "SOMESTRING" FROM "UC2_DLT_SRC_Delta"</code></pre><P>&nbsp;</P><P>What we want to accomplish is to copy this Transformation Flow for each partition that we want to create, add a WHERE clause to the SQL definition, and bind these Flows together in one Task Chain.</P><H2 id="toc-hId-818177563">The code (and running it)</H2><P>I’ve uploaded the code into this Git repo: <A href="https://github.com/SAP-samples/btp-global-center-of-excellence-samples/tree/main/DSP_object_generation" target="_blank" rel="noopener nofollow noreferrer">https://github.com/SAP-samples/btp-global-center-of-excellence-samples/tree/main/DSP_object_generation</A>. The repo is shared with a few other code samples, only the folder “DSP_object_generation” is needed for what we achieve in this blog. The folder also contains the code for Remote Table to View generation, which is described in <A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-view-generation-with-python-and-the-command-line-interface/ba-p/13558181" target="_self">a previous blog.</A></P><P>Requirements:</P><UL><LI>Python to run this. I developed this using 3.10.6.</LI><LI>The latest SAP Datasphere CLI. I used 2024.9.0.</LI></UL><P>To run it:</P><UL><LI>Create a dsp_logon_data.json file in the root of the folder. You can use “dsp_logon_data.json.template” as a template. The hdb* parameters do not have to be set for transformation flow generation, they only apply to the previous blog where we read remote table definitions from the database.</LI><LI>Edit the parameters in main.py:<UL><LI>Technical space name</LI><LI>Technical transformation flow name (this one should have been created already)</LI><LI>Column to partition on (in my example this is a String column)</LI><LI>Partition definition (in my example these are Strings)</LI></UL></LI><LI>Run main.py</LI></UL><H2 id="toc-hId-621664058">The output</H2><P>For each partition, a Transformation Flow is generated, as you can see in below screenshot. Also depicted is the WHERE statement added to one of the generated Transformation Flows. Besides that, a Task Chain is generated which runs all generated Transformation Flows in sequence.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_2-1715784753405.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111085i81F4CB446D114E29/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_2-1715784753405.png" alt="Sefan_Linders_2-1715784753405.png" /></span></P><P>Figure 3: An overview of the generated objects</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_3-1715784753411.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111086i5B9A2D5616141931/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_3-1715784753411.png" alt="Sefan_Linders_3-1715784753411.png" /></span></P><P>Figure 4: The WHERE statement added to the generated Transformation Flow SQL definition</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_4-1715784753419.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111087i0733B636AFD949BE/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_4-1715784753419.png" alt="Sefan_Linders_4-1715784753419.png" /></span></P><P>Figure 5: The generated Task Chain that binds all generated Transformation Flows together</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sefan_Linders_5-1715784753429.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111088iA0BD51A4A69FEA7A/image-size/large?v=v2&amp;px=999" role="button" title="Sefan_Linders_5-1715784753429.png" alt="Sefan_Linders_5-1715784753429.png" /></span></P><P>Figure 6: Running the generated Task Chain</P><H2 id="toc-hId-425150553">Conclusion</H2><P>This blog post describes how to use the SAP Datasphere CLI and object generation using Python to partition a Transformation Flow. This is merely another example of what you can achieve with the CLI and some coding. This example might help you directly but applies to a rather straightforward Transformation Flow where we only want to add a WHERE clause, repeat this for each partition, and bind them together with a Task Chain. In anyway, I hope you enjoyed the read and are inspired to build something yourself.</P><P>&nbsp;</P> 2024-05-15T17:16:21.756000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-multilingual-support-translation-dashboard/ba-p/13702743 SAP Datasphere Multilingual Support - Translation Dashboard 2024-05-16T09:39:56.336000+02:00 senthurshree https://community.sap.com/t5/user/viewprofilepage/user-id/535907 <P><STRONG>Introduction</STRONG></P><P>The SAP Datasphere Multilingual Support Series is intended to provide you with useful guidance on how to utilize the new multilingual translation dashboard. The translation dashboard allows you to translate large amounts of metadata in SAP Datasphere that can be leveraged by the SAP Analytics Cloud to display the stories or dashboard in several available languages.&nbsp;</P><P>This article is the second in the blog post series on multilingual capabilities in SAP Datasphere.</P><P>Blog Post #1: <A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-multilingual-support-master-data-translations/ba-p/13700657" target="_self"><SPAN>Master Data Translations in SAP Datasphere</SPAN></A></P><P>Blog Post #2: Meta Data Translations using the Translation dashboard in SAP Datasphere</P><P><STRONG>Prerequisites</STRONG></P><P style=" padding-left : 30px; ">To manage translation, you must be assigned to a scoped role that inherits a template (such as DW Space Administrator), which grants the Translation privilege.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_0-1715800206629.png" style="width: 1151px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111196i3A75A09C611849DB/image-dimensions/1151x55?v=v2" width="1151" height="55" role="button" title="senthurshree_0-1715800206629.png" alt="senthurshree_0-1715800206629.png" /></span></P><P>&nbsp;</P><P><STRONG>Understading the Translations Workflow</STRONG></P><P>The below diagram clearly represents the activities that needs to be performed by different roles to enable, create and consume translations</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_1-1715800261534.png" style="width: 592px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111198iF15B84226A4936E9/image-dimensions/592x430?v=v2" width="592" height="430" role="button" title="senthurshree_1-1715800261534.png" alt="senthurshree_1-1715800261534.png" /></span></P><P>&nbsp;</P><P><STRONG>Steps to setup and consume Translations</STRONG></P><OL><LI><STRONG>Enable the Space for Translation </STRONG></LI></OL><UL><LI>In the Main Menu, click&nbsp; (Space Management) and Select a space.</LI><LI>In the Overview tab, in the General Settings section, check the Enabling Translation box. Select a language as Source Language. For more information on available languages, see <A href="https://help.sap.com/docs/SAP_ANALYTICS_CLOUD/00f68c2e08b941f081002fd3691d86a7/8310076258a94a4194a926506b80c390.html#supported-data-access-languages" target="_blank" rel="noopener noreferrer">Supported Data Access Languages in SAP Analytics Cloud</A></LI></UL><P style=" padding-left : 30px; "><STRONG>Note: </STRONG></P><P style=" padding-left : 30px; ">Translation toggle cannot be disabled and the source language cannot be changed after the space is deployed.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_2-1715800360362.png" style="width: 757px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111200iFCE0E7B09E5D06E5/image-dimensions/757x351?v=v2" width="757" height="351" role="button" title="senthurshree_2-1715800360362.png" alt="senthurshree_2-1715800360362.png" /></span></P><P>&nbsp;</P><P style=" padding-left : 30px; "><STRONG>2. Select Analytic Models to Translate in the Translation Dashboard</STRONG></P><P style=" padding-left : 30px; ">The user with the Translator privilege is able to launch the Translation Dashboard. All the spaces enabled with the Translation will be visible in the translation dashboard. Select the required space and you would be able to add the analytic model</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_3-1715800426034.png" style="width: 776px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111201i4B6DCFF1F26A9E10/image-dimensions/776x380?v=v2" width="776" height="380" role="button" title="senthurshree_3-1715800426034.png" alt="senthurshree_3-1715800426034.png" /></span></P><UL><LI>Click + (Add). The Add Objects dialog opens. Select the objects you want to translate. Click Add Objects.</LI></UL><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_4-1715800460338.png" style="width: 750px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111202i7D20F831EF2A0D73/image-dimensions/750x375?v=v2" width="750" height="375" role="button" title="senthurshree_4-1715800460338.png" alt="senthurshree_4-1715800460338.png" /></span></P><UL><LI>The selected Analytic Models and their dependencies are added to the tool.</LI></UL><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_5-1715800485184.png" style="width: 756px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111203i7EE6A86D9B54F918/image-dimensions/756x131?v=v2" width="756" height="131" role="button" title="senthurshree_5-1715800485184.png" alt="senthurshree_5-1715800485184.png" /></span></P><P style=" padding-left : 30px; "><STRONG>Note</STRONG></P><P style=" padding-left : 30px; ">Only Analytic Models can be selected in the Add Objects dialog, but the dimensions and hierarchies attached to added Analytic Models are automatically selected to the Translation tool.</P><P style=" padding-left : 30px; ">&nbsp;</P><P style=" padding-left : 30px; "><STRONG>3. Provide Translations for required languages.</STRONG></P><P style=" padding-left : 30px; ">The Translator can translate the metadata using two methods:</P><P style=" padding-left : 60px; "><STRONG>a. Translate Metadata via XLIFF Files</STRONG></P><P style=" padding-left : 60px; ">You can translate large amounts of metadata with the help of XLIFF files.</P><P style=" padding-left : 60px; "><STRONG>Download</STRONG></P><P style=" padding-left : 60px; ">In the Translation tool, select at least one entity that you want to translate. Click (Download) to download the XLIFF files.</P><P style=" padding-left : 60px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_6-1715800604236.png" style="width: 738px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111204i4B25E3472F04CDAF/image-dimensions/738x131?v=v2" width="738" height="131" role="button" title="senthurshree_6-1715800604236.png" alt="senthurshree_6-1715800604236.png" /></span></P><P style=" padding-left : 60px; ">You can choose to select All Strings or Outstanding Strings while exporting the XLIFF file(s). The Outstanding Strings option is enabled only for partially translated objects. The XLIFF file is downloaded to the default location in your system.</P><P style=" padding-left : 60px; ">The downloaded XLIFF file contains the source locale and the list of text that needs translation. You can use this XLIFF file in any external translation tool to generate the translated content as a separate XLIFF file per locale.</P><P style=" padding-left : 60px; "><STRONG>Upload</STRONG></P><P style=" padding-left : 60px; ">Once the translation is done, click (Upload) to open the Import Translations dialog. Click Upload Files to upload the translated XLIFF files back into the Translation tool and Click Import to upload the XLIFF files.</P><P style=" padding-left : 60px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_7-1715800725936.png" style="width: 734px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111205iC4DA34F940B33BE3/image-dimensions/734x222?v=v2" width="734" height="222" role="button" title="senthurshree_7-1715800725936.png" alt="senthurshree_7-1715800725936.png" /></span></P><P style=" padding-left : 60px; ">You can import multiple XLIFF files. Each XLIFF file contains the translated content in one language, which is automatically fetched from the language code embedded in the name of the uploaded XLIFF files. These XLIFF files can be uploaded together since they belong to one object.</P><P style=" padding-left : 60px; ">Note: Name the file in the format &lt;<EM>Analytical Model Name</EM>&gt;_&lt;<EM>Language (2 char)</EM>&gt; as shown below,</P><P style=" padding-left : 60px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_8-1715800761416.png" style="width: 727px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111206i7C988F8E6A2DE3AB/image-dimensions/727x100?v=v2" width="727" height="100" role="button" title="senthurshree_8-1715800761416.png" alt="senthurshree_8-1715800761416.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P style=" padding-left : 60px; "><STRONG>b. Translate Metadata Manually</STRONG></P><P style=" padding-left : 60px; ">You can view the source and the translated text in the Translation tool, and add or edit the translated text in-line.</P><P style=" padding-left : 60px; ">Select the object you want to translate manually and click (Edit).</P><P style=" padding-left : 60px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_9-1715800810087.png" style="width: 704px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111207i617170794AD739D1/image-dimensions/704x122?v=v2" width="704" height="122" role="button" title="senthurshree_9-1715800810087.png" alt="senthurshree_9-1715800810087.png" /></span></P><P style=" padding-left : 60px; ">The list of source text appears. Choose the Target Language and start adding the translated text in the empty field and save.</P><P style=" padding-left : 60px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_10-1715800837489.png" style="width: 689px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111208i8C28E94828740484/image-dimensions/689x380?v=v2" width="689" height="380" role="button" title="senthurshree_10-1715800837489.png" alt="senthurshree_10-1715800837489.png" /></span></P><P>&nbsp;</P><P style=" padding-left : 30px; "><STRONG>4. Consume Model with Data Access Language </STRONG></P><P style=" padding-left : 30px; ">Once the translations have been maintained for the Analytic Model in the Translation Dashboard in SAP Datasphere, you can see that the Analytical Model Preview displays the data in the Data Access Language.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_11-1715800882939.png" style="width: 729px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111209i6F037B4B5F660A0E/image-dimensions/729x588?v=v2" width="729" height="588" role="button" title="senthurshree_11-1715800882939.png" alt="senthurshree_11-1715800882939.png" /></span></P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_12-1715800899768.png" style="width: 727px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111210i3B6D6E2F41034F60/image-dimensions/727x380?v=v2" width="727" height="380" role="button" title="senthurshree_12-1715800899768.png" alt="senthurshree_12-1715800899768.png" /></span></P><P style=" padding-left : 30px; ">Similarly, users can access the SAP Analytic Cloud stories built on top of this Analytic Model in one of the available Data Access Languages defined in Translation Dashboard.</P><P style=" padding-left : 30px; ">&nbsp;</P><P><STRONG>Managing Translations </STRONG></P><UL><LI><STRONG>Different Status of Translations per object</STRONG> - To see the status of the translation of an object per language, click on the object. A dialog appears with the list of translated languages and corresponding status in which the object was translated.</LI><UL><LI>Translated: Once all strings are translated completely, the status is shown as Translated.</LI><LI>Not Translated: For a new object, or for objects whose translation is expired or deleted completely, the status is shown as Not Translated.</LI><LI>Partially Translated: When the strings in an object are not translated completely, the status is shown as Partially Translated. Once it is translated again, the status changes back to Translated.</LI></UL></UL><P style=" padding-left : 90px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="senthurshree_13-1715800931018.png" style="width: 767px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/111212i776DC362992888DE/image-dimensions/767x117?v=v2" width="767" height="117" role="button" title="senthurshree_13-1715800931018.png" alt="senthurshree_13-1715800931018.png" /></span></P><P>&nbsp;</P><UL><LI><STRONG>Analytical Model Updated with New Columns</STRONG> - If the model has changed, new strings will automatically appear in the translation tool as Not Translated. Expired strings will still be available in the Translation tool. You can upload the XLIFF files even if there is a mismatch in the HTML tags between the source and target language strings.</LI><LI><STRONG>Deleting Translations</STRONG> - To delete translations, select one or more objects in the Translation tool and click (Delete). Deleting translations removes the object and attached translated strings from the Translation tool. You'll be able to add the object back to the tool, but no translation will be attached to it any longer.</LI></UL><P><STRONG>Summary</STRONG></P><P>Now you should be able to create and consume Translations for metadata in Analytic Models in SAP Analytic Cloud Stories.</P><P>Please like, comment or post a question!</P> 2024-05-16T09:39:56.336000+02:00 https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-multilingual-support-master-data-translation/ba-p/13700657 SAP Datasphere Multilingual Support - Master Data Translation 2024-05-16T10:22:57.119000+02:00 mona_durai https://community.sap.com/t5/user/viewprofilepage/user-id/542031 <P><SPAN>In this blog we will provide you the overview of Master data translation. We will walk you through each case and how it can be visualized in the Analytical Model Preview as well.</SPAN><SPAN>&nbsp;</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>This article is the </SPAN><SPAN>first</SPAN><SPAN> in the blog post series on multilingual capabilities in SAP Datasphere.</SPAN></P><P><SPAN>Blog Post #1: SAP Datasphere Multilingual Support - Master Data Translations</SPAN><SPAN> (this blog)</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Blog Post #2: <A title="Translating Metadata using the Translation Dashboard in SAP Datasphere" href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-multilingual-support-translation-dashboard/ba-p/13702743" target="_self">Translating Metadata using the Translation Dashboard in SAP Datasphere</A></SPAN></P><P><STRONG><SPAN>Introduction</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><P><SPAN>Master data text is the descriptive information associated to the respective master data. For example, customers, products, vendors, materials and their hierarchies can have their independent description in their respective text table. In general, the master data or fact data entity has only IDs. In the analytical representation, if the textual data gets fetched it adds more value and additional context to the better understanding.</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>In SAC stories, displaying the product description rather than their id's gives a better understanding to the visuals.</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Master Data Translation within SAP Datasphere can be accomplished through various methods.</SPAN><SPAN>&nbsp;</SPAN></P><OL><LI><SPAN>Semantic mapping from an attribute within entity</SPAN><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Text Association</SPAN><SPAN>&nbsp;</SPAN></LI><LI><SPAN>Dimension Association</SPAN><SPAN>&nbsp;</SPAN></LI><LI>Hierarchy Association</LI></OL><P>&nbsp;</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Figure1: Dependent entities of the Analytical Model" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110109i17C52EDE6A8D1048/image-size/large?v=v2&amp;px=999" role="button" title="Figure1.png" alt="Figure1: Dependent entities of the Analytical Model" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure1: Dependent entities of the Analytical Model</span></span></P><P>&nbsp; &nbsp; &nbsp;<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Figure2: Text/Dimension Associations in the entity – SALES (fact)" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110082i3B2E1011B0AFFE2E/image-size/large?v=v2&amp;px=999" role="button" title="Figure2.jpg" alt="Figure2: Text/Dimension Associations in the entity – SALES (fact)" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure2: Text/Dimension Associations in the entity – SALES (fact)</span></span></P><P><SPAN>1.</SPAN><STRONG>Semantic mapping from an attribute within entity</STRONG></P><P style=" padding-left : 30px; ">Here <STRONG>Product Id</STRONG> attribute fetching the corresponding description from the attribute <STRONG>Description</STRONG> of the same fact. When an attribute is marked as semantic type <STRONG>Text</STRONG>, it can be referenced in another attribute for Text/Association. <STRONG>Description</STRONG> with Semantic type <STRONG>Text </STRONG>is referenced at Product ID for Text/Association as mentioned in Figure2.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure3: Analytical Model Preview - Fetching the description from the same entity." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110092iEF6122273834E861/image-size/large?v=v2&amp;px=999" role="button" title="Figure3.jpg" alt="Figure3: Analytical Model Preview - Fetching the description from the same entity." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure3: Analytical Model Preview - Fetching the description from the same entity.</span></span></P><P>2. <STRONG>Text Association</STRONG></P><P style=" padding-left : 30px; "><STRONG>Customer ID</STRONG> attribute has the Text association to table <STRONG>Description</STRONG>.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure4: SALES – Description Text Association" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110095i1757C74993489874/image-size/large?v=v2&amp;px=999" role="button" title="Figure4.jpg" alt="Figure4: SALES – Description Text Association" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure4: SALES – Description Text Association</span></span></P><P style=" padding-left : 30px; ">&nbsp;</P><P style=" padding-left : 30px; ">Based on the Data Access Language the description will be retrieved. Here I have German (Deutsch), and the texts are fetched for the same.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure5: Data Access Language Settings" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110097i522BF0E0FDC566FB/image-size/large?v=v2&amp;px=999" role="button" title="Figure5.jpg" alt="Figure5: Data Access Language Settings" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure5: Data Access Language Settings</span></span></P><P style=" padding-left : 30px; ">&nbsp;</P><P style=" padding-left : 30px; ">To get the descriptions, we need to enable the ID and Description or Description from Presentations.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure6: Analytical Model Preview - Retrieving the description via Direct text association." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110100i0A364203D28F7868/image-size/large?v=v2&amp;px=999" role="button" title="Figure6.jpg" alt="Figure6: Analytical Model Preview - Retrieving the description via Direct text association." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure6: Analytical Model Preview - Retrieving the description via Direct text association.</span></span></P><P>3. <STRONG>Dimension Association</STRONG></P><P style=" padding-left : 30px; "><STRONG>Purchase Date</STRONG> is associated to the Time Dimension – Day.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure7: Purchase Date – Time Dimension - Day Association" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110101i68BB19C88A4B6C96/image-size/large?v=v2&amp;px=999" role="button" title="Figure7.jpg" alt="Figure7: Purchase Date – Time Dimension - Day Association" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure7: Purchase Date – Time Dimension - Day Association</span></span></P><P style=" padding-left : 30px; ">&nbsp;</P><P style=" padding-left : 30px; "><FONT size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure8: Time Dimension – Day, Here the associated Text tables are highlighted." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110103iFDCC254C3770BEAB/image-size/large?v=v2&amp;px=999" role="button" title="Figure8.jpg" alt="Figure8: Time Dimension – Day, Here the associated Text tables are highlighted." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure8: Time Dimension – Day, Here the associated Text tables are highlighted.</span></span></FONT></P><P style=" padding-left : 30px; ">Since the day Dimension has a text association for months, the description for months is fetched accordingly (refer Figure10) based on the data access language settings.<STRONG>&nbsp;</STRONG></P><P style=" padding-left : 30px; ">Select the Hierarchy like below and enable the display of ID and Description from Presentations settings.</P><P style=" padding-left : 30px; "><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure9: Hierarchy Selection" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110104iC137DB1237792535/image-size/medium?v=v2&amp;px=400" role="button" title="Figure9.jpg" alt="Figure9: Hierarchy Selection" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure9: Hierarchy Selection</span></span></P><P style=" padding-left : 30px; ">&nbsp;</P><P style=" padding-left : 30px; "><FONT size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Figure10: Analytical Model Preview - Retrieving the description via Dimension text association." style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/110105i0211A1F79C1DB414/image-size/large?v=v2&amp;px=999" role="button" title="Figure10.jpg" alt="Figure10: Analytical Model Preview - Retrieving the description via Dimension text association." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Figure10: Analytical Model Preview - Retrieving the description via Dimension text association.</span></span></FONT></P><P>4.<STRONG> Hierarchy Association</STRONG></P><P style=" padding-left : 30px; ">SAP datasphere has released the external hierarchy with directory support. Here the language and texts for Hierarchy label and the Node can be retrieved as well. This blog has already addressed the same&nbsp;&nbsp;<SPAN>here - </SPAN><SPAN><A href="https://community.sap.com/t5/technology-blogs-by-sap/an-introduction-to-hierarchy-with-directory-in-sap-datasphere/ba-p/13575573" target="_blank">An Introduction to Hierarchy with Directory in SAP Datasphere</A></SPAN></P><P><STRONG>&nbsp;</STRONG></P><P><STRONG>Summary</STRONG></P><P>Text tables serve as the core of master data translation. While importing master data, ensure that if you are using the ABAP language codes like E or D, as compared to i18n language codes like de or en, the string length for the language column is set to 1. And this ensures the effective translation of master data if it has ABAP language codes. A cleaner data without any text duplications is recommended for the effective translation.</P><P style=" padding-left : 30px; ">&nbsp;</P> 2024-05-16T10:22:57.119000+02:00