https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/Big-Data-blog-posts.xml SAP Community - Big Data 2026-02-20T12:10:15.498804+00:00 python-feedgen Big Data blog posts in SAP Community https://community.sap.com/t5/technology-blog-posts-by-sap/5-steps-to-a-business-data-fabric/ba-p/13580358 5 Steps to a Business Data Fabric 2023-10-17T19:31:56+02:00 SavannahVoll https://community.sap.com/t5/user/viewprofilepage/user-id/13466 <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/291513_GettyImages-1146500457_small.jpg" /></P><BR /> <SPAN data-contrast="auto">Managing and leveraging data can be a daunting task. Businesses grapple with complex datasets from many different and unconnected sources, including operations, finance, marketing, customer success, and more. Plus, a lot of organizations are geographically dispersed and have complicated use cases or specific needs, like storing data across cloud, hybrid, multi-cloud, and on-premises devices.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto"> A </SPAN><A href="https://blogs.sap.com/2023/09/15/what-is-a-business-data-fabric/" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">business data fabric</SPAN></A><SPAN data-contrast="auto"> offers a solution.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">This data management architecture provides an integrated, semantically-rich data layer over underlying data landscapes to deliver scalable access to data without duplication. </SPAN><SPAN data-contrast="none">In other data platforms, when data is extracted from core systems, much of its original context is lost. A business data fabric preserves this context, helping ensure the data remains meaningful and relevant for decision-making, regardless of its origin.</SPAN> <SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">This approach </SPAN><SPAN data-contrast="auto">offers a number of </SPAN><A href="https://blogs.sap.com/2023/09/18/3-benefits-of-a-business-data-fabric/" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">benefits</SPAN></A><SPAN data-contrast="auto"> including enhanced data accessibility, improved data governance, and accelerated insights.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">But how do you get started?</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H2 id="toc-hId-964759435"><STRONG>Implementation framework&nbsp;</STRONG></H2><BR /> <SPAN data-contrast="auto">Let’s look at a high-level framework for implementing a business data fabric architecture in your organization.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H3 id="toc-hId-897328649"><STRONG>1. Data Ingestion&nbsp;</STRONG></H3><BR /> <SPAN data-contrast="auto">The first step is to ensure that all your data, whether it’s structured or unstructured, can be easily ingested into the system. A business data fabric, with its open data ecosystem, allows for simple data ingestion, regardless of the source or the format of the data.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H3 id="toc-hId-700815144"><STRONG>2. Data Integration&nbsp;</STRONG></H3><BR /> <SPAN data-contrast="auto">Data from various sources must be integrated and transformed into a unified format easily consumed by data users. The interoperability of a business data fabric enables data from different sources to be combined and connected rather than being moved around.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H3 id="toc-hId-504301639"><STRONG>3. Data Governance&nbsp;</STRONG></H3><BR /> <SPAN data-contrast="auto">With the growing c</SPAN><SPAN data-contrast="auto">omplexity and volume of data, governance becomes an increasingly important topic. This includes ensuring data quality, privacy, and compliance with various regulations. A business data fabric ensures effective governance by maintaining metadata, lineage, and control measures.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H3 id="toc-hId-307788134"><STRONG>4. Data Cataloging&nbsp;</STRONG></H3><BR /> <SPAN data-contrast="auto">This involves creating an inventory of data assets and their metadata. The catalog serves as a single source of truth for users to find, understand, and trust the data they need. It’s a critical component of the business data fabric that allows data consumers to understand the business semantics.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H3 id="toc-hId-111274629"><STRONG>5. Data Consumption&nbsp;</STRONG></H3><BR /> <SPAN data-contrast="auto">This is about delivering the right data, in the right format, at the right time, to the right people. The business data fabric supports data federation, which enables unified and consistent access to data across diverse sources, reducing redundancy. It ensures data is presented in business-friendly terms and contexts, making it simple for data consumers to interpret and use the data for their specific use cases.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">This is where SAP provides the foundation for a business data fabric: SAP Datasphere.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H2 id="toc-hId--214321595"><STRONG>Transform your organization with SAP Datasphere&nbsp;&nbsp;</STRONG></H2><BR /> <A href="https://www.sap.com/canada/products/technology-platform/datasphere.html" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">SAP Datasphere</SPAN></A><SPAN data-contrast="auto"> is a comprehensive data service that empowers users&nbsp;to provide seamless and scalable access to mission-critical business data.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">It makes it easy for organizations to deliver meaningful data to every data consumer with business context and logic intact. </SPAN><SPAN data-contrast="none">As organizations need accurate data that is quickly available and described with business-friendly terms, this approach enables data professionals to permeate the clarity that business semantics provide throughout every use case.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="none">In a major moment for our industry and customers, SAP is partnering with other open data partners —&nbsp;</SPAN><A href="https://www.databricks.com/" target="_blank" rel="nofollow noopener noreferrer"><SPAN data-contrast="none">Databricks</SPAN></A><SPAN data-contrast="none">,&nbsp;</SPAN><A href="https://collibra.com/sap-partnership" target="_blank" rel="nofollow noopener noreferrer"><SPAN data-contrast="none">Collibra</SPAN></A><SPAN data-contrast="none">,&nbsp;</SPAN><A href="https://www.confluent.io/" target="_blank" rel="nofollow noopener noreferrer"><SPAN data-contrast="none">Confluent</SPAN></A><SPAN data-contrast="none">, </SPAN><A href="https://www.datarobot.com/blog/datarobot-and-sap-partner-to-deliver-joint-enterprise-ai-solution/" target="_blank" rel="nofollow noopener noreferrer"><SPAN data-contrast="none">DataRobot,</SPAN></A><SPAN data-contrast="none"> and </SPAN><A href="https://discover.sap.com/google/en-us/index.html" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">Google Cloud</SPAN></A><SPAN data-contrast="none"> — to radically simplify customers’ data landscapes. By closely integrating their data and AI platforms with SAP Datasphere, organizations can access their mission-critical business data across any cloud infrastructure.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="none">SAP Datasphere, and its open data ecosystem, is the technology foundation that enables a </SPAN><A href="https://news.sap.com/2023/03/sap-datasphere-power-of-business-data/" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">business data fabric</SPAN></A><SPAN data-contrast="none">. </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <H2 id="toc-hId--410835100"><STRONG>Learn more&nbsp;&nbsp;&nbsp;</STRONG></H2><BR /> <SPAN data-contrast="auto">Read the new </SPAN><A href="https://www.sap.com/documents/2023/10/4675c6a3-927e-0010-bca6-c68f7e60039b.html" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">e-book</SPAN></A><SPAN data-contrast="auto"> to learn more about the practical applications of a business data fabric, including: </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <UL><BR /> <LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" data-aria-posinset="1" data-aria-level="1"><SPAN data-contrast="auto">Why you need a business data fabric</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI><BR /> <LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" data-aria-posinset="2" data-aria-level="1"><SPAN data-contrast="auto">How to implement a business data fabric&nbsp;</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI><BR /> <LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" data-aria-posinset="3" data-aria-level="1"><SPAN data-contrast="auto">Five business data fabric use cases</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI><BR /> </UL><BR /> <SPAN data-contrast="auto">Get started with the </SPAN><A href="https://www.sap.com/documents/2023/10/4675c6a3-927e-0010-bca6-c68f7e60039b.html" target="_blank" rel="noopener noreferrer"><SPAN data-contrast="none">Five Steps to a Business Data Fabric Architecture</SPAN></A><SPAN data-contrast="auto"> e-book today.&nbsp;&nbsp;</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <P style="overflow: hidden;margin-bottom: 0px"><A href="https://www.sap.com/documents/2023/10/4675c6a3-927e-0010-bca6-c68f7e60039b.html" target="_blank" rel="noopener noreferrer"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/Five-Steps-to-a-BDF_Paid_1200x627_V2.png" /></A></P><BR /> &nbsp;<BR /> <BR /> &nbsp; 2023-10-17T19:31:56+02:00 https://community.sap.com/t5/technology-blog-posts-by-sap/why-business-data-is-fundamental-to-artificial-intelligence/ba-p/13572569 Why Business Data is Fundamental to Artificial Intelligence 2023-10-30T23:09:56+01:00 i032821 https://community.sap.com/t5/user/viewprofilepage/user-id/148569 The introduction of cloud computing has enabled organisations all over the world to store vast amounts of data in a cost-effective way as they digitally transform their business operations.&nbsp; Data has commonly been referred to as the 'new oil' and is where companies are looking to help increase their productivity going forward.&nbsp; However, in order to harness this technology the data needs to be relevant, reliable and responsible.&nbsp; As the adage goes, garbage in, garbage out.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/Image-01.png" /></P><BR /> &nbsp;<BR /> <BR /> AI serves as a powerful tool for extracting actionable insights from the vast amount of reliable data generated and stored within SAP systems.&nbsp; Combining AI with SAP BTP, advanced data analytics and machine learning algorithms becomes possible, allowing organisations to tap into the potential of their SAP data along with the AI technologies available in the market.<BR /> <BR /> Five pillars support SAP BTP: App development, automation, integration, data analytics, and AI. These pillars interplay with embedded intelligent technologies like situation handling, machine learning, and analytics, all fully integrated within the SAP S/4HANA Cloud. Furthermore, side-by-side capabilities through SAP BTP offer additional intelligent industry functionalities like Intelligent Situation Automation, SAP Build Process Automation, and chatbot technology.<BR /> <BR /> By harnessing artificial intelligence, SAP BTP combines business data from S4 with external data, enabling the creation of increasingly precise models in real-time.&nbsp; This ensures a versatile and agile platform that propels innovation while retaining a clean digital core. This demonstrates a shift from traditional systems of record to systems of intelligence.<BR /> <BR /> &nbsp;<BR /> <BR /> <STRONG>AI-Powered Capabilities&nbsp;</STRONG><BR /> <BR /> SAP has embedded AI into its products for many years, from journal reconciliations in S4 to AI-powered writing assistants aimed to streamline HR-related tasks in Success Factors.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/Image-02.png" /></P><BR /> &nbsp;<BR /> <BR /> These innovative functionalities are not merely theoretical but are practically applicable, ensuring HR admins, managers, and employees can operate more efficiently.&nbsp; In fact, these innovations exist across all the functions in your SAP landscape such as procurement, finance, and human resources.<BR /> <BR /> &nbsp;<BR /> <BR /> <STRONG>How to get started?</STRONG><BR /> <BR /> Automation is pivotal for managing manual and repetitive tasks, especially those involving the consolidation and manipulation of data from diverse sources like MS Excel, vendor portals, and SAP systems. High-volume processes, often exceeding 1000 steps a day—such as data migrations and approvals—and those requiring access to multiple applications, can be streamlined, ensuring seamless operation across your SAP environment.<BR /> <BR /> SAP has provided templates across all the business functions to accelerate these initiatives, as shown below.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/Image-03.png" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>A More Advanced AI Use Case</STRONG><BR /> <BR /> The transition from a rules-based approach to an AI-empowered, data-driven model is illustrated through an example case study of an Australian customer.&nbsp; A decade ago, an employee scripted manual “if:then” statements for road upgrades; a process that has now been revolutionised by AI. AI can now analyse these rules and infuse them with real-time data like weather, road usage, and vehicle types. As a part of their operations, this customer assesses road conditions using specialised trucks called profilometers, generating colossal data volumes that outpace their storage capacities. SAP BTP, however, can house this data in expansive lakes, giving AI the agility to model exponentially precise “if:then” statements.<BR /> <BR /> &nbsp;<BR /> <BR /> The shift will allow this customer to manage large datasets from disparate sources seamlessly, scaling memory and compute capabilities to handle big data without losing granularity. Moreover, unlike fixed rules, the AI algorithms continually evolve based on data, thereby ensuring maintenance and road upgrade strategies that are timely, relevant, and efficient.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/Image-05-2.png" /></P><BR /> In the realm of road maintenance, AI’s practical application is manifest, where even a small percentage improvement can result in significant savings for this customer.&nbsp; This financial efficacy, combined with the potential to extend the useful life of assets, underscores the tangible, impactful benefits of combining AI with SAP BTP.<BR /> <BR /> <STRONG>In Summation</STRONG><BR /> <BR /> Early AI integration can offer businesses a decisive advantage. SAP’s AI vision isn’t just about pioneering technology; it's about tangible, real-world applications. From simple tools deployable within days to intricate endeavours with broad impact.<BR /> <BR /> If you’d like to find out about the value AI can bring to businesses through automation and explore other use cases, then visit the <A href="https://www.sap.com/australia/products/artificial-intelligence.html" target="_blank" rel="noopener noreferrer"><STRONG>SAP Business AI</STRONG></A> website. 2023-10-30T23:09:56+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/sap-and-dxc-team-plan-to-deliver-rise-with-sap-s-4hana-cloud-in-customer/ba-p/13578185 SAP and DXC team plan to deliver RISE with SAP S/4HANA Cloud, in customer data centers and co-location facilities, creating a new and powerful platform for digital transformation 2023-11-06T18:39:34+01:00 j_zarb https://community.sap.com/t5/user/viewprofilepage/user-id/631199 SAP and DXC aim to deliver RISE with SAP S/4HANA Cloud, private edition, customer data center option as a turn-key service delivered by DXC.&nbsp; The new service is ideally suited for Private Cloud customers and other managed services customers who wish to run SAP either from their own data center or a DXC managed data center and get the transformational benefits of RISE with SAP.<BR /> <BR /> In this blog, DXC reaffirms its commitment as a Partner Managed Cloud (PMC) service provider of RISE with SAP by expanding its distinct deployment capabilities already announced with DXC Hyperscaler solutions to support the customer data center option.<BR /> <BR /> This partnership empowers DXC and SAP to offer a comprehensive catalog of managed services and extraordinary opportunities that surpass what each entity could achieve independently, benefiting our mutual customers.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/DXC.png" /></P><BR /> <STRONG>RISE with SAP S/4HANA Cloud, private edition, customer data center option</STRONG> or “CDC” represents the hybridization of SAP’s strategic RISE cloud solution, where a customer can run S/4HANA as a cloud service from their data center; while accessing SAP BTP and SAP Signavio, and all the other innovative RISE components from the public cloud – aka a mixture of internal and external cloud services.&nbsp; For more information about CDC, please visit this dedicated web page: &nbsp;<A href="https://www.sap.com/products/erp/rise/customer-data-center.html" target="_blank" rel="noopener noreferrer">Customer data center option | RISE with SAP</A><BR /> <BR /> Similar to SAP’s cloud solutions, CDC provides diverse deployment options on Lenovo, HPE and Dell infrastructure.&nbsp; DXC enhances this offering further by enabling delivery of SAP and non-SAP Infrastructure as a Service (IaaS), platform as a service (PaaS), and software as a service (SaaS), all backed by a top service level agreement.<BR /> <BR /> <STRONG>With DXC, SAP’s strategic RISE Cloud offering deployable in Customer Data Centers,</STRONG> is specifically designed to:<BR /> <OL><BR /> <LI>Enhance RISE with SAP by harnessing DXC’s expertise in delivering SAP and non-SAP managed services and enabling mutual customers to become more agile and alleviate the challenges often associated with digital transformation projects. This is achieved through best-of-breed solutions and a wealth of experience and skills.</LI><BR /> <LI>Establish a secure infrastructure and platform for SAP within the customer data centers, backed, managed, designed to perform by SAP and DXC.</LI><BR /> <LI>Empower customers who want to run in their data center (on-premise) while staying aligned with SAP’s cloud innovation agenda, including ML/AI (Machine Learning / Artificial Intelligence), LLMS (large language models), etc.</LI><BR /> <LI>Meet the needs of industries with strict regulatory compliance requirements that may prevent running SAP in a public shared Hyperscaler such as utilities, public sector, healthcare, pharmaceuticals, aerospace &amp; defense, etc.</LI><BR /> <LI>Provide extended services to address specific data sovereignty needs of customers, governments, and industry stakeholders by keeping sensitive data within their national boundaries, governed by local laws.</LI><BR /> <LI>Offer an innovative approach for those seeking a cloud OpEx model while benefiting from a high-performance dedicated onPrem system with minimal latency.</LI><BR /> <LI>Benefit from a dedicated on-premise setup, without the data center environment having to be managed by the customer.</LI><BR /> </OL><BR /> To learn more about the DXC &amp; SAP cloud solutions and the DXC Premier Services for RISE with SAP, please visit the following link: <A href="https://dxc.com/us/en/offerings/applications/eas-sap/dxc-premier-services-for-rise-with-sap" target="_blank" rel="nofollow noopener noreferrer">DXC Premier Services for RISE with SAP</A><BR /> <BR /> All thoughts and questions are welcome, please share your comments below to contribute to this discussion.<BR /> <BR /> Joseph Zarb<BR /> Head of RISE with SAP – Customer Data Center<BR /> SAP RISE Global GTM Execution<BR /> 10 Hudson Yards, 51st Floor, New York NY 10001 USA<BR /> <BR /> j.zarb@sap.com 2023-11-06T18:39:34+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/how-to-connect-sap-ecc-s-4hana-on-prem-and-private-cloud-with-confluent/ba-p/13575214 How to connect SAP ECC + S/4HANA on-prem and private cloud with Confluent Cloud or Confluent Platform (1) 2023-12-01T15:27:08+01:00 FlorianFarr https://community.sap.com/t5/user/viewprofilepage/user-id/176163 This blog post explains how to connect SAP ECC or S/4HANA on-prem and private cloud editions with Confluent Cloud or Confluent Platform.<BR /> Whether you're using SAP NetWeaver Event-enablement Add-on or ASAPIO Integration Add-on, this step-by-step guide provides all you need as a first step to enable SAP systems for communication with a Confluent broker.<BR /> <H2 id="toc-hId-963983780">Architecture / Connection types</H2><BR /> When connecting SAP with Confluent, it is important to understand that there are two very different approaches in terms of connection architecture.<BR /> <H3 id="toc-hId-896552994"><SPAN style="font-size: 1rem">Using a REST Proxy</SPAN></H3><BR /> "Confluent REST Proxy for Kafka" or a similar product is the standard approach currently and therefore can be considered mandatory for the connectivity.<BR /> Please see <A href="https://github.com/confluentinc/kafka-rest" target="_blank" rel="nofollow noopener noreferrer">https://github.com/confluentinc/kafka-rest</A> for details.<BR /> <BR /> Reason for using REST instead of AMQP is, that SAP ECC does not support streaming protocols and 3rd-party-libraries cannot be used in SAP-certified Add-ons.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/aia_confluent_architecture_2024.png" height="334" width="261" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">SAP-to-Confluent Architecture</P><BR /> <BR /> <H3 id="toc-hId-700039489"><SPAN style="font-size: 1rem">Direct connect/V3:</SPAN></H3><BR /> A Connector for Confluent Cloud direct connect (V3 REST API) is available as pre-release in the download section for registered ASAPIO customers. This connector can only handle outbound connectivity at the time of this blog being published (November 2023). For inbound connectivity, the REST proxy approach above is still required.<BR /> <H3 id="toc-hId-503525984"><SPAN style="font-size: 1rem">AMQP support:</SPAN></H3><BR /> AMQP support is planned to be released by ASAPIO in 2024, for S/4HANA systems.<BR /> <H2 id="toc-hId-177929760">System prerequisites</H2><BR /> Before diving into the integration process, make sure you have the following components available:<BR /> <H3 id="toc-hId-110498974">Software components required on your SAP system</H3><BR /> <UL><BR /> <LI>SAP NetWeaver Event-enablement Add-on (SAP Event Mesh Edition)</LI><BR /> <LI>or, alternatively, ASAPIO Integration Add-on - Framework (full version)</LI><BR /> <LI>ASAPIO Connector for Confluent (using REST proxy)</LI><BR /> </UL><BR /> For direct connect/V3 REST API, a pre-release Connector for Confluent Cloud is available for registered ASAPIO customers.<BR /> <H3 id="toc-hId--86014531">Non-SAP components</H3><BR /> Please make sure you have endpoint URI and authorization data at hand for:<BR /> <UL><BR /> <LI>Confluent Components:<BR /> <UL><BR /> <LI>Confluent REST Proxy for Kafka (<A href="https://docs.confluent.io/platform/current/kafka-rest/index.html" target="_blank" rel="nofollow noopener noreferrer">More info</A>)</LI><BR /> <LI>Confluent Cloud</LI><BR /> <LI>Or, alternatively Confluent Platform</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <STRONG>Licensing</STRONG><BR /> <BR /> All software above requires the purchase of appropriate licenses.<BR /> <H2 id="toc-hId--411610755">Set-up Connectivity</H2><BR /> <H3 id="toc-hId--479041541">1. Create RFC Destinations to Confluent REST Proxy</H3><BR /> Transaction: <CODE>SM59</CODE><BR /> <BR /> Type: "G" (HTTP Connection to External Server)<BR /> <BR /> Target Host: Endpoint of Confluent REST Proxy<BR /> <BR /> Save and perform a "Connection Test" to ensure HTTP status code 200.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog1.png" height="344" width="331" /></P><BR /> <BR /> <H3 id="toc-hId--675555046">2. Set-up Authentication to REST Proxy</H3><BR /> Pre-requisites: Obtain user and password for the REST proxy or exchange certificates with the SAP system.<BR /> <BR /> Transaction: <CODE>SM59</CODE><BR /> <BR /> Choose the correct RFC destination, go to "Logon &amp; Security," and select authentication method:<BR /> <UL><BR /> <LI>"Basic Authentication" with username and password</LI><BR /> <LI>SSL certificate-based authentication</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog2.png" height="243" width="422" /></P><BR /> &nbsp;<BR /> <H3 id="toc-hId--947299920">3. Set-up Basic Settings</H3><BR /> Activate BC-Sets: Use <CODE>SCPR20</CODE> to activate BC-Sets for cloud adapter and codepages.<BR /> <BR /> Configure Cloud Adapter: In <CODE>SPRO</CODE>, go to ASAPIO Cloud Integrator, Maintain Cloud Adapter, and add an entry for the Confluent connector.<BR /> <H3 id="toc-hId--1143813425">4. Set-up Connection Instance</H3><BR /> Transaction: <CODE>SPRO</CODE> or <CODE>/ASADEV/68000202</CODE><BR /> <BR /> Add a new entry specifying connection details, RFC destination, ISO code, and cloud type.<BR /> <H3 id="toc-hId--1340326930">5. Set-up Error Type Mapping</H3><BR /> Create an entry mapping response codes to message types.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog5-1.png" height="243" width="492" /></P><BR /> <BR /> <H3 id="toc-hId--1536840435">6. Set-up Connection Values</H3><BR /> Maintain default values for the connection to Confluent in <CODE>Connections -&gt; Default values</CODE>.<BR /> <TABLE style="height: 96px" width="560"><BR /> <TBODY><BR /> <TR><BR /> <TH style="width: 199.512px">Default Attribute</TH><BR /> <TH style="width: 345.688px">Default Attribute Value</TH><BR /> </TR><BR /> <TR><BR /> <TD style="width: 199.512px">KAFKA_ACCEPT</TD><BR /> <TD style="width: 345.688px">application/vnd.kafka.v2+json</TD><BR /> </TR><BR /> <TR><BR /> <TD style="width: 199.512px">KAFKA_CALL_METHOD</TD><BR /> <TD style="width: 345.688px">POST</TD><BR /> </TR><BR /> <TR><BR /> <TD style="width: 199.512px">KAFKA_CONTENT_TYPE</TD><BR /> <TD style="width: 345.688px">application/vnd.kafka.json.v2+json<BR /> (or application/vnd.kafka.jsonschema.v2+json)</TD><BR /> </TR><BR /> </TBODY><BR /> </TABLE><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog6.png" height="194" width="490" /></P><BR /> <BR /> <H2 id="toc-hId--1439950933">Set-up Outbound Messaging</H2><BR /> <H3 id="toc-hId--1929867445">1. Create Message Type</H3><BR /> Transaction: <CODE>WE81</CODE><BR /> <BR /> Add a new entry specifying a unique name and description for the integration.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog7.png" height="70" width="490" /></P><BR /> <BR /> <H3 id="toc-hId--2126380950">2. Activate Message Type</H3><BR /> Transaction: <CODE>BD50</CODE><BR /> <BR /> Activate the created message type.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog8.png" height="76" width="283" /></P><BR /> <BR /> <H3 id="toc-hId-1972072841">3. Set-up additional settings in 'Header Attributes'</H3><BR /> Configure the topic, fields for the key, and schema IDs for key/value schemas.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog9.png" height="152" width="529" /></P><BR /> <BR /> <H3 id="toc-hId-1775559336">4. Set up 'Business Object Event Linkage'</H3><BR /> Link the configuration of the outbound object to a Business Object event.<BR /> <H2 id="toc-hId-1872448838">Send a "Simple Notifications" event for testing</H2><BR /> <H3 id="toc-hId-1550716017">1. Create Outbound Object Configuration</H3><BR /> Transaction: <CODE>SPRO</CODE> or <CODE>/ASADEV/68000202</CODE><BR /> <BR /> Select the created connection and go to Outbound Objects.<BR /> <BR /> Add a new entry specifying the object, extraction function module, message type, load type, and response function.<BR /> <H3 id="toc-hId-1354202512">2. Test Outbound Event Creation</H3><BR /> In the example above, please pick any test sales order in transaction <CODE>/nVA02</CODE> and force a change event, e.g., by changing the requested delivery date on header level.<BR /> <H3 id="toc-hId-1157689007">3. Check monitor transaction for actual message and payload</H3><BR /> Access to monitor application<BR /> User must have PFCG role <CODE>/ASADEV/ACI_ADMIN_ROLE</CODE> to access Add-On monitor.<BR /> <BR /> Use transaction <CODE>/n/ASADEV/ACI_MONITOR</CODE> to start the monitor.<BR /> You will see the entry screen with a selection form on top.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog10.png" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/confluent_blog12.png" /></P><BR /> <STRONG>Congrats, you are now able to send data out to Confluent.</STRONG><BR /> <BR /> In the next blog, we will create a custom payload for the event. 2023-12-01T15:27:08+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/event-driven-architecture-simplifying-payload-creation-with-payload/ba-p/13577526 Event-driven architecture: Simplifying Payload Creation with Payload Designer 2023-12-11T08:42:28+01:00 Benedikt_Sprung https://community.sap.com/t5/user/viewprofilepage/user-id/661488 <H2 id="toc-hId-964046278"><STRONG>Overview</STRONG></H2><BR /> Configuring payloads for the SAP NetWeaver Add-On for Event enablement has become remarkably straightforward, all thanks to the Payload Designer.<BR /> <BR /> This powerful tool enables you to effortlessly add tables, define relationships with inner joins, left outer joins, rename tables and fields, all through simple configuration.<BR /> <BR /> In this blog post, we will guide you through the process of configuring your payload with just a few easy steps.<BR /> <BR /> If you you are new to SAP Enterprise Messaging in SAP ERP systems and the Integration Add-on, you can have a look at following Blog-Posts:<BR /> <UL><BR /> <LI><A title="Event-driven architecture – now available for SAP ECC users" href="https://blogs.sap.com/?p=1132020" target="_blank" rel="noopener noreferrer">Event-driven architecture – now available for SAP ECC users</A></LI><BR /> <LI><A title="SAP Enterprise Messaging for SAP ERP: HowTo-Guide (Part 1 - Connectivity)" href="https://blogs.sap.com/?p=1185933" target="_blank" rel="noopener noreferrer">SAP Enterprise Messaging for SAP ERP: HowTo-Guide (Part 1 - Connectivity)</A></LI><BR /> <LI><A title="SAP Enterprise Messaging for SAP ERP: HowTo-Guide (Part 2 - First use case)" href="https://blogs.sap.com/?p=1179612" target="_blank" rel="noopener noreferrer">SAP Enterprise Messaging for SAP ERP: HowTo-Guide (Part 2 - First use case)</A></LI><BR /> <LI><A title="Data Events scenario With SAP Event Enablement Add-on for SAP S/4HANA, SAP Event Mesh and SAP Cloud Integration: Step-by-Step Guide" href="https://blogs.sap.com/2022/03/04/data-events-scenario-with-sap-event-enablement-add-on-for-sap-s-4hana-sap-event-mesh-and-sap-cloud-integration-step-by-step-guide/" target="_blank" rel="noopener noreferrer">Data Events scenario With SAP Event Enablement Add-on for SAP S/4HANA, SAP Event Mesh and SAP Cloud Integration: Step-by-Step Guide</A></LI><BR /> <LI><A href="https://blogs.sap.com/2021/08/13/emit-data-events-from-sap-s-4hana-or-sap-ecc-through-sap-netweaver-add-on-for-event-enablement/" target="_blank" rel="noopener noreferrer">Emit Data Events from SAP S/4HANA or SAP ECC through SAP NetWeaver Add-On for Event Enablement</A></LI><BR /> <LI><A href="https://blogs.sap.com/2023/12/01/how-to-connect-sap-ecc-s-4hana-on-prem-and-private-cloud-with-confluent-cloud-or-confluent-platform-1/" target="_blank" rel="noopener noreferrer">How to connect SAP ECC + S/4HANA on-prem and private cloud with Confluent Cloud or Confluent Platform (1)</A></LI><BR /> </UL><BR /> &nbsp;<BR /> <H2 id="toc-hId-767532773"><STRONG>System Prerequisite</STRONG></H2><BR /> One of the Software components needs to be availabvle on your system:<BR /> <UL><BR /> <LI>SAP NetWeaver Event-enablement Add-on (SAP Event Mesh Edition)</LI><BR /> <LI>or, alternatively, ASAPIO Integration Add-on – Framework (full version)</LI><BR /> </UL><BR /> <STRONG>Licensing</STRONG><BR /> <BR /> All software above requires the purchase of appropriate licenses.<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <H2 id="toc-hId-571019268"><STRONG>Creating Custom Payloads with Payload Designer</STRONG></H2><BR /> <H3 id="toc-hId-503588482"><STRONG>Step 1:</STRONG> <STRONG>Creating Custom Payloads with Payload Designer</STRONG></H3><BR /> 1. Navigate to transaction /n/ASADEV/DESIGN.<BR /> <BR /> Click the "Create Payload Designer" button on the main screen and fill in the necessary fields. This action creates the initial version of the payload and takes you to the main screen.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/1-41.png" height="365" width="468" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Creating a Payload Designer version in transaction /n/ASADEV/DESIGN</P><BR /> <BR /> <H3 id="toc-hId-307074977"></H3><BR /> 2. Use the join builder to establish table joins.<BR /> <UL><BR /> <LI>Insert new tables or custom views.</LI><BR /> <LI>Adjust table joins through field connections.</LI><BR /> <LI>Return to the main screen.</LI><BR /> <LI>Note: Parent relationships in the Table section and key fields in the Field section are automatically determined based on hierarchical sorting.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/PD__join.png" /></P><BR /> <BR /> <H3 id="toc-hId-110561472"></H3><BR /> 3. Add additional payload fields from the tables:<BR /> <UL><BR /> <LI>Double click on the preferred table.</LI><BR /> <LI>Select one or multiple fields.</LI><BR /> <LI>Fields can be reordered using sequence numbers.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/2-85.png" /></P><BR /> <BR /> <H3 id="toc-hId--85952033"></H3><BR /> <H3 id="toc-hId--282465538"><STRONG>Step 2: Outbound Configuration Using Payload Designer</STRONG></H3><BR /> To configure outbound objects using Payload Designer, follow these steps:<BR /> <OL><BR /> <LI>Access transaction SPRO.</LI><BR /> <LI>Navigate to IMG &gt; Cloud Integrator – Connection and Replication Object Customizing or directly to transaction: /ASADEV/68000202.</LI><BR /> <LI>Select the created connection.</LI><BR /> <LI>Go to the "Outbound Objects" section.</LI><BR /> <LI>Add a new entry and specify the following:<BR /> <UL><BR /> <LI>Object: Name of the outbound configuration.</LI><BR /> <LI>Extraction Func. Module: /ASADEV/ACI_GEN_PDVIEW_EXTRACT.</LI><BR /> <LI>Load Type: Incremental Load.</LI><BR /> <LI>Trace: Activate for testing purposes.</LI><BR /> <LI>Formatting Func.: /ASADEV/ACI_GEN_VIEW_FORM_CB.</LI><BR /> <LI>Field Payload View Name: Payload Name.</LI><BR /> <LI>Field Payload View Version: Payload Version.</LI><BR /> </UL><BR /> </LI><BR /> </OL><BR /> <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/3-12.png" height="396" width="629" /><BR /> <H3 id="toc-hId--478979043"></H3><BR /> <H3 id="toc-hId--675492548"><STRONG>Step 3: </STRONG>See your Payload in the ACI_Monitor</H3><BR /> Navigate to transaction /n/ASADEV/ACI_MONITOR<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/10/ACI_Monitor.png" /></P><BR /> <BR /> <H3 id="toc-hId--947237422"></H3><BR /> <H2 id="toc-hId--850347920"><STRONG>Conclusion</STRONG></H2><BR /> Payload Designer simplifies SAP interface configuration by providing a user-friendly, code-free approach to defining payloads. With its intuitive interface and powerful features, it enables organizations to streamline their data integration processes and improve efficiency in managing payloads for event messages sent to the SAP Event Mesh.<BR /> <BR /> &nbsp; 2023-12-11T08:42:28+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/sap-hana-cloud-data-lake-files-%E3%81%B8%E3%81%AE%E6%9C%80%E5%88%9D%E3%81%AE%E3%82%A2%E3%82%AF%E3%82%BB%E3%82%B9%E8%A8%AD%E5%AE%9A/ba-p/13574382 SAP HANA Cloud, data lake Files への最初のアクセス設定 2023-12-19T07:37:51+01:00 Sawa_Ito https://community.sap.com/t5/user/viewprofilepage/user-id/7449 <P>このブログは、2022 年 11 月 15 日に SAP ジャパン公式ブログに掲載されたものを SAP ジャパン公式ブログ閉鎖に伴い転載したものです。<BR /><BR /></P><HR /><P><BR /><BR />このブログは、<SPAN class="">jason.hinsperger</SPAN> が執筆したブログ「<A href="https://blogs.sap.com/2021/08/05/setting-up-initial-access-to-hana-cloud-data-lake-files/" target="_blank" rel="noopener noreferrer"><STRONG>Setting Up Initial Access to HANA Cloud data lake Files</STRONG></A>(2021 年 8 月 5 日)の抄訳です。最新の情報は、<A href="https://blogs.sap.com/tags/7efde293-f35d-4737-b40f-756b6a798216/" target="_blank" rel="noopener noreferrer">SAP Community の最新ブログ</A>や<A href="https://help.sap.com/docs/SAP_HANA_DATA_LAKE?locale=en-US" target="_blank" rel="noopener noreferrer">マニュアル</A>を参照してください。<BR /><BR /></P><HR /><P><BR /><BR />&nbsp;<BR /><BR />SAP HANA Cloud, data lake は、フォーマットのあらゆるタイプのデータのネイティブ形式でのストレージをサポートしています。<BR /><BR />マネージドファイルストレージは、外部のハイパースケーラーアカウントでストレージを設定することなく、あらゆるタイプのファイルをセキュアに格納するストレージを提供します。<BR /><BR />これは、高速 SQL 分析を行う目的で SAP HANA Cloud, data lake にデータを高速投入する必要がある場合や、何等かの目的でデータを extract する場合にとても便利です。<BR /><BR />SAP HANA Cloud, data lake Files への初回のアクセス設定は、特にデータベースのバックグラウンドを持ち、オブジェクトストレージや REST API に詳しくない場合には少し難しいプロセスかもしれません。<BR /><BR />以下は、私が SAP HANA Cloud, data lake files をテストするのに使用したプロセスです。<BR /><BR />SAP HANA Cloud, data lake files はユーザーセキュリティーやアクセスを認証経由で管理するため、ユーザーアクセスの設定には署名付きの証明書の生成が必要です。<BR /><BR />認証局へのアクセスがない場合には、OpenSSL を利用する以下のプロセスを使用してCAと署名付きのクライアント証明書を作成してSAP HANA Cloud, data lake files 設定を更新することができます。<BR />私はこれまで何度もこれでテストしたことがあるので読者の方でも同様に行えるでしょう。<BR /><BR />最初に、CA バンドルを作成してアップロードする必要があります。<BR /><BR />&nbsp;<BR /><BR />以下の OpenSSL コマンドを使用して CA を生成できます。<BR /><BR />openssl genrsa -out ca.key 2048<BR /><BR />&nbsp;<BR /><BR />次に、CA の公開証明書(この場合は 200 日間有効)を作成します。共通名を最低限入力し、他のフィールドを必要に応じて入力します。<BR /><BR />openssl req -x509 -new -key ca.key -days 200 -out ca.crt<BR /><BR />&nbsp;<BR /><BR />クライアント証明書の署名リクエストを作成する必要があります。共通名を最低限提供し、他のフィールドを必要に応じて入力します。<BR /><BR />openssl req -new -nodes -newkey rsa:2048 -out client.csr -keyout client.key<BR /><BR />&nbsp;<BR /><BR />最後に、クライアント証明書を作成します(この場合は 100 日有効)。<BR /><BR />openssl x509 -days 100 -req -in client.csr -CA ca.crt -CAkey ca.key -CAcreateserial -out client.crt</P><BLOCKQUOTE><STRONG>*</STRONG><STRONG><EM>備考</EM></STRONG><EM> – CA とクライアント証明書のフィールドがすべて全く同じにならないようにしてください。さもないと、自己署名証明書とみなされ、以下の証明書の認証が失敗します。</EM></BLOCKQUOTE><P><BR />&nbsp;<BR /><BR />証明書が任意の CA によって署名されたことを認証するには(SAP HANA Cloud, data lake に CA 証明書をアップロードしたときにクライアント証明書を認証するために使用できるとわかるように)<BR /><BR />openssl verify -CAfile ca.crt client.crt<BR /><BR />&nbsp;<BR /><BR />次に、SAP HANA Cloud Central でインスタンスを開き、「Manage File Container」を選択し、SAP HANA Cloud, data lake files ユーザーを設定します。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2021/08/HDLFiles_EditConfig.jpg" border="0" /><BR /><BR />&nbsp;<BR /><BR />設定を編集し、「Trusts」セクションの「Add」を選択します。前に生成した ca.crt をコピーまたはアップロードし、「Apply」をクリックします。すぐには「Manage File Container」スクリーンはクローズしないでください。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2021/08/HDLFiles_AddTrust.jpg" border="0" /><BR /><BR />&nbsp;<BR /><BR />これで、管理されたファイルストレージにアクセスできるようにユーザーを構成できるようになりました。<BR /><BR />&nbsp;<BR /><BR />「Authorizations」セクションをスクロールダウンして、「Add」を選択します。新しい入力欄が表示されます。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2021/08/HDLFiles_AddUser-1.jpg" border="0" /><BR /><BR />&nbsp;<BR /><BR />ユーザーのロールをドロップダウンリストから選択します(デフォルトでは admin とユーザーロールがあります)。<BR /><BR />ここからが少し難しいところです。<BR /><BR />リクエストしたときに、ストレージゲートウェイ(SAP HANA Cloud, data lake files へのエントリーポイント)がどのユーザーに対して認証するのか決定できるようクライアント証明書からパターン文字列を追加する必要があります。<BR /><BR />パターン文字列を生成するにあたり、2 つのオプションがあります。<BR />以下の OpenSSL コマンドを使用して、パターン文字列を生成することができます(アウトプットに表示される 「subject= 」 プレフィックスは省略します) 。<BR /><BR />&nbsp;<BR /><BR />openssl x509 -in client.crt -in client.crt -nameopt RFC2253 -subject -noout<BR /><BR />&nbsp;<BR /><BR />あるいは、スクリーンにある「generate pattern」オプションを使用することもできます。<BR />これは、ダイアログボックスを開き、クライアント証明書をアップロード/貼り付けて、自動でパターンを生成します。<BR />証明書は保存せず、パターン文字列だけを保存することに注意してください。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2021/08/HDLFiles_GeneratePattern.jpg" border="0" /><BR /><BR />&nbsp;<BR /><BR />「Apply」をクリックして、権限入力欄にパターン文字列を追加します。</P><BLOCKQUOTE>パターン文字列は、ワイルドカードも使用可能なため、特定のロールの証明書のクラスを認証できることに注意してください。証明書のパターンが複数の認証と一致する場合、使用する認証は、特定の認証エントリーに設定された「Rank」値セットによって制御されます。</BLOCKQUOTE><P><BR />&nbsp;<BR /><BR />これで、REST api 経由で SAP HANA Cloud, data lake files にアクセスして使用することができます。<BR /><BR />私のテストではうまくいったcurl コマンドのサンプルがあります。接続が成功しているかどうかvalicate できます。(インスタンス ID とファイル REST API エンドポイントは HANA Cloud Centralのインスタンス詳細からコピーすることができます)。<BR /><BR />上記で生成して認証の作成に使用したクライアント証明書とキーを使用してください。<BR /><BR />&nbsp;</P><BLOCKQUOTE><EM>curl </EM><EM>は少し tricky なことに注意してください。Windows で試していましたが、Windows 10 バージョン用の curl を動作させることができませんでした。最終的に新しい curl version (7.75.0) をダウンロードしたところ、機能しましたが、Windows で curl から証明書ストアへどうアクセスするのかわからなかったため、SAP HANA Cloud サーバー証明書の認証をスキップするために –insecure’ オプションを使用しなければなりませんでした。</EM><BR /><BR />&nbsp;</BLOCKQUOTE><P><BR />curl --insecure -H "x-sap-filecontainer:&nbsp;<EM>&lt;instance_id&gt;</EM>" --cert ./client.crt --key ./client.key "https://<EM>&lt;Files REST API endpoint&gt;</EM>/webhdfs/v1/?op=LISTSTATUS" -X GET<BR /><BR />&nbsp;<BR /><BR />上記のコマンドは、以下を返します (空のSAP HANA Cloud, data lake)。<BR /><BR />&nbsp;<BR /><BR />{"FileStatuses":{"FileStatus":[]}}<BR /><BR />&nbsp;<BR /><BR />これで、SAP HANA Cloud, data lake files を使用して、あらゆるタイプのファイルを SAP HANA Cloud に格納するための設定は終了です。<BR /><BR />ファイルの管理でサポートされている REST API と引数のフルセットについては、<A href="https://help.sap.com/doc/9d084a41830f46d6904fd4c23cd4bbfa/QRC_2_2021/en-US/html/index.html" target="_blank" rel="noopener noreferrer">マニュアル</A>を参照ください。<BR /><BR />&nbsp;<BR /><BR /></P><HR /><P><BR /><BR />オリジナルのブログはここまでです。<BR /><BR /></P><HR /><P>&nbsp;</P><P><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/data_pyramid_1-2.jpg" border="0" /></P><P>&nbsp;</P><P><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/data_pyramid_2-2.jpg" border="0" /></P><P><BR /><BR /><BR /></P><HR /><P><BR /><BR />&nbsp;</P> 2023-12-19T07:37:51+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/sap-hana-cloud-hana-%E3%83%87%E3%83%BC%E3%82%BF%E3%83%99%E3%83%BC%E3%82%B9%E3%81%8B%E3%82%89-sap-hana-cloud-data-lake/ba-p/13574477 SAP HANA Cloud, HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンへの最速のデータ移動方法と移動速度テスト結果 2023-12-19T07:47:55+01:00 Sawa_Ito https://community.sap.com/t5/user/viewprofilepage/user-id/7449 <P>このブログは、2022 年 11 月 17 日に SAP ジャパン公式ブログに掲載されたものを SAP ジャパン公式ブログ閉鎖に伴い転載したものです。<BR /><BR /></P><HR /><P><BR /><BR />このブログは、<SPAN class="">douglas.hoover</SPAN>&nbsp;が執筆したブログ「<A href="https://blogs.sap.com/2022/03/08/the-fastest-way-to-load-data-from-hana-cloud-hana-into-hana-cloud-hana-data-lake/" target="_blank" rel="noopener noreferrer"><STRONG>The fastest way to load data from HANA Cloud, HANA into HANA Cloud, HANA Data Lake</STRONG></A>(2022 年 3 月 8 日)の抄訳です。オリジナルのブログページでのコメントのやりとりなどもぜひご参照ください。<BR /><BR />最新の情報は、<A href="https://blogs.sap.com/tags/7efde293-f35d-4737-b40f-756b6a798216/" target="_blank" rel="noopener noreferrer">SAP Community の最新ブログ</A>や<A href="https://help.sap.com/docs/SAP_HANA_DATA_LAKE?locale=en-US" target="_blank" rel="noopener noreferrer">マニュアル</A>を参照してください。<BR /><BR /></P><HR /><P><BR /><BR /></P><H2 id="toc-hId-963956100">&nbsp;</H2><P>&nbsp;</P><H2 id="toc-hId-767442595">&nbsp;</H2><P><BR />このブログは、SAP HANA データ戦略ブログシリーズの1つです。<BR /><A href="https://blogs.sap.com/2019/10/14/sap-hana-data-strategy/" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/2019/10/14/sap-hana-data-strategy/</A></P><H1 id="toc-hId-441846371">&nbsp;</H1><P><BR />&nbsp;</P><H1 id="toc-hId-245332866">&nbsp;</H1><P>&nbsp;</P><H1 id="toc-hId-48819361">概要</H1><P><BR />&nbsp;<BR /><BR />SAP HANA Cloud の HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンにより大きなテーブルを移動するお客様が増えるにつれ、SAP HANA Cloud の HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンへのデータ移動の最速の方法について聞かれるようになりました。<BR /><BR />より正確にいうと、SAP HANA Cloud, data lake リレーショナルエンジン仮想テーブルに対してシンプルに HANA INSERT を実行するよりも高速な方法があるのか聞かれるようになりました。<BR /><BR />なぜ SAP HANA Cloud の HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンに大きなテーブルを移動するお客様がいるのか疑問に思う方もいるかもしれません。<BR /><BR />最もよくある利用ケースは、大きなデータベースの最初のマテリアライズあるいは古いデータをSAP HANA Cloud, data lake リレーショナルエンジンにアーカイブするためです。<BR /><BR />これらのお客様の大半は、通常 SAP HANA Smart Data Integration (SDI) を使用してこのマテリアライゼーションを行っており、これらのテーブルを最新の状態にキープするために SDI の Flowgraphs や SDI のリアルタイムレプリケーションを使用した <STRONG>Change Data Capture </STRONG>を同じインターフェースを使用して行っています。<BR /><BR />&nbsp;<BR /><BR /><STRONG>SAP HANA SDI </STRONG><STRONG>の詳細については以下のブログを参照してください</STRONG><STRONG>:</STRONG><BR /><BR />&nbsp;<BR /><BR /><STRONG>SAP HANA&nbsp;</STRONG><STRONG>データ戦略</STRONG><STRONG>:&nbsp;</STRONG><STRONG>リアルタイム </STRONG><STRONG>Change Data Capture&nbsp;</STRONG><STRONG>を含む高速データ投入(英語)</STRONG><BR /><BR /><A href="https://blogs.sap.com/2020/06/18/hana-data-strategy-data-ingestion-including-real-time-change-data-capture/" target="test_blank" rel="noopener noreferrer">https://blogs.sap.com/2020/06/18/hana-data-strategy-data-ingestion-including-real-time-change-data-capture/</A><BR /><BR /><STRONG>SAP HANA&nbsp;</STRONG><STRONG>データ戦略</STRONG><STRONG>:&nbsp;</STRONG><STRONG>高速データ投入</STRONG><STRONG>&nbsp;–&nbsp;</STRONG><STRONG>仮想化(英語)</STRONG><BR /><BR /><A href="https://blogs.sap.com/2020/03/09/hana-data-strategy-data-ingestion-virtualization/" target="test_blank" rel="noopener noreferrer">https://blogs.sap.com/2020/03/09/hana-data-strategy-data-ingestion-virtualization/</A><BR /><BR />&nbsp;<BR /><BR />ここで実験するデータ移動に関するシンプルな方法は以下の 3 種です:<BR /><BR /></P><UL><UL><LI>シンプルに SAP HANA Cloud, data lake リレーショナルエンジン仮想テーブルへの HANA INSERT</LI></UL></UL><P>&nbsp;</P><UL><UL><LI>HANA 仮想テーブルにアクセスし、シンプルに SAP HANA Cloud, data lake リレーショナルエンジンから data lake INSERT</LI></UL></UL><P>&nbsp;</P><UL><UL><LI>HANA エクスポートと data lake LOAD</LI></UL></UL><P><BR /><BR />&nbsp;<BR /><BR />こう質問する人もいるかもいれません:<BR /><BR />「なぜ SAP HANA Cloud, HANA データベース経由で行うのか?」<BR /><BR />「なぜ SAP HANA Cloud, data lake リレーショナルエンジンに直接データをロードしないのか?」<BR /><BR />&nbsp;<BR /><BR />繰り返しますが、これらのお客様はターゲットとして HANAオブジェクト(ローカルまたは仮想)を必要とする HANA Enterprise Information Management (EIM)ツールを使用しています。<BR /><BR />将来のブログでは、SAP IQ クライアントサイドロード、Data Services、Data Intelligence 経由の SAP HANA Cloud, data lake リレーショナルエンジンへの直接のデータロードについて説明したいと思います。<BR /><BR />SAP HANA Cloud, HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンへの最速のデータロード方法は、SAP HANA Cloud, data lake リレーショナルエンジンから、SAP HANA Cloud, HANA データベースの物理テーブルを指定するプロキシテーブルを作成するために「create existing local temporary table」を使用して HANA&nbsp; テーブルから SELECT で INSERT 文を実行する方法です。<BR />(詳細は以下のテーブル参照)<BR /><BR />&nbsp;</P><TABLE><TBODY><TR><TD width="156"><STRONG>方法</STRONG></TD><TD width="156"><STRONG>行</STRONG></TD><TD width="156"><STRONG>データサイズ</STRONG></TD><TD width="156"><STRONG>時間 (秒)</STRONG></TD></TR><TR><TD width="156">HANA Cloud, data lake/IQ INSERT..SELECT</TD><TD width="156">28,565,809</TD><TD width="156">3.3 GB</TD><TD width="156"><STRONG>52.86</STRONG></TD></TR><TR><TD width="156">*HANA Cloud, data lake/IQ<BR />LOAD<BR />Azure ファイルシステム</TD><TD width="156">28,565,809</TD><TD width="156">3.3 GB</TD><TD width="156">116 (1分56秒)</TD></TR><TR><TD width="156">*HANA Cloud, data lake/IQ<BR />LOAD<BR />Data Lake ファイルシステム</TD><TD width="156">28,565,809</TD><TD width="156">3.3 GB</TD><TD width="156">510 (8分30秒)</TD></TR><TR><TD width="156">HANA INSERT..SELECT</TD><TD width="156">28,565,809</TD><TD width="156">3.3 GB</TD><TD width="156">1277 (21分7秒)</TD></TR></TBODY></TABLE><P><BR />&nbsp;<BR /><BR />* HANA データベースからファイルシステムへのデータエクスポート時間は含めていません。<BR /><BR />28,565,809 行、約 3.3 GB の TPC-D ORDERS テーブルを使用し、SAP HANA Cloud, data lake リレーショナルエンジンの小さめの設定でロードしています。 <STRONG>&nbsp;</STRONG><BR /><BR />&nbsp;</P><H2 id="toc-hId--18611425">&nbsp;</H2><P>&nbsp;</P><H2 id="toc-hId--215124930">以下の SAP HANA Cloud 設定を使用してテストしました</H2><P><BR />&nbsp;<BR /><BR />SAP HANA Cloud, HANA データベース:60 GB / 200 GB、4 vCPU<BR /><BR />SAP HANA Cloud, data lake リレーショナルエンジン:16 TB、ワーカー 8 vCPU /コーディネーター 8 vCPU<BR /><BR />SAP HANA Cloud, data lake リレーショナルエンジンでは、より多くの並列処理を実行するには(特により大きなテーブルの場合)より多くの vCPU 数を設定します。<BR /><BR />より多くの TB を SAP HANA Cloud, data lake リレーショナルエンジンに追加することで、より大きなディスク I/O スループットを得ることができます。</P><H1 id="toc-hId--540721154">&nbsp;</H1><P><BR />&nbsp;</P><H1 id="toc-hId--737234659">テストで使用した詳細設定と構文</H1><P><BR />&nbsp;<BR /><BR />SAP HANA Cockpit をスタートして SAP HANA Cloud を管理します。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/03/HANACockpitManager.png" border="0" /><BR /><BR />&nbsp;<BR /><BR />SAP HANA Cloud, data lake リレーショナルエンジンから「Open in SAP HANA Database Explorer」を選択します。<BR /><BR />もしこれが初回であれば、SAP HANA Cloud, data lake リレーショナルエンジンの ADMIN パスワードを求められます。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/03/HANACockpitManager2.png" border="0" /><BR /><BR />&nbsp;<BR /><BR />SQL コマンドを入力し、クリックして実行します。<BR /><BR />&nbsp;<BR /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/03/HANADBExplorer.png" border="0" /><BR /><BR />&nbsp;</P><H2 id="toc-hId--804665445">&nbsp;</H2><P>&nbsp;</P><H2 id="toc-hId--653924593">以下を作成するための SAP HANA Cloud, data lake コマンド</H2><P><BR />&nbsp;<BR /><BR /></P><OL><OL><LI>SAP HANA Cloud, data lake リレーショナルエンジンから、SAP HANA Cloud HANA データベースへ接続しているサーバー</LI></OL></OL><P>&nbsp;</P><OL><OL><LI>データをロードして作成するためのローカルの SAP HANA Cloud, data lake リレーショナルエンジンテーブル</LI></OL></OL><P>&nbsp;</P><OL><OL><LI>SAP HANA Cloud インスタンスのテーブルを指定するローカルのテンポラリープロキシーテーブル</LI></OL></OL><P><BR /><BR />&nbsp;<BR /><BR /><STRONG>CREATE SERVER</STRONG><BR /><BR />–DROP SERVER DRHHC2_HDB<BR /><BR />CREATE SERVER DRHHC2_HDB CLASS ‘HANAODBC’ USING ‘Driver=libodbcHDB.so;ConnectTimeout=60000;ServerNode=xyxy.hana.prod-us10.hanacloud.ondemand.com:443;ENCRYPT=TRUE;ssltruststore=xyxy.hana.prod-us10.hanacloud.ondemand.com;ssltrustcert=Yes;UID=DBADMIN;PWD=xyxyx;’<BR /><BR />&nbsp;<BR /><BR />&nbsp;<BR /><BR /><STRONG>CREATE TARGET TABLE</STRONG><BR /><BR />CREATE&nbsp; TABLE REGIONPULL (<BR /><BR />R_REGIONKEY&nbsp;&nbsp; bigint&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />R_NAME&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; varchar(25)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />R_COMMENT&nbsp;&nbsp;&nbsp; varchar(152)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />primary key (R_REGIONKEY)<BR /><BR />);<BR /><BR />&nbsp;<BR /><BR /><STRONG>CREATE local temporary PROXY</STRONG><BR /><BR />create existing local temporary table REGION_PROXY (<BR /><BR />R_REGIONKEY&nbsp;&nbsp; bigint&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />R_NAME&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; varchar(25)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />R_COMMENT&nbsp;&nbsp;&nbsp; varchar(152)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />primary key (R_REGIONKEY)<BR /><BR />)<BR /><BR />at ‘DRHHC2_HDB..TPCD.REGION’;<BR /><BR />&nbsp;<BR /><BR /><STRONG>INSERT DATA</STRONG><BR /><BR />INSERT into REGIONPULL SELECT * from REGION_PROXY;<BR /><BR />Commit;<BR /><BR />–1.9s</P><H2 id="toc-hId--850438098">&nbsp;</H2><P><BR />&nbsp;</P><H2 id="toc-hId--1046951603">ORDERS テーブルテストコマンド</H2><P><BR />&nbsp;<BR /><BR />–DROP TABLE ORDERSPULL;<BR /><BR />create table ORDERSPULL (<BR /><BR />O_ORDERKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_CUSTKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERSTATUS&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_TOTALPRICE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERPRIORITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(15)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_CLERK&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(15)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_SHIPPRIORITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; INTEGER&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_COMMENT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(79) &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;not null,<BR /><BR />primary key (O_ORDERKEY)<BR /><BR />);<BR /><BR />&nbsp;<BR /><BR />create existing local temporary table ORDERS_PROXY (<BR /><BR />O_ORDERKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_CUSTKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERSTATUS&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;not null,<BR /><BR />O_TOTALPRICE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_ORDERPRIORITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(15)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_CLERK&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(15)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_SHIPPRIORITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; INTEGER&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />O_COMMENT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(79)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null<BR /><BR />)<BR /><BR />at ‘DRHHC2_HDB..TPCD.ORDERS’;<BR /><BR />&nbsp;<BR /><BR />INSERT into ORDERSPULL SELECT * from ORDERS_PROXY;<BR /><BR />Commit;<BR /><BR />–59s<BR /><BR />–52.86 s<BR /><BR />&nbsp;<BR /><BR />SELECT COUNT(*) FROM ORDERSPULL;<BR /><BR />–28,565,809</P><H2 id="toc-hId--1243465108">&nbsp;</H2><P><BR />&nbsp;</P><H2 id="toc-hId--1439978613">LINEITEM テーブルテストコマンド</H2><P><BR />&nbsp;<BR /><BR />create table LINEITEM (<BR /><BR />L_ORDERKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_PARTKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SUPPKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_LINENUMBER&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; INTEGER&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;not null,<BR /><BR />L_QUANTITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_EXTENDEDPRICE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_DISCOUNT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_TAX&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_RETURNFLAG&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_LINESTATUS&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_COMMITDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_RECEIPTDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPINSTRUCT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(25)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPMODE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(10)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_COMMENT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(44)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />primary key (L_ORDERKEY,L_LINENUMBER)<BR /><BR />);<BR /><BR />&nbsp;<BR /><BR />create existing local temporary table LINEITEM_PROXY (<BR /><BR />L_ORDERKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_PARTKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SUPPKEY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; BIGINT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_LINENUMBER&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; INTEGER&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_QUANTITY&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_EXTENDEDPRICE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_DISCOUNT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_TAX&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DECIMAL(12,2)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_RETURNFLAG&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_LINESTATUS&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(1)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_COMMITDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_RECEIPTDATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; DATE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPINSTRUCT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(25)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_SHIPMODE&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(10)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null,<BR /><BR />L_COMMENT&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; VARCHAR(44)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; not null<BR /><BR />)<BR /><BR />at ‘DRHHC2_HDB..TPCD.LINEITEM’;<BR /><BR />&nbsp;<BR /><BR />INSERT into LINEITEM SELECT * from LINEITEM_PROXY;<BR /><BR />Commit;<BR /><BR />— Rows affected:&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 114,129,863<BR /><BR />— Client elapsed time: 4 m 52 s</P><H1 id="toc-hId--1343089111">&nbsp;</H1><P>&nbsp;</P><H1 id="toc-hId--1539602616"><STRONG><BR />まとめ</STRONG></H1><P><BR />&nbsp;<BR /><BR />SAP HANA Cloud, HANA データベースから SAP HANA Cloud, data lake リレーショナルエンジンへのデータの最速のロード方法は、SAP HANA Cloud, data lake リレーショナルエンジンから、SAP HANA Cloud, HANA データベースの物理テーブルを指定するプロキシテーブルを作成するために「create existing local temporary table」を使用してHANA テーブルから SELECT で INSERT 文を実行する方法です。<BR /><BR />これは、このブログで紹介しているコマンドを使用することで、とても容易に行うことができます。<BR /><BR />あるいは、これらのコマンドを生成するプロシージャーを作成すると、さらに容易になります。(下の Daniel のブログを参照してください。)</P><H2 id="toc-hId--2029519128">&nbsp;</H2><P>&nbsp;</P><H2 id="toc-hId-2068934663">&nbsp;</H2><P>&nbsp;</P><H2 id="toc-hId-1872421158">&nbsp;</H2><P><BR />&nbsp;</P><H2 id="toc-hId-1844091344">以下も参考にしてください</H2><P><BR />Jason Hinsperger の「SAP HANA Cloud, data lakeへのデータロード」のブログでは、SAP HANA Cloud, data lake リレーショナルエンジンの vCPU 数やデータベースサイズを増やすとロードのパフォーマンスにどのような影響があるか説明しています。<BR /><BR /><A href="https://blogs.sap.com/?p=1866471" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/?p=1866471</A><BR /><BR />&nbsp;<BR /><BR />Daniel Utvich の「SAP HANA Cloud, HANA データベースから SAP HANA Cloud, data lake へのデータの高速移動」のブログでは、システムテーブル情報をベースにした SQL コードを生成するプロシージャーの例を紹介しています。<BR /><BR /><A href="https://blogs.sap.com/?p=1867099" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/?p=1867099</A><BR /><BR />&nbsp;</P><H3 id="toc-hId-1354174832">&nbsp;</H3><P>&nbsp;</P><H3 id="toc-hId-1157661327"><STRONG>SAP HANA データ戦略ブログインデックス</STRONG></H3><P><BR /><A href="https://blogs.sap.com/2019/10/14/sap-hana-data-strategy/" target="_blank" rel="noopener noreferrer">SAP HANA Data Strategy</A><BR /><BR /></P><UL><UL><LI><A href="https://blogs.sap.com/2019/11/14/sap-hana-data-strategy-hana-data-modeling-a-detailed-overview/" target="_blank" rel="noopener noreferrer">SAP HANA Data Strategy: HANA Data Modeling a Detailed&nbsp;</A><A href="https://blogs.sap.com/2019/11/14/sap-hana-data-strategy-hana-data-modeling-a-detailed-overview/" target="_blank" rel="noopener noreferrer">Overview</A></LI></UL></UL><P>&nbsp;</P><UL><UL><LI><A href="https://blogs.sap.com/2020/06/18/hana-data-strategy-data-ingestion-including-real-time-change-data-capture/?update=updated" target="_blank" rel="noopener noreferrer">HANA Data Strategy: Data Ingestion including Real-Time Change Data Capture</A><BR /><BR /><UL><UL><LI><A href="https://blogs.sap.com/2020/03/16/access-sap-erp-data-from-sap-hana-through-sdi-abap-adapter-2/" target="_blank" rel="noopener noreferrer">Access&nbsp;SAP ERP data from SAP HANA through SDI ABAP&nbsp;Adapter by&nbsp;Maxime Simon</A></LI></UL></UL><BR /><BR /></LI></UL></UL><P>&nbsp;</P><UL><UL><LI><A href="https://blogs.sap.com/2020/03/09/hana-data-strategy-data-ingestion-virtualization/" target="_blank" rel="noopener noreferrer">HANA Data Strategy: Data Ingestion – Virtualization</A></LI></UL></UL><P>&nbsp;</P><UL><UL><LI><A href="https://blogs.sap.com/2020/02/12/hana-data-strategy-hana-data-tiering/" target="_blank" rel="noopener noreferrer">HANA Data Strategy: HANA Data Tiering</A><BR /><BR /><UL><UL><LI><A href="https://blogs.sap.com/2019/06/19/store-more-with-sps04/" target="_blank" rel="noopener noreferrer">Store More with&nbsp;</A><A href="https://blogs.sap.com/2019/06/19/store-more-with-sps04/" target="_blank" rel="noopener noreferrer">SPS04 – NSE BLOG</A></LI></UL></UL><BR /><BR /></LI></UL></UL><P><BR /><BR />&nbsp;<BR /><BR /></P><HR /><P><BR /><BR />オリジナルのブログはここまでです。<BR /><BR /></P><HR /><P><BR /><BR />&nbsp;</P><P><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/data_pyramid_1-4.jpg" border="0" /><BR /><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/data_pyramid_2-4.jpg" border="0" /></P><P><BR />&nbsp;</P> 2023-12-19T07:47:55+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/using-data-lake-and-sql-to-create-custom-reporting-models/ba-p/13579874 Using Data lake and SQL to create custom reporting models 2023-12-20T10:29:19+01:00 Roney_Mathew https://community.sap.com/t5/user/viewprofilepage/user-id/147426 <P><STRONG>Overview:</STRONG> Through a series of blogs, would like to share scripts that utilize data lakes built for SAP tables, to create reporting models that represent certain sections of SAP screens/transactions or areas of analysis. Hopefully, these scripts serve as an accelerator to cater multiple use cases.For this first script we'll look at building User Status using JCDS and JEST.</P><P>&nbsp;</P><P><STRONG><U>Background:</U></STRONG> &nbsp;Most structured reporting tools (eg:BW) or ETL processes don’t bring in all fields available in source systems, these are deployed using a predefined datamodel (dimensions/measures) that collects fields from different tables and &nbsp;limit what’s initially available for reporting, restricting the ability of Analysts to explore additional fields.</P><P>&nbsp;</P><P>Eg: Financial reporting models built using ACDOCA or BSEG or FAGLFLEXA tables- Irrespective of the approach(CDS views or BW models), these don’t bring all fields from the source as they mostly focus on meeting initial requirements from primary stakeholders.</P><P>&nbsp;</P><P>&nbsp;Additional fields maybe available in SAP transaction systems and to make them available for reporting, multiple cycles of enhancements are implemented, reflecting a dependency on different support teams and time involved to meet these requirements.</P><P>&nbsp;</P><P><STRONG><U>Solution </U></STRONG>With a data lake that replicates tables from SAP, Analysts working with functional resources can build models that meet their specific needs. If replications are managed through SAP SLT, then it enables near realtime (possible delay of a few seconds) reporting. Review must be done with functional consultants to ensure that tables being replicated dont have confidential content.</P><P>&nbsp;</P><P>As part of this blog series, we shall see some models that reflect SAP transactions or commonly used reporting metrics.</P><P>&nbsp;</P><P><STRONG><U>Factors that are not addressed in this blog but must be considered:</U></STRONG></P><P><BR /><BR /><BR /></P><OL><OL><LI>Organization of reporting models and data lake tables, if not using similar reference as SAP Application components. This becomes Important for managing confidentiality and ensuring personal information of customers, employees and vendors is only available to those that need it as part of their business roles.</LI></OL></OL><P>&nbsp;</P><OL><OL><LI>Security models needed for<BR /><BR /><OL><OL><LI>Functional areas of reporting (multiple tables grouped in an area of reporting)</LI></OL></OL><BR /><OL><OL><LI>Row based access</LI></OL></OL><BR /><OL><OL><LI>Any additional configuration needed to secure fields in tables</LI></OL></OL><BR /><BR /></LI></OL></OL><P><BR /><BR />Here's the first script:<BR /><BR /></P><OL><OL><LI><STRONG><U>Script for Plant maintenance object status</U></STRONG></LI></OL></OL><P><BR /><BR /><U>Need:</U> Near real time availability of object status’ for Plant maintenance, eg: an emergency order created for addressing critical equipment failure, the status and progress of investigation needs to be communicated through the manufacturing channels for them to manage bottlnecks in production.<BR /><BR /><U>Solution: </U>Below layout provides a simplified overview of how different tables are joined together with their respective fields.<BR /><BR />&nbsp;<BR /><BR /><STRONG>Tables used:</STRONG><BR /><BR />JEST-Individual Object Status<BR /><BR />JCDS-Change Documents for System/User Statuses (Table JEST)<BR /><BR />JSTO- Status object information<BR /><BR />TJ02-System status<BR /><BR />TJ02T - System status texts<BR /><BR />TJ04- Status control for object type<BR /><BR />TJ30- User Status<BR /><BR />TJ30T- Texts for User Status</P><P><IMG src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Object-status-table-overview-1.jpg" border="0" /></P><P>&nbsp;</P><P class="">Object status tables relationship overview</P><P><BR />&nbsp;<BR /><BR />&nbsp;<BR /><BR />Script below provides active status’ for all Plant maintenance objects . To view all instances of status changes remove the JEST.INACT is NULL clause/restriction. Each table and the filter condition starts with a comment(begins with --) to show what it represents. May have to tweak formatting based on tool being used, especially the comments section.<BR /><BR />&nbsp;</P><BLOCKQUOTE><BR /><PRE>SELECT JEST.OBJNR AS OBJECT_NUMBER, JSTO.OBTYP AS OBJECT_CATEGORY, SUBSTR(JEST.OBJNR, 3) AS OBJECT, JEST.STAT AS OBJECT_STATUS, (CASE WHEN LEFT(JEST.STAT, 1) = ‘I’ THEN ‘SYSTEM’ ELSE ‘USER’ END) ASSTATUS_TYPE, (CASE WHEN LEFT(JEST.STAT, 1) = ‘I’ THEN TJ02T.TXT04 ELSE TJ30T.TXT04 END) AS STATUS_SHORT_TEXT, (CASE WHEN LEFT(JEST.STAT, 1) = ‘I’ THEN TJ02T.TXT30 ELSE TJ30T.TXT30 END) AS STATUS_LONG_TEXT, JSTO.STSMA AS STATUS_PROFILE, JCDS.USNAM AS STATUS_CHANGED_BY, JCDS.UDATE AS STATUS_CHANGED_DATE, JCDS.UTIME AS STATUS_CHANGED_TIME, JCDS.CHIND AS STATUS_CHANGED_TYPE, TJ04.INIST AS SYSTEM_STATUS_INITIAL_STATUS_FLAG, TJ04.STATP AS SYSTEM_STATUS_DISPLAY_PRIORITY, TJ04.LINEP AS SYSTEM_STATUS_LINE_POSITION, TJ02.NODIS AS SYSTEM_STATUS_NO_DISPLAY_INDICATOR, TJ02.SETONLY AS SYSTEM_STATUS_SET_ONLY_INDICATOR, TJ30.STONR AS USER_STATUS_WITH_NUMBER, TJ30.INIST AS USER_STATUS_INITIAL_STATUS_FLAG_INDICATOR, TJ30.STATP AS USER_STATUS_DISPLAY_PRIORITY, TJ30.LINEP AS USER_STATUS_LINE_POSITION, CASE WHEN TJ30.LINEP = ’01’ THEN TJ30T.TXT04 END ASPOSITION1_USER_STATUS FROM JEST --Individual object status INNER JOIN JCDS -- Change Documents for System/User Statuses (Table JEST) ON JEST.OBJNR = JCDS.OBJNR AND JEST.STAT = JCDS.STAT AND JEST.CHGNR = JCDS.CHGNR LEFT JOIN JSTO -- Status profile information for objects ON JEST.OBJNR = JSTO.OBJNR LEFT JOIN TJ02T --System status texts ON JEST.STAT = TJ02T.ISTAT AND TJ02T.SPRAS = ‘E’ LEFT JOIN TJ04 -- System status control config table 2 ON JEST.STAT = TJ04.ISTAT and TJ04.OBTYP = JSTO.OBTYP LEFT JOIN TJ30T -- User status texts ON JSTO.STSMA = TJ30T.STSMA AND JEST.STAT = TJ30T.ESTAT AND TJ30T.SPRAS = ‘E’ LEFT JOIN TJ02 ”System status config table 1 ON JEST.STAT = TJ02.ISTAT LEFT JOIN TJ30 -- User status config table 1 ON JSTO.STSMA = TJ30.STSMA AND JEST.STAT = TJ30.ESTAT WHERE JEST.INACT is NULL -- remove this to see when a status was set inactive or to get timelines for all status</PRE></BLOCKQUOTE><P><BR />&nbsp;<BR /><BR /><STRONG>Conclusion : </STRONG>Using the above code we can active status' and their respective times for all operational objects that have been configured for status tracking. Similar approach can be used to get status' for CRM using table CRM_JEST and CRM_JCDS. Remove the inactive filter to get status' that are currently not active (depending on the values are mapped in data lake i.e default value of blanks as NULLs, NULL may need to be replaced with '')<BR /><BR />&nbsp;<BR /><BR /><STRONG>Possible variations based on need</STRONG>:<BR /><BR /></P><OL><OL><LI>To plot timeline of how the operational object moved between status' use JCDS</LI></OL></OL><P>&nbsp;</P><OL><OL><LI>Restrict to certain Status profile(s) in table JSTO when requirement is to focus on certain types of objects or group</LI></OL></OL><P>&nbsp;</P><OL><OL><LI>Restrict using change date and time if the need is to focus of recent changes within the hour or day(s)</LI></OL></OL><P><BR /><BR />&nbsp;<BR /><BR />Next blog will look at details of combining details of orders and related operational tasks</P> 2023-12-20T10:29:19+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/empowering-businesses-with-new-insights-the-google-cloud-and-sap-analytics/ba-p/13580348 Empowering Businesses with New Insights: The Google Cloud and SAP Analytics Partnership 2023-12-28T12:35:19+01:00 Thisgaard https://community.sap.com/t5/user/viewprofilepage/user-id/4350 A year ago, the tech giants Google Cloud and SAP embarked on a journey to revolutionize data analytics for businesses. Their goal: to bring together&nbsp;SAP systems and data with Google’s data cloud, offering customers better insights for decision making and innovation. The new SAP Datasphere Replication Flow connector for Google BigQuery is now<A href="https://blogs.sap.com/2023/11/16/replication-flows-sap-datasphere-and-google-big-query/" target="_blank" rel="noopener noreferrer"> available</A>.<BR /> <BR /> From the outset, customers have been excited about the potential of this partnership: integrating SAP's robust data models and real-time processes with Google BigQuery's comprehensive real-time data streams, including search engine data, weather data, marketing data, and customer event data, to inspire new and better ways to do business.<BR /> <BR /> Prior to this collaboration, businesses found it challenging to merge data models from either side due to cost, effort, and time. This partnership aims to eliminate these hurdles, providing real-time data streams that adjust dynamically to changes from Google Cloud and SAP. Haridas Nair, the Head of Cross Product Management for Database and Analytics at SAP, stated, "Customers using SAP Business Technology Platform can now extend the reach of curated and modeled SAP business data for downstream consumption with SAP Datasphere Replication Flow. The integration with BigQuery now enables customers to combine SAP business data with Google BigQuery data. This enables new use cases that can unleash significant business value."<BR /> <BR /> For example, while enterprises rely on SAP S/4HANA for their financial planning, reporting, and budgeting there are many that also have finance data coming in other systems. Join ventures, new acquisitions or decentralized business models are common cases where finance data will reside in non-SAP S/4HANA systems. Early adopters of such companies are leveraging the Google Cloud and the SAP Datasphere Replication Flow connector to unify accounting data insights into SAP Datasphere to get a single financial dashboard across all their financial sources and thereby enable secure, self-service access to reusable data models, and streamline financial reporting. The result is enhanced analytics that enable new market correlations to transpire, as well as reporting efficiency and reduced data management costs. As such finance and operations experts can receive new insights that improve their business planning.<BR /> <BR /> The other common use case relates to Consumer Products and Retail companies. A prime example is a North American consumer products company selling through retailers as well as their online platform. As many other companies in this space, they're investing in brand loyalty and scaling their product portfolio to target different customer segments. The company strives to corelate online customer trends with their retail channel sales using demographics and other consumer data.<BR /> <BR /> Their business goals involve improving channel inventory turns, trade promotion management, shelf-availability, SKU margins, and overall understanding of customer buying behavior. Simultaneously, they aim to reduce the cost, effort, and time for accessing SAP data and enable richer SAP ERP data in real-time.<BR /> <BR /> To achieve these goals, they have connected Google Cloud and SAP data to gain better insight into their retail channels and to improve their demand forecasting and supply chain algorithms. The more real-time the connectivity between their SAP data and Google BigQuery data, the more confident they'll be in the predictive algorithms they adopt for their supply chain.<BR /> <BR /> But this is just the beginning. The Google Cloud and SAP Analytics partnership opens the door to a wide range of strategic customer and supply chain programs that leverage data for advanced predictive demand models. Early examples of customer innovations include true customer 360 insight, improved sales performance, yield-driven pricing, new product introductions, unifying external accounting with SAP, manufacturing automation, and operationalizing sustainability.<BR /> <BR /> Both of the Google Cloud and SAP development organizations are excited to see that their work is making a difference. Future releases plan to include more advanced features for managing enterprise scale federation, replication and data catalogs between their respective data platforms. Honza Fedak, Director of BigQuery Engineering at Google Cloud stated, "The combination of Google Cloud's data and AI expertise and SAP's deep understanding of business data is a powerful force that can help businesses unlock the full potential of their data."<BR /> <BR /> As more enterprises utilize this partnership, we are confident that the Google Cloud and SAP analytics partnership can provide better insights, enable better decisions, and foster innovation. This partnership is a significant step towards creating more Intelligent Enterprises, and we hope your enterprise will be one of them. 2023-12-28T12:35:19+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/integration-options-for-moving-data-from-sap-into-databricks/ba-p/13793459 Integration Options for moving data from SAP into Databricks 2024-08-13T19:58:39.925000+02:00 STALANKI https://community.sap.com/t5/user/viewprofilepage/user-id/13911 <P><FONT size="5"><STRONG>Background</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">This blog delves into the various methods for integrating data from your SAP systems into Databricks. This exploration is particularly relevant given SAP's recent announcement of SAP Datasphere in March 2023.&nbsp;This collaboration aims to provide businesses with the power of federated AI-driven analytics. This will allow them to effortlessly analyze structured and unstructured data from both SAP and non-SAP sources within a single, unified platform.</P><P class="lia-align-justify" data-unlink="true" style="text-align : justify;">However, I am not going to discuss how we integrate SAP systems into Databricks via Datasphere or BW4 HANA as we already have great blogs on&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/replication-flow-blog-series-part-5-integration-of-sap-datasphere-and/ba-p/13604976" target="_self">datasphere</A>&nbsp;and <A href="https://community.sap.com/t5/technology-blogs-by-members/sap-bw-hana-to-databricks-via-sap-di-i/ba-p/13580075" target="_self">BW4HANA.</A></P><P class="lia-align-justify" data-unlink="true" style="text-align : justify;">This blog will explore options for migrating data from SAP to Databricks without relying on Datasphere or BW4HANA, even though licensing for transferring SAP data to non-SAP systems might still be necessary.</P><P class="lia-align-justify" data-unlink="true" style="text-align : justify;"><FONT size="4"><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="datalake.jpg" style="width: 470px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/151264i13412C5FE72F646F/image-dimensions/470x527?v=v2" width="470" height="527" role="button" title="datalake.jpg" alt="datalake.jpg" /></span></SPAN></FONT></P><P data-unlink="true"><FONT size="5"><STRONG>Integration Options</STRONG></FONT></P><P data-unlink="true">In this blog, I am discussing 4 different options to move data from SAP into data bricks.</P><P data-unlink="true"><FONT size="5"><STRONG>SAP Data Services ETL Integration</STRONG></FONT></P><P>We can leverage the popular ETL tool SAP Data Services to move data between SAP and Databricks.</P><P>While a direct integration between SAP Data Services and Databricks might not be readily available, you can establish a connection using intermediary stages and leveraging data transfer mechanisms. Here are a few approaches:</P><P><STRONG>File-Based Integration:&nbsp;</STRONG> Initiate the integration by designing and running data extraction jobs within SAP Data Services. These jobs should be configured to export your SAP data in formats readily consumable by Databricks, such as CSV, Parquet, or Avro. Once exported, these files can be seamlessly transferred to a storage service[Ex:&nbsp;Azure Blob Storage or AWS S3, as well as shared file systems] accessible by Databricks.&nbsp;</P><P><STRONG>Database Staging: </STRONG>Optimize your data pipeline by using SAP Data Services to efficiently load extracted and transformed data directly into a staging database readily accessible by Databricks.&nbsp;Suitable options for this staging database include Azure SQL Database, Amazon Redshift, or similar platforms. Once the data is in the staging area, establish a connection between Databricks and the database using&nbsp; Spark JDBC connectors or Azure Synapse native connectors and map the respective tables.<BR /><BR /><STRONG>Custom Integration using APIs:&nbsp;</STRONG>Investigate the availability of APIs or SDKs provided by both SAP Data Services and Databricks. Develop custom scripts or applications using languages like Python or Java to&nbsp; extract data from SAP Data Services and transfer it to Databricks using their respective APIs.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SAP Databricks Integration Options v.1.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/151251iEB94B4428CC11D04/image-size/large?v=v2&amp;px=999" role="button" title="SAP Databricks Integration Options v.1.jpg" alt="SAP Databricks Integration Options v.1.jpg" /></span></P><P class="lia-align-justify" style="text-align : justify;"><FONT face="trebuchet ms,geneva" size="5"><STRONG>SAP SLT Integration</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">Replicating SAP data to external systems using SAP SLT can be complex, but leveraging HANA as a staging area provides a pathway for efficient real-time replication. By establishing connectivity through JDBC Spark or SDI HANA connectors, you can move data into Databricks for AI based predictive analytics.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Option2DB.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/151255i608F834D0F1E5492/image-size/large?v=v2&amp;px=999" role="button" title="Option2DB.jpg" alt="Option2DB.jpg" /></span></P><P><FONT face="trebuchet ms,geneva" size="5"><STRONG>Event Based Messaging</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">Set up SAP BTP Integration Platform to capture real-time data changes from your SAP system, leveraging Change Data Capture (CDC) mechanisms or APIs for seamless data extraction. Then, integrate SAP BTP&nbsp;Integration Platform&nbsp;with a message queue or streaming platform like Apache Kafka or Azure Event Hubs to reliably publish these captured data changes. Databricks can then tap into these data streams using its robust streaming capabilities, subscribing to and consuming the data from the message queue.</P><P class="lia-align-justify" style="text-align : justify;">This approach empowers you with near real-time data ingestion and analysis capabilities within Databricks. For additional flexibility, consider incorporating HANA Cloud as an optional staging area to further transform and prepare your data before it's loaded into Databricks.</P><P><FONT face="trebuchet ms,geneva" size="5"><STRONG><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DatabricksKafka.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/151258i3AB70371847AAA0A/image-size/large?v=v2&amp;px=999" role="button" title="DatabricksKafka.png" alt="DatabricksKafka.png" /></span></STRONG></FONT></P><P><FONT face="trebuchet ms,geneva" size="5"><STRONG>SNP GLUE</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">SNP Glue is another product that can be used to replicate data from SAP platforms into cloud platforms.&nbsp;While that particular product might have limitations in terms of advanced transformation capabilities, it's essential to investigate its compatibility with other cloud solutions like SuccessFactors and Ariba to ensure a comprehensive integration strategy.</P><P class="lia-align-justify" style="text-align : justify;">&nbsp;</P><P class="lia-align-justify" style="text-align : justify;"><FONT face="trebuchet ms,geneva" size="5"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SNP Glue.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/151263iB02B6E9BCAC781ED/image-size/large?v=v2&amp;px=999" role="button" title="SNP Glue.jpg" alt="SNP Glue.jpg" /></span></FONT></P><P><FONT size="5"><STRONG>Key Considerations</STRONG></FONT></P><P class="lia-align-justify" style="text-align : justify;">We need consider the following factors when choosing the right tool :</P><P class="lia-align-justify" style="text-align : justify;"><STRONG>Data Volume and Frequency:</STRONG> The chosen integration method should align with the volume of data being transferred and the desired frequency of updates.</P><P class="lia-align-justify" style="text-align : justify;"><STRONG>Data Transformation:</STRONG> Determine whether data transformations are necessary before loading into Databricks and whether these transformations are best performed within SAP Data Services or using Databricks' data manipulation capabilities.</P><P class="lia-align-justify" style="text-align : justify;"><STRONG>Security and Access Control:</STRONG>Implement appropriate security measures to protect data during transfer and storage, ensuring secure access to both SAP Data Services and Databricks.</P><P class="lia-align-justify" style="text-align : justify;"><STRONG>Data Latency Requirements: </STRONG>Determine the acceptable latency for data availability in Databricks. The streaming approach offers near real-time capabilities, while the intermediate database approach might involve some delay</P><P class="lia-align-justify" style="text-align : justify;">As you embark on your SAP-Databricks integration journey, carefully consider your specific needs, data characteristics, and latency requirements to select the optimal approach for your business. With a well-planned strategy and the right tools in place, you can harness the combined power of SAP and Databricks for AI powered federated analytics.</P> 2024-08-13T19:58:39.925000+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/consuming-data-from-datasphere-to-azure-data-factory-via-odbc/ba-p/13869551 Consuming Data from Datasphere to Azure Data Factory via ODBC 2024-09-18T13:43:15.587000+02:00 vignesh3027 https://community.sap.com/t5/user/viewprofilepage/user-id/160733 <P><STRONG>Prerequisites:</STRONG></P><UL><LI><STRONG>Access:</STRONG> Access to ADF and Datasphere.</LI><LI><STRONG>Credentials:</STRONG> Datasphere and ADF credential details.</LI></UL><H2 id="toc-hId-1049069880"><STRONG>Connect Datasphere to Azure Data Factory</STRONG></H2><H2 id="toc-hId-852556375"><STRONG>DATASPHERE PART:</STRONG></H2><UL><LI>Log in to Datasphere -&gt; Space Management -&gt; Choose the space and select Edit<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_24-1726656894703.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167873i670A26213FBEDD04/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_24-1726656894703.png" alt="vignesh3027_24-1726656894703.png" /></span><P>&nbsp;</P></LI><LI>Click Create and Make sure that you have enabled Expose for consumption by default<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_25-1726656921632.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167874i5A227EA201C8241C/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_25-1726656921632.png" alt="vignesh3027_25-1726656921632.png" /></span></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_26-1726656953296.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167875i83CADA8A2CB82EA3/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_26-1726656953296.png" alt="vignesh3027_26-1726656953296.png" /></span></P><UL><LI>Copy the Database Username, Hostname, Port, Password<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_27-1726657059047.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167877i00A6C9F2E1F77457/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_27-1726657059047.png" alt="vignesh3027_27-1726657059047.png" /></span><P>&nbsp;</P></LI><LI>Go to System-&gt; Configuration-&gt; IP Allowlist-&gt; Trusted Ips<BR />EXTERNAL IPV4 ADDRESS should be added here, not Internal IPV4<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_28-1726657167838.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167879iA19DF6F9E2CD2BD7/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_28-1726657167838.png" alt="vignesh3027_28-1726657167838.png" /></span><P>&nbsp;</P></LI><LI>To get an External IPV4 Address, use this URL:&nbsp; <SPAN><A href="https://whatismyipaddress.com/" target="_blank" rel="noopener nofollow noreferrer">What Is My IP Address - See Your Public Address - IPv4 &amp; IPv6</A></SPAN></LI><LI><STRONG>Add and Save </STRONG>the External ipv4 address in the Datasphere’s IP Allowlist.<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_29-1726657199288.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167881i950182F9C33AF190/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_29-1726657199288.png" alt="vignesh3027_29-1726657199288.png" /></span></LI></UL><H3 id="toc-hId-785125589"><STRONG>ODBC PART:</STRONG></H3><UL><LI>Need to install SAP HDODBC driver <SPAN><A href="https://tools.eu1.hana.ondemand.com/#hanatools" target="_blank" rel="noopener nofollow noreferrer">SAP Development Tools (ondemand.com)</A></SPAN> in the system.</LI><LI>Open ODBC in the system</LI><LI>Click Add</LI><LI>Select HDODBC<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_30-1726657244135.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167882i83514388B627B176/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_30-1726657244135.png" alt="vignesh3027_30-1726657244135.png" /></span><P>&nbsp;</P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_31-1726657274364.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167884iDEF7EEAFE2DED4A1/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_31-1726657274364.png" alt="vignesh3027_31-1726657274364.png" /></span><P class="lia-align-center" style="text-align: center;">&nbsp;</P></LI></UL><UL><UL><LI>Give any meaningful name to Data source name, description.</LI></UL></UL><UL><UL><LI>Database type: SAP HANA Cloud or SAP HANA Single tenant (both will work fine).</LI></UL></UL><UL><UL><LI>Already copied Host URL in datasphere space, Paste the copied Host URL.</LI></UL></UL><UL><UL><LI>Click <STRONG>Test connection</STRONG></LI></UL></UL><UL><UL><LI>Paste the Database username in the Username and Password<span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_32-1726657318730.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167887iF74C597CF9999BF5/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_32-1726657318730.png" alt="vignesh3027_32-1726657318730.png" /></span><P>&nbsp;</P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_33-1726657370908.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167892iA02F4188CCA9564E/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_33-1726657370908.png" alt="vignesh3027_33-1726657370908.png" /></span><P>&nbsp;</P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_34-1726657394852.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/167893i173324864AA39077/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_34-1726657394852.png" alt="vignesh3027_34-1726657394852.png" /></span><P>&nbsp;</P></LI></UL></UL><H3 id="toc-hId-588612084"><STRONG>AZURE DATA FACTORY PART:</STRONG></H3><P><STRONG>Open Azure Data Factory</STRONG>:</P><UL><LI>Go to your Azure Data Factory instance via the Azure Portal.</LI></UL><P><STRONG>Create a Linked Service</STRONG>:</P><UL><LI>On the left pane, go to <STRONG>Manage</STRONG> &gt; <STRONG>Linked services</STRONG>.</LI><LI>Click <STRONG>New</STRONG> to create a new Linked Service.</LI><LI>In the search box, search for <STRONG>ODBC</STRONG> or <STRONG>SAP HANA</STRONG>.</LI><LI>Select <STRONG>ODBC</STRONG></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_1-1727088001141.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/169743iA50A4BD9D0CADAF6/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_1-1727088001141.png" alt="vignesh3027_1-1727088001141.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P><STRONG>Enter Connection Information</STRONG>:</P><UL><LI>In the <STRONG>Connection String</STRONG> field, enter the connection string:</LI></UL><P>Driver={HDBODBC};ServerNODE=XXXXXXXXXX:443;UID=SAP_CONTENT#XXXX;PWD=XXXXXXXXXXXX;SCHEMA=SAP_CONTENT;</P><P><STRONG>Breakdown of Components:</STRONG></P><UL><LI>Driver: Specifies the SAP HANA ODBC driver needed to connect (e.g., {HDBODBC}).</LI><LI>ServerNODE: Indicates the SAP HANA server address and port to connect to (e.g., rw2922...443).</LI><LI>UID: The username used to authenticate to SAP HANA (e.g., SAP_CONTENT#XXXX).</LI><LI>PWD: The password associated with the provided username for authentication.</LI><LI>SCHEMA: The specific database schema where the data is located (e.g., SAP_CONTENT).</LI><LI>Encrypt: Ensures the connection is encrypted for secure communication (e.g., Encrypt=1).</LI></UL><P><STRONG>Authentication Type</STRONG>:</P><UL><LI>Choose <STRONG>Basic Authentication</STRONG> since you’re using a username and password.</LI><LI>Enter the copied Datasphere’s Space username and password.</LI></UL><P><STRONG>Test the Connection</STRONG>:</P><UL><LI>Click on the <STRONG>Test Connection</STRONG> button to make sure the connection is successful.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_0-1727087847159.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/169740i4A19DDC4CBF6B01D/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_0-1727087847159.png" alt="vignesh3027_0-1727087847159.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><UL><LI>To check the connection, create a copy pipeline and choose the ODBC as the connector we used to get data from Datasphere and the corresponding Destination environment.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_3-1727088706719.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/169751iA390520116BC0E28/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_3-1727088706719.png" alt="vignesh3027_3-1727088706719.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><UL><LI>In this case, I have chosen the <STRONG>Azure Data Lake Storage Gen 2</STRONG> as the Destination environment.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_1-1727088427086.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/169746i62CD0EECA0EB3B8F/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_1-1727088427086.png" alt="vignesh3027_1-1727088427086.png" /></span></P><P>&nbsp;</P><UL><LI>Login to Azure Data Lake Storage via Azure platform and check that the data copied</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_2-1727088654728.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/169749i829313BE7C531B54/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_2-1727088654728.png" alt="vignesh3027_2-1727088654728.png" /></span></P><P>&nbsp;</P><P> </P><P>&nbsp;</P><UL><LI>Hence the connection is established successfully from Datasphere to Azure Data Factory.</LI></UL><P>HINTS: <span class="lia-unicode-emoji" title=":grinning_face:">😀</span>🤫</P><P>If your connection failed, then you have to check these two things,</P><OL><LI>Make sure your current IPV4 is added to Datasphere's IP Allowlist.</LI><LI>Ensure you have entered the correct Datasphere's Space credentials in your systems ODBC and test connection.</LI></OL><P>Thank you!</P> 2024-09-18T13:43:15.587000+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-community-question-in-your-opinion-what-s-the-biggest-use-case-for/ba-p/13936650 SAP Community Question: In your opinion, what's the biggest use-case for Blockchain at SAP Customers 2024-11-13T09:09:19.281000+01:00 AndySilvey https://community.sap.com/t5/user/viewprofilepage/user-id/1397601 <P>Good Morning SAP Community,</P><P><STRONG><EM>in your opinion what is _the_ biggest use case for Enterprise Blockchain / Distributed Ledger Technology at SAP Customers ?</EM></STRONG></P><P>Anybody following my blogs will know <A href="https://community.sap.com/t5/technology-blogs-by-members/why-i-love-sap-and-blockchain-databases-and-why-you-should-too/ba-p/13625869" target="_self">I like Enterprise Blockchain</A>, and am interested in <A href="https://community.sap.com/t5/technology-blogs-by-members/sap-enterprise-architecture-positioning-blockchain-database-as-an/ba-p/13629842" target="_self">positioning Enterprise Blockchain as a Technology Standard</A> for <A href="https://community.sap.com/t5/technology-blogs-by-members/sap-enterprise-architecture-let-the-use-case-find-the-blockchain/ba-p/13632458" target="_self">use-cases and business demands</A> at SAP Customers enabling them to <A href="https://community.sap.com/t5/technology-blogs-by-members/running-your-own-blockchain-on-the-sap-btp-kyma-trial-a-hands-on-how-to/ba-p/13724580" target="_self">run</A> Enterprise Blockchain.</P><P>I talk to a lot of people about this and regularly get asked the same question, what is _the_ BIG use case for Blockchain in the Enterprise and at SAP Customers ?</P><P>So, the floor is open, the comments are open, there is no wrong answer, and everybody is invited to give their opinion on,</P><P><STRONG>for SAP Customers, what is the biggest use case for Blockchain ?</STRONG></P><P>Over to you, feel welcome to put your thoughts in the comments and let's see what we discover, and, there is no wrong answer <span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span></P><P>Andy Silvey.</P> 2024-11-13T09:09:19.281000+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/beyond-the-data-silo-the-convergence-of-sap-systems-and-modern-data-lake/ba-p/14011954 Beyond the Data Silo: The Convergence of SAP Systems and Modern Data Lake Technologies 2025-02-10T04:53:39.311000+01:00 rugved88 https://community.sap.com/t5/user/viewprofilepage/user-id/842167 <P><SPAN>In today's enterprise landscape, a fascinating transformation is underway. While SAP systems continue to serve as the backbone of business operations, organizations are discovering that traditional approaches to data management no longer meet the demands of our rapidly evolving digital economy. The challenge isn't just about managing data – it's about turning it into actionable intelligence at the speed of business.</SPAN></P><H2 id="toc-hId-1702688267"><STRONG>The Evolution of SAP Data Architecture</STRONG></H2><P><SPAN>The SAP ecosystem has long been the backbone of enterprise operations, housing critical business data across its various modules. However, today's digital economy demands more than just robust transaction processing – it requires seamless integration of internal and external data sources, real-time analytics, and AI-driven insights.</SPAN></P><P><SPAN>The current state of enterprise data presents an intriguing paradox. Consider a typical global manufacturing Company X: Multiple SAP instances span continents, each running critical modules like FI/CO, MM, SD, and PP. In Europe, teams work with one set of customizations and configurations, while their Americas counterparts operate with different chart of accounts and business rules. What seems like a simple question about global inventory levels can spiral into a days-long exercise involving multiple teams, countless Excel sheets, and lengthy email threads.</SPAN></P><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="system-architecture_.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/223916iA3D5AA05D3AD969A/image-size/large?v=v2&amp;px=999" role="button" title="system-architecture_.png" alt="system-architecture_.png" /></span></P><P class="lia-align-center" style="text-align: center;"><SPAN>Traditional-architecture</SPAN></P><P><SPAN>These challenges extend beyond just SAP systems. The integration of external data sources – IoT sensors, market data feeds, social media analytics – adds layers of complexity to an already intricate landscape. The time lag between data creation and insight generation has become a critical bottleneck for business agility.</SPAN></P><H2 id="toc-hId-1506174762"><STRONG>The Promise of Modern Data Lakes</STRONG></H2><P><SPAN>Modern data lake technologies represent a paradigm shift in how enterprises can manage and utilize their data. These platforms bring transformative capabilities through their support for industry-standard formats, enabling seamless data exchange across the enterprise. Their flexible schema management adapts to changing business needs, while enterprise-grade reliability is ensured through ACID transactions. The ability to access and restore historical data states, combined with unified processing for both batch and streaming workloads, creates a robust foundation for next-generation enterprise data management.</SPAN></P><P>&nbsp;</P><H2 id="toc-hId-1309661257"><STRONG>The Convergence: A New Enterprise Data Architecture</STRONG></H2><P><SPAN>The future lies in architectures that bring together the best of both worlds – SAP's robust business processes and the flexibility of modern data lakes. This next-generation architecture reimagines how enterprise data flows and interacts:</SPAN></P><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Untitled diagram-2025-02-09-090222.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/223917i91AFA408420F800C/image-size/large?v=v2&amp;px=999" role="button" title="Untitled diagram-2025-02-09-090222.png" alt="Untitled diagram-2025-02-09-090222.png" /></span></P><P class="lia-align-center" style="text-align: center;"><SPAN>Future-architecture</SPAN></P><P><SPAN>&nbsp;This convergence enables a new approach to data governance, where metadata management, data quality rules, and lineage tracking are centralized across both SAP and non-SAP data. Organizations can achieve zero-latency access to operational data with instant synchronization across systems, while performing real-time analytics without impacting transaction systems.</SPAN></P><P><SPAN>Perhaps most importantly, this convergence creates an AI-ready data foundation. By standardizing data formats for AI/ML workloads and enabling the integration of structured and unstructured data, organizations can fully leverage the power of large language models and generative AI across their enterprise data landscape.</SPAN></P><H2 id="toc-hId-1113147752"><STRONG>Practical Implementation Strategies</STRONG></H2><P><SPAN>Organizations looking to modernize their SAP data architecture should begin with a focus on high-value scenarios that require integrated data, particularly in areas where real-time insights drive business value. Projects that combine SAP and external data often yield the most significant returns and provide valuable learning experiences for teams. The foundation of this modernization requires implementing modern data lake capabilities alongside existing systems, while establishing unified governance frameworks. Creating semantic layers that abstract technical complexity enables broader adoption across the organization. Organizations must also prepare their data structures for AI workloads, implementing robust data quality measures and building pipelines for continuous data refreshes.&nbsp;</SPAN></P><H2 id="toc-hId-916634247"><STRONG>Conclusion: Preparing for Tomorrow</STRONG></H2><P><SPAN>The convergence of SAP systems with modern data lake technologies isn't just a technical evolution – it's a business imperative. Organizations that successfully navigate this transformation will find themselves better positioned to accelerate innovation through integrated data access, improve decision-making with real-time insights, enable AI-driven business processes, and maintain competitive advantage in the digital economy.</SPAN></P><P><SPAN>The question isn't whether to embrace this transformation, but how to implement it in a way that maximizes business value while minimizing disruption. As technology leaders, our role is to guide our organizations through this evolution, ensuring we build data architectures that are not just modern, but future-ready.</SPAN></P><P><SPAN>What steps is your organization taking to modernize its enterprise data architecture? The journey toward unified, AI-ready data platforms is just beginning, and the decisions we make today will shape our ability to compete tomorrow.</SPAN></P><P><BR /><BR /></P> 2025-02-10T04:53:39.311000+01:00 https://community.sap.com/t5/sap-for-utilities-blog-posts/announcing-the-water-track-at-the-sap-for-energy-and-utilities-conference/ba-p/14012654 Announcing the Water Track at the SAP for Energy and Utilities Conference 2025 2025-02-10T16:01:41.866000+01:00 MiquelCarbo https://community.sap.com/t5/user/viewprofilepage/user-id/176877 <P>For the third consecutive year, we are happy to announce a dedicated track for water and wastewater companies in the <STRONG>SAP for Energy and Utilities Conference</STRONG> <A href="https://tac-insights.com/events/sap-for-energy-and-utilities-conference/agenda/" target="_self" rel="nofollow noopener noreferrer">agenda</A> for this year.</P><P>It will take place on <A href="https://tac-insights.com/events/sap-for-energy-and-utilities-conference/agenda/" target="_blank" rel="noopener nofollow noreferrer">Wednesday afternoon April 9th</A>&nbsp;with the aim to recognize the success of this section in previous conference editions and continue giving floor to a selection of exceptional professionals not only presenting, but building relationships and developing partnerships among this industry group.</P><P>Shifting from an agenda designed last year to show the concept architecture built by SAP to support water management processes to help organizations leveraging their software investments to tackle processes such as smart meter data management, named the “Smart Platform for Water”, current year be focused on cases where SAP portfolio of applications and technologies can help organizations confront their challenges and obtain operational benefits.</P><P><STRONG><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="inge opreel.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/224312iAA643991ED6C35D0/image-size/large?v=v2&amp;px=999" role="button" title="inge opreel.jpg" alt="inge opreel.jpg" /></span></STRONG></P><P><STRONG>Mrs Inge Opreel, Farys CIO</STRONG></P><P>I am delighted to inform you about the presentions of the Water Track:</P><UL><LI><STRONG>SAP for Water and Wastewater Services: Building the Resilient Water Company (SAP Keynote)</STRONG></LI><LI><STRONG>ESRI ArcGIS on HANA: Building Next Generation of Integrated Asset Management at FARYS</STRONG></LI><LI><STRONG>Streaming Service Operations: SAP Field Service Management (FSM) Integration at Vitens</STRONG></LI></UL><P>But the agenda of water-specific items will not be restricted to this track but will be highly recommended water-specific or water-relevant conferences before and after the Water Track which will configure an entire agenda specifically for delegates of this industry:</P><P><STRONG><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="business data cloud.jpg" style="width: 984px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/224313i4516DF41540B8131/image-size/large?v=v2&amp;px=999" role="button" title="business data cloud.jpg" alt="business data cloud.jpg" /></span></STRONG></P><P><STRONG>SAP Business Data Cloud</STRONG></P><P><STRONG>Wednesday morning, April 9th&nbsp;</STRONG>(before the Water Track): suggested slots unveil the large ongoing updates in the portfolio of Asset Management, including an interesting use case of Asset Performance Management as well as an excellent presentation by SAP Product Development of the evolution of SAP analytical&nbsp; portfolio, this is Business Data Cloud: &nbsp;&nbsp;</P><UL><LI><STRONG>Latest Innovations in SAP Asset Management </STRONG>(Asset Management track)</LI><LI><STRONG>Business Data Cloud – Unlocking the Power of Unified Data</STRONG></LI><LI><STRONG>SAP Asset Performance Management (APM) at AkerBP:&nbsp; From Test Environment to Production</STRONG></LI></UL><P><STRONG>Thursday morning, April 10th</STRONG> will be a continuation of the water-specific cases, presenting the application of AI with SAP technology platform at FARYS, a live demonstration at the Pop-up Campus (which will require a pre-booking)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-left" image-alt="SAP Pop Up Campus.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/224365iFE8B27A77E990EF6/image-size/medium?v=v2&amp;px=400" role="button" title="SAP Pop Up Campus.png" alt="SAP Pop Up Campus.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P><EM>Inside of SAP Pop-Up Campus</EM></P><P>A very interesting use case of RISE conversion, from a perspective very close to the reality of many water companies'.</P><UL><LI><STRONG>Unlocking the Potential of AI with SAP Business Technology Platform (BTP) and Datasphere</STRONG> (SAP Microforum at 10:30AM&nbsp;</LI><LI><STRONG>A Day in a Life of a Water Drop</STRONG>: <STRONG>Deep Dive into the Benefits of Asset Management at the Lifecycle of a Waterdrop</STRONG> (session needs to be pre-booked via the app)</LI><LI><STRONG>Transformation from OnPrem to SAP RISE</STRONG>: <STRONG>Lessons from Andel’s Journey</STRONG> (AI Track)</LI></UL><P>Overall, the complete conference is designed for an era based in<STRONG> networks, relationships and reputation, where collaboration and developing partnerships is the key of success.</STRONG> But it is known that such relationships cannot be automated <SPAN>and the soul of this track, and the overall water community formation, is the inheritance of the Executive Value Network for Water and Waste Water Companies workshops. This is a regular meeting of water companies taking place in Europe (already six editions), Latin America (three) and now Africa. Participants highlight common challenges and go along with case studies of how SAP customers are leveraging SAP solutions to address them. </SPAN><STRONG>Take this unique opportunity to connect with your peers, gain insights into industry trends, and chart your development plan with the latest technologies of 2025!</STRONG></P><P>&nbsp;Looking forward to seeing you in Rotterdam!</P><P>&nbsp;<A href="https://tac-insights.com/events/sap-for-energy-and-utilities-conference/" target="_self" rel="nofollow noopener noreferrer"><span class="lia-inline-image-display-wrapper lia-image-align-left" image-alt="EUC Banner Register Now Email Campaign 2.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/224349i21517B4D8ADDD6D3/image-size/medium?v=v2&amp;px=400" role="button" title="EUC Banner Register Now Email Campaign 2.png" alt="EUC Banner Register Now Email Campaign 2.png" /></span></A></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2025-02-10T16:01:41.866000+01:00 https://community.sap.com/t5/technology-blog-posts-by-sap/unlock-the-power-of-cloud-ai-and-business-transformation-at-sap-sapphire/ba-p/14074005 🚀 Unlock the Power of Cloud, AI, and Business Transformation at SAP Sapphire 2025 in Madrid 2025-04-12T09:23:10.254000+02:00 oliverhuschke https://community.sap.com/t5/user/viewprofilepage/user-id/35096 <P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AdobeStock_1351012730.jpeg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/249980iA3359038D5A989D0/image-size/large?v=v2&amp;px=999" role="button" title="AdobeStock_1351012730.jpeg" alt="AdobeStock_1351012730.jpeg" /></span></SPAN></P><P><SPAN>From <A href="https://www.sap.com/events/sapphire/madrid.html" target="_self" rel="noopener noreferrer"><STRONG>May 26 to 28, SAP Sapphire in Madrid</STRONG></A> will bring together the brightest minds, boldest innovations, and most impactful customer stories across the SAP ecosystem. Whether you're embarking on your RISE with SAP journey, transforming your support strategy with AI, or streamlining operations with SAP Cloud ALM, this is your chance to connect, learn, and get inspired.</SPAN></P><P><SPAN>If you're planning your agenda, make sure to explore the powerful lineup of sessions hosted by our SAP Customer Support &amp; Cloud Lifecycle Management team. These sessions dive deep into real customer use cases, cutting-edge tools, and strategic insights that can help you unlock value, improve resilience, and stay ahead in today’s rapidly evolving business landscape.</SPAN></P><P><SPAN><span class="lia-unicode-emoji" title=":sparkles:">✨</span></SPAN><SPAN><STRONG>Don't miss these highlights from our team at SAP Sapphire Madrid 2025</STRONG></SPAN></P><P><SPAN><span class="lia-unicode-emoji" title=":round_pushpin:">📍</span></SPAN><SPAN><STRONG>Main Theater &amp; Breakout Sessions – Tuesday, May 27</STRONG></SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233073735001KoTL" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1211</STRONG></SPAN></A><SPAN> | 01:00 PM | Room 10.17</SPAN><BR /><SPAN><STRONG>Achieving a clean core with your RISE with SAP journey</STRONG></SPAN><BR /><SPAN><EM>Wieland Schreiner</EM></SPAN><BR /><SPAN>Learn how RISE with SAP enables a clean core in your transition to cloud ERP. See how SAP Cloud ALM supports transformation across five key dimensions and discover proven strategies for sustainable business success.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233279411001vM5u" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1213</STRONG></SPAN></A><SPAN> | 01:30 PM | Room 10.17</SPAN><BR /><SPAN><STRONG>Accelerating success: Unlocking value with smooth customer onboarding</STRONG></SPAN><BR /><SPAN><EM>Lee Evans</EM></SPAN><BR /><SPAN>Discover how structured onboarding can fast-track value realization. This session shares strategies to drive adoption, increase customer lifetime value, and build a solid foundation for ongoing success.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233186947001Rzhr" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1212</STRONG></SPAN></A><SPAN> | 02:00 PM | Room 10.19</SPAN><BR /><SPAN><STRONG>Streamline your RISE with SAP journey with the power of SAP Cloud ALM</STRONG></SPAN><BR /><SPAN><EM>Wieland Schreiner</EM></SPAN><BR /><SPAN>Explore how SAP Cloud ALM works with SAP LeanIX and SAP Signavio to simplify landscape discovery, optimize processes, and enhance visibility. Hear success stories and practical takeaways for smarter transformation.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1742798561047001Ijpk" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1372</STRONG></SPAN></A><SPAN> | 02:30 PM | Room 10.15</SPAN><BR /><SPAN><STRONG>Fast, Smart, and Scalable – How Bosch Leverages SAP S/4HANA Public Cloud</STRONG></SPAN><BR /><SPAN><EM>Stefan Steinle, Bastian Kloiber (Bosch)</EM></SPAN><BR /><SPAN>Get an inside look at Bosch’s two-tier ERP strategy using SAP S/4HANA Public Cloud. Learn how automation, SAP Best Practices, and close collaboration with SAP helped ensure smooth rollouts and operational success.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233473888001Qwkb" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1215</STRONG></SPAN></A><SPAN> | 03:00 PM | Room 10.18</SPAN><BR /><SPAN><STRONG>Elevating your GROW with SAP journey with SAP Cloud ALM</STRONG></SPAN><BR /><SPAN><EM>Tonja Kehrer</EM></SPAN><BR /><SPAN>Discover how SAP Cloud ALM can accelerate your GROW with SAP journey with real-time insights, demo-driven learning, and innovation strategies for scalable growth.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233374809001THRG" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1214</STRONG></SPAN></A><SPAN> | 03:30 PM | Room 10.18</SPAN><BR /><SPAN><STRONG>Thriving in holiday peaks: How AI transforms system resilience</STRONG></SPAN><BR /><SPAN><EM>Stefan Steinle, Suhaim Mohamed (Douglas)</EM></SPAN><BR /><SPAN>Explore how AI-driven holiday readiness safeguarded €11 billion in GMV for 143 customers. Learn how this initiative transforms system resilience and prepares your business for peak performance.</SPAN></P><P><SPAN><span class="lia-unicode-emoji" title=":round_pushpin:">📍</span></SPAN><SPAN><STRONG>Ask the Expert Sessions – Tuesday, May 27 | Room 10.201</STRONG></SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741233942773001fAzu" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1218</STRONG></SPAN></A><SPAN> | 01:30 PM</SPAN><BR /><SPAN><STRONG>Maximizing business value with AI-driven support tools</STRONG></SPAN><BR /><SPAN><EM>Wilhelm Juette</EM></SPAN><BR /><SPAN>Explore how SAP Cloud ALM, Built-In Support, and the ITSM connector help you tailor support and drive value through intelligent automation.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741234059084001HOm6" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1219</STRONG></SPAN></A><SPAN> | 02:00 PM</SPAN><BR /><SPAN><STRONG>AI – From A to Impact: A case study to measure business value with AI</STRONG></SPAN><BR /><SPAN><EM>Wilhelm Juette</EM></SPAN><BR /><SPAN>See how SAP uses process mining and data insights to create a value measurement framework in the world of agentic AI.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741234140043001kv1Y" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1220</STRONG></SPAN></A><SPAN> | 02:30 PM</SPAN><BR /><SPAN><STRONG>Streamlining migration to SAP S/4HANA with lean selective data transition</STRONG></SPAN><BR /><SPAN><EM>Thorsten Spihlmann</EM></SPAN><BR /><SPAN>Get hands-on strategies using the SAP Business Transformation Center to simplify and automate your SAP S/4HANA migration while focusing on what matters most.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741234264004001OH16" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1221</STRONG></SPAN></A><SPAN> | 03:00 PM</SPAN><BR /><SPAN><STRONG>Safeguarding operations during the smooth transition to SAP S/4HANA Cloud</STRONG></SPAN><BR /><SPAN><EM>Mirja Kempin, Markus Winter</EM></SPAN><BR /><SPAN>Learn how to keep business continuity front and center during your move to SAP S/4HANA Cloud Private Edition.</SPAN></P><P><A href="https://www.sap.com/events/sapphire/flow/sap/sm25/catalog-inperson/page/catalog/session/1741234378182001exJr" target="_self" rel="noopener noreferrer"><SPAN><STRONG>SER1222</STRONG></SPAN></A><SPAN> | 03:30 PM</SPAN><BR /><SPAN><STRONG>Using AI and data-driven insights for a smooth customer support experience</STRONG></SPAN><BR /><SPAN><EM>Vivian Luechau-de la Roche</EM></SPAN><BR /><SPAN>Explore how AI and tools like SAP for Me are streamlining support by surfacing relevant case history, resolution paths, and customer insights.</SPAN></P><P><SPAN><span class="lia-unicode-emoji" title=":direct_hit:">🎯</span></SPAN><SPAN><STRONG>Why You Should Attend</STRONG></SPAN></P><P><SPAN>From onboarding to migration, support to system resilience—these sessions are packed with real-world insight, customer voices, and practical frameworks to drive your transformation forward. They also highlight SAP’s ongoing commitment to intelligent, cloud-based innovation that delivers measurable outcomes.</SPAN></P><P><SPAN><span class="lia-unicode-emoji" title=":spiral_calendar:">🗓</span>️ </SPAN><SPAN><STRONG>Join us in Madrid!</STRONG></SPAN></P><P><SPAN>We can’t wait to meet you onsite—stop by our sessions, ask your toughest questions, and connect with our experts to explore how AI, automation, and cloud transformation can take your business to the next level.</SPAN></P><P><SPAN>Let’s build the future together at <A href="https://www.sap.com/events/sapphire/madrid.html" target="_self" rel="noopener noreferrer"><STRONG>SAP Sapphire Madrid 2025</STRONG></A>.</SPAN></P><P><SPAN>#SAPSapphire #SAPCommunity #CustomerSupport #SAPCloudALM #AI #RISEwithSAP #GROWWITHSAP</SPAN></P><P>&nbsp;</P> 2025-04-12T09:23:10.254000+02:00 https://community.sap.com/t5/technology-blog-posts-by-sap/lets-talk-to-the-database-cap-amp-sqlite-powered-by-google-gemini-2-5-pro/ba-p/14077754 Lets Talk to the Database - CAP & SQLite Powered by Google Gemini-2.5-Pro 2025-04-16T14:08:37.441000+02:00 ShivamShuklaSAP https://community.sap.com/t5/user/viewprofilepage/user-id/1859416 <P><EM>I am sure everyone is doing great !!!</EM></P><P class="lia-align-justify" style="text-align : justify;">I am glad and excited to share my learning on Talk to Database , I mean just talk to the database using Natural Language powered by Google Gemini-2.5-pro-exp handled by SAP CAP programming.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_3-1744803613376.png" style="width: 434px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251466i9EA1A1E439C25D98/image-dimensions/434x283?v=v2" width="434" height="283" role="button" title="ShivamShuklaSAP_3-1744803613376.png" alt="ShivamShuklaSAP_3-1744803613376.png" /></span></P><P><STRONG>So what all we have here to achieve this end-to-end ?</STRONG></P><UL><LI>We have access to Gemini-2.5-pro exprimetnal version available ( PAID ) Get your API Keys to interact through API Call</LI><LI>A CAP Application where we have deployed Sales Order header / Line Item / Deliveries Entities Deployed</LI><LI>Install SQLITE DB For Data Storage for all the 3 Tables</LI><LI>Add some Dummy data into your DB tables So that later we can validate data and interact with the database for fetching requested information and run select statement on top of that.</LI></UL><P>Test out Gemini API using POSTMAN before you jump into this.</P><P><STRONG>Data Prep:</STRONG></P><UL><LI>Extracted data from Sales Order Header (VBAK)</LI><LI>Extracted data from Sales Line Item (VBAP)</LI><LI>Extracted data from Delivery (LIPS)</LI></UL><P><STRONG>&nbsp;</STRONG><STRONG>Tasks Break Down:</STRONG></P><P><STRONG>Prompt Prep</STRONG> :</P><OL><LI>Prepare the prompt and provide enough context to generate SQL</LI><LI>Example &nbsp;Prompt – Give me number of Sales Order in System</LI><LI>Prompt Specifies --&nbsp; A Count is required from Sales Order Header table</LI></OL><P><STRONG>Context</STRONG><SPAN> -&nbsp;</SPAN></P><OL><LI>Sales Order Header = SalesOrderH Table</LI><LI>Sales Order Item = SalesOrderItem</LI></OL><P><STRONG>Instruction – </STRONG></P><P>`You are a specialized SQL query assistant for a computer store database. Your primary goal is to answer user questions by retrieving data using the available tools.</P><P>Database Context:</P><P>The key tables you will interact with are:</P><P>&nbsp; &nbsp;'sp_prompts_Delivery': Contains Delivery Data.</P><P>&nbsp; &nbsp;'sp_prompts_SalesOrderHeader': Contains Sales Order Header Data (often referred to as just Sales Orders).</P><P>&nbsp; &nbsp;'sp_prompts_SalesOrderItem': Contains detailed Sales Order Item Data. `</P><P>&nbsp;</P><P><STRONG>Identify Tables </STRONG>:</P><OL><LI>Develop NodeJS API to extract tables names Specified</LI><LI>Develop NodeJS API to list all the tables:</LI></OL><P>Example&nbsp; - <SPAN><A href="http://localhost:4004/odata/v4/my/listTables" target="_blank" rel="noopener nofollow noreferrer">http://localhost:4004/odata/v4/my/listTables</A></SPAN></P><P class="lia-align-center" style="text-align: center;"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_4-1744804130786.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251473i26E6BA17CF10B2DD/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_4-1744804130786.png" alt="ShivamShuklaSAP_4-1744804130786.png" /></span></P><P>&nbsp; &nbsp; &nbsp;</P><P>&nbsp;<STRONG>Benefit</STRONG> – this will help in finding out extract matching table while building the select query , will help in preparing the right context for model input</P><P><STRONG>Schema Extraction </STRONG>: Develop Nodejs API to extract schema of tables ( Entities ) specified</P><P>Extract Schema for any give table input</P><P>Example –</P><P><A href="http://localhost:4004/odata/v4/my/describeDbTable" target="_blank" rel="noopener nofollow noreferrer">http://localhost:4004/odata/v4/my/describeDbTable</A></P><P>Content-Type: application/json</P><P>{&nbsp;&nbsp;&nbsp;"tableName":"sp_prompts_SalesOrderHeader"&nbsp; }</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_5-1744804236785.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251476i2D5452B019C00C94/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_5-1744804236785.png" alt="ShivamShuklaSAP_5-1744804236785.png" /></span></P><P><STRONG>Benefit: </STRONG>this will help in preparing schema for tables incase we need only few columns or specific column like Sales Order Types / Delivery Types etc.</P><P>&nbsp;</P><P><STRONG>Execute DB Query: </STRONG>&nbsp;This is the most important one once Model has generated the select statement for the input , this will execute the query and provide the response</P><P>Example – Prompt - SELECT * FROM sp_prompts_SalesOrderItem</P><P>Response :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_6-1744804267933.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251477iBE0A13AB39D37A7F/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_6-1744804267933.png" alt="ShivamShuklaSAP_6-1744804267933.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_7-1744804290070.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251479i5A807A6998724290/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_7-1744804290070.png" alt="ShivamShuklaSAP_7-1744804290070.png" /></span></P><P><STRONG>Benefit: - </STRONG>It will run the DB Statements and will return the results</P><P>&nbsp;</P><P><STRONG>Prepare Response: </STRONG></P><P>This will come into picture in case we are requesting some information and expect the response in a plain english Language</P><P>Example – Give me no of Sales Order and Sales Item from System</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_8-1744804320192.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251480i34FC5246135E18B2/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_8-1744804320192.png" alt="ShivamShuklaSAP_8-1744804320192.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_9-1744804320195.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251481iA0268A8D9C44C3B4/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_9-1744804320195.png" alt="ShivamShuklaSAP_9-1744804320195.png" /></span></P><P>If you notice the generated sql then you will get the idea what happening behind the scene.</P><P><STRONG>Generated SQL</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_10-1744804320199.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251482i6B82B2E03E48CF08/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_10-1744804320199.png" alt="ShivamShuklaSAP_10-1744804320199.png" /></span></P><P><STRONG>CAP Structure:</STRONG></P><P>This is how my CAP Structure look like.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_11-1744804381662.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251484i2E843A3BC29BA6E3/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_11-1744804381662.png" alt="ShivamShuklaSAP_11-1744804381662.png" /></span></P><P><STRONG>QueryEngine</STRONG> Contains the Main API Call to Gemini which subsequently calls all the functions required to get data back from Database like below.</P><P><STRONG>AI Model Used : &nbsp;</STRONG>Gemini-2.5-pro-exp</P><P>Initialize of GenAI Model like below.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_12-1744804407189.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251485i142D09B94C423CD8/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_12-1744804407189.png" alt="ShivamShuklaSAP_12-1744804407189.png" /></span></P><P>If you see we have passed set of instructions that contains information about the database and tools are nothing the list of function which will give information about Tables and their Schema and later the final function for executing the generated query.</P><P><STRONG>Dbquery.cds</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_13-1744804407201.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251486i9B5C5B1C7F153E26/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_13-1744804407201.png" alt="ShivamShuklaSAP_13-1744804407201.png" /></span></P><P>This action will be responsible for prompt handing and it will return object of Information.</P><P><STRONG>Dbquery.JS</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShivamShuklaSAP_14-1744804551408.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/251487i69DA416D2B273599/image-size/medium?v=v2&amp;px=400" role="button" title="ShivamShuklaSAP_14-1744804551408.png" alt="ShivamShuklaSAP_14-1744804551408.png" /></span></P><P>This will handle all the incoming requests ---QueryDatabase is the one which will be triggered from outside rest of functions will be called internally , these are listed for standalone testing.</P><P><STRONG>Note</STRONG></P><P>I am also attaching a small Video Clip where you can see the execution of various prompts for technical implementation DM me directly , i can help you in building this POC.</P><P><div class="video-embed-center video-embed"><iframe class="embedly-embed" src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FC00Ut0QeDng%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DC00Ut0QeDng&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FC00Ut0QeDng%2Fhqdefault.jpg&amp;type=text%2Fhtml&amp;schema=youtube" width="200" height="112" scrolling="no" title="CAP and SQLite Powered by Google Gemini 2.5 Pro" frameborder="0" allow="autoplay; fullscreen; encrypted-media; picture-in-picture;" allowfullscreen="true"></iframe></div></P><P>#KeepLearningKeepSharing</P><P>&nbsp;</P> 2025-04-16T14:08:37.441000+02:00 https://community.sap.com/t5/sap-teched-blog-posts/sap-teched-2025-slip-on-speed-up-the-next-decade-of-business/ba-p/14204442 SAP TechEd 2025: 👟 Slip On, Speed Up: The Next Decade of Business 2025-09-02T14:16:17.305000+02:00 lauramarwood https://community.sap.com/t5/user/viewprofilepage/user-id/44830 <H3 id="toc-hId-1343831468" id="toc-hId-1888190302">The world is in sprint mode ...</H3><P>.... AI cycles flip every quarter, supply chains get bendy, climate shocks rewrite plans. As Lenin put it, "there are weeks when decades happen." Winning 2030 isn’t about hoarding data or demo sparkle. It’s about <STRONG>cutting friction</STRONG>, <STRONG>staying focused</STRONG>, and <STRONG>moving as one</STRONG>.</P><P>SAP’s 2030 play:</P><UL><LI><STRONG>AI that removes grind -&nbsp;</STRONG>copilots, automation, anomaly detection.</LI><LI><STRONG>Data you can trust -&nbsp;</STRONG>clean, connected, decision-ready.</LI><LI><STRONG>Networks that sync the ecosystem -&nbsp;</STRONG>suppliers, partners, customers moving together</LI></UL><P>The metaphor? <STRONG>Lock laces.</STRONG> Triathletes don’t stop to tie shoes. They slip on and go. In finance, AI is the lock lace: auto-recs, instant cleanup, fewer errors. Hours back. Focus up. Strategy on.</P><P>This is the shift: from tying shoes to breaking records. From reconciling ledgers to reshaping the business.</P><P><A href="https://www.sap.com/events/teched/berlin/flow/sap/te25/catalog-inperson/page/catalog/session/1753971252933001gofy" target="_self" rel="noopener noreferrer">Join <STRONG>Paul Saunders</STRONG> and <STRONG>Laura Marwood Ph.D.</STRONG></A>&nbsp;at SAP TechEd 2025 to see how we’re building for that future - because in 2030, every second (and every decision) will count.<BR />__________________________________<BR /><BR /><STRONG>Can't make it to Berlin? Dial in for SAP TechEd Virtual!</STRONG></P><P>If you can’t make it to Berlin, we’ve still got you covered. <A href="https://www.sap.com/events/teched/virtual.html" target="_blank" rel="noopener noreferrer">SAP TechEd Virtual</A> is a free online event that’ll help you broaden your technical expertise and explore SAP’s latest innovations<BR /><BR /><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="asasa.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/307878iC9DEC57B450EDCDE/image-size/large?v=v2&amp;px=999" role="button" title="asasa.jpg" alt="asasa.jpg" /></span></P><P>&nbsp;</P> 2025-09-02T14:16:17.305000+02:00 https://community.sap.com/t5/human-capital-management-blog-posts-by-members/optimizing-dynamic-group-creation-and-simplifying-rule-for-new-hire-data/ba-p/14247584 Optimizing Dynamic Group Creation and simplifying rule for New Hire Data Review 2025-10-17T21:34:56.085000+02:00 saurabhgrover8 https://community.sap.com/t5/user/viewprofilepage/user-id/36894 <P>&nbsp;</P><P><STRONG>Optimizing Dynamic Group Creation and New Hire Data Review</STRONG></P><P>&nbsp;</P><P>My client had a large number of offices and training locations—<STRONG>around 250</STRONG>+—each responsible for reviewing new hire data for incoming employees. This created a need for multiple <STRONG>Dynamic Groups</STRONG> and a complex set of rules to select the correct locations based on predefined criteria.</P><P>&nbsp;</P><P>To simplify this, we implemented a structured solution, which I will explain step by step below.</P><P>&nbsp;</P><P><STRONG>Step 1</STRONG>: <STRONG>Configuring the Training Location Field in Recruiting</STRONG></P><P>&nbsp;</P><P>At the application stage, where the correct location for the employee is selected during the recruiting process, we created a <STRONG>Training Location</STRONG> field with the same picklist values as in Employee Central (EC).</P><P>&nbsp;</P><P>This field carries the value over to EC, which becomes the <STRONG>**key deciding factor**</STRONG> for determining which group will be selected for reviewing new hire data.</P><P>&nbsp;</P><P><STRONG>Note</STRONG>: This blog does not cover how to create a field in the RCM application or how to map that field to ONB.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_0-1760725352503.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329375i04C86476E9F17DB0/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_0-1760725352503.png" alt="saurabhgrover8_0-1760725352503.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P><STRONG>Step 2: Creating and Configuring the Object</STRONG></P><P>&nbsp;</P><P>Once the field was set up, we created an object as shown below.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_1-1760725352508.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329376iC157702BACEED998/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_1-1760725352508.png" alt="saurabhgrover8_1-1760725352508.png" /></span></P><P>&nbsp;</P><P>Please ensure that the following fields are configured as indicated:</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_2-1760725352511.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329374i4FD1E22BF60CFFCB/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_2-1760725352511.png" alt="saurabhgrover8_2-1760725352511.png" /></span></P><P>&nbsp;</P><P>Object: cust_ONB2ResponsibilityDetailConfig</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_3-1760725352514.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329377i53A59966A5AB2C9B/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_3-1760725352514.png" alt="saurabhgrover8_3-1760725352514.png" /></span></P><P>&nbsp;</P><P>* The second field must match the value from Recruiting. It should have the same data type and field configuration as in Recruiting to enable proper mapping and data transfer to Onboarding.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_4-1760725352524.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329379i1FCBA95E99FB95CD/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_4-1760725352524.png" alt="saurabhgrover8_4-1760725352524.png" /></span></P><P>&nbsp;</P><P><STRONG>Step 3: Setting Permissions</STRONG></P><P>It’s essential to assign the correct permissions to the object before entering data.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_5-1760725352525.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329378i0D94C4837D6396C2/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_5-1760725352525.png" alt="saurabhgrover8_5-1760725352525.png" /></span></P><P>&nbsp;</P><P>Once the permissions are set, the object should appear as shown below:</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_6-1760725352527.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329381i4D624A06E97E4C2C/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_6-1760725352527.png" alt="saurabhgrover8_6-1760725352527.png" /></span></P><P>&nbsp;</P><P>---</P><P>&nbsp;</P><P><STRONG>Step 4: Creating the Business Rule</STRONG></P><P>&nbsp;</P><P>Next, we designed a **business rule** to control the group selection logic. This rule determines which dynamic group should be picked based on the training location value passed from Recruiting.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_7-1760725352529.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329380iA3C81F65896394D5/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_7-1760725352529.png" alt="saurabhgrover8_7-1760725352529.png" /></span></P><P>&nbsp;</P><P>---</P><P>&nbsp;</P><P><STRONG>Data Collection Object Configuration</STRONG></P><P>&nbsp;</P><P>The second part of the requirement involved handling **various data collection UIs**, which the customer wanted to display based on specific conditions.</P><P>&nbsp;</P><P>To achieve this, we created a **new object definition** as illustrated below:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_8-1760725352540.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329382iB2DF34C85F8554A9/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_8-1760725352540.png" alt="saurabhgrover8_8-1760725352540.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>We added the **<STRONG>foundation object: Location Group</STRONG>**, since each combination of **Location Group + Company + Employee Group** determines the type of object to be displayed.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_9-1760725352548.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329385i5B24669FE05021F8/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_9-1760725352548.png" alt="saurabhgrover8_9-1760725352548.png" /></span></P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_10-1760725352554.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329383i90DFD296CCD0DD26/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_10-1760725352554.png" alt="saurabhgrover8_10-1760725352554.png" /></span></P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_11-1760725352561.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329384i7A748BB118BA1D9B/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_11-1760725352561.png" alt="saurabhgrover8_11-1760725352561.png" /></span></P><P>&nbsp;</P><P>Additionally, the **Object UI** must include the following configuration:</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_12-1760725352567.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329387i25CD32F42DCECCCF/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_12-1760725352567.png" alt="saurabhgrover8_12-1760725352567.png" /></span></P><P>&nbsp;</P><P>Once the object is uploaded, it should look like this:</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_13-1760725352575.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329386iF57F50A60A16EC85/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_13-1760725352575.png" alt="saurabhgrover8_13-1760725352575.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>Finally, we designed a <STRONG>Rule</STRONG> to pick the appropriate lookup table configuration**. The rule should resemble the structure shown below:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="saurabhgrover8_14-1760725352582.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/329388iDAB2F545653FBF48/image-size/medium?v=v2&amp;px=400" role="button" title="saurabhgrover8_14-1760725352582.png" alt="saurabhgrover8_14-1760725352582.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P><STRONG>Conclusion</STRONG></P><P>&nbsp;</P><P>By standardizing the location field, creating a well-structured object, and designing targeted business rules, we were able to <STRONG>eliminate the need for lengthy line items</STRONG> in dynamic group creation. This approach not only improves system performance but also provides <STRONG>flexibility in data collection UI configuration</STRONG> based on location and organizational attributes.</P> 2025-10-17T21:34:56.085000+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-rpt-1-context-model-vs-training-classical-models-the-models-battle/ba-p/14268507 SAP RPT-1 Context Model vs. Training Classical Models: The Models Battle (Python Hands-on) 2025-11-20T07:50:27.670000+01:00 nicolasestevan https://community.sap.com/t5/user/viewprofilepage/user-id/1198632 <H2 id="toc-hId-1764768715"><span class="lia-unicode-emoji" title=":collision:">💥</span>The Models Battle</H2><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_5-1763206328497.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341535i2A2C9A98D24BF43B/image-size/large/is-moderation-mode/true?v=v2&amp;px=999" role="button" title="nicolasestevan_5-1763206328497.png" alt="nicolasestevan_5-1763206328497.png" /></span></P><P>Predictive modeling is becoming a built-in capability across SAP, improving how teams handle forecasting, pricing, and planning. <STRONG>Many SAP professionals, however, aren’t machine-learning specialists</STRONG>, and traditional models often demand extensive setup, tuning, and repeated training, which slows down new ideas.</P><P><STRONG>SAP RPT-1</STRONG> offers a simpler path. It’s a pretrained model from SAP, also available in an OSS version, that lets developers and consultants produce predictions with far less technical effort, no deep ML background required.</P><P>I've explored SAP RPT-1 hands-on, comparing it with traditional regressors using Python and a real public vehicles price dataset.&nbsp;</P><BLOCKQUOTE><P><STRONG>Goal:</STRONG> To see (as a non Data Scientist) how <STRONG>SAP RPT-1</STRONG> behaves in practice, what advantages and limits it shows, and when it could make sense in a predictive scenario.</P></BLOCKQUOTE><P>Usually for real-world scenario, the right approach would be consume the SAP RPT-1 though the available and simplified API, but for studies proposal and fair comparision over othe traditional ML models, the <STRONG>OSS</STRONG> fits perfectly for it:</P><HR /><H2 id="toc-hId-1568255210"><span class="lia-unicode-emoji" title=":thinking_face:">🤔</span>&nbsp;SAP RPT-1 vs Traditional Machine Learning - Core Differences</H2><P>Before diving into the code, let’s quickly revisit how<STRONG> traditional ML</STRONG> models work:</P><UL><LI>Training-based models like Random Forest, LightGBM, and Linear Regression learn patterns directly from data.&nbsp;</LI><LI>They require hundreds or thousands of examples to tune their internal parameters.</LI><LI>Their performance depends heavily on data quantity and quality.</LI><LI>The more relevant examples they see, the smarter they get.</LI></UL><P>On the other hand, <STRONG>SAP RPT-1 f</STRONG>ollows a different philosophy. It’s part of the RPT (Representational Predictive Transformer) family, pretrained on a wide variety of business and contextual data. This means:</P><UL><LI>You don’t "train" it in the traditional sense. Instead, it uses context embeddings to predict outcomes.</LI><LI>It can be used immediately, even with smaller datasets.</LI><LI>The OSS version allows developers to experiment directly in Python.</LI><LI>No special SAP backend required.</LI></UL><BLOCKQUOTE><P><STRONG>Outcome:</STRONG> Traditional ML models learn from high amount of data. SAP RPT-1 already knows how to deal with small context amount of data.</P></BLOCKQUOTE><HR /><H2 id="toc-hId-1371741705"><span class="lia-unicode-emoji" title=":desktop_computer:">🖥</span>&nbsp;The Experiment - Setup &amp; Dataset&nbsp;</H2><div class="lia-spoiler-container"><a class="lia-spoiler-link" href="#" rel="nofollow noopener noreferrer">Spoiler</a><noscript> (Highlight to read)</noscript><div class="lia-spoiler-border"><div class="lia-spoiler-content">Don't worry on "playing puzzles" copying + pasting code below. The full version is available for download at end!</div><noscript><div class="lia-spoiler-noscript-container"><div class="lia-spoiler-noscript-content">Don't worry on "playing puzzles" copying + pasting code below. The full version is available for download at end!</div></div></noscript></div></div><P>To make this comparison tangible, I built a simple yet realistic Python experiment to predict vehicle selling prices using a public dataset containing car attributes like make, model, year, transmission, and mileage.</P><P>Why vehicle pricing? Because it’s an intuitive example where both traditional machine learning and pretrained AI models can be applie, and it helps visualize how prediction quality evolves as the sample size grows.</P><P>This entire analysis runs on a local Python environment&nbsp;with the following stack:</P><pre class="lia-code-sample language-python"><code>import os import gc import warnings import pandas as pd import numpy as np import matplotlib.pyplot as plt from datetime import datetime from sklearn.model_selection import train_test_split from sklearn.metrics import r2_score from sklearn.ensemble import RandomForestRegressor from sklearn.preprocessing import LabelEncoder from sklearn.linear_model import LinearRegression from sap_rpt_oss import SAP_RPT_OSS_Regressor import lightgbm as lgb</code></pre><UL><LI><STRONG>pandas</STRONG> and <STRONG>numpy</STRONG> for data manipulation</LI><LI><STRONG>scikit-learn</STRONG> for classical ML regressors (R<STRONG>andom Forest, Linear Regression</STRONG>)</LI><LI><STRONG>LightGBM</STRONG> for gradient <STRONG>boosting</STRONG> comparison</LI><LI><STRONG>sap_rpt_oss</STRONG> — the open-source Python version of <STRONG>SAP’s RPT-1 model</STRONG></LI><LI><STRONG>matplotlib</STRONG> for all <STRONG>visualizations</STRONG></LI></UL><BLOCKQUOTE><P><STRONG>SAP RPT-1 OSS </STRONG>can be downloaded installed following official Hugging Face:&nbsp;<A title="https://huggingface.co/SAP/sap-rpt-1-oss?library=sap-rpt-1-oss" href="https://huggingface.co/SAP/sap-rpt-1-oss?library=sap-rpt-1-oss" target="_blank" rel="noopener nofollow noreferrer">https://huggingface.co/SAP/sap-rpt-1-oss?library=sap-rpt-1-oss</A>&nbsp;. Python can be installed with executable download on Windows, or via <STRONG>Home Brew</STRONG> for Mac and <STRONG>apt</STRONG> commands for Linux. Libraries dependencies can be downloaded with <STRONG>pip</STRONG> commands. Googling it may not be a road blocker.</P></BLOCKQUOTE><P>We use a sample&nbsp;vehicle sales dataset. The complete file is about to 88Mb but for such experiment a restricted sample of 20k as it's more than enough to prove our the concept, still it's faster and consuming less computing resources.</P><DIV class=""><DIV class=""><TABLE border="1" width="498px"><TBODY><TR><TD><STRONG>Feature</STRONG></TD><TD><STRONG>Description</STRONG></TD></TR><TR><TD width="248.57px" height="30px"><CODE>year</CODE></TD><TD width="248.43px" height="30px">Vehicle model year</TD></TR><TR><TD width="248.57px" height="30px"><CODE>make</CODE></TD><TD width="248.43px" height="30px">Brand (e.g., Toyota, Ford, BMW)</TD></TR><TR><TD width="248.57px" height="30px"><CODE>model</CODE></TD><TD width="248.43px" height="30px">Specific model name</TD></TR><TR><TD width="248.57px" height="30px"><CODE>body</CODE></TD><TD width="248.43px" height="30px">Type (SUV, Sedan, etc.)</TD></TR><TR><TD width="248.57px" height="30px"><CODE>transmission</CODE></TD><TD width="248.43px" height="30px">Gear type</TD></TR><TR><TD width="248.57px" height="30px"><CODE>odometer</CODE></TD><TD width="248.43px" height="30px">Vehicle mileage</TD></TR><TR><TD width="248.57px" height="30px"><CODE>color</CODE>, <CODE>interior</CODE></TD><TD width="248.43px" height="30px">Visual attributes</TD></TR><TR><TD width="248.57px" height="30px"><CODE>sellingprice</CODE></TD><TD width="248.43px" height="30px">The target variable to predict</TD></TR></TBODY></TABLE><P><STRONG><span class="lia-unicode-emoji" title=":bar_chart:">📊</span>&nbsp;Dataset Download:</STRONG>&nbsp;<A title="https://www.kaggle.com/datasets/syedanwarafridi/vehicle-sales-data?resource=download" href="https://www.kaggle.com/datasets/syedanwarafridi/vehicle-sales-data?resource=download" target="_blank" rel="noopener nofollow noreferrer">https://www.kaggle.com/datasets/syedanwarafridi/vehicle-sales-data?resource=download</A>&nbsp;</P><P>The dataset is loaded and preprocessed in a few simple steps:</P></DIV></DIV><pre class="lia-code-sample language-python"><code>df = pd.read_csv("car_prices.csv").sample(n=20000, random_state=42) # Fill missing values for categorical columns fill_defaults = { 'make': 'Other', 'model': 'Other', 'color': 'Other', 'interior': 'Unknown', 'body': 'Unknown', 'transmission': 'Unknown' } for col, val in fill_defaults.items(): df[col] = df[col].fillna(val) X = df[["year", "make", "model", "body", "transmission", "odometer", "color", "interior"]] y = df["sellingprice"]</code></pre><P>At this point, the stage is set:</P><UL><LI>The data is clean.</LI><LI>The environment is ready.</LI><LI>All models, traditionals and SAP RPT-1, are ready to be tested under identical conditions.</LI></UL><HR /><H2 id="toc-hId-1175228200"><span class="lia-unicode-emoji" title=":robot_face:">🤖</span>&nbsp;Training the Models - Three different ones</H2><P>With the dataset ready, the <STRONG>next step</STRONG> is to run each model under the same conditions: <STRONG>same features, same target, same train/test split and same random seed</STRONG>. This ensures the comparison is fair and repeatable.</P><P>We evaluate prediction performance using <STRONG>R² (coefficient of determination)</STRONG>, which indicates how much of the price variation the model can explain (1.0 = perfect prediction).</P><HR /><H3 id="toc-hId-1107797414">Training Model #1 - Random Forest</H3><P>Random Forest is often the first model used in tabular ML. It works by creating <STRONG>many decision trees</STRONG> and averaging their predictions. Before training, categorical variables need to be <STRONG>label-encoded</STRONG> into numbers, a common requirement for classical ML models:</P><pre class="lia-code-sample language-python"><code>def train_random_forest(X, y): X = X.copy() cat_cols = ["make", "model", "body", "transmission", "color", "interior"] le = LabelEncoder() for col in cat_cols: X[col] = le.fit_transform(X[col].astype(str).fillna("Unknown")) X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=default_test_size, random_state=42 ) model = RandomForestRegressor( n_estimators=150, max_depth=20, random_state=42, n_jobs=-1 ) try: model.fit(X_train, y_train) preds = model.predict(X_test) r2 = r2_score(y_test, preds) except Exception as e: preds, r2 = np.zeros_like(y_test), 0 return [preds, r2, y_test]</code></pre><H3 id="toc-hId-911283909">Up to 50 rows:</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_3-1763206176248.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341502i82216AA724092E03/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_3-1763206176248.png" alt="nicolasestevan_3-1763206176248.png" /></span></P><H3 id="toc-hId-714770404">Up to 7067 rows:</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_8-1763206511155.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341538iF2A25E0C0EBE0612/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_8-1763206511155.png" alt="nicolasestevan_8-1763206511155.png" /></span></P><H3 id="toc-hId-518256899">Live view</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="RandomForest_20251115_092355.gif" style="width: 960px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341551i3A2C874AFAF47388/image-size/large?v=v2&amp;px=999" role="button" title="RandomForest_20251115_092355.gif" alt="RandomForest_20251115_092355.gif" /></span></P><P>&nbsp;</P><HR /><H3 id="toc-hId-321743394">Training Model #2 - LightGBM</H3><P>LightGBM is one of the most powerful models for tabular data. Unlike Random Forest (many independent trees), LightGBM builds trees <STRONG>sequentially</STRONG>, each correcting the errors of the previous one. It supports categorical features natively, which simplifies preprocessing.</P><pre class="lia-code-sample language-python"><code>def train_lightgbm(X, y): X = X.copy() cat_cols = ["make", "model", "body", "transmission", "color", "interior"] for col in cat_cols: X[col] = X[col].astype(str).fillna("Unknown").astype("category") X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=default_test_size, random_state=42 ) model = lgb.LGBMRegressor( n_estimators=500, learning_rate=0.05, num_leaves=31, subsample=0.8, colsample_bytree=0.8, random_state=42 ) try: model.fit(X_train, y_train, categorical_feature=cat_cols) preds = model.predict(X_test) r2 = r2_score(y_test, preds) except Exception: preds, r2 = np.zeros_like(y_test), 0 return [preds, r2, y_test]</code></pre><H3 id="toc-hId-125229889">Up to 50 rows:</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_2-1763205951324.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341474i1AAB214E2D01C2B2/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_2-1763205951324.png" alt="nicolasestevan_2-1763205951324.png" /></span></P><H3 id="toc-hId--146514985">Up to 7067 rows:</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_7-1763206474860.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341537i0ACD453B96C87ADF/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_7-1763206474860.png" alt="nicolasestevan_7-1763206474860.png" /></span></P><H3 id="toc-hId--343028490">Live view</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="LightGBM_20251115_092355.gif" style="width: 960px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341552i30BC4DE94C4988F6/image-size/large?v=v2&amp;px=999" role="button" title="LightGBM_20251115_092355.gif" alt="LightGBM_20251115_092355.gif" /></span></P><HR /><H3 id="toc-hId--539541995">Training Model #3 - Linear Regression</H3><P>Not fancy and even not complex, Linear Regression provides a baseline that shows:&nbsp;<SPAN>“If the relationship between attributes and price is roughly linear, how well can a simple model perform?”</SPAN></P><pre class="lia-code-sample language-python"><code>def train_linear_model(X, y): X = X.copy() cat_cols = ["make", "model", "body", "transmission", "color", "interior"] for col in cat_cols: X[col] = LabelEncoder().fit_transform(X[col].astype(str).fillna("Unknown")) X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=default_test_size, random_state=42 ) model = LinearRegression() X_train = X_train.fillna(X_train.mean(numeric_only=True)) X_test = X_test.fillna(X_test.mean(numeric_only=True)) try: model.fit(X_train, y_train) preds = model.predict(X_test) r2 = r2_score(y_test, preds) except Exception: preds, r2 = np.zeros_like(y_test), 0 return [preds, r2, y_test]</code></pre><H3 id="toc-hId--736055500"><STRONG>Up to 50 rows:</STRONG></H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_1-1763205857765.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341472i81AFB2D0BE770F90/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_1-1763205857765.png" alt="nicolasestevan_1-1763205857765.png" /></span></P><H3 id="toc-hId--932569005"><STRONG>Up to 7067 rows:</STRONG></H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_6-1763206428099.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341536iC708165AEAE11D46/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_6-1763206428099.png" alt="nicolasestevan_6-1763206428099.png" /></span></P><H3 id="toc-hId--1129082510">Live view</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="LinearModel_20251115_092355.gif" style="width: 960px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341553i0849B4C842A417EE/image-size/large?v=v2&amp;px=999" role="button" title="LinearModel_20251115_092355.gif" alt="LinearModel_20251115_092355.gif" /></span></P><H2 id="toc-hId--1032193008"><span class="lia-unicode-emoji" title=":chequered_flag:">🏁</span>&nbsp;<SPAN>SAP RPT-1 OSS: Context Model</SPAN></H2><P>This is where things get interesting. SAP RPT-1 does <STRONG>not</STRONG> rely on learning patterns from the dataset. Instead, it uses a pretrained transformer architecture to infer relationships directly through <STRONG>context embeddings</STRONG>. Lean and simple, "for non-Data Science PhD":</P><pre class="lia-code-sample language-python"><code>def train_sap_rpt1(X, y): X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=default_test_size, random_state=42 ) model = SAP_RPT_OSS_Regressor(max_context_size=8192, bagging=8) model.fit(X_train, y_train) preds = model.predict(X_test) r2 = r2_score(y_test, preds) return [preds, r2, y_test]</code></pre><H3 id="toc-hId--1522109520"><STRONG>Up to 50 rows:</STRONG></H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_0-1763205729558.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341471i4AC7007DCA5A0F76/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_0-1763205729558.png" alt="nicolasestevan_0-1763205729558.png" /></span></P><H3 id="toc-hId--1718623025"><STRONG>Up to 2055 rows:</STRONG></H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="nicolasestevan_4-1763206228416.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341505i9ADE9D2D2B38C363/image-size/large?v=v2&amp;px=999" role="button" title="nicolasestevan_4-1763206228416.png" alt="nicolasestevan_4-1763206228416.png" /></span></P><H3 id="toc-hId--1915136530">Live view</H3><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="SAP_RPT1_20251115_092355.gif" style="width: 960px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/341566i0BE0E0D666836951/image-size/large?v=v2&amp;px=999" role="button" title="SAP_RPT1_20251115_092355.gif" alt="SAP_RPT1_20251115_092355.gif" /></span></P><P>&nbsp;</P><HR /><H2 id="toc-hId--1650063337"><STRONG><span class="lia-unicode-emoji" title=":magnifying_glass_tilted_right:">🔎</span>&nbsp;Running Experiments at Multiple Sample Sizes</STRONG></H2><P>This section breaks down how the iterative experiment loop works, why the SAP RPT-1 OSS model has a max-context limit, and how performance changes as we scale up the dataset. By running the same models across several sample sizes, we can see where traditional ML shines, where RPT-1 stays competitive, and how both behave as the data grows.</P><pre class="lia-code-sample language-python"><code>sample_sizes = np.linspace(50, len(X), 200, dtype=int) results, max_r2_rpt1, max_sample_rpt1 = [], 0, 0 for n in sample_sizes: idx = np.random.choice(len(X), n, replace=False) X_sample, y_sample = X.iloc[idx], y.iloc[idx] # SAP RPT-1 OSS (limited sample size) if n &lt;= rpt1_limit: rpt_res = train_sap_rpt1(X_sample, y_sample) fn = plot_predictions(rpt_res[2], rpt_res[0], rpt_res[1], "SAP_RPT1", n) video_frames["SAP_RPT1"].append(fn) r2_rpt1 = rpt_res[1] max_r2_rpt1 = max(max_r2_rpt1, r2_rpt1) else: r2_rpt1 = max_r2_rpt1 if max_sample_rpt1 == 0: max_sample_rpt1 = n # Train and plot models rf_res = train_random_forest(X_sample, y_sample) fn = plot_predictions(rf_res[2], rf_res[0], rf_res[1], "RandomForest", n) video_frames["RandomForest"].append(fn) lgb_res = train_lightgbm(X_sample, y_sample) fn = plot_predictions(lgb_res[2], lgb_res[0], lgb_res[1], "LightGBM", n) video_frames["LightGBM"].append(fn) lin_res = train_linear_model(X_sample, y_sample) fn = plot_predictions(lin_res[2], lin_res[0], lin_res[1], "LinearModel", n) video_frames["LinearModel"].append(fn) results.append((n, rf_res[1], r2_rpt1, lgb_res[1], lin_res[1])) # Early stop if traditional model reaches SAP RPT-1 if rf_res[1] &gt;= max_r2_rpt1 or lgb_res[1] &gt;= max_r2_rpt1 or lin_res[1] &gt;= max_r2_rpt1: break gc.collect()</code></pre><P>This loop compares SAP RPT-1 OSS with traditional ML models as sample sizes increase. Each iteration randomly selects a subset of the data and trains all models on the same slice for a fair comparison. SAP RPT-1 can only run up to its max-context limit, so once the sample size exceeds that threshold, it stops retraining and simply carries forward its best R². The traditional models continue training at every step. The loop ends early when any traditional model matches or surpasses RPT-1’s best score, making the experiment efficient while showing how performance evolves as data grows.</P><HR /><H2 id="toc-hId--1846576842"><STRONG><span class="lia-unicode-emoji" title=":end_arrow:">🔚</span>&nbsp;Conclusion and Final Thoughts</STRONG></H2><P>&nbsp;SAP RPT-1 OSS stands out because it performs well with small datasets, requires minimal code, and can generate useful predictions with just an API call and a bit of context. This makes it ideal for jump-starting predictive use cases early on, delivering fast business value without a full ML pipeline. Traditional models, however, still shine when projects mature, data grows, and fine-tuned control becomes important. It’s not about choosing one over the other, but understanding where each approach brings the most value.</P><TABLE border="1" width="100%"><TBODY><TR><TD><STRONG>&nbsp;</STRONG><STRONG>Aspect&nbsp;</STRONG></TD><TD><STRONG>SAP RPT-1 OSS&nbsp;</STRONG></TD><TD><STRONG>Traditional ML (RF, LGBM, Linear)</STRONG></TD></TR><TR><TD width="19.011815252416756%" height="30px">Data Requirements</TD><TD width="38.66809881847476%" height="30px">Low (performs well with small samples)</TD><TD width="42.21267454350161%" height="30px">Medium/High (performance scales with data</TD></TR><TR><TD width="19.011815252416756%" height="30px">Setup Effort</TD><TD width="38.66809881847476%" height="30px">Minimal (API call + context)</TD><TD width="42.21267454350161%" height="30px">Higher (preprocessing, encoding, tuning)</TD></TR><TR><TD width="19.011815252416756%" height="30px">Training Process</TD><TD width="38.66809881847476%" height="30px">None (pretrained context model)</TD><TD width="42.21267454350161%" height="30px">Full training pipeline required</TD></TR><TR><TD width="19.011815252416756%" height="30px">Speed to Insights</TD><TD width="38.66809881847476%" height="30px">Very fast</TD><TD width="42.21267454350161%" height="30px">Moderate to slow</TD></TR><TR><TD width="19.011815252416756%" height="30px">Best Use Case</TD><TD width="38.66809881847476%" height="30px">Early-stage predictive cases, quick baselines</TD><TD width="42.21267454350161%" height="30px">Mature pipelines, high control and customization</TD></TR><TR><TD width="19.011815252416756%" height="30px">Flexibility</TD><TD width="38.66809881847476%" height="30px">Limited tuning / plug-and-play</TD><TD width="42.21267454350161%" height="30px">Highly customizable</TD></TR><TR><TD width="19.011815252416756%" height="30px">Business Value</TD><TD width="38.66809881847476%" height="30px">Immediate, fast, accessible</TD><TD width="42.21267454350161%" height="30px">Strong when optimized and scaled</TD></TR></TBODY></TABLE><P>This experiment highlights a simple truth: <STRONG>SAP RPT-1 isn’t here to replace traditional ML, it jump-starts it.&nbsp;</STRONG>With a pretrained, context-driven approach, RPT-1 delivers fast, reliable insights with very little data and almost no setup. Traditional models still excel in mature, data-rich scenarios, but RPT-1 shines as a rapid accelerator and early-value generator inside SAP landscapes.</P><HR /><H3 id="toc-hId-1958473942"><STRONG><span class="lia-unicode-emoji" title=":speech_balloon:">💬</span>Open for Exchange</STRONG></H3><P>If you're testing RPT-1, exploring predictive cases, or want the full code, feel free to reach out.<BR /><STRONG>Happy to connect, compare experiences, and push this topic forward together.</STRONG></P> 2025-11-20T07:50:27.670000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-btp-data-fabric/ba-p/14288089 SAP BTP: Data Fabric 2025-12-12T12:50:49.920000+01:00 SekhuteTK https://community.sap.com/t5/user/viewprofilepage/user-id/15314 <P>Irrespective of where your source system resides, SAP provides the capabilities to ingest, harmonize and unlock real-time and predictive insights on your target system with confidence. This is supported by resilient, enterprise-grade frameworks, tools, and methodologies that safeguard data quality, integrity, and consistency throughout the entire data lifecycle.</P><P class="lia-align-center" style="text-align: center;"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="SAP. (2023). Figure:1 Source Systems for Data Provisioning in SAP" style="width: 590px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/350863i677834D7174C2EA3/image-size/large?v=v2&amp;px=999" role="button" title="Data source nice.jpg" alt="Data source nice.jpg" /></span><SPAN>&nbsp;</SPAN><EM>SAP. (2023). Figure:1</EM></P><P class="lia-align-center" style="text-align: center;"><EM>Source Systems for Data Provisioning in SAP</EM></P><P class="lia-align-center" style="text-align: center;"><EM>URL: <A href="https://learning.sap.com/courses/implementing-data-provisioning-in-sap-bw-4hana/identifying-sap-bw-4hana-source-systems-and-their-use-cases" target="_blank" rel="noopener noreferrer">https://learning.sap.com/courses/implementing-data-provisioning-in-sap-bw-4hana/identifying-sap-bw-4hana-source-systems-and-their-use-cases</A></EM></P><P>&nbsp;</P><P>The endless possibilities of raw, intrinsic data ingestion from e.g. Enterprise warehouses, data lakes, rational &amp; in-memory databases, SaaS applications even social media into SAP HANA (On-premises and/ or Cloud) opens a world of opportunity for data engineers to acquire rich, accurate, and complete master and transactional data from all possible sources imaginable. Responsibility for managing the data acquisition processes depends on the provisioning type being used. Common approaches include:</P><TABLE><TBODY><TR><TD><P><STRONG>Provisioning Type</STRONG></P></TD><TD><P><STRONG>Data Flow Responsibility</STRONG></P></TD><TD><P><STRONG>Metadata Management</STRONG></P></TD><TD><P><STRONG>Typical Use Cases</STRONG></P></TD></TR><TR><TD><P><STRONG>Application-Controlled</STRONG></P></TD><TD><P>Source <STRONG>application pushes data</STRONG> to target using APIs or connectors.</P></TD><TD><P>Managed in <STRONG>application layer</STRONG> (schema, mappings, transformations).</P></TD><TD><P>Custom logic, application-driven workflows, SaaS integrations.</P></TD></TR><TR><TD><P><STRONG>Change Data Capture (CDC)</STRONG></P></TD><TD><P><STRONG>CDC engine streams changes</STRONG> (insert/update/delete) from source to target in near real-time.</P></TD><TD><P>Managed by <STRONG>CDC tool</STRONG> (change logs, timestamps, transaction markers).</P></TD><TD><P>Real-time replication, incremental updates, event-driven architectures.</P></TD></TR><TR><TD><P><STRONG>Database-Controlled</STRONG></P></TD><TD><P><STRONG>Target database pulls data</STRONG> from source using adapters or SDI.</P></TD><TD><P>Centralized in <STRONG>database layer</STRONG> (schemas, lineage, governance).</P></TD><TD><P>Governance-heavy environments, centralized data lakes, enterprise ETL.</P></TD></TR></TBODY></TABLE><P class="lia-align-center" style="text-align: center;"><EM>Figure:2</EM></P><P class="lia-align-center" style="text-align: center;"><EM>Provisioning Types: Data Flow &amp; Metadata Management</EM></P><P class="lia-align-center" style="text-align: center;">&nbsp;</P><P>The utilization of high-performance ETL/ELT tools ensure the consistence, quality and reliability of the data is upheld and thus often used to unify the data to best meet the objectives of the stakeholder’s request either being executives, decision makers, marketing or product. A well-thought-out data provisioning strategy should always take precedence as this ensure a smooth flowing data process. Depending on the usage of the data, a data engineer may explorer the below different data provisioning approaches:</P><UL><LI><STRONG>Data federation – </STRONG>Data is not loaded nor persisted into the target system (SAP HANA) but rather read directly from the source system only when the data is needed for use.</LI><LI><STRONG>Data replication –</STRONG> Data from the source system is loaded, persisted into the target system and kept in sync using various mechanisms<UL><LI>Real-time (Synchronous) replication</LI><LI>Delta load (asynchronous)replication</LI></UL></LI><LI><STRONG>Data transformation – </STRONG>The raw data is enhanced with additional features e.g. calculation columns, renamed fields or new fields derived from other exiting fields</LI></UL><P>SAP provides but not limited to the below high-performance native data integration technologies:</P><TABLE width="609"><TBODY><TR><TD><P><STRONG>Tool</STRONG></P></TD><TD><P><STRONG>Primary Function</STRONG></P></TD><TD><P><STRONG>Position in Landscape Architecture</STRONG></P></TD></TR><TR><TD><P><STRONG>SLT (SAP Landscape Transformation Replication)</STRONG></P></TD><TD><P>Real-time replication (CDC)</P></TD><TD><P>Integration layer between SAP sources and HANA DB</P></TD></TR><TR><TD><P><STRONG>SDA (Smart Data Access)</STRONG></P></TD><TD><P>Virtual access to remote sources (federation)</P></TD><TD><P>Integration layer enabling query pushdown to the HANA DB</P></TD></TR><TR><TD><P><STRONG>SDI (Smart Data Integration)</STRONG></P></TD><TD><P>ETL/ELT pipelines, adapters, dataflows</P></TD><TD><P>Integration layer moving and transforming data before loading into the HANA DB</P></TD></TR></TBODY></TABLE><P class="lia-align-center" style="text-align: center;"><EM>Figure 3:</EM></P><P class="lia-align-center" style="text-align: center;"><EM>SAP Native ETL/ELT tools</EM></P><P>Furthermore, for Data quality, cleansing and enrichment before persisting into the HANA DB, SAP provides the below:</P><UL><LI><STRONG><EM>Smart Data quality (SDQ)</EM></STRONG> for SAP HANA on-premises</LI><LI><STRONG><EM>Data Quality Management Microservices (DQMm)</EM></STRONG> as a subscription-based microservice for SAP HANA Cloud</LI></UL><P>&nbsp;</P><P>SAP Hana provides a variation of consolidated tools both on premise and in the cloud, all providing for various functionality for different use cases such as:</P><UL><LI>Supporting data provisioning &nbsp;</LI><LI>Developing Database artefacts</LI><LI>Monitoring the performance of the develop artefacts</LI><LI>Monitoring the Database and its system performance</LI><LI>Administering and managing the Database</LI></UL><P>One of the key strengths of the SAP ecosystem is the ability to seamlessly combine cloud functionalities together with on-premises and still import specific/ needed applications as micro services into common IDE’s such as Microsoft Visual Studio code is commendable because it:</P><UL><LI>Provides business with a comprehensive 360-degree view of<UL><LI>Database health overview.</LI><LI>Data life cycle from end to end.</LI><LI>Application life cycle from end to end.</LI></UL></LI></UL><UL><LI>Empowers the end users, data engineers, developers, data scientists, data analyst and support engineers with greater power and flexibility over the data being used.</LI></UL><P>Below are native tools used by SAP for various purposes:</P><UL><LI><STRONG>SAP Business Application studio</STRONG> – Cloud development tool.</LI><LI><STRONG>Web IDE</STRONG> – On-premises development tool</LI><LI><STRONG>SAP Hana cockpit</STRONG> - a web-based tool for monitoring, administering, and managing SAP HANA databases, offering a centralized, browser-accessible interface with tile-based dashboards for performance, resource usage (CPU, memory, disk), alerts, and security. It allows tasks like starting/stopping services, managing users, viewing alerts, analyzing performance, and running SQL queries through its integrated Database Explorer.&nbsp;</LI></UL><P>SAP SaaS applications are imported into common IDE, s such as Microsoft Visual Studio code via extensions, libraries etc. for a specific purpose e.g.:</P><UL><LI>&nbsp;<STRONG>Supportability tools for SAP HANA</STRONG> - For both online and offline performance analysis and troubleshooting. With a focus on the needs of support engineers, the performance analysis tools provide enhanced analysis options for trace files and runtime dump files, as well as allowing in-depth analysis on extensive datasets.</LI><LI><STRONG>The SQL analyzer tool for SAP HANA</STRONG> - A performance analysis tool that helps developers visualize and understand SQL execution plans. Designed for query tuning and optimization, it offers detailed insights into how queries run within the SAP HANA database.</LI><LI><STRONG>SAP HANA Database Explorer </STRONG>- Provides functionality for accessing SAP HANA databases, browsing the database catalog, and executing SQL from a&nbsp;SQL console.</LI></UL><P class="lia-align-center" style="text-align: center;"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="SAP Dev and Adm tool.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/350856iD98C53CD26472527/image-size/large?v=v2&amp;px=999" role="button" title="SAP Dev and Adm tool.jpg" alt="SAP Dev and Adm tool.jpg" /></span><EM>Figure 4:</EM></P><P class="lia-align-center" style="text-align: center;"><EM>SAP Development and Administration Tools Overview</EM></P><P class="lia-align-left" style="text-align : left;">&nbsp;</P><P>The consolidated toolset further enhances productivity and convenience for end users, data engineers, developers, data scientists, data analyst and support engineers by enabling them to access all essential capabilities from a single unified platform. The choice and flexibility to work with the desired programming languages ranging from but not limited to: SQL, SQL Script, Python, Node.js, JavaScript and react native etc. Enable a cross-functional team with diverse skill set to collaborate effectively while striving towards the same common goal.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Language.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/350858i3C91624D1DC9DDE2/image-size/large?v=v2&amp;px=999" role="button" title="Language.jpg" alt="Language.jpg" /></span></P><P class="lia-align-center" style="text-align: center;">&nbsp;<EM>Figure 5:</EM></P><P class="lia-align-center" style="text-align: center;"><EM>Core Programming Languages and Frameworks for SAP HANA Cloud and On-Premises</EM></P><P class="lia-align-center" style="text-align: center;">&nbsp;</P><P>After successfully completing the Explore phase, with a solid fit -to-standard analysis of proposed use cases, Extrinsic data pipelines are then implemented for reliable and consistent data flow. These pipelines support a wide range of business needs such as decision making, go-to-market activities or ML/AI use cases requested by various stakeholders. The process is typically executed and maintained by data engineers, developers, data scientists, data analyst and/or support engineers.</P><P>The harmonized data is then governed, analyzed, and visualized through multiple front-end applications. These applications are powered by various integration frameworks e.g. REST APIs, OData v4, JDBC/ODBC, each serving different user needs and functional objectives:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="visual.jpg" style="width: 743px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/350859i916A48C2451B6519/image-size/large?v=v2&amp;px=999" role="button" title="visual.jpg" alt="visual.jpg" /></span></P><P class="lia-align-center" style="text-align: center;">&nbsp;<EM>Figure: 6</EM></P><P class="lia-align-center" style="text-align: center;"><EM>SAP-Compatible Applications and Their Integration Frameworks</EM></P><P class="lia-align-center" style="text-align: center;">&nbsp;</P><P class="lia-align-center" style="text-align: center;">Mutatis mutandis, Data is the inception of all there is, as it has been in the beginning, is now and ever shall be. Thank you all for taking a glimpse into the above Content. Please don’t for get to like, comment and share.</P><P class="lia-align-left" style="text-align : left;">&nbsp;</P><P><STRONG>Abbreviations:</STRONG></P><P><STRONG>API</STRONG>: Application Programming Interface</P><P><STRONG>BTP</STRONG>: Business Technology Platform</P><P><STRONG>CDC</STRONG>: Change Data Capture</P><P><STRONG>DB</STRONG>: Database</P><P><STRONG>DQMm</STRONG>: Data Quality Management Microservices</P><P><STRONG>ELT</STRONG>: Extract, Load, Transform</P><P><STRONG>ETL</STRONG>: Extract, Transform, Load</P><P><STRONG>IDE</STRONG>: Integrated Development Environment</P><P><STRONG>ML/AI</STRONG>: Machine Learning/Artificial Intelligence</P><P><STRONG>ODBC</STRONG>: Open Database Connectivity</P><P><STRONG>OData</STRONG>: Open Data Protocol</P><P><STRONG>REST</STRONG>: Representational State Transfer</P><P><STRONG>SaaS</STRONG>: Software as a Service</P><P><STRONG>SDI</STRONG>: Smart Data Integration</P><P><STRONG>SDA</STRONG>: Smart Data Access</P><P><STRONG>SDQ</STRONG>: Smart Data Quality</P><P><STRONG>SLT</STRONG>: SAP Landscape Transformation Replication</P><P><STRONG>SQL</STRONG>: Structured Query Language</P><P>&nbsp;</P><P><STRONG>Reference:</STRONG></P><OL><LI><EM>SAP. (2023). Figure:1 Source Systems for Data Provisioning in SAP, URL: <A href="https://learning.sap.com/courses/implementing-data-provisioning-in-sap-bw-4hana/identifying-sap-bw-4hana-source-systems-and-their-use-cases" target="_blank" rel="noopener noreferrer">https://learning.sap.com/courses/implementing-data-provisioning-in-sap-bw-4hana/identifying-sap-bw-4hana-source-systems-and-their-use-cases</A></EM></LI></OL> 2025-12-12T12:50:49.920000+01:00