https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Analytics-Hub-blog-posts.xml SAP Community - SAP Analytics Hub 2026-02-21T12:11:39.562744+00:00 python-feedgen SAP Analytics Hub blog posts in SAP Community https://community.sap.com/t5/technology-blog-posts-by-members/data-catalog-with-sap-quo-vadis/ba-p/13545875 Data Catalog with SAP - Quo Vadis? 2022-09-17T12:56:55+02:00 pbaumann https://community.sap.com/t5/user/viewprofilepage/user-id/942 <SPAN style="color: #ff0000">Update 03.11.2022 - current roadmap items for data cataloging</SPAN><BR /> <BR /> As mentioned in my <A href="https://blogs.sap.com/2022/08/19/data-architecture-with-sap-data-fabric/" target="_blank" rel="noopener noreferrer">Data Fabric</A> blog, the data and analytics landscape in companies become more and more distributed and decentralized concepts as <A href="https://blogs.sap.com/2022/06/04/more-than-just-a-hype-data-mesh-as-a-new-approach-to-increase-agility-in-value-creation-from-data/" target="_blank" rel="noopener noreferrer">Data Mesh</A> are on the rise. If we see data as an asset, as the new oil or want to make our company data driven, tools to get an overview are more and more important.<BR /> <BR /> Data Catalogs could be the kind of software delivering what is needed here.<BR /> <BR /> A from my perspective good definition of Data Catalog is from Gartner:<BR /> <BLOCKQUOTE><I>"A data catalog maintains an inventory of data assets through the discovery, description </I><I>and organization of datasets. The catalog provides context to enable data analysts, data </I><I>scientists, data stewards and other data consumers to find and understand a relevant </I><I>dataset for the purpose of extracting business value."</I></BLOCKQUOTE><BR /> <BLOCKQUOTE>- Gartner, 2017</BLOCKQUOTE><BR /> Now if you screen the market you find a lot of - often marketing-driven - terms, concepts and meanings here. Some speake about Data Discovery as a main task to gain value from (distributed) data. Lately I have seen a study challenging the term Data Intelligence, what is seen from some vendors as an advancement of Data Catalogs.<BR /> <BR /> In practice introducing a data catalog in a company can be a high effort depending on sources, complexity of data assets, people skilling up and bring into new roles (Data Owner, Data Stewardship, …), processes for data validation and data protection and so on.<BR /> <BR /> A lot of companies start just with a spreadsheet or tools like MS OneNote to provide information about there data and analytics assets in the company. And this could be sufficient depending on your complexity and organization.<BR /> <BR /> On the other hand we have solution specific metadata handling and search capabilities and specialiced solutions. If we have a look into the SAP specific portfolio we find different approaches from an (non-technical) user perspective:<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/09/DC1-1.jpg" /></P><BR /> <EM><STRONG>Fig. 1: Overview of end user metadata usage in SAP's Data &amp; Analytics solutions</STRONG></EM><BR /> <BR /> For sure there are further solutions working strongly with metadata like SAP Information Steward or SAP Power Designer. They are relevant in this context but more bound to a specific group of users like Data Stewards or Data Modelers.<BR /> <BR /> Maybe you know SAP communicated new developments integrated into SAP's cloud-based Unified Data &amp; Analytics Solution Portfolio. Currently there is some discussion about project <A href="https://twitter.com/gangadharansind/status/1567887673351880704" target="_blank" rel="nofollow noopener noreferrer">Data Suite</A> to redesign the full portfolio and break it down into a more flexible, service-based offering. Maybe on effect we could see is the usage of Data Flow capabilities from SAP Data Intelligence into SAP Data Warehouse Cloud. But not much further information is public available here. If we have a look on<BR /> <UL><BR /> <LI>SAP Analytics Cloud</LI><BR /> <LI>SAP Data Warehouse Cloud</LI><BR /> <LI>SAP HANA Cloud</LI><BR /> <LI>SAP Data Intelligence Cloud</LI><BR /> </UL><BR /> all these solutions have own approaches, terms and concepts to handle data assets today. Therefore to bring that together for a unified management with a streamlined Data Catalog-approach could be interesting. The last information I found is:<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/09/DC2.jpg" /></P><BR /> &nbsp;<EM><STRONG>Fig. 2: ONE Catalog Vision of SAP</STRONG></EM><BR /> <BR /> From time to time I have a look into the <A href="https://roadmaps.sap.com/board?CB=000D3AAC9DD21EECBB916A79BF223BBB&amp;range=2023Q1-2023Q4#Q1%202023" target="_blank" rel="noopener noreferrer">roadmap</A> for solutions. You can found that for the current Data Catalog offering within SAP Data Intelligence (DI) there is not much on the roadmap but the <A href="https://roadmaps.sap.com/board?range=CURRENT-LAST&amp;q=Data%20Intelligence&amp;PRODUCT=73554900100800002671#Q3%202022;INNO=000D3ABE772D1EECBCACBC91CFD89FF3" target="_blank" rel="noopener noreferrer">tighter integration</A> of capabilities with DWC, SAC and S/4HANA. But my understanding is, DI Data Catalog capabilities will be transfered to the new Data Catalog solution over time. The roadmap announces the Business Data Catalog based on SAP Data Warehouse Cloud:<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/09/DC3.jpg" /></P><BR /> <EM><STRONG>Fig. 3: SAP road map for Data Cataloging capabilities in SAP Data Warehouse Cloud</STRONG></EM><BR /> <BR /> This means for 2023 we can look forward for new features, functionalities and the chance of a better usage of our data assets in an overarching approach using a new Data Catalog solution. This approach seems to focus on SAP solutions especially those in the cloud. But let's see what comes.<BR /> <BR /> <SPAN style="color: #ff0000">Update 03.11.2022 - new items and differentiation in the roadmap</SPAN><BR /> <BR /> <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2022/09/DC4.jpg" /><BR /> <BR /> For Data Catalogs, many things has to be considered. Companies not only have SAP systems and there are a lot of specific Data Catalog products out there. SAP is not going into a blue ocean.<BR /> <BR /> What SAP ever did best was to integrate SAP with SAP. The question is, will this be enough? Are there approaches like leveraging add-ons and partnerships like SAP already does with <A href="https://bigid.com/partner/sap/" target="_blank" rel="nofollow noopener noreferrer">BigID</A> or is possibly a concept of Catalog of Catalogs via <A href="https://blogs.sap.com/2021/11/29/open-catalog-of-sap-data-intelligence/" target="_blank" rel="noopener noreferrer">open interfaces</A> a doable way?<BR /> <BR /> &nbsp;<BR /> <BR /> <EM>What is your perspective? What is your approach of keeping an overview of your data and analytics assets in your company to create value from data?</EM> 2022-09-17T12:56:55+02:00 https://community.sap.com/t5/technology-blog-posts-by-sap/discover-the-sap-cloud-alm-analytics-api/ba-p/13568744 Discover the SAP Cloud ALM Analytics API 2023-01-13T09:00:25+01:00 CyMac https://community.sap.com/t5/user/viewprofilepage/user-id/92235 <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/01/header_blog-1-scaled.jpg" height="114" width="848" /></P><BR /> To learn about and experiment with the <A href="https://help.sap.com/docs/CloudALM/fe419bfabbdc46dfbddbfd78b21483d5/d012b5cdb3744372ada8018c6a570358.html" target="_blank" rel="noopener noreferrer">SAP Cloud ALM Analytics API </A>, you can check out this <A href="https://github.com/SAP-samples/cloud-alm-api-examples/tree/main/postmancollections/analytics" target="_blank" rel="nofollow noopener noreferrer">Postman Collection.</A><BR /> <BR /> You just need to retrieve your apiKey from <A href="https://api.sap.com/api/CALM_ANALYTICS/overview" target="_blank" rel="noopener noreferrer">SAP API Business Hub</A> to try the OData and REST interfaces with the Postman client.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/01/postman_example.jpg" /></P><BR /> Another way to discover the interface with the mouse is to use the <A href="https://github.com/SAP/alm-plug-in-for-grafana" target="_blank" rel="nofollow noopener noreferrer">SAP ALM Plug-in</A> for Grafana with a data source pointing to the SAP API Business Hub Sandbox.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/01/grafana_plugin_config.jpg" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/01/grafana_dashboard-scaled.jpg" /></P><BR /> The SAP API Business Hub's sandbox for the SAP Cloud ALM Analytics API is the S<A href="https://support.sap.com/en/alm/demo-systems/cloud-alm-demo-system.html" target="_blank" rel="noopener noreferrer">AP Cloud ALM Public Demo Tenant</A>.<BR /> <BR /> &nbsp; 2023-01-13T09:00:25+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/customer-receipts-automation-through-payment-service-provider-integration/ba-p/13556176 Customer receipts automation through payment service provider integration 2023-03-29T01:22:10+02:00 sivamothukuri84 https://community.sap.com/t5/user/viewprofilepage/user-id/129785 <P class="reader-text-block__paragraph">This document explains how to create an inbound interface between payment service providers and SAP. The solution focuses on the automation of customer payment transactions made via various payment methods on the client's e-commerce website.</P><BR /> <P class="reader-text-block__paragraph">This solution would be cost-effective and appropriate for customers who run their businesses in SAP ECC.</P><BR /> <BR /> <H3 class="reader-text-block__heading2" id="toc-hId-1091248475"><STRONG>Business case:</STRONG></H3><BR /> <H3 class="reader-text-block__heading2" id="toc-hId-894734970">Key Drivers for the automation of the process: -</H3><BR /> <UL><BR /> <LI>When processing the transaction into SAP, the client followed a manual process that was prone to errors.</LI><BR /> <LI>The Finance Business user logs into the payment provider website (PayPal, WorldPay, Amex &amp; Klarna) and downloads a file containing payment transactions from a particular date, converts the file into a WinShuttle format, and then loads those transactions into SAP. A file downloaded today would contain sales from yesterday, since the user tends to check for and process the files on Mondays and Fridays, but with additional sensitivity around month-ends.</LI><BR /> <LI>Before uploading the files into SAP, the user must follow certain steps to manipulate the files downloaded from the payment provider's portal.</LI><BR /> <LI>The unloadable file should include a few standard transaction codes of the payment providers.</LI><BR /> <LI>Manual processing and file manipulation would depend on the solution designed to process the uploaded file.</LI><BR /> <LI>Payment transactions will be posted by the process file, calling SAP transaction code F-28, debiting the bank GL account and crediting the customer account. This process is not the same for all other payment providers. The receipt amount is net of the fee paid to the payment service providers.</LI><BR /> <LI>Additional transactions such as refunds, chargebacks, chargeback fees, etc, will also be captured and posted into finance along with receipt postings.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId-698221465">Key Rationale:</H3><BR /> <UL><BR /> <LI>Reduce manual effort and remove risk of human error: The incoming transaction records are processed, manipulated and transformed into a loadable format with minimal human effort.</LI><BR /> <LI>Provide future-proofing in the event that sales transaction volumes increase.</LI><BR /> <LI>Provide a daily stock reconciliation process similar to the one already in place for the various integrated 3PLs.</LI><BR /> <LI>Improve stock and payment reconciliation (removing write-offs where data cannot currently be reconciled) by aligning BRUSA with Business Intelligence processes, especially in capturing data at the order level (rather than at the summary level).</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId-501707960"><STRONG>Components of Integration –</STRONG></H3><BR /> <UL><BR /> <LI>In order to automate and integrate the Payment service providers with SAP, the development of SAG, PI, and SAP ECC is proposed.</LI><BR /> <LI>SAP PI can be integrated with the payment service provider's portal to retrieve transaction files from their SFTP folders. Based on the country in which the transactions were originated, the PSP will make the files available at pre-defined intervals and they will be transmitted to SAP AL11 folders for creation of Idocs.</LI><BR /> <LI>PI (Process Integration) integrates with SAP to process transaction files automatically.</LI><BR /> <LI>Through context types and mapping of source values to target values, the value mapping table can be used to derive certain important target parameters into the inbound interface. When mapping a transaction file to an Idoc, SAP PI will perform a lookup.</LI><BR /> <LI>Duplicate check functionality to prevent processing of transaction files more than once. In order to store all processed files, a bespoke table was created, and before processing any file, PI checks the bespoke table to see if any other files with the same name exist. If PI identifies a duplicate file, the file will not produce an IDOC.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId-305194455"><STRONG>PI -</STRONG></H3><BR /> <UL><BR /> <LI>Sender File Adapter with Content Conversion to maintain node names and field names as per the Payment Transaction file.</LI><BR /> <LI>Create Structure of the file in Enterprise Service Repository with field names based on the File Content Conversion Parameter defined in Integration Builder.</LI><BR /> <LI>Create Mapping transformation (Node &amp; Field level mapping logics) from Payment Transaction file to IDOC structure in Enterprise Service Repository.</LI><BR /> <LI>Create Integrated Configuration with IDOC Receiver Adapter.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId-108680950"><STRONG>Value Mapping –</STRONG></H3><BR /> <UL><BR /> <LI>In Payment providers integration, the value mapping functionality plays a pivotal role in converting a parameter from a source format to target value format to populate the parameters into the FI Idocs.</LI><BR /> <LI>Context types are created to group the source information on the origin of the data</LI><BR /> <LI>In the current model, the value mapping replication happens from source system to the target PI (Process integration) system.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId--87832555"><STRONG>File Formats -</STRONG></H3><BR /> <UL><BR /> <LI>The payment transaction file from each payment service providers will be distinct.</LI><BR /> <LI>A complex data mapping should be designed, to read, transform and translate the source file format into a target application document (SAP Idoc &amp; SAP document).</LI><BR /> <LI>Each field and column of the payment transaction files provided by the payment providers should be read by the data mapping design.</LI><BR /> <LI>Payment files supplied by payment providers could be in any format, for example, CSV or TXT.</LI><BR /> <LI>PSP file formats are very distinct, so it is preferred to have separate PI mapping logic for each file format from the PSPs (ex. PayPal, Worldpay, Amex, Klarna).</LI><BR /> <LI>Each field in the transaction file should be translated and populated, into the fields of the Idoc segments.</LI><BR /> <LI>Because the fields are interpreted differently by each payment provider, the data mapping will be complex.</LI><BR /> <LI>A “unique transaction type or event type code” identifies each transaction in the file, which is different for each payment provider.</LI><BR /> <LI>PSP portals and SFTPs create different file formats, and payment providers are very reluctant to modify the formats.</LI><BR /> <LI>Despite the apparent simplicity of the overall solution design, mapping the fields and realizing the business need will be extremely challenging.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId--284346060"><STRONG>Business Impact: -</STRONG></H3><BR /> <UL><BR /> <LI>Through the integration of payment service providers, the customer receipts process will be automated, reducing the manual effort in clearing receipts items during month-end closing by matching the same reference field from the O2C sales order automation.</LI><BR /> <LI>The financial impact on recording the receipts, tax and fees is unique to the payment service providers.</LI><BR /> <LI>Users can reconcile the total receipts posted through automation with those in the file downloaded from the portal.</LI><BR /> <LI>When a PSP sends a file with new transactions, it may include a new event code that has never been mapped to a business rule in PI. To identify and post correction journals by the business, we have designed a solution to post to the same bank GL account with unique line-item text.</LI><BR /> </UL><BR /> <H3 class="reader-text-block__heading2" id="toc-hId--480859565"><STRONG>S/4 Hana Road map (SAP Digital payments Add-on solution): -</STRONG></H3><BR /> <UL><BR /> <LI>The SAP digital payments add-on integration offers an out-of-the-box alternative to current custom payment service provider (PSP) integrations. This integration makes use of SAP's digital payments add-on with ready-to-use PSP connectivity.</LI><BR /> </UL><BR /> <H3 id="toc-hId--677373070">Summary: -</H3><BR /> <P class="reader-text-block__paragraph">By establishing an inbound interface between payment service providers and SAP, customer payment transactions can be automated in a simple manner, especially for customers who run their businesses in SAP ECC.</P><BR /> <BR /> <H3 id="toc-hId--949117944">Related topics –</H3><BR /> <UL><BR /> <LI>For information about the supported payment service providers, refer the documentation for&nbsp;<A href="https://help.sap.com/docs/DIGITALPAYMENTS" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/DIGITALPAYMENTS</A></LI><BR /> <LI>For information on Enterprise Management Layer for SAP S/4 HANA, refer the topic <A href="https://community.sap.com/topics/s4hana-cloud/enterprise-management-layer" target="test_blank">https://community.sap.com/topics/s4hana-cloud/enterprise-management-layer</A></LI><BR /> <LI>Ask questions about SAP Analytics Hub and follow (<A href="https://answers.sap.com/tags/73555000100800000638" target="_blank" rel="noopener noreferrer">https://answers.sap.com/tags/73555000100800000638</A>)</LI><BR /> <LI>Read other SAP Analytics Hub blog posts and follow (<A href="https://blogs.sap.com/tags/73555000100800000638" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/tags/73555000100800000638</A>)</LI><BR /> </UL><BR /> Please follow my profile for future posts - <SPAN class="mention-scrubbed">sivakumarmothukuri84</SPAN> 2023-03-29T01:22:10+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/stop-working-start-flowing-with-sap-sac-workflow-management-input-task/ba-p/13555161 Stop Working, Start flowing with SAP SAC Workflow Management: Input Task 2023-06-08T00:03:57+02:00 monica_sugeth https://community.sap.com/t5/user/viewprofilepage/user-id/852059 I am Monica Elam Parithi, a Senior SAP Consultant working in SAC and ByD.<BR /> <BR /> In this article, I would like to provide comprehensive insights into the “Input Task” in SAC.<BR /> <BR /> We will generate an Input Task rather than share the version with each responsible member and risk getting out of track.<BR /> <BR /> In most cases, we share the version with responsible members and don’t know which task is performed by which member. This is where the Input task plays the key role.<BR /> <BR /> <STRONG>What is an Input task?</STRONG><BR /> <UL><BR /> <LI>Input Task is used to obtain feedback or value changes or other additional information from colleagues.</LI><BR /> <LI>It can be assigned to one or more colleagues and can be used to work iteratively on different assignments.</LI><BR /> <LI>Here each Assignee works individually on the assigned task in a private version so it will not disturb the actual values (Original Story remains unchanged) and will also not be visible to any other users.</LI><BR /> <LI>Once the task is completed and after it is reviewed by the owner of the input task it will be published from the private to the public version.</LI><BR /> </UL><BR /> <STRONG>What are the Prerequisites to Perform Input Task?</STRONG><BR /> <UL><BR /> <LI>You need to start by creating a Story using the Planning Model</LI><BR /> <LI>The model must have at least one dimension by which the responsible users (Assigner/Assignee) can be identified.</LI><BR /> <LI>There must be an active Private Version to create a task.</LI><BR /> <LI>To perform this process, the Story must have the Actual values for the data which has to be updated.</LI><BR /> </UL><BR /> <STRONG>Process Flow of Input Task</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px">Assign the responsible person in the dimension of the Data Model.<IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task1.png" /></P><BR /> A brief example for the Input Task: Let us consider that the owner of the model needs to assign the task to “NJACOB” (the Assignee). In this case, it is the Sales Head and he wants to update the sales team’s working hours and cost and it is for that reason that he is using the “Input Task.”<BR /> <BR /> <STRONG>Step-1: </STRONG><BR /> <BR /> <STRONG>&nbsp;</STRONG><BR /> <UL><BR /> <LI>Open the Story where the Input Task is to be created.</LI><BR /> <LI>The Story must have the Private version for task allocation, here I have used the private version named “Input Task.”</LI><BR /> <LI>Choose the “More” icon and select the “Create Input Task.”</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task2.png" /></P><BR /> <STRONG>Step-2</STRONG><EM>:</EM> The summary page tab will be created on the left side of the story page<BR /> <UL><BR /> <LI>Specify the due date.</LI><BR /> <LI>Enable “Cancel task on the due date automatically.” This is needed if you wish to cancel the task on the due date upon completion.</LI><BR /> <LI>You can also add a reminder to inform the Assignee about the task.</LI><BR /> <LI>Enable the “Models” Check box.</LI><BR /> <LI>In “Versions,” select the private version against which we are going to assign the task.</LI><BR /> <LI>Under “Distribution” select the Dimension against which the Person responsible is assigned in the data model.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task3.png" /></P><BR /> <BR /> <UL><BR /> <LI>Enable the checkbox against which the assignee is going to work. I have assigned for “Refining” and “Corporate” members.</LI><BR /> <LI>Before clicking on “Send” make sure there are no other Versions in edit mode.</LI><BR /> <LI>Click on “Send” (The task will be sent as mail and notified in the SAC).</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task4.png" /></P><BR /> <STRONG>Step-3</STRONG><EM>:</EM> Action to be performed by the Assignee NJACOB:<BR /> <UL><BR /> <LI>The Assignee receives the task in the notification and selects the task.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task5.png" /></P><BR /> <BR /> <UL><BR /> <LI>Assignee will be able to see the Task for active members. Initially, the status of the task will show “In Process.”</LI><BR /> <LI>Choose one of the “Active Members” to switch to the active Story where you wish to edit.</LI><BR /> <LI>On selecting the member “Sales,” we will be navigated to the Sales Cost center on the Story page.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task6.png" /></P><BR /> <BR /> <UL><BR /> <LI>Initially the value of Sales is “0.”</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task7.png" /></P><BR /> <BR /> <UL><BR /> <LI>The value is specified by the owner of the Cost Center “Sales”</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task8.png" /></P><BR /> <BR /> <UL><BR /> <LI>After completing the task, click on “Submit.” Add the comment if needed and then Submit.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task9.png" /></P><BR /> <STRONG>Step-4</STRONG><EM>:</EM> The Owner of the task gets the notification in both mail and SAC. The owner opens the Input Task and that changes the Task Status to “Submitted.”<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task10.png" /></P><BR /> <BR /> <UL><BR /> <LI>Check the values updated by the Assignee; the Assigner can either Approve or Cancel the submitted task.</LI><BR /> <LI>Once the task is approved or cancelled by the owner of the task, the Assignee will be intimated through mail and SAC tenant.</LI><BR /> <LI>Now the Status is changed to “Successful.”</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task11.png" /></P><BR /> <BR /> <UL><BR /> <LI>In this case, the task is approved, and after the approval, the updated values are available in the Original Story. Now you can convert the Private Version to the Public Version, which makes it visible for all the users.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/06/Input-Task12.png" /></P><BR /> In conclusion, we can easily assign the task and track the work by the owner of the story.<BR /> <BR /> Always try to simplify and streamline the work. Make it easy, rather than complicated.<BR /> <BR /> Thanks for investing your time in reading this blog. Please feel free to ask a question in the comments below if you need any further information.<BR /> <BR /> If you liked my blog and thought it was informative, follow me at Monica Parithi. 2023-06-08T00:03:57+02:00 https://community.sap.com/t5/technology-blog-posts-by-sap/sap-analytics-cloud-changes-with-managing-licenses-q3-or-q4-2023/ba-p/13552931 SAP Analytics Cloud - Changes with managing licenses (Q3 or Q4 2023) 2023-07-05T16:47:08+02:00 Matthew_Shaw https://community.sap.com/t5/user/viewprofilepage/user-id/70553 &nbsp;<BR /> <BR /> I'm pleased to announce many updates to my article on managing licenses with teams and roles for SAP Analytics Cloud. This update is needed because many significant changes are coming. This blog provides a high-level summary covered in-depth in my <A href="https://blogs.sap.com/2020/03/10/sap-analytics-cloud-managing-licenses-with-roles-and-teams/" target="_blank" rel="noopener noreferrer">original, now updated article</A>.<BR /> <H1 id="toc-hId-832971432">Improvements in SAP Analytics Cloud, from early November 2023:</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/deactivate-users-in-SAC.jpg" height="260" width="404" /></P><BR /> Recent improvements:<BR /> <UL><BR /> <LI><STRONG>Users can be deactivated</STRONG>, freeing them from all license consumption</LI><BR /> <LI>This means no need to delete dormant users to remain compliant with SAP licensing</LI><BR /> <LI>Introduced in Q2 QRC, May 2023, for all customers</LI><BR /> <LI><A href="https://blogs.sap.com/2023/04/12/sap-analytics-cloud-activate-deactivate/" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/2023/04/12/sap-analytics-cloud-activate-deactivate/ </A></LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/The-number-of-available-Business-Intelligence-licenses-has-been-exceeded.jpg" height="101" width="460" /></P><BR /> Changes already now in place, effective early November 2023:<BR /> <UL><BR /> <LI><STRONG>‘Named-user license entitlement’ will be enforced</STRONG><BR /> <UL><BR /> <LI>SAP Analytics Cloud will prevent the number of named-user licenses from exceeding their entitlement</LI><BR /> <LI>This will be enforced at the time of consuming a new license when:<BR /> <UL><BR /> <LI>creating named-users</LI><BR /> <LI>assigning roles</LI><BR /> <LI>updating team membership of roles and users</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> </LI><BR /> <LI><DEL>Planned change is coming in Q3 QRC, August 2023, for all customers</DEL></LI><BR /> <LI><DEL><SPAN style="color: #000000">The planned change is for the end of Q3 or early Q4 2023, for all customers. There is no longer an alignment with the QRC release, meaning the planned change will be applied mid-way through the update cycle.The exact dates are still not confirmed, however, they might be</SPAN></DEL><BR /> <UL><BR /> <LI><DEL><SPAN style="color: #000000">Thursday 19th October 2023 for Asia-Pac,</SPAN></DEL></LI><BR /> <LI><DEL><SPAN style="color: #000000">Thursday 26th October 2023 for EMEA,</SPAN></DEL></LI><BR /> <LI><DEL><SPAN style="color: #000000">and Thursday 2nd November 2023 for the Americas.</SPAN></DEL></LI><BR /> </UL><BR /> <DEL><SPAN style="color: #000000">The dates remain subject to change and they could be changed without notice</SPAN></DEL></LI><BR /> <LI><STRONG>The license entitlement is now enforced for all customers worldwide. There are no exceptions.</STRONG></LI><BR /> <LI><A href="https://blogs.sap.com/2023/04/12/sap-analytics-cloud-license-usage-limits/" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/2023/04/12/sap-analytics-cloud-license-usage-limits/</A></LI><BR /> </UL><BR /> &nbsp;<BR /> <H1 id="toc-hId-636457927">Important steps needed to prepare</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/License-exceeded-entitlement.jpg" height="472" width="473" /></P><BR /> Essential steps may be required in preparation for this change<BR /> <UL><BR /> <LI>Action is needed if the license limit has been exceeded, as shown in the ‘Security-Monitor’ interface<BR /> <UL><BR /> <LI>a toast warning is also shown in the ‘Security-Users’ interface</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <UL><BR /> <LI><STRONG>If no action is taken</STRONG>:<BR /> <UL><BR /> <LI>Existing users will still be able to log in, and any schedules will keep running</LI><BR /> <LI>Users will not be locked out; they can continue to log in and enjoy SAP Analytics Cloud as they did before</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/user-and-team-update-fail-due-to-license-limit-reached.jpg" /></P><BR /> <BR /> <UL><BR /> <LI>However:<BR /> <UL><BR /> <LI>New user creation and team updates <STRONG>will fail</STRONG> if a new license assignment exceeds the entitlement</LI><BR /> <LI>For example, the following operations <STRONG>will fail</STRONG>:<BR /> <UL><BR /> <LI>new user creation, if there are not enough available licenses</LI><BR /> <LI>any role-to-user, or role-to-team assignment change, if there are not enough available licenses</LI><BR /> <LI>any team change, if any user of that team requires a license that isn’t available</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <H1 id="toc-hId-439944422">What named-user license enforcement means</H1><BR /> <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/5-what-license-enforcement-means.jpg" height="180" width="180" /><BR /> <UL><BR /> <LI>Previously, SAP Analytics Cloud allowed organisations to consume more licenses than those entitled<BR /> <UL><BR /> <LI>For example, you could create 100 named users on an SAP Analytics Cloud Service that only had 50 named user licenses, although a warning would be shown in the System-Monitor screen</LI><BR /> </UL><BR /> </LI><BR /> <LI>Previously, this meant you, the customer, had the responsibility to prevent over-provisioning</LI><BR /> <LI>These changes mean<BR /> <UL><BR /> <LI>SAP is taking on the role of preventing the over-provisioning of license entitlement</LI><BR /> <LI>Workflows that would over-provision the license entitlement will now fail, whereas before, they would not</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <H1 id="toc-hId-243430917">Scope</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/6-landscape.jpg" /></P><BR /> Scope of the changes:<BR /> <UL><BR /> <LI>Applicable for all Enterprise Edition service types:<BR /> <UL><BR /> <LI>i.e. regular production service/tenant (non-Test), Test &amp; Test Preview<BR /> <UL><BR /> <LI>A regular production service (non-Test ) can be used as ‘Dev’, or ‘QA’.</LI><BR /> </UL><BR /> </LI><BR /> <LI>Makes no difference if the service is ‘public’ or ‘private’</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>It is only named users, this is not applicable for users with a ‘Business Intelligence concurrent session’ license</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/7-import-users-as-file.jpg" /></P><BR /> <BR /> <UL><BR /> <LI>Applicable for all forms of user management<BR /> <UL><BR /> <LI>Using the user interface, including the ‘import-users’ file option</LI><BR /> <LI>Includes ‘dynamic user creation’ available with SAML SSO</LI><BR /> <LI>SCIM API and user provisioning jobs</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>Not applicable for the Business Technology Platform Embedded Edition<BR /> <UL><BR /> <LI>This is the cut-down edition and provides live connectivity to SAP HANA Cloud only</LI><BR /> <LI>This edition has a fixed 150 concurrent user session limit</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>To understand more about Test and regular production service/tenant (non-Test) services please visit the blog <A href="https://blogs.sap.com/2020/04/20/sap-analytics-cloud-landscape-architecture-life-cycle-management/#MRPT" target="_blank" rel="noopener noreferrer">https://blogs.sap.com/2020/04/20/sap-analytics-cloud-landscape-architecture-life-cycle-management/#MRPT</A></LI><BR /> </UL><BR /> &nbsp;<BR /> <H1 id="toc-hId-46917412">Resolving exceeded license limits</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/8-activate-deactivate-users-delete-users.jpg" /></P><BR /> Any currently exceeded license can be resolved either by<BR /> <OL><BR /> <LI>Deactivating users, which means they don’t consume a license</LI><BR /> <LI>Assigning users so they consume a different license not yet at capacity<BR /> <UL><BR /> <LI>Any users consuming a ‘Business Intelligence’ license could be assigned a ‘Business Intelligence concurrent session’ license if you have one of these, now legacy licenses. There are no limits to the number of users that can be assigned such licenses</LI><BR /> <LI>Users currently consuming a Planning license could be assigned a ‘Business Intelligence’ license</LI><BR /> </UL><BR /> </LI><BR /> <LI>Deleting users</LI><BR /> <LI>Or, for some services and license types, add more licenses by purchasing them from SAP</LI><BR /> </OL><BR /> <H1 id="toc-hId--149596093">Understanding new license consumption rules</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/9-user-license-types.jpg" height="317" width="482" /></P><BR /> <BR /> <UL><BR /> <LI>There are new license calculation rules</LI><BR /> <LI>And these rules that govern license consumption may not be entirely obvious</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/10-license-consumption-calculation.jpg" height="211" width="412" /></P><BR /> <BR /> <UL><BR /> <LI>For example, some licenses may need be consume a higher license type<BR /> <UL><BR /> <LI>For instance, in this example, there are 11 Business Intelligence licenses assigned</LI><BR /> <LI>This 11 exceeds the 10 licensed by 1</LI><BR /> <LI>Since there are Planning Standard licenses available, and given such licenses include Business Intelligence licenses, this excess of 1 license can be consumed from the Planning Standard license entitlement as only 2 of the 5 are assigned, making 3 available</LI><BR /> </UL><BR /> </LI><BR /> <LI>Comprehensive details and many more examples are provided in the updated article</LI><BR /> </UL><BR /> <H1 id="toc-hId--346109598">Current errors may disappear after the update</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/13-error-could-disappear-scaled.jpg" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/14-old-rules-do-not-use-higher-license-scaled.jpg" /></P><BR /> <BR /> <UL><BR /> <LI>These new license calculation rules are <STRONG>not</STRONG> applicable before the named-user license enforcement</LI><BR /> <LI>This means existing warnings <STRONG>could</STRONG> disappear since the new calculation rules can consume a license from a ‘higher’ license type<BR /> <UL><BR /> <LI>In this instance:<BR /> <UL><BR /> <LI>There are 100 ‘Planning Professional’ licenses, of which 46 are consumed, leaving 54 available</LI><BR /> <LI>There are no ‘Planning Standard’ licenses</LI><BR /> <LI>The screenshot shows an ‘error’ even though there are 54 ‘Planning Professional’ licenses available</LI><BR /> <LI>Once the new rules are applied, this 1 ‘Planning Standard’ assigned license will be consumed from the ‘Planning Professional’ license type resolving the error message</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> <H1 id="toc-hId--542623103">Important next steps</H1><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/11-next-steps.jpg" height="233" width="407" /></P><BR /> <BR /> <UL><BR /> <LI>The article ‘Managing License with Roles and Teams’ has received a significant update to reflect these changes<BR /> <UL><BR /> <LI>This updated article strives to answer all your questions, or possible concerns this blog might raise</LI><BR /> <LI>All the thinking has been done for you</LI><BR /> <LI>All you need to do is read this article carefully and take the appropriate action</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>It provides comprehensive details, including:<BR /> <UL><BR /> <LI>How licenses are assigned to users</LI><BR /> <LI>What the default license is, and how is it consumed when no roles are assigned</LI><BR /> <LI>How licenses are calculated and examples of consuming ‘higher’ licenses</LI><BR /> <LI>License assignment examples, including the new ‘deactivate’ users feature</LI><BR /> <LI>Examples of user creation failure and suggested solutions due to lack of licenses</LI><BR /> <LI>Examples of team and user update failure including suggested solutions related to Analytics Hub roles</LI><BR /> </UL><BR /> </LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>It has also received an update related to the Business Intelligence concurrent session licenses as there is a subtle but important change planned that is a common ask from customers. This change means that when best practices are adopted, removing previously assigned Planning roles will revert the user back to a concurrent session license, unlike before, thus saving the effort to re-assign the concurrent license</LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>Other updates have been made to best practices to reflect subtle changes to the SCIM API, including the updated SCIM2 API that supports the PATCH method and what this means for user provisioning with SAP IPS</LI><BR /> </UL><BR /> &nbsp;<BR /> <UL><BR /> <LI>Finally, a significant number of new FAQs have been added, and this includes many on ‘deactivated users’</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/12-rating.jpg" height="179" width="239" /></P><BR /> <BR /> <UL><BR /> <LI><A href="https://blogs.sap.com/2020/03/10/sap-analytics-cloud-managing-licenses-with-roles-and-teams/" target="_blank" rel="noopener noreferrer">Please proceed to this blog that introduces the updated article for further information and best practices to prepare you for these upcoming changes</A></LI><BR /> </UL><BR /> &nbsp;<BR /> <BR /> Many thanks<BR /> <BR /> Matthew Shaw&nbsp;<A href="https://twitter.com/MattShaw_On_BI" target="_blank" rel="noopener nofollow noreferrer">@MattShaw_on_BI</A> 2023-07-05T16:47:08+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/my-success-with-hana-express/ba-p/13559675 My Success with HANA Express 2023-07-20T10:38:44+02:00 P281512 https://community.sap.com/t5/user/viewprofilepage/user-id/161179 Some friendly queries on embedded Analytics Data Science Machine learning made me interested in SAP HANA&nbsp; ML.<BR /> <BR /> First my sheer chance I chose PAL (nor APL)<BR /> <BR /> <A href="https://blogs.sap.com/2021/02/25/hands-on-tutorial-leverage-sap-hana-machine-learning-in-the-cloud-through-the-predictive-analysis-library/" target="_blank" rel="noopener noreferrer"><SPAN style="font-weight: 400">Hands-On Tutorial: Leverage SAP HANA Machine Learning in the Cloud through the</SPAN><B> Predictive Analysis Library</B><SPAN style="font-weight: 400"> | SAP Blogs</SPAN></A><BR /> <BR /> I requested advice from blog author how to try this<BR /> <STRONG>He suggested HANA Express</STRONG><BR /> <BR /> <STRONG>Install SAP HANA 2.0, express edition on a Preconfigured Virtual Machine</STRONG><BR /> <A href="https://developers.sap.com/group.hxe-install-vm.html" target="_blank" rel="noopener noreferrer">https://developers.sap.com/group.hxe-install-vm.html</A><BR /> <BR /> <EM>Excellent link</EM><BR /> <BR /> My home PC is CENTOS8 with 16GB RAM<BR /> I prefer Linux as one can spin up a server with 24 or 32GB RAM for an experiment and after task is over delete the instance in any cloud provider like GCP AWS .. I prefer GCP Google Cloud Platform<BR /> <BR /> The PAL example crashed in my own Centos8 as it faced a lot of memory problems.<BR /> though I&nbsp; ran VMware Linux Xwindows version<BR /> I was not keen on X-Windows on GCP&nbsp; as thought nested VMs may not work<BR /> Docker is not VM and is container<BR /> <BR /> <EM>VMWare VirtualBox ova files can be converted to docker image</EM><BR /> <BR /> Saplabs has a docker but does not support ML<BR /> <A href="https://hub.docker.com/r/saplabs/hanaexpress" target="_blank" rel="nofollow noopener noreferrer">https://hub.docker.com/r/saplabs/hanaexpress</A><BR /> <BR /> I therefore converted hxe.ova to docker image <STRONG>hxesave:latest</STRONG> using<BR /> <A href="https://medium.com/@roberto.fernandez.perez/create-docker-base-image-for-legacy-linux-system-3f5f77acd740" target="_blank" rel="nofollow noopener noreferrer">https://medium.com/@roberto.fernandez.perez/create-docker-base-image-for-legacy-linux-system-3f5f77acd740</A><BR /> <BR /> <STRONG>hxesave:latest</STRONG> was my image from hxe.ova<BR /> <BR /> docker run -p 8090:8090 -p 4390:4390 -p 39013:39013 -p 39017:39017 -p 39041-39045:39041-39045 -p 1128-1129:1128-1129 -p 59013-59014:59013-59014 -it --name <STRONG>hxecnt</STRONG> -h hxehost hxesave:latest bash<BR /> <BR /> hxecnt is the container<BR /> docker container start hxecnt&nbsp; &nbsp; # To start<BR /> docker exec -it hxecnt bash&nbsp; &nbsp; &nbsp; # HDB start&nbsp; HDB stop<BR /> docker container stop hxecnt&nbsp; &nbsp; # To stop<BR /> <BR /> <SPAN style="text-decoration: underline">AFTER docker exec -it hxecnt bash</SPAN><BR /> hxehost:/ # <STRONG>su - hxeadm</STRONG><BR /> hxeadm@hxehost:/usr/sap/HXE/HDB90&gt; source jncbash.sh # My fav bash settings<BR /> hxeadm@hxehost:/usr/sap/HXE/HDB90&gt; cat jncstart.sh<BR /> <BR /> HDB start<BR /> HDB stop<BR /> HDB start<BR /> <BR /> hxeadm@hxehost:/usr/sap/HXE/HDB90&gt; ./jncstart.sh<BR /> start-stop-start needed as some init issue stops nameserver first time<BR /> <BR /> On the 24GB GCP Server Yannick Schaper's PAL example ran fine!<BR /> This was a demo of a large example to bring out the miracle of SAP Data Science!<BR /> <BR /> Unknown to me there was pre-existing AFL plugin<BR /> <SPAN style="font-weight: 400">SAP HANA AFL (incl.<STRONG>PAL</STRONG>,BFL,OFL)</SPAN><SPAN style="font-weight: 400"><BR /> </SPAN><SPAN style="font-weight: 400">Version 2.00.054.0000.1611928859</SPAN><BR /> <BR /> Challenge came after I tried APL<BR /> <BR /> <A href="https://blogs.sap.com/2020/07/27/hands-on-tutorial-automated-predictive-apl-in-sap-hana-cloud/" target="_blank" rel="noopener noreferrer"><SPAN style="font-weight: 400">Hands-On Tutorial: </SPAN><B>Automated Predictive (APL) in SAP HANA Cloud</B><SPAN style="font-weight: 400"> | SAP Blogs</SPAN></A><BR /> <BR /> I did not have the APL plugin<BR /> Install Optional Packages<BR /> <A href="https://developers.sap.com/tutorials/hxe-ua-packages-vm.html" target="_blank" rel="noopener noreferrer">https://developers.sap.com/tutorials/hxe-ua-packages-vm.html</A><BR /> <BR /> Unfortunately <STRONG>apl.tgz</STRONG> the download of APL <EM>did not work though installed OK</EM>.<BR /> <BR /> <SPAN style="font-weight: 400">I read Installing the Automated Predictive Library (APL) on SAP HANA Express 2.0<BR /> </SPAN><A href="https://blogs.sap.com/2018/01/11/installing-the-automated-predictive-library-apl-on-sap-hana-express-2.0/" target="_blank" rel="noopener noreferrer"><SPAN style="font-weight: 400"></SPAN></A><A href="https://blogs.sap.com/2018/01/11/installing-the-automated-predictive-library-apl-on-sap-hana-express-2.0/" target="test_blank" rel="noopener noreferrer">https://blogs.sap.com/2018/01/11/installing-the-automated-predictive-library-apl-on-sap-hana-express-2.0/</A><BR /> <BR /> <STRONG>Downloaded official APL plugin and this worked fine.</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px">There is a SAP NOTE <STRONG>2871252</STRONG> - What does AFL, APL, PAL stand for<BR /> With the picture that explains it all</P><BR /> <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/07/2871252-1.jpg" height="221" width="374" /><BR /> <P class="image_caption" style="text-align: center;font-style: italic">AFL PAL APL Explained</P><BR /> Way I have understood<BR /> AFL is the library tightly coupled with the Hana Data Base<BR /> PAL and APL are the 2 libraries that have the Data Science Algorithms<BR /> PAL is part of AFL and APL is to be installed additionally<BR /> <A href="https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/hana_ml.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/doc/1d0ebfe5e8dd44d09606814d83308d4b/2.0.07/en-US/hana_ml.html</A><BR /> <BR /> To summarize<BR /> HANA EXPRESS is simply amazing learning vehicle.<BR /> Thanks to SAP for sharing as you can have ScriptServer and both PAL &amp; APL to study fast data science in HANA Express; total control<BR /> <BR /> If you have GCP(Google Cloud Platform);&nbsp; you can create and drop VMs as you wish what size and power you wish<BR /> Just upload a docker image (saved in tar gz) and you are up and running.<BR /> Delete the server when over<BR /> <BR /> 90% of the time my 16GB RAM Centos8 is fine<BR /> <BR /> I would encourage ALL developers to try HANA Express<BR /> Free HANA in BTP interface very nice but highly restricted<BR /> I noticed SAP now&nbsp; has renamed AFL as PAL in BTP<BR /> <BR /> <STRONG>If you want to experience HANA Express in full glory do try</STRONG><BR /> at least VMWARE version in Windows PRO or Linux 2023-07-20T10:38:44+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-datasphere-data-flow-series-introduction-and-sample-example/ba-p/13578048 Sap Datasphere Data Flow Series – Introduction and sample example 2023-11-09T21:58:07+01:00 Kunal_Mohanty https://community.sap.com/t5/user/viewprofilepage/user-id/172190 <P style="text-align: left">Hey Champs,</P><BR /> <P style="text-align: left">Let’s understand one of the important feature of datasphere which is dataflow, Before moving on to <SPAN style="font-size: 1rem">the topic first we have to understand the three things well i.e “What”&nbsp; “Why”&nbsp; ”How”.</SPAN></P><BR /> This article is the first in the blog post series and introduces about dataflow and one small example that we’ll use along the entire blog series listed below:<BR /> <UL><BR /> <LI>Blog Post #1:&nbsp;<A href="https://blogs.sap.com/2023/11/09/sap-datasphere-data-flow-series-introduction-and-sample-example/" target="_blank" rel="noopener noreferrer">Introduction to dataflow and example</A></LI><BR /> <LI>Blog Post #2:&nbsp;<A href="https://blogs.sap.com/2023/11/17/sap-datasphere-data-flow-series-operators-joins-projection-aggregation-union/" target="_blank" rel="noopener noreferrer">Sap Datasphere Data Flow Series – Operators (Joins, Projection, Aggregation, Union)</A></LI><BR /> <LI>Blog Post #3:&nbsp;<A href="https://blogs.sap.com/2023/11/25/sap-datasphere-data-flow-series-script-operator-part-1/" target="_blank" rel="noopener noreferrer">Sap Datasphere Data Flow Series – Script Operator Part 1</A></LI><BR /> <LI>Blog Post #4: Sap Datasphere Data Flwo series - Script operator Part 2(Still cooking)</LI><BR /> </UL><BR /> <P style="text-align: left"><STRONG>Why:</STRONG></P><BR /> <P style="text-align: left">Data flow is an exciting new feature of SAP Datasphere because it provides a visually-based modeling experience for data integration. This makes it easier to combine and load data from various sources, including structured and unstructured data. Just imagine we have BODS facility integrated here and that is awesome.</P><BR /> <P style="text-align: left"><STRONG>What:</STRONG></P><BR /> <P style="text-align: left">Data flow allows you to create and manage data pipelines using a graphical interface. These pipelines can be used to perform a variety of tasks, such as extracting data from different sources, transforming it, and loading it into target destinations.</P><BR /> <P style="text-align: left"><STRONG>How:</STRONG></P><BR /> <P style="text-align: left">Well this answer will not be small and simple, to understand this lets watch the full movie with me.</P><BR /> <P style="text-align: left"><STRONG>User Interface :</STRONG></P><BR /> <P style="text-align: left">Champs lets see what all tools and option Sap Datasphere is providing us.</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/DataFlow-User-Interface.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">DataFlow User Interface</P><BR /> &nbsp;<BR /> <P style="text-align: left">Now lets cover a simple scenario of getting data from a source as excel file and loading to a target table.</P><BR /> <P style="text-align: left">Wait.. Do we need to create target table again ?. No not at all, thanks to Sap for this wonderful feature where we just have to click create and deploy target table and automatically target table will be created.</P><BR /> <P style="text-align: left"><STRONG>USE CASE 1:</STRONG></P><BR /> <P style="text-align: left">We got a requirement where while getting the order information from a different source table we want to add the leading zeros to the item column, so how to do it ?</P><BR /> <P style="text-align: left">let’s jump in for data flow to add the leading zeros :-</P><BR /> <P style="text-align: left">First we will create the source table using one excel file. After that we need to drag and drop the Sales order table into our play area :</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><STRONG><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-Source-Table.png" /></STRONG></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding Source Table : Sales Order Table</P><BR /> <P style="text-align: left">Well, Now lets write one small python script to add the leading zeros. “What ? Datasphere supports python ??”. Yes, that’s correct datasphere support many python function and its still evolving.</P><BR /> <P style="text-align: left">Drag and drop script operator as shown in the below image and write the following code.</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-Script-Operator.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding Script Operator</P><BR /> <P style="text-align: left">Once we finished writing the script we have to then click on add table to create the target table.</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-Target-Table.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding Target Table</P><BR /> <P style="text-align: left">Now once we have added the target table. Click on Target table and in the right side we will get details panel. Time being select the mode as append. And then in the top left corner just clock deploy option.</P><BR /> <P style="text-align: left">I will discuss all the rest mode types in details in further blogs.</P><BR /> <P style="text-align: left"><STRONG>Append</STRONG>: Write the data obtained from the data flow as new records appended to the end of the target table.</P><BR /> <P style="text-align: left">Now go to target table and click on data preview and its done, as we can see all the items have leading zeros.</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Running-the-Dataflow.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Running the Dataflow</P><BR /> That's not the end, Stay tuned for more blogs which I will be adding to the Data Flow Series. 2023-11-09T21:58:07+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-datasphere-data-flow-series-operators-joins-projection-aggregation/ba-p/13580214 Sap Datasphere Data Flow Series – Operators (Joins, Projection, Aggregation, Union) 2023-11-17T17:49:34+01:00 Kunal_Mohanty https://community.sap.com/t5/user/viewprofilepage/user-id/172190 Hey Champs,<BR /> <BR /> Hope you enjoyed the last blog on&nbsp; <A href="https://blogs.sap.com/2023/11/09/sap-datasphere-data-flow-series-introduction-and-sample-example/" target="_blank" rel="noopener noreferrer">Sap Datasphere Data Flow Series – Introduction and sample example</A>, well its just the trailer. Today lets jump into different types of operator in Datasphere in details.<BR /> <P data-sourcepos="1:1-1:164">Operators in SAP Data Sphere are used to manipulate and transform data. They can be used to perform a wide variety of tasks, such as filtering, sorting, aggregating, and joining data. Operators can be used in both data flows and data jobs.</P><BR /> <P data-sourcepos="3:1-3:62">Here are some of the most common operators in SAP DataSphere:</P><BR /> <BR /> <UL><BR /> <LI>Projection Operator</LI><BR /> <LI>Join Operator</LI><BR /> <LI>Union Operator</LI><BR /> <LI>Aggregation Operator</LI><BR /> <LI>Script Operator (We will cover in a separate blog)</LI><BR /> </UL><BR /> <P style="text-align: left"><STRONG>Join Operator &amp; Projection Operator in Data Flow:</STRONG></P><BR /> <P style="text-align: left" data-sourcepos="1:1-1:152">In SAP Data Sphere, the Join Operator is used to combine data from two or more sources based on a common column between them</P><BR /> <P style="text-align: left" data-sourcepos="3:1-3:72">There are several types of joins available in SAP Data Sphere, each with its own purpose:</P><BR /> <BR /> <OL style="text-align: left" data-sourcepos="5:1-5:116"><BR /> <LI data-sourcepos="5:1-5:116"><BR /> <P data-sourcepos="5:4-5:205"><STRONG>Inner Join:</STRONG> This is the most common type of join, and it combines data from two sources where the values in a related column match. Only rows with matching values are included in the result dataset.</P><BR /> </LI><BR /> <LI data-sourcepos="7:1-7:66"><BR /> <P data-sourcepos="7:4-7:66"><STRONG>Left Join:</STRONG> This join includes all rows from the left source, regardless of whether there is a matching row in the right source. If there is no matching row, the values for the right source's columns are displayed as null.</P><BR /> </LI><BR /> <LI data-sourcepos="9:1-10:0"><BR /> <P data-sourcepos="9:4-9:225"><STRONG>Right Join:</STRONG> This join is similar to the left join, but it includes all rows from the right source instead of the left source. If there is no matching row, the values for the left source's columns are displayed as null.</P><BR /> </LI><BR /> </OL><BR /> <P style="text-align: left" data-sourcepos="15:1-15:330">To use the Join Operator in SAP Data Sphere, you can drag and drop it onto the data flow canvas, connect it to the desired data sources, and configure the join type and join condition. The join condition specifies the related column between the sources and the values that must match for rows to be included in the result dataset.</P><BR /> <STRONG>Projection Operator:</STRONG><BR /> <BR /> It allows us select the required fields into the output. For example we have 100 columns in a table and we want to see only 50 columns so in that case we can use a projection operator and select our required fields.<BR /> <BR /> Lets see one example where we are join two dataset named Order Table and Order Item table.<BR /> We can combine Order table and Order Item table using Order id as join condition between two dataset.<BR /> <BR /> <EM>Thoda Gyan:</EM><BR /> <EM>The size limit for files being processed by the join operator is 10 GB.</EM><BR /> <BR /> <STRONG>Steps :</STRONG><BR /> <OL><BR /> <LI>From the left side repository panel drag the Order header table and Order Item table to the play area as source.</LI><BR /> <LI>Drag and drop the join operator from operator panel to the play area.</LI><BR /> <LI>Now drag the projection operator in front of the Order header table, now connect the Order header table and projection.</LI><BR /> <LI>Now connect a line from projection and Order item table to the join operator.</LI><BR /> <LI>Click on projection and change the Name to Customer Name. Now click on the join operator and select inner join as shown below.</LI><BR /> </OL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-as-projection.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding a projection</P><BR /> Now click on add target table, Add in the right side details pane click on “Create and deploy target&nbsp; table” . And from the general tab select the mode as append and save and activate. Now in the top left corner click run as show in the below image.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Running-the-Data-Flow.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Running the Data Flow</P><BR /> .<STRONG>Aggregation Operator:</STRONG><BR /> <BR /> Aggregation refers to the function whereby key figure values on detail level are automatically summed up at runtime and shown or planned on aggregated level.<BR /> <P data-sourcepos="3:1-3:23"><STRONG>SUM:</STRONG> Calculates the total sum of values for a particular column.</P><BR /> <P data-sourcepos="5:1-5:62"><STRONG>AVG:</STRONG> Calculates the average value for a particular column.</P><BR /> <P data-sourcepos="7:1-7:41"><STRONG>MIN:</STRONG> Determines the minimum value for a particular column.</P><BR /> <P data-sourcepos="9:1-9:62"><STRONG>MAX:</STRONG> Determines the maximum value for a particular column.</P><BR /> <P data-sourcepos="11:1-11:59"><STRONG>COUNT:</STRONG> Counts the total number of non-null values for a particular column.</P><BR /> <P data-sourcepos="13:1-13:92">These aggregate functions can be applied in various scenarios within Data Sphere, including:</P><BR /> <BR /> <OL data-sourcepos="15:1-21:31"><BR /> <LI data-sourcepos="15:1-16:0"><BR /> <P data-sourcepos="15:4-15:200"><STRONG>Analyzing sales data:</STRONG> By aggregating sales figures across different time periods, product categories, or customer segments, you can identify trends, patterns, and outliers in sales performance.</P><BR /> </LI><BR /> <LI data-sourcepos="17:1-18:0"><BR /> <P data-sourcepos="17:4-17:174"><STRONG>Calculating performance metrics:</STRONG> Aggregated functions can be used to assess employee performance, track website traffic, or evaluate the impact of marketing campaigns.</P><BR /> </LI><BR /> <LI data-sourcepos="19:1-20:0"><BR /> <P data-sourcepos="19:4-19:191"><STRONG>Identifying trends in sensor data:</STRONG> By aggregating sensor readings over time, you can detect patterns and anomalies in environmental conditions, equipment performance, or user behavior.</P><BR /> </LI><BR /> <LI data-sourcepos="21:1-21:31"><BR /> <P data-sourcepos="21:4-21:31"><STRONG>Generating summary reports for companies:</STRONG> Aggregated data can be used to create&nbsp; informative reports that highlight key findings and trends.</P><BR /> </LI><BR /> </OL><BR /> <STRONG>Steps:</STRONG><BR /> <OL><BR /> <LI>Using the previous flow after Order item table, drag a projection to the play area and remove the item column from the projection. Keep those that you want to aggregate and those that you want to group the aggregations by.</LI><BR /> <LI>Click the Aggregation tool, drag it onto the diagram canvas, and release it where you want to create the aggregation.</LI><BR /> <LI>Click on aggregation and click the column named Item and then select the aggregation type as Sum.</LI><BR /> <LI>Join the flow between projection, aggregation and join.</LI><BR /> </OL><BR /> <P style="overflow: hidden;margin-bottom: 0px">&nbsp; <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-Aggregation-into-the-Play-Area.png" height="26" width="1016" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding the Aggregation Node</P><BR /> &nbsp; &nbsp; &nbsp; 5.Now save and deploy and run the dataflow. Then click on the target table and do data preview.<BR /> <P style="overflow: hidden;margin-bottom: 0px">&nbsp; <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Data-Preview-of-Aggregation-Node.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Data Preview of Aggregation Node</P><BR /> <STRONG>Union in Data Flow:</STRONG><BR /> <BR /> Union in Sap DataSphere Data Flow is used to combine two datasets. Lets take one example to combine two dataset. In my example I am taking Order Header Table and Sales Order Table. Just to show the union functionality. I will only take order number from both the dataset.<BR /> <BR /> <STRONG>Steps:</STRONG><BR /> <OL><BR /> <LI>Drag and drop Order Table Header into the play area as source and add a projection and&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; connect the source table and projection. Now enable the Order Id column.</LI><BR /> <LI>Drag and drop Sales Order Table into the play area as source and add a projection and connect the source table and projection. Now enable the Order Id column.</LI><BR /> <LI>Drag and drop the union node into the play area.</LI><BR /> <LI>Now connect the flow of the projection to the union node</LI><BR /> <LI>Click on the union node and map the order id of both the table. And go to the unmapped section and delete the columns which are not mapped.</LI><BR /> <LI>Now save and deploy it and click on run to see the data preview..</LI><BR /> </OL><BR /> <P style="overflow: hidden;margin-bottom: 0px">Before lets look into the ingredients for the dish :</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Order-Table-Header-5.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Order Header Table</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Order-Item-Table.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Order Item Table</P><BR /> Now we can data preview and see the complete data.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adding-the-union-Operator.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adding the Union Operator</P><BR /> &nbsp;<BR /> <BR /> <STRONG>Conclusion:</STRONG><BR /> <BR /> This blog introduced the different types of operator available in datasphere data model that we’ll use along the entire blog series, I have explained each and every topic in detail. We can do a lot more using all these operators. I have skipped the script Operator and will cover it soon with another blog.<BR /> <BR /> Thanks for reading! I hope you find this post helpful. For any questions or feedback just leave a comment below this post. Feel free to also check out the other blog posts in the series and follow me to learn as well as master sap analytics. Let me know if you find something can be improved or added.<BR /> <BR /> Best wishes,<BR /> <BR /> Kunal Mohanty<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"></P> 2023-11-17T17:49:34+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-datasphere-data-flow-series-script-operator-part-1/ba-p/13573999 Sap Datasphere Data Flow Series – Script Operator Part 1 2023-11-25T10:15:33+01:00 Kunal_Mohanty https://community.sap.com/t5/user/viewprofilepage/user-id/172190 Hey Champs,<BR /> <BR /> Hope you enjoyed the last blog on&nbsp;&nbsp;<A href="https://blogs.sap.com/2023/11/17/sap-datasphere-data-flow-series-operators-joins-projection-aggregation-union/" target="_blank" rel="noopener noreferrer">Sap Datasphere Data Flow Series – Operators (Joins, Projection, Aggregation, Union)</A>, well its just the trailer. Today lets jump into script operator of Datasphere in details.<BR /> <BR /> The <STRONG>Script Operator</STRONG> seamlessly integrates the functionalities of popular Python libraries Pandas and NumPy with SAP Data Warehouse Cloud, enabling the creation of<STRONG> Data Flows</STRONG> and tailored information views. This versatile operator caters to a diverse range of tasks, including data cleansing, data transformation, and more.<BR /> <BR /> <STRONG>Syntax for script operator:</STRONG><BR /> <BR /> -----------------------------------------------------------------------<BR /> <BR /> <EM>def transformation(data):</EM><BR /> <BR /> <EM>&nbsp; &nbsp; &nbsp; &nbsp;# Fill in your scripts here with data as a source</EM><BR /> <BR /> <EM>&nbsp; &nbsp; &nbsp; &nbsp;# and save the values into data as output</EM><BR /> <BR /> <EM>return data</EM><BR /> <BR /> ------------------------------------------------------------------------<BR /> <BR /> Currently included libraries are restricted to <STRONG>Pandas, NumPy</STRONG> and several built-in module operations. But still not all functions are supported we can check the official documentation for the supported things.<BR /> <BR /> <A href="https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/73e8ba1a69cd4eeba722b458a253779d.html#loio73e8ba1a69cd4eeba722b458a253779d__section_modules." target="_blank" rel="noopener noreferrer">Sap Official Documentation</A><BR /> <BR /> <STRONG>Limitations of Sap Datasphere Script Operator:</STRONG><BR /> <UL><BR /> <LI>We have a dedicated python editor where we can write our python code. But the only limitation is that we don’t have a syntax validator. And that’s where it hurts a lot, hope will get in future soon.</LI><BR /> <LI>Due to the maximum batch size limit of 100,000 entries, the given Pandas function cannot be used to effectively remove duplicates. Using SQL-Script is the recommended approach for this task to ensure accurate duplicate detection, even across batches.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Limitation-of-script-operator-batch.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Limitation due to batch size</P><BR /> <STRONG>Importance of DataSphere script operator in real world:</STRONG><BR /> <BR /> The<STRONG> Script Operator</STRONG> in SAP DataSphere offers several valuable use cases and is crucial for real-world data manipulation and analysis tasks. Here are some key usage and advantage of the Script Operator:<BR /> <UL><BR /> <LI data-sourcepos="3:4-3:338"><STRONG>Data Cleaning and Preprocessing:</STRONG> The Script Operator works very well in cleaning and preparing raw data for analysis. It helps the users to handle tasks like removing invalid characters, imputing missing values, standardizing data formats, and detecting outliers. This ensures data quality and reliability for analytical processes.</LI><BR /> <LI data-sourcepos="5:4-5:361"><STRONG>Complex Data Transformations:</STRONG> The Script Operator extends the capabilities of SAP DataSphere by enabling complex data transformations which are not feasible with built-in operators. It provides user the flexibility to perform advanced data manipulation techniques like feature extraction from text data, data enrichment through external sources, and custom aggregation functions.</LI><BR /> <LI data-sourcepos="7:4-7:284"><STRONG>Integration with Python Libraries:</STRONG> The Script Operator seamlessly integrates with popular Python libraries like Pandas, NumPy, and built-in modules. This allows users to leverage the extensive functionalities of these libraries for data manipulation, analysis, and exploration.</LI><BR /> <LI data-sourcepos="9:4-9:315"><STRONG>Handling Unstructured Data:</STRONG> The Script Operator effectively handles unstructured data formats like text, JSON, and XML. Users can write Python code to parse, extract, and transform unstructured data, making it usable for downstream analysis.</LI><BR /> <LI data-sourcepos="13:4-13:308"><STRONG>Custom Data Validation and Checks:</STRONG> The Script Operator facilitates custom data validation and checks. Users can write Python code to implement custom business rules, data quality checks, and anomaly detection mechanisms, ensuring data integrity and consistency.</LI><BR /> </UL><BR /> <STRONG>Note :</STRONG> <EM>We will try to learn from use case about how we can use the python operators and get familiar with the syntax. Not all use cases will be useful. Try to get the use of the operators and how to write the code</EM><BR /> <BR /> <STRONG>Use Case 1:</STRONG><BR /> <BR /> We will always get a scenario where we want to count the number of employee for each department or number of items in a order. So in this case we will try to take a similar example where I will use script operator to find the count of different payment mode. Using this we will see how efficient is script operator and what flexibility it gives us while writing python code.<BR /> <BR /> Let’s have a look at our data:<BR /> <BR /> This is our order table data and using this I want to count the payment modes.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Sample-data-Order-table.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Order Table Sample Data</P><BR /> <P style="overflow: hidden;margin-bottom: 0px">So we will expect some output like this.</P><BR /> <IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Expected-output-after-using-script-operator.png" /><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Expected Output Script Operator</P><BR /> &nbsp;<BR /> <BR /> <STRONG>Let’s have a look at the code and understand it step by step:</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Python-Script-to-group-the-data.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Python Script to group Data</P><BR /> This code defines a function called transform that takes a Pandas DataFrame as input and returns a new DataFrame with two columns: 'Payment Mode' and 'Count of Payment mode'. The function performs the following steps:<BR /> <OL><BR /> <LI data-sourcepos="34:4-34:303">Convert <EM>'Payment Mode'</EM> column to categorical data type: The Payment Mode column is converted to a categorical data type using the <STRONG>pd.Categorical()</STRONG> function. This is useful for tasks like data analysis and machine learning, as it allows for more efficient data manipulation and representation.</LI><BR /> <LI data-sourcepos="36:4-36:242">Group the data by the<EM> 'Payment Mode'</EM> column: The data is grouped by the<EM> 'Payment Mode'</EM> column using the <STRONG>DataFrame.groupby()</STRONG> method. This creates a grouping object that allows us to iterate over the data grouped by the payment mode.</LI><BR /> <LI data-sourcepos="38:4-38:214">Create a new DataFrame to store the desired output: An empty DataFrame is created with two columns: <EM>'Payment Mode'</EM> and <EM>'Count of Payment mode'.</EM> This DataFrame will store the final output of the function.</LI><BR /> </OL><BR /> Now lets get our hand dirty by entering to datasphere and using the dataflow.<BR /> <OL><BR /> <LI>Add a table as source into the play area.</LI><BR /> <LI>Add a projection and keep only payment mode and remove other columns.</LI><BR /> <LI>Add a script operator and write the above script and connect everything.</LI><BR /> <LI>Add a target table and connect the script operator to the target table. Now save and deploy it</LI><BR /> </OL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Usage-of-script-operator-in-DataFlow.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Dataflow Script Operator</P><BR /> &nbsp;<BR /> <BR /> <STRONG>Conclusion :</STRONG><BR /> <BR /> This blog introduced the script operator available in datasphere data flow that we’ll use along the entire blog series, I have explained each and every part of the script operator in detail. Script allow is a very handy thing to do a lot of manipulation and analysis of data. The things which we can do using sql script, the same thing also we can do using script operator but since pyhon is used as a language so its gives us the flexibility to use lots of inbuilt function to do a more operation on data. I will continue the second part of this script operator to show what things are possible easily using script operator which we can't do using sql script.<BR /> <BR /> Thanks for reading! I hope you find this post helpful. For any questions or feedback just leave a comment below this post. Feel free to also check out the other blog posts in the series and follow me to learn as well as master sap analytics. Let me know if you find something can be improved or added.<BR /> <BR /> Best wishes,<BR /> <BR /> Kunal Mohanty 2023-11-25T10:15:33+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/data-flows-the-python-script-operator-and-why-you-should-avoid-it/ba-p/13664408 Data Flows - The Python Script Operator and why you should avoid it 2024-04-16T12:48:44.641000+02:00 christian_willi https://community.sap.com/t5/user/viewprofilepage/user-id/678327 <H1 id="toc-hId-862578795"><SPAN>Introduction</SPAN></H1><P><SPAN>When using SAP Datasphere to transform data for persistence, the Data Flow provides the necessary functionality</SPAN><SPAN>. We recently compared various basic transformation tasks using different modeling approaches. Therefore, we tried four different approaches to implement a certain logic:</SPAN></P><OL><LI><SPAN>Modelling with the Standard Operators in the Data Flow</SPAN></LI><LI><SPAN>Modelling with a Graphical View as a source to be consumed in the Data Flow</SPAN></LI><LI><SPAN>Modelling with a SQL View as a source to be consumed in the Data Flow</SPAN></LI><LI><SPAN>Modelling with the Script Operator in the Data Flow.</SPAN></LI></OL><P><SPAN>The goal was to give a recommendation about what approach might be the best for various scenarios in case of runtime, maintenance, other categories and if every scenario can even be modelled with every approach. We implemented the following scenarios:</SPAN></P><UL><LI><SPAN>String to Date Conversion</SPAN></LI><LI><SPAN>Join Data</SPAN></LI><LI><SPAN>Concatenate Columns</SPAN></LI><LI><SPAN>Aggregate Data</SPAN></LI><LI><SPAN>Transpose Data and Aggregate</SPAN></LI><LI><SPAN>Regex</SPAN></LI><LI><SPAN>Unnesting Data</SPAN></LI><LI><SPAN>Generate a Hash</SPAN></LI><LI><SPAN>Generate a Rank Column</SPAN></LI><LI><SPAN>Calculate a moving Average</SPAN></LI></UL><H1 id="toc-hId-666065290"><SPAN>Setup</SPAN></H1><P><SPAN>To have a comparable setup, we performed this action with an identical dataset, which contains the following columns:</SPAN></P><UL><LI>Region</LI><LI>Country</LI><LI>Item Type</LI><LI>Sales Channel</LI><LI>Order Priority</LI><LI>Order Date</LI><LI>Order ID</LI><LI>Ship Date</LI><LI>Unit Sold</LI><LI>Unit Price</LI></UL><P><SPAN>We uploaded this dataset (a CSV file) into a table. The table then contained 10 million records. The reason for that is that we wanted to get a feeling how Data Flows and Datasphere handles big amounts of data.</SPAN></P><H1 id="toc-hId-469551785"><SPAN>Results and Interpretation</SPAN></H1><P><SPAN>The outcome of our tests is now displayed in the table below. Note that the runtimes are displayed in MM:SS format, with seconds rounded to minutes if the runtime exceeds a few minutes.</SPAN></P><TABLE><TBODY><TR><TD width="120px"><P><STRONG><SPAN>Scenario</SPAN></STRONG></P></TD><TD width="120px"><P><STRONG><SPAN>Python (Script Operator)</SPAN></STRONG></P></TD><TD width="120px"><P><STRONG><SPAN>Standard Operator</SPAN></STRONG></P></TD><TD width="120px"><P><STRONG><SPAN>Graphical View</SPAN></STRONG></P></TD><TD width="120px"><P><STRONG><SPAN>SQL View</SPAN></STRONG></P></TD></TR><TR><TD width="120px"><P><SPAN>String to Date</SPAN></P></TD><TD width="120px"><P><SPAN>45:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:45</SPAN></P></TD><TD width="120px"><P><SPAN>00:58</SPAN></P></TD><TD width="120px"><P><SPAN>00:49</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Join</SPAN></P></TD><TD width="120px"><P><SPAN>&nbsp;NA</SPAN></P></TD><TD width="120px"><P><SPAN>01:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:53</SPAN></P></TD><TD width="120px"><P><SPAN>00:50</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Concatenate</SPAN></P></TD><TD width="120px"><P><SPAN>36:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:52</SPAN></P></TD><TD width="120px"><P><SPAN>00:51</SPAN></P></TD><TD width="120px"><P><SPAN>00:36</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Aggregation</SPAN></P></TD><TD width="120px"><P><SPAN>23:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:39</SPAN></P></TD><TD width="120px"><P><SPAN>00:25</SPAN></P></TD><TD width="120px"><P><SPAN>00:37</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Transpose and Aggregation</SPAN></P></TD><TD width="120px"><P><SPAN>24:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:50</SPAN></P></TD><TD width="120px"><P><SPAN>00:28</SPAN></P></TD><TD width="120px"><P><SPAN>00:24</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Regex</SPAN></P></TD><TD width="120px"><P><SPAN>36:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:59</SPAN></P></TD><TD width="120px"><P><SPAN>01:00</SPAN></P></TD><TD width="120px"><P><SPAN>00:50</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Unnesting Data</SPAN></P></TD><TD width="120px"><P><SPAN>14:00</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>00:38</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Hash </SPAN></P></TD><TD width="120px"><P><SPAN>234:00</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>01:00</SPAN></P></TD><TD width="120px"><P><SPAN>01:00</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Rank</SPAN></P></TD><TD width="120px"><P><SPAN>40:00</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>00:58</SPAN></P></TD><TD width="120px"><P><SPAN>01:00</SPAN></P></TD></TR><TR><TD width="120px"><P><SPAN>Moving Averages</SPAN></P></TD><TD width="120px"><P><SPAN>23:00</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>NA</SPAN></P></TD><TD width="120px"><P><SPAN>00:21</SPAN></P></TD></TR></TBODY></TABLE><P><SPAN>For better comparison, the chart below provides an overview in logarithmic scale.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="1_execution_times_plot_log.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/93915i2DA1E0227BB66314/image-size/large?v=v2&amp;px=999" role="button" title="1_execution_times_plot_log.png" alt="1_execution_times_plot_log.png" /></span></P><P><SPAN>One of the first findings is that between the Standard Operator, the Graphical View and the SQL View there is not a huge difference. Given the amount of data, the performance is overall quite pleasant. </SPAN></P><P><SPAN>Additionally, some requirements or tasks are not feasible with the Standard Operator or the Graphical View, but an SQL View supports a wide range of possibilities.</SPAN></P><P><SPAN>The elephant in the room is obviously the performance of the Script Operator. The one thing which should enhance your possibilities as a developer with a currently very popular programming language does not perform in any acceptable way compared to the other options. After we did our tests, we contacted SAP support to verify one of our scenarios. We thought we missed something in our modelling approach or probably this is even a bug. Maybe we missed the “Make it fast” setting. But after we posted our incident, we got some insight from SAP Support why this is slow. Spoiler alert: We did not miss the “Make it fast” setting. The explanation for this is quite simple. When you use the Standard Operators (without the Script Operator), the Graphical View or the SQL View everything can be performed directly on the database. However, when you use the Script Operator all the data which is processed in the Script Operator needs to be transferred to a separate SAP DI cluster which will perform the Python operation and afterwards the result needs to be transferred back. In our case that is 10 million records which is almost about 1GB of data. We tried to illustrate the process based on the feedback from SAP in the picture below on a high level.</SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2_data_flow_matrix.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/93917iD1EE16A7163729C0/image-size/large?v=v2&amp;px=999" role="button" title="2_data_flow_matrix.png" alt="2_data_flow_matrix.png" /></span></SPAN></P><P><SPAN>Also, the recommendation by the support was that the Script Operator should only be used if the requirement cannot be implemented with one of the other options. However, we think that how the Script Operator is advertised by SAP this can be an unpleasant surprise. Currently we see the Script Operator to be used very carefully, because in the end it might be a bottle neck in processing data during a transformation. Now one could argue that 10 million records is not something which is transferred on a regular basis in data warehouses, but we think this statement is not correct. In current SAP BW Warehouses, we regularly see the amount of data which is growing. Transferring at least 1 million records daily is not uncommon. Initially we were very excited to used Python, but currently we would generally advise against its use unless absolutely necessary. Even then, be prepared for potential performance issues during the runtime of your Data Flows.</SPAN></P><H1 id="toc-hId-273038280"><SPAN>Conclusion</SPAN></H1><P><SPAN>To reiterate, the primary takeaway is the recommendation to avoid using the Script Operator in a Dataflow. Due to our test and the incident we submitted to SAP, we gained insights into how the data is processed in the background. We also searched to find if SAP provides this information already somewhere within the Datasphere documentation but could not find it. This might be helpful to gain a better understanding. It might be slightly misleading how the Script Operator is advertised. It's important to be aware of its limitations, making SQL the preferred option for now.</SPAN></P> 2024-04-16T12:48:44.641000+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/sac-filter-multiple-dimension-with-one-input-field/ba-p/13752778 SAC: Filter Multiple Dimension with One Input Field 2024-07-05T15:04:54.895000+02:00 HLatif https://community.sap.com/t5/user/viewprofilepage/user-id/1454436 <P><SPAN>Hello All,</SPAN></P><P>In this blog, I wanted to show how you can filter multiple dimensions with one input parameter. The classical filtering works as adding filters for each dimension on report but with this effective way, you need only one input parameter and you can filter whole different dimensions individually with this input field.<BR />the gif that below down shows how it works. There are three different dimensions on table and when I input a value and push the filter button, it filters the dimension that has contains my input so, no need to write whole data inside. Also there is no case sensitivity.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="0704-ezgif.com-video-to-gif-converter (2).gif" style="width: 800px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/132992iC5B583124C44A8F1/image-size/large?v=v2&amp;px=999" role="button" title="0704-ezgif.com-video-to-gif-converter (2).gif" alt="0704-ezgif.com-video-to-gif-converter (2).gif" /></span><BR /><BR />For example when I write an input as "Turk" , it detect that this input is contained in "Country" dimension as "Turkey" and then it filters country. When I write "Eur" it filters "Region" dimension at this time and also same for "soda" filters "Product Group" dimension.<BR />The scripting is not working on dimensions with their name, it works with dimensions order. Therefore when I add a new dimension, it works for it too. I added three loops into the scripting so, it works for only first three dimensions. You can add loops as much as you want.</P><P>How to do it? (for only multiple filtering)<BR />-Necessary Widgets: Table, Input Field , Button (* I filtered only table in this example you can add other widgets too as you wish)<BR />-Scripting (You have to add scripting to only Button)</P><P>&nbsp;</P><pre class="lia-code-sample language-javascript"><code>var INPUTTAKI_DEGER = InputField_1.getValue().toLowerCase(); //get the value from input field into global variable var BOYUT1 = Table_1.getDimensionsOnRows()[0]; var BOYUT2 = Table_1.getDimensionsOnRows()[1]; var BOYUT3 = Table_1.getDimensionsOnRows()[2]; //get dimensions that you want to filter into local variables. for this case I used three dimensions. You can use more or less but if you will use more, you have to script more loops for every dimension. Table_1.getDataSource().removeDimensionFilter(BOYUT1); Table_1.getDataSource().removeDimensionFilter(BOYUT2); Table_1.getDataSource().removeDimensionFilter(BOYUT3); // remove filters on dimensions. Also you can add a new "remove filters" button and use the scripting part up to here to remove all filters. if (INPUTTAKI_DEGER.length===0 ) {return;} //If there is no value inside input field, scripting will stop here and all filters will be deleted on these first 3 dimensions. //First Loop var SECILENLER = Table_1.getDataSource().getDataSelections(); for (var i=0;i&lt;SECILENLER.length;i++) {var Uyeler = Table_1.getDataSource().getResultMember(BOYUT1,SECILENLER[i]); var buyuk_kucuk_harf_esiteleme = Uyeler.description.toLowerCase(); // With this for loop, every member in that dimension will be in variable "UYELER". last part gets the first dimension's descriptions into a variable "buyuk_kucuk_harf_esiteleme" as lowercase. var ESAS_UYE = Uyeler.description; // ESAS_UYE variable will take the Original description value. if(buyuk_kucuk_harf_esiteleme.includes(INPUTTAKI_DEGER)) // this if condition checks if the input value is contained in a value from first dimension. if there is, condition will be applied. If there is not, it will goes to second loop { Table_1.getDataSource().setDimensionFilter(BOYUT1,ESAS_UYE); console.log(ESAS_UYE); return ;}} // if the value in input is contained in first dimension, it filters table and stop scripting here. if it doesn't stop, this will work anyway but gives an error and works for nothing. //second Loop var SECILENLER2 = Table_1.getDataSource().getDataSelections(); for (var a=0;a&lt;SECILENLER2.length;a++) {var Uyeler2 = Table_1.getDataSource().getResultMember(BOYUT2,SECILENLER[a]); var buyuk_kucuk_harf_esiteleme2 = Uyeler2.id.toLowerCase(); var ESAS_UYE2 = Uyeler2.id; //I used ID here, for my case ids and descriptions are same but you can use necessary one in your case. id is better generally if you can choose it. if(buyuk_kucuk_harf_esiteleme2.includes(INPUTTAKI_DEGER)) {Table_1.getDataSource().setDimensionFilter(BOYUT2,ESAS_UYE2); console.log(ESAS_UYE2); return ;}} console.log(INPUTTAKI_DEGER); //third Loop var SECILENLER3 = Table_1.getDataSource().getDataSelections(); for (var e=0;e&lt;SECILENLER3.length;e++) {var Uyeler3 = Table_1.getDataSource().getResultMember(BOYUT3,SECILENLER[e]); var buyuk_kucuk_harf_esiteleme3 = Uyeler3.id.toLowerCase(); var ESAS_UYE3 = Uyeler3.id; if(buyuk_kucuk_harf_esiteleme3.includes(INPUTTAKI_DEGER)) {Table_1.getDataSource().setDimensionFilter(BOYUT3,ESAS_UYE3); console.log(ESAS_UYE3); return;}} </code></pre><P>&nbsp;</P><P>&nbsp;</P><P><STRONG>Summary</STRONG></P><P>In this work,&nbsp; I aimed to show you how to filter multiple dimension. With this way user can manage every dimensions with only one input field.&nbsp;</P><P>When I was looking for analytic application examples, I saw a great blog from&nbsp;&nbsp;<a href="https://community.sap.com/t5/user/viewprofilepage/user-id/14634">@JBARLOW</a>&nbsp;&nbsp;and I wanted to add something on it. Therefore I got inspired from his work "Use a text search to filter a dashboard". Thanks.&nbsp;</P> 2024-07-05T15:04:54.895000+02:00 https://community.sap.com/t5/technology-blog-posts-by-sap/introducing-data-amp-analytics-maturity-assessment-of-sap-data-and/ba-p/13864842 Introducing Data & Analytics Maturity Assessment of “SAP Data and Analytics Advisory Methodology" 2024-09-13T11:47:15.067000+02:00 abange https://community.sap.com/t5/user/viewprofilepage/user-id/677769 <P>In the last blog, I announced the <A href="https://community.sap.com/t5/technology-blogs-by-sap/new-release-of-sap-data-and-analytics-advisory-methodology/ba-p/13725453" target="_self">new release of the “SAP Data and Analytics Advisory Methodology”</A>. One of the improvements is the introduction of a Data and Analytics Maturity Assessment used in phase IV. This tool helps to evaluate data governance and organizational aspects that are essential for a successful realization of the value proposition defined for the business outcomes in phase II.<BR />Thus, this blog provides guidance how to apply the maturity assessment to identify improvement actions that are considered in the roadmap afterwards.</P><P><FONT size="4"><STRONG>Overview of the Data and Analytics Maturity Assessment</STRONG></FONT></P><P>The “SAP Data and Analytics Advisory Methodology” focuses on data &amp; analytics architecture development. But even the best architecture is ineffective when critical data governance processes, policies, key roles, and competencies are not established within the organization. Therefore, the methodology provides a general investigation of data governance-related topics to identify critical gaps that impact the architecture’s ability to deliver effective value.</P><P>This maturity assessment offers a total of 15 data governance and organizational focus topics from five dimensions, which can be assessed first from the current perspective and then for the desired future state. The assessment comprises five maturity levels, ranging from “initial” to “data-driven”.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Maturity Assessment Overview.png" style="width: 640px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/166002i1404AB29923FF037/image-dimensions/640x496?v=v2" width="640" height="496" role="button" title="Maturity Assessment Overview.png" alt="Maturity Assessment Overview.png" /></span></P><P>Not all governance topics in context of the architectural investigation are equally important. It is sufficient to select and analyze the most relevant aspects.<BR />The maturity assessment is used in phase IV of the methodology and is organized in two parts:</P><UL><LI>Part 1 covers topics related to critical data governance topics like data strategy, data quality &amp; KPI’s, data architecture and processes.</LI><LI>Part 2 addresses typical data governance and management roles, their responsibilities, and interactions in an organization.</LI></UL><P>Let’s have a look into the two parts of the assessment in more detail.</P><P><FONT size="4"><STRONG>Part 1: Assessing data governance focus topics</STRONG></FONT></P><P>Data governance is a crucial framework that outlines the rules, processes, and accountabilities necessary for organizations to effectively manage their data. This comprehensive approach treats data as a product, ensuring that it meets specific standards and delivers value to both internal and external stakeholders. When data is handled in this manner, it is referred to as a "data product".<BR />The key components of data governance are:</P><UL><LI>Rules: include standards, policies, and guidelines that set the foundation for data management.</LI><LI>Processes: the mechanisms that facilitate data management and decision-making on relevant topics.</LI><LI>Roles &amp; Accountability: defines the responsibilities and decision rights for data management.</LI></UL><P>Managing "data as a product" involves overseeing its availability, usability, security, and integrity to deliver business value. The goal of data governance is to ensure the right data reaches the right person at the right time, and at the expected quality, enabling informed decision-making that drives business outcomes.</P><P>The first part of the maturity assessment starts to investigate the <STRONG>data strategy</STRONG> as a baseline to establish data &amp; analytics as a core competency. This is followed by looking into <STRONG>KPI’s</STRONG> to track data strategy execution and assess <STRONG>data quality</STRONG>. Then <STRONG>data architecture</STRONG> is analyzed that is important to provide a common understanding of data. Finally, we investigate data governance&nbsp;<STRONG>processes</STRONG> to ensure sufficient data quality and deliver tailored reporting, analytics and business AI solutions.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Maturity Assessment Dimensions I.png" style="width: 752px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/166007i9CAEAEBF0210B57C/image-dimensions/752x394?v=v2" width="752" height="394" role="button" title="Maturity Assessment Dimensions I.png" alt="Maturity Assessment Dimensions I.png" /></span></P><P><FONT size="4"><STRONG>Part 2: Assessing organizational focus topics </STRONG></FONT></P><P>Part two of this maturity assessment focuses on the dimension <STRONG>“organization”</STRONG>. Here you evaluate the readiness of the organization regarding roles and teams contributing to the data governance and identify improvement potential.</P><P>In a mature data-driven organization, data governance and management teams consist of several roles, each with distinct responsibilities and skills.<BR />Examples include:</P><UL><LI><STRONG>Chief Data Officer:</STRONG> responsible for maintaining data quality, security, and compliance, as well as promoting innovation and data-driven decision-making within the organization.</LI><LI><STRONG>Data Steward:</STRONG> enforce data governance policies and procedures and ensure compliance with data regulations.</LI><LI><STRONG>Data Domain Owner:</STRONG> manages a specific area or domain of data within an organization to ensure accuracy, integrity, security, and availability.</LI></UL><P>Such roles are part of the data governance, which is recognized as an overarching responsibility within the organization. Nevertheless, it is crucial to define the structure of governance boards and teams to establish and enforce data governance rules and policies effectively.<BR />This structure includes:</P><UL><LI><STRONG>Data Governance Council:</STRONG> provides strategic guidance of data governance programs, prioritizes data governance projects and initiatives, and approves organization-wide data policies and standards. Data Governance Council: provides</LI><LI><STRONG>Data Governance Team(s):</STRONG> define Data Governance policies, roles, methods, processes, tools etc. (Framework), enforcing data standards and collaborating with operational teams to resolve governance issues</LI><LI><STRONG>Data Domain Teams:</STRONG> apply data governance policies, rules, methods, tools etc. on daily basis, manage data product demand &amp; requirements and ensure their data quality and security.</LI></UL><P>The "organization" dimension of the maturity assessment evaluates in how far such roles and teams are established in the company.</P><P>The following five focus topics are assessed:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Maturity Assessment Dimensions II.png" style="width: 757px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/166011iBB35678591BF6C73/image-dimensions/757x313?v=v2" width="757" height="313" role="button" title="Maturity Assessment Dimensions II.png" alt="Maturity Assessment Dimensions II.png" /></span></P><P><FONT size="4"><STRONG>Procedure of maturity assessment</STRONG></FONT></P><P>Although the maturity assessment is divided into two parts the assessment can be executed in one session, e.g. in a workshop with relevant stakeholders. To perform the maturity assessment, the following steps are carried out:</P><UL><LI>Select data &amp; analytics governance focus topics from the five dimensions relevant to the scope of the assessment and target architecture.</LI><LI>Assess current maturity for selected topics, either collectively or by the individual stakeholders.</LI><LI>Assess future maturity by discussing whether the current level of maturity is sufficient or whether a higher level is required.</LI><LI>If the current and required maturity levels diverge, then necessary actions must be agreed upon and incorporated into the final roadmap.<BR />The Data &amp; Analytics Advisory Methodology provides a template with a detailed description of each focus topic and its maturity levels.</LI></UL><P>Here is one example where four relevant focus topics were assessed:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Maturity Assessment Example.png" style="width: 735px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/166014iB3DC644CA383C6F5/image-dimensions/735x261?v=v2" width="735" height="261" role="button" title="Maturity Assessment Example.png" alt="Maturity Assessment Example.png" /></span></P><P> </P><P>In case current and desired maturity levels are different you should document what the gap is and what actions need to be taken to get to the to-be maturity level.</P><P>In the example above a data strategy was not existing but was considered as essential to provide the guardrails and guidelines to improve overall data &amp; analytics governance. Also, the skills and knowledge in the organization required to work with data and create business value (data literacy) was not sufficient. Thus, follow up actions were defined to organize a data strategy development initiative and define a training plan to improve data literacy.</P><P>If this general assessment leads to the conclusion that data governance needs to be investigated in more detail, please refer to related frameworks from SAP (SAP Data Management Framework) or SAP partners.</P><P>In the next blog I will explain phase IV of the “SAP Data and Analytics Advisory Methodology” in its entirety with a focus on developing the architecture roadmap.</P> 2024-09-13T11:47:15.067000+02:00 https://community.sap.com/t5/financial-management-blog-posts-by-members/the-exchange-rate-maze-a-journey-from-confusion-to-clarity/ba-p/13947696 The exchange rate maze: A journey from confusion to clarity 2024-11-29T10:52:02.517000+01:00 SohiniRay https://community.sap.com/t5/user/viewprofilepage/user-id/1881113 <DIV class=""><DIV class=""><DIV class=""><DIV class=""><P>When I started hanging out in the risk management space in the SAP world, one topic that always left me flabbergasted was Forex rates used in Mark to Market and Profit and Loss reporting. Neither did I have an idea about the different rates, nor did I understand the calculation on the Forward exchange rates. After successfully surviving in this commodity business for some time now, I have had the opportunity to delve into this topic numerous times finally reaching the Eureka moment and now I would like to pass this on to my peers or anybody who needs clarity.</P><P>To start with we are going to talk about 3 different types of FX rates:<BR />Spot rates: The exchange rate for immediate transactions, often used for buying/selling commodities promptly. Maintained in tables like TCURR, they guide current valuation.</P><P>Swap rates: A differential added or subtracted to a spot rate to derive a forward rate, reflecting interest rate differences between two currencies. Stored in the AT15 table, these rates are defined by maintaining interest differentials for different time buckets.</P><P>Forwards FX rates: The agreed rate for a currency exchange on a future date, calculated as<SPAN>&nbsp;</SPAN><EM>Spot Rate ± Swap Rate</EM>. To calculate Forward exchange rate and derive it in risk reporting we must execute tcode CMM_LOAD_GMDAFX.<BR />Obviously, this doesn’t clarify anything, so we will start with examples.</P><P>Spot rates need to be maintained in tcode OB08. We will take the currency pair EUR/USD and the below rate type M (standard translation at average rate) for our calculation:</P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SohiniRay_0-1732537095704.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/195195i1B05586D58624033/image-size/medium?v=v2&amp;px=400" role="button" title="SohiniRay_0-1732537095704.png" alt="SohiniRay_0-1732537095704.png" /></span><P>Swap rates are maintained via tcode TMDFXFP:</P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SohiniRay_1-1732537095709.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/195197i4CE8A61AC7D7AA5C/image-size/medium?v=v2&amp;px=400" role="button" title="SohiniRay_1-1732537095709.png" alt="SohiniRay_1-1732537095709.png" /></span><P>Now we will execute tcode CMM_LOAD_GMDAFX to check the Swap point interpolation and Forward FX rates calculation based on that. As we have mentioned previously, Forward Exchange rate = Spot rate<SPAN>&nbsp;</SPAN><EM>±</EM><SPAN>&nbsp;</SPAN>Swap rate, it becomes imperative to understand the Swap rate interpolation first.</P><P>Below image shows us the result retrieved after execution of CMM_LOAD_GMDAFX, and after we see the numbers, we will explain the logic behind it:</P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SohiniRay_2-1732537095939.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/195196i5F613A65F9134399/image-size/medium?v=v2&amp;px=400" role="button" title="SohiniRay_2-1732537095939.png" alt="SohiniRay_2-1732537095939.png" /></span><P>&nbsp;</P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SohiniRay_3-1732537095934.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/195199i170657D8DA27331C/image-size/medium?v=v2&amp;px=400" role="button" title="SohiniRay_3-1732537095934.png" alt="SohiniRay_3-1732537095934.png" /></span><P>Now the flow of the program flows as below:</P><UL><LI>Depending on the evaluation date and maturity key dates, it first checks the Spot rates which are available in the system. It picks up the latest spot rate using Function Module : READ_EXCHANGE_RATE, and for our calculation Spot rate = 1.07360.<BR />Based on whether it’s a direct or indirect notation, the spot rate might be maintained in negative and in that case an inversion function is performed by FM : FOREX_RATE_INVERT, which is not relevant for our current scenario.</LI><LI>Next we calculate the Swap rate interpolation, and before that it is necessary to check the Leading currency using FM : FTR_GET_LEADING_CURRENCY to determine the corresponding signage that would be used for the swap rate. We use FM : TB_EVALUATION_SWAPRATE_INTEROLATE to interpolate the swap rates. For our example we will take few lines and explain the calculation below:</LI></UL><P>For the 1st line :</P><UL><LI>Evaluation date = 20.11.2024</LI><LI>Maturity date = 20.11.2024</LI><LI>Leading currency = EUR</LI><LI>Following currency = USD</LI></UL><P>Given the Spot rate is always 2 days+ we have the Swap rate as 0.00 for the first line, which makes the Forward FX rate = Spot rate.<BR />Going to the 2nd line where we evaluation date and maturity dates are different the calculation is as follows:</P><UL><LI>Evaluation date = 20.11.2024</LI><LI>Maturity date = 21.11.2024</LI><LI>Swap rate = 0.00188 for 30 days from 01.11.2024</LI></UL><P>The Swap rate interpolation formula from SAP would be :<BR /><STRONG>swaprate&nbsp;=&nbsp;(&nbsp;swap_lang&nbsp;-&nbsp;swap_kurz&nbsp;)&nbsp;*&nbsp;(&nbsp;save_lfz&nbsp;-&nbsp;save_lfz_kurz&nbsp;)&nbsp;/(&nbsp;save_lfz_lang&nbsp;-&nbsp;save_lfz_kurz&nbsp;)&nbsp;+&nbsp;swap_kurz.</STRONG></P><P>save_lfz = Maturity date (21.11.2024) – Real Spot date (20.11.2024) = 1day</P><P>When we don’t have a swap rate maintained for the specific date (21.11.2024) then we consider:</P><P>Swap_kurz = 0.000 (As we don’t have direct swap rate maintained for 21.11.2024)</P><P>Swap_lang = 0.00188</P><P>Save_lfz_lang = 30 days</P><P>Safe_lfz_kurz = 0 (As we don’t have 1day maintained in the term days in AT15)</P><P>Replacing the values in the formula we get below:<BR />Swap rate = ( 0.00188 – 0 ) * ( 1 – 0 )/ (30 – 0 ) + 0 = 0.00006</P><P>So Forward FX rate = 1.07360 (Spot rate) + 0.00006 (Interpolated Swap rate)</P><P>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp; &nbsp;= 1.07366 (Same as our screenshot)</P><P>To get a hang of this calculation we will do another calculation where we have difference between spot date and maturity date more than 30days so that the swap duration of 61days is also taken into consideration.<SPAN>&nbsp;</SPAN><STRONG>Very important point to remember, that the term for the Swap is not between evaluation date and maturity date, but the spot value date and maturity date.</STRONG></P><UL><LI>Evaluation date = 20.11.2024</LI><LI>Maturity date = 21.12.2024</LI><LI>Swap rate = 0.00188 for 30 days and 0.00376 for 61days</LI></UL><P>The Swap rate interpolation formula from SAP would be :<BR /><STRONG>swaprate&nbsp;=&nbsp;(&nbsp;swap_lang&nbsp;-&nbsp;swap_kurz&nbsp;)&nbsp;*&nbsp;(&nbsp;save_lfz&nbsp;-&nbsp;save_lfz_kurz&nbsp;)&nbsp;/(&nbsp;save_lfz_lang&nbsp;-&nbsp;save_lfz_kurz&nbsp;)&nbsp;+&nbsp;swap_kurz.</STRONG></P><P>save_lfz = Maturity date (21.12.2024) – Real Spot date (20.11.2024) = 31days</P><P>When we don’t have a swap rate maintained for the specific date (21.12.2024) then we consider:</P><P>Swap_kurz = 0.00188 (As the last swap rate before the new term days)</P><P>Swap_lang = 0.00376 (The latest swap rate based on the term days)</P><P>Save_lfz_lang = 61 days (Based on latest term days value)</P><P>Safe_lfz_kurz = 30 (As the last term days duration)</P><P>Replacing the values in the formula we get below:<BR />Swap rate = ( 0.00376 – 0.00188 ) * ( 31 – 30 )/ (61 – 30 ) + 0.00188 = 0.00194</P><P>So Forward FX rate = 1.07360 (Spot rate) + 0.00194 (Interpolated Swap rate)</P><P>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;= 1.07554 (Same as our screenshot)</P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="SohiniRay_4-1732537095941.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/195198i8B5F94BE8A5DC57C/image-size/medium?v=v2&amp;px=400" role="button" title="SohiniRay_4-1732537095941.png" alt="SohiniRay_4-1732537095941.png" /></span><P>We can keep on reconciling these results based on the calculations specified above for validating the formula for different cases.<BR />Surely this is not an easy calculation and remembering it every time for reconciling risk reporting values can be frustrating, so hopefully this blog can be your one stop memory refresher rather than hours of debugging.</P><P>Happy Forex-ing!</P></DIV></DIV></DIV></DIV> 2024-11-29T10:52:02.517000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/connectivity-bw4hana-amp-sac-configuration-of-import-connection/ba-p/13953864 Connectivity-BW4HANA & SAC:: Configuration of Import Connection 2024-12-04T11:49:26.529000+01:00 hemant2004 https://community.sap.com/t5/user/viewprofilepage/user-id/157161 <P><SPAN>Integration between SAC and BW offers a dynamic and comprehensive solution for accessing, analyzing, and visualizing data stored in BW.</SPAN></P><P><SPAN>Types Of Connections:</SPAN></P><P>1.) Import: When there is a need to bring data from BW into SAP SAC for further analysis, modeling, and visualization, an import connection between SAP SAC and BW is necessary. Below we will explore the step-by-step process of configuring an import connection and setting it up in SAP SAC.</P><P>2.) Live: A live connection between SAP SAC and BW is preferred when real-time access to BW data is critical for decision-making. Below we will explore the step-by-step process of configuring a live connection and setting it up in SAP SAC.</P><P>Configuration: Import Connection between BW4HANA &amp; SAC:</P><H6 id="toc-hId-1592930557">To configure the data source in SAP SAC, follow these simple steps.</H6><P>1.First, navigate to the “system” tab in the administration section. Next, click on the “data source configuration” tab, where you will encounter a variety of data sources to choose from. Scroll down until you find the SAP BTP core account option. Take note of the region host and sub-account user details displayed there, as they will be essential for the next step.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_1-1733206126073.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197731iBA2784CB6A40F31E/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_1-1733206126073.png" alt="hemant2004_1-1733206126073.png" /></span></P><P>2.To establish a secure connection between your on-premise application and the cloud, a cloud connector is essential. This crucial tool acts as a handshaking application, facilitating a seamless link between the two environments. Before proceeding with the installation of the cloud connector, ensure that you have a SAP JVM ready. Locate or download the SAP JVM and make note of its file path, as you will need to specify this during the installation process.</P><P>3.Once you have the SAP JVM, extract it onto the server where you plan to install the Cloud Connector. As an example, you can unzip the file into the directory C:\SAP JVM\sapjvm_8. You can download it from this link:&nbsp;<A href="https://tools.hana.ondemand.com/%23cloud" target="_blank" rel="noopener nofollow noreferrer">https://tools.hana.ondemand.com/#cloud</A></P><P>4.After obtaining the SAP JVM from the provided link, download the cloud connector from the same source. During installation, specify the folder location of the SAP JVM.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_2-1733206209960.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197732iB987E220C36E2A45/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_2-1733206209960.png" alt="hemant2004_2-1733206209960.png" /></span></P><P><SPAN>5.Once the installation process is complete, open your web browser and enter the following URL:&nbsp;</SPAN><A href="https://localhost:8443/" target="_blank" rel="noopener nofollow noreferrer">https://localhost:8443</A><SPAN>. This will take you to the logon screen. Here, use the default credentials: “Administrator” for the username and “manage” for the password. It is recommended to change the password after your initial logon. At this point, you will be prompted to choose between a master and shadow installation. Opt for the “Master” option if you are installing a single Cloud Connector instance or the primary instance from a pair of Cloud Connector instances.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_4-1733206343894.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197735i8C9253E868DB9907/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_4-1733206343894.png" alt="hemant2004_4-1733206343894.png" /></span></P><P><SPAN>6.Next, enter the subaccount details that you noted earlier. Specify proxy host and port if using a proxy, otherwise leave blank. Optional to enter Location ID for multiple Cloud Connectors. Remember to save the configuration.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_5-1733206386891.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197736iEAA46775D0525D5D/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_5-1733206386891.png" alt="hemant2004_5-1733206386891.png" /></span></P><P><SPAN>7.Once you have completed the necessary configuration, click on the “Connect” button. After doing so, verify that the connector state is displayed as “Connected.” This confirmation ensures that the Cloud Connector has successfully established.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_6-1733209529675.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197742iC0788D5A3A22338A/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_6-1733209529675.png" alt="hemant2004_6-1733209529675.png" /></span></P><P>8.When establishing connections with SAP BPC MS, SAP BW, SAP UNX, or SAP ERP systems, it is necessary to install the SAP Analytics Cloud agent in addition to configuring the Cloud Connector. However, for connections with SAP BPC NW, SAP BPC for BW/4HANA, OData, or SAP S/4HANA, configuring the Cloud Connector alone is sufficient.</P><P>9.To proceed with the installation of the SAP Analytics Cloud Agent, there is a preliminary step to follow. It involves downloading Apache Tomcat, a widely-used web server and servlet container, from the official website at&nbsp;<A href="https://tomcat.apache.org/download-80.cgi" target="_blank" rel="noopener nofollow noreferrer">https://tomcat.apache.org/download-80.cgi</A>. Once the download is complete, simply double click on the Tomcat executable file to initiate the installation procedure. During this step, you will be prompted to accept the license agreement, an important formality. Moving forward, when presented with the Choose Components screen, it is recommended to stick with the default options, which are already configured optimally. And click next.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_7-1733209568680.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197743i1AFFF2E82E10B286/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_7-1733209568680.png" alt="hemant2004_7-1733209568680.png" /></span></P><P><SPAN>10.you need to specify the ports to be used by Apache Tomcat. It is essential to ensure that there are no conflicts with existing applications already running on your system.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_8-1733209633165.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197744i4E4E4F6C9A241306/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_8-1733209633165.png" alt="hemant2004_8-1733209633165.png" /></span></P><P>11.Next, specify the path to your and install.</P><P>12.Once you have successfully installed Apache Tomcat, the next vital step in optimizing the performance of SAP Analytics Cloud Agent is to configure the JAVA Heap space allocation. Launching the Tomcat configuration allows you to modify these settings and allocate more memory accordingly. To initiate the configuration process, navigate to the Tomcat installation directory and locate the Tomcat8w.exe file. Simply double click on it to open the configuration window. By default, the initial and maximum heap space values (-Xms and -Xmx) are usually set to 128MB and 256MB respectively, which often prove insufficient for efficient data acquisition in SAC, leading to timeout errors. It is crucial to increase these values to prevent such issues. For enhanced performance, it is recommended to set the values to 1024MB and 2048MB respectively.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_9-1733209682549.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197745i10F802F5C7D021CA/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_9-1733209682549.png" alt="hemant2004_9-1733209682549.png" /></span></P><P><SPAN>13.To install the SAP SAC (Analytics Cloud) agent, begin by downloading it from the SAP Support Portal:&nbsp;</SPAN><STRONG><A href="https://support.sap.com/swdc" target="_blank" rel="noopener noreferrer">https://support.sap.com/swdc</A></STRONG><SPAN>. Access the SAP Software Downloads page and navigate to “By Category.” Choose “SAP Cloud Solutions” and then select “SAP ANALYTICS CLOUD CONN SAP ANALYTICS CLOUD CONN 1.0 SAP ANALYTICS CLOUD AGENT 1.0.” Proceed to download the latest version available.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_10-1733209719634.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197746i99E029010CD65FF5/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_10-1733209719634.png" alt="hemant2004_10-1733209719634.png" /></span></P><P><SPAN>14.Once downloaded, unzip the file and rename the WAR file to C4A_AGENT.war. Extract the package and copy the C4A_AGENT.war file to the webapps directory in your Tomcat installation. When you restart Tomcat, the agent will automatically deploy.</SPAN></P><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_11-1733209755429.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197749i3AD3531C83829270/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_11-1733209755429.png" alt="hemant2004_11-1733209755429.png" /></span></P><P><SPAN>15.To establish the authentication credentials needed for configuring the SAC Agent, follow these steps. First, locate the Tomcat conf directory and open the tomcat-users.xml file. Inside this file, you can add a new user with the Services role. This user will have the necessary permissions to ensure smooth communication between SAP Analytics Cloud and the SAC Agent. Simply provide a unique username and password for the new user, and make sure to save the changes.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_12-1733209824959.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197750i3E308AF4688D845A/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_12-1733209824959.png" alt="hemant2004_12-1733209824959.png" /></span></P><P>16.For establishing a connection with SAP BW, an additional component called the SAP JCO connector is required. To obtain it, you can download the connector from the official SAP website at&nbsp;<A href="http://support.sap.com/swdc" target="_blank" rel="noopener noreferrer">http://support.sap.com/swdc</A>. Once the download is complete, extract the contents of the downloaded zip file. Within the extracted files, locate the sapjco3.dll and sapjco3.jar files. Copy these two files and paste them into the tomcat/lib directory. After completing the file transfer, it is essential to restart the Tomcat application for the changes to take effect.</P><P>17.Now go to your cloud connector at&nbsp;<A href="http://localhost:8443/" target="_blank" rel="noopener nofollow noreferrer">http://localhost:8443</A>. Enter your login credentials to access the Cloud Connector administration console. Once logged in, select the “Cloud To On-Premise” option. Next, locate the “+” icon and click on it to add a new Access Control entry. This step is crucial for defining the access permissions for on-premises systems.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_13-1733209876996.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197751iB266521220385B6E/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_13-1733209876996.png" alt="hemant2004_13-1733209876996.png" /></span></P><P><SPAN>18.The next step is to specify the Back-end Type. Select “Other SAP System” as the Back-end Type from the available options.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_14-1733209918045.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197752iD6A643B9F0E30EF3/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_14-1733209918045.png" alt="hemant2004_14-1733209918045.png" /></span></P><P><SPAN>19.Then select HTTP protocol. Choose HTTPS if SSL configured on the Tomcat instance where the BOC Agent is deployed.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_15-1733209953893.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197753iAEB1E1F9677223AE/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_15-1733209953893.png" alt="hemant2004_15-1733209953893.png" /></span></P><P><SPAN>20.enter the hostname (Internal Host) and port number (Internal Port) of the Tomcat server where the SAC Agent is running. The default HTTP port for Tomcat is 8080, while the default HTTPS port is 8443. Therefore, when specifying the Internal Host, provide the hostname or IP address of the Tomcat server. For the Internal Port, enter the corresponding port number based on whether you are using HTTP or HTTPS (8080 for HTTP or 8443 for HTTPS).</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_16-1733209984200.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197755iECE6CDBD8A7589F6/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_16-1733209984200.png" alt="hemant2004_16-1733209984200.png" /></span></P><P>21.Now give virtual host and port that will be used on SAC side.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_17-1733210013685.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197756i534EC7002525E89C/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_17-1733210013685.png" alt="hemant2004_17-1733210013685.png" /></span></P><P><SPAN>22.Select “None” as the Principal Type. This selection indicates that there is no specific principal or user identity associated with the access control settings. By choosing “None,” you are specifying that the access control applies to all users or principals attempting to access the specified resources. This can be useful in scenarios where you want to grant general access to the SAP SAC Agent without requiring specific user authentication or identity verification. However, please note that depending on your specific security requirements, you may need to consider alternative options or additional authentication mechanisms to ensure secure access to the SAC Agent.</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_18-1733210067421.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197757i0E18E6BC98906EAD/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_18-1733210067421.png" alt="hemant2004_18-1733210067421.png" /></span></P><P><SPAN>23.Select the option “Check availability of internal host” to ensure the HANA Cloud. Connector can access the C4A Agent and click finish</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_19-1733210102529.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197758i3402C0033BB9E573/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_19-1733210102529.png" alt="hemant2004_19-1733210102529.png" /></span></P><P>24.The next step is to add a resource in the Cloud Connector configuration. Set the URL PATH to “/C4A_AGENT/” and choose the option “Path and all sub-paths” to allow access to this specific path and its sub-paths. This URL PATH represents the endpoint or route where the SAC Agent is accessible within the Tomcat server. By adding this resource and specifying the URL PATH, you are granting the necessary permissions for SAP SAC (Analytics Cloud) to communicate with the SAC Agent and retrieve the required data. Once the URL PATH and access settings are configured, click on the “Save” button to save the resource configuration.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hemant2004_20-1733210136147.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/197759i24FF96536175E935/image-size/medium?v=v2&amp;px=400" role="button" title="hemant2004_20-1733210136147.png" alt="hemant2004_20-1733210136147.png" /></span></P><H2 id="toc-hId-880086176"><STRONG>Creating an import connection in SAC:</STRONG></H2><P>Follow these simple steps within your SAC tenant. Firstly, navigate to the administration section and access the data configuration tab. Here, you can establish a connection to your on-premise data sources by adding a new location. Enter the required information to create the connection. Specify the Host as the virtual hostname that you previously defined during the HANA Cloud Connector configuration. Next, input the virtual port that you set during the HANA Cloud Connector configuration in the Port field. In the Username and Password fields, enter the credentials that were specified in the tomcat-users.xml file. These credentials ensure secure authentication for the connection between SAC and your on-premise data sources. Once you have entered all the necessary information, simply click on the Create button. This will establish the connection, allowing you to access and analyze your on-premises data.</P><P>Happy Learning!!</P><P>&nbsp;</P><P>&nbsp;</P><P><SPAN>&nbsp;</SPAN></P> 2024-12-04T11:49:26.529000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/boosting-performance-in-sap-analytics-cloud-simple-and-efficient-strategies/ba-p/13990231 Boosting Performance in SAP Analytics Cloud: Simple and Efficient Strategies 2025-01-19T16:56:34.177000+01:00 vignesh3027 https://community.sap.com/t5/user/viewprofilepage/user-id/160733 <P><STRONG>Objective:</STRONG> Optimize the performance in SAP Analytics Cloud using simple and efficient steps.</P><P>I strongly believe data modeling is best optimized in <STRONG>Datasphere</STRONG> rather than <STRONG>HANA modeling</STRONG>. Therefore, it is possible to enhance the performance of dashboards in SAC by following the steps mentioned below:</P><OL><LI>Follow widget weightage rules by SAP.</LI><LI>Implementing Apply buttons for input control, measure control, and dimension control.</LI><LI>Filter Optimization: Use Story Filters Instead of Page Filters.</LI><LI>Collapse Page Filters and Input Controls</LI></OL><P>Let breakdown each thing,</P><P><STRONG>Follow Widget Weightage Rules by SAP</STRONG></P><UL><LI><STRONG>Overview:</STRONG> SAP Analytics Cloud assigns widget weightage based on complexity and resource consumption. To ensure optimal performance, developing dashboards while adhering to the widget weightage rule is advisable. If multiple KPIs need to be developed, they should be distributed across separate pages, maintaining a recommended total weight of 5 units per page.</LI><LI><STRONG>Performance Considerations:</STRONG> Developing 14 to 20 KPIs and multiple input controls on a single page may be <STRONG><EM>optimized for design</EM></STRONG> but <STRONG><EM>could impact performance.</EM></STRONG></LI><LI><STRONG>Best Practice:</STRONG> Instead of placing all KPIs on one page, split them according to the widget weightage rule, ensuring no more than 5 units per page. This will help in achieving an optimized, high-performing dashboard.</LI><LI><STRONG>For better understanding visit:</STRONG> <A href="https://help.sap.com/docs/SAP_ANALYTICS_CLOUD/00f68c2e08b941f081002fd3691d86a7/fbe339efda1241b5a3f46cf17f54cdff.html?locale=en" target="_blank" rel="noopener noreferrer">Apply Best Practices for Performance Optimization in Story Design | SAP Help Portal</A></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_0-1737299725422.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/214141iF0EC3C586494EABB/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_0-1737299725422.png" alt="vignesh3027_0-1737299725422.png" /></span></P><P>&nbsp;</P><UL><LI><STRONG>Optimization Tips:</STRONG></LI><UL><LI>Use <STRONG>Optimized Design Experiences</STRONG> to create stories.</LI><LI>Limit the <STRONG>number of widgets</STRONG> on a single page.</LI><LI>Go with <STRONG>optimized charts</STRONG> such as bar or line charts instead of complex visualizations where possible.</LI><LI>Minimize the use of <STRONG>widgets with high rendering weights</STRONG>, such as maps or complex hierarchical tables.</LI><LI>Leverage the <STRONG>Performance Analyzer</STRONG> to assess whether the story is optimized and identify any issues. It will display which widgets or scripts take longer to load.</LI></UL></UL><P><STRONG>Implement Apply Buttons for Input Control, Measure Control, Dimension Control</STRONG></P><UL><LI>Implementing an <STRONG>Apply Button</STRONG> for input, measure, and dimension controls is an effective way to optimize performance in SAP Analytics Cloud (SAC) dashboards.</LI><LI>By using an Apply button, you allow users to make multiple selections or changes without triggering data recalculations or page refreshes with every single input. Instead, the system will <STRONG>only process the changes</STRONG> when the user <STRONG><EM>explicitly clicks the Apply button</EM></STRONG>, minimizing unnecessary data reloads and improving dashboard performance.</LI></UL><P><STRONG>Benefits of Using an Apply Button:</STRONG></P><OL><LI><STRONG>Improved Performance:</STRONG> By delaying the data refresh until the user clicks the Apply button, you reduce the load on the system, especially for complex stories with multiple filters.</LI><LI><STRONG>User Control:</STRONG> Users can select multiple filters or options at once and see the effect of their selections only after applying them, leading to a more controlled and predictable user experience.</LI><LI><STRONG>Avoid Redundant Queries:</STRONG> Without an Apply button, the system may trigger data queries for every change made to the input controls, which can slow down performance. The Apply button ensures that only the necessary queries are triggered when needed.</LI><LI>In this case, the dashboard triggers a data query every time the user changes a filter without the apply button and with the apply button. Here's an example of how SQL queries would be triggered</LI></OL><P>&nbsp;</P><TABLE width="555px"><TBODY><TR><TD width="164.8px" height="30px"><STRONG>Scenario</STRONG></TD><TD width="389.4px" height="30px"><STRONG>SQL Queries</STRONG></TD></TR><TR><TD width="164.8px" height="142px"><STRONG>Without Apply</STRONG></TD><TD width="389.4px" height="57px"><P><EM><STRONG>Query 1:</STRONG></EM> SELECT * FROM sales_data WHERE Invoice_Date IN ('2020', '2021', '2022')</P><P><EM><STRONG>Query 2:</STRONG></EM> SELECT * FROM sales_data WHERE Invoice_Date IN ('2020', '2021', '2022') AND Region IN ('South', 'Southeast')</P></TD></TR><TR><TD width="389.4px" height="85px">&nbsp;</TD></TR><TR><TD width="164.8px" height="85px"><STRONG>With Apply</STRONG></TD><TD width="389.4px" height="85px"><STRONG>Query (After Apply):</STRONG> SELECT * FROM sales_data WHERE Invoice_Date IN ('2020', '2021', '2022') AND Region IN ('South', 'Southeast')</TD></TR></TBODY></TABLE><P><STRONG><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Apply Button" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/214142iB1E989717C012658/image-size/large?v=v2&amp;px=999" role="button" title="ApplyFilter.gif" alt="Apply Button" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Apply Button</span></span></STRONG></P><P><STRONG>Impact on Performance:</STRONG></P><UL><LI><STRONG>Without Apply Button:</STRONG> Multiple queries are sent to the data source for each filter change, causing <STRONG>extra load and slowing down the dashboard performance</STRONG>, especially with larger datasets or complex queries.</LI><LI><STRONG>With Apply Button:</STRONG> Only a <STRONG>single query is triggered</STRONG> after the user applies their filter selections, improving performance by reducing the number of queries and data retrieval times.</LI></UL><P><STRONG>Filter Optimization: Use Story Filters Instead of Page Filters</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_3-1737300757410.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/214144i8FB731083C1ABEBA/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_3-1737300757410.png" alt="vignesh3027_3-1737300757410.png" /></span></P><P>&nbsp;</P><P>If the same filter/input control is repeated across all pages in a story, use a Story Filter (Global Filter) instead of Page Filters. Here's why:</P><OL><LI><STRONG>Consistency Across Pages:</STRONG><BR />A Story Filter applies the same filter to all pages, ensuring uniformity in data across the story.</LI><LI><STRONG>Simplified User Interaction:</STRONG><BR />Users don’t need to set the same filter on every page; changing it once updates all pages.</LI><LI><STRONG>Reduced Complexity:</STRONG><BR />Managing one Story Filter is simpler than maintaining multiple Page Filters with identical logic.</LI><LI><STRONG>Improved Performance for Shared Filters:</STRONG><BR />Since the filter is applied globally, it reduces redundancy in querying and ensures a smoother user experience.</LI></OL><P><STRONG>Usage:<BR /></STRONG>Apply a Story Filter when the same filter (e.g., Region or Date) is required across all pages to maintain a consistent context and improve usability.</P><P><STRONG>Collapse Page Filters and Input Controls</STRONG></P><P class="">Expanded page filters or input controls are convenient because you can quickly select members or search for members in the control. However, an expanded input control refreshes often and that can affect the story's performance.</P><P class="">When the input control is collapsed, you must select the control to display the member list before you can search or make your selections.</P><DIV class="">&nbsp;</DIV><DIV class="">&nbsp;</DIV><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="loio2a3a7f11cc5c4b77a2b4c33b04fc60d5_LowRes.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/214143i8BFD7CA7EAEEC730/image-size/large?v=v2&amp;px=999" role="button" title="loio2a3a7f11cc5c4b77a2b4c33b04fc60d5_LowRes.png" alt="loio2a3a7f11cc5c4b77a2b4c33b04fc60d5_LowRes.png" /></span></P><P>Hope this helps you guys.</P><P>Thank you.</P><P>&nbsp;</P><P>&nbsp;</P> 2025-01-19T16:56:34.177000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/end-to-end-designing-the-multilevel-hierarchy-in-datasphere/ba-p/14009535 End to End Designing the Multilevel Hierarchy in Datasphere 2025-02-06T19:33:50.934000+01:00 kartheekkkota https://community.sap.com/t5/user/viewprofilepage/user-id/227849 <P>In this blog we would like to cover up how to design multilevel Hierarchy in Datasphere on CDS views.</P><P>I have considered Profit center in my scenario and Below are the list of standard CDS views required to design Hierarchy.</P><TABLE><TBODY><TR><TD><P><SPAN>CDS View Name</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>SQL VIEW NAME</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>CDC</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Support DataFlow</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Support Replication flow</SPAN><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTERTEXT</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IFIPROFITCENTERT</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>NO</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTER</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IFIPROFITCENTER</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>YES</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTERHIERARCHY</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IFIPROFITCENTERH</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTERHIERARCHYTEXT</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IFIPRFTCTRHT</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTERHIERARCHYNODE</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IPRFTCTRHNODE</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>No</SPAN><SPAN>&nbsp;</SPAN></P></TD></TR><TR><TD><P><SPAN>I_PROFITCENTERHIERARCHYNODET</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>IFIPRFTCTRHNODET</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>No</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>Yes</SPAN><SPAN>&nbsp;</SPAN></P></TD><TD><P><SPAN>No</SPAN><SPAN>&nbsp;</SPAN></P></TD></TR></TBODY></TABLE><P>To Achieve a hierarchy Representative key plays a key role. If we miss deciding the representative key then the Hierarchy will not be visible.</P><P>We don't have option like in BW - Tcode RSH1 in datasphere to check whether Hierarchy is enabled or not at Master data dimension level or Fact association level</P><P>What is representative key ?</P><P><SPAN class=""><SPAN class="">In an analytical scenario where data is exposed as dimension, the master data can have multiple key fields, but a single key is required to access the data. That's why this field must be identified as a representative key field</SPAN></SPAN><SPAN class="">&nbsp;</SPAN></P><P><SPAN class="">Here i try to explain on the standard CDS view name for better understanding while designing replace the standard view name with your custom view name in Datasphere</SPAN></P><P><STRONG>I_Profitcentertext :&nbsp;</STRONG></P><P>Semantics Usage as<STRONG> Text&nbsp;&nbsp;</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_0-1738859505816.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222853i91F2E28CAF29A358/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_0-1738859505816.png" alt="kartheekkk_0-1738859505816.png" /></span></P><P>Preparing the Hierarchy Directory :&nbsp;</P><P>I_PROFITCENTERHIERARCHYTEXT ;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_2-1738859845798.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222855i95C4027C5B87DDB0/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_2-1738859845798.png" alt="kartheekkk_2-1738859845798.png" /></span></P><P>I_PROFITCENTERHIERARCHY : This CDS view will help us to choose multiple hierarchy versions</P><P><STRONG>Semantics Usage Type :</STRONG> Dimension</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_1-1738859728142.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222854i03881B84C81D8030/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_1-1738859728142.png" alt="kartheekkk_1-1738859728142.png" /></span></P><P>Add&nbsp;I_PROFITCENTERHIERARCHYTEXT&nbsp; as <STRONG>Text Association</STRONG></P><P>Now moving to Preparing Hierarchy node&nbsp;</P><P><STRONG>I_PROFITCENTERHIERARCHYNODETEXT</STRONG> :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_3-1738860020316.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222857i59F3A89CCB457246/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_3-1738860020316.png" alt="kartheekkk_3-1738860020316.png" /></span></P><P><STRONG>I_PROFITCENTERHIERARCHYNODE</STRONG> :</P><P>Add the below <STRONG>association</STRONG> Before choosing the semantics to&nbsp;&nbsp;Hierarchy with Directory</P><P>I_PROFITCENTERHIERARCHY</P><P>I_PROFITCENTERHIERARCHYTEXT</P><P>Choose the semantics usage as Hierarchy with Directory&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_4-1738860165825.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222861i7D89F74AE478A70C/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_4-1738860165825.png" alt="kartheekkk_4-1738860165825.png" /></span></P><P>Once click on the Hierarchy with Directly Setting we have to pass the fields&nbsp;</P><P><STRONG>Parent</STRONG> : once click on the field you will be able to see the dropdown and choose Parent node&nbsp;</P><P><STRONG>Child</STRONG> : Hierarchy Node</P><P><STRONG>Hierarchy Directory :</STRONG> I_PROFITCENTERHIERARCHY( for your ease of understanding i have given standard CDS view name. Here you will see the Business name of the view as per your design</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_5-1738860232736.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222862iA5FA35483FCBA30A/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_5-1738860232736.png" alt="kartheekkk_5-1738860232736.png" /></span></P><P>Node Type : we will see 3 unique values under Node type <STRONG>R- ROOT , N-Node , L - Leaves</STRONG></P><P>Node type values : initially it will be blank click on the + icon Next to Node type values to Add values as shown in my screenshot&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_6-1738860641528.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222865i34B81C7AF20D75BD/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_6-1738860641528.png" alt="kartheekkk_6-1738860641528.png" /></span></P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_7-1738860652574.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222866iC93EF862D5FFA9E8/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_7-1738860652574.png" alt="kartheekkk_7-1738860652574.png" /></span></P><P>Now click on this icon to assign the representative key&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_8-1738860778953.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222867i4EA92E270BF87615/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_8-1738860778953.png" alt="kartheekkk_8-1738860778953.png" /></span></P><P>Add Hierarchy Node as Representative Key to this view&nbsp;</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_9-1738860797193.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222868iF98ADBDF6BF4E93C/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_9-1738860797193.png" alt="kartheekkk_9-1738860797193.png" /></span></P><P>Now the final step to add this hierarchy view to Profit center Dimension&nbsp;</P><P>I_PROFITCENTER</P><P>Add I_PROFITCENTERHIERARCHYNODE as association to I_PROFITCENTER</P><P>Add I_PROFITCENTERTEXT as association to I_PROFITCENTER( We can ignore this step as you can fine the Description field is available in I_PROFITCENTER itself you can decide the semantics as Text and assign the text to Profit center</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_10-1738861027807.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222870i30E1CAF60992A271/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_10-1738861027807.png" alt="kartheekkk_10-1738861027807.png" /></span></P><P>Now add&nbsp; I_PROFITCENTER(Dimension) as association to your fact view &amp; Deploy .</P><P>Go to your Analytical view Click on Fact source&nbsp;</P><P>In Details tab find for profit center and click on the right side arrow&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_12-1738861333989.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222873iEE8BEBF02CE00CF1/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_12-1738861333989.png" alt="kartheekkk_12-1738861333989.png" /></span></P><P>Enable the check box <STRONG>Use Associated Dimension</STRONG> to see the hierarchy&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_13-1738861351830.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222875i0E08982E8E8F0586/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_13-1738861351830.png" alt="kartheekkk_13-1738861351830.png" /></span></P><P>Now click on preview once deployed.</P><P>You will see the Hiearchy option if all went well&nbsp;</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_14-1738861470589.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222876i7D5168D459DE17F4/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_14-1738861470589.png" alt="kartheekkk_14-1738861470589.png" /></span></P><P>Select your Hierarchy from Drop down.&nbsp;</P><P>At this moment we don't have option to apply <STRONG>Hierarchy Variable</STRONG>&nbsp;at analytical model level tried applying input parameter at Hierarchy dimension level which ended up with error message</P><P><SPAN>Review the following errors: " - CSN Compilation failed with following messages: Errors(ORIGINAL): (in entity:“IT.2VD_RTR_PROFITCENTERHIERARCHY_01”) Table-like entities with parameters are not supported for conversion to SQL [Ln 3459] Errors(FORMATTED): (in entity:“IT.2VD_RTR_PROFITCENTERHIERARCHY_01”) Table-like entities with parameters are not supported for conversion to SQL [Ln 3459], Correlation ID: 4c684267-8ea9-42d6-44fd-3adf8d7587b3</SPAN><SPAN>&nbsp;</SPAN></P><P>and SAP is releasing it in their upcoming release till the time we have to filter the required Hierarchy at I_PROFITCENTERHIERARCHY view.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_15-1738861581901.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222877i28CC3D47D0B7EB5F/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_15-1738861581901.png" alt="kartheekkk_15-1738861581901.png" /></span></P><P>Here is the final Output at Analytical Model level</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_1-1738863374531.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222893i8CE7A959EFEE6DE1/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_1-1738863374531.png" alt="kartheekkk_1-1738863374531.png" /></span></P><P>Are you still not able to see the Hierarchy ?? Execute the below steps&nbsp;</P><P><SPAN>Step 1 : Execute&nbsp;tcode&nbsp;KCH2 to display Profit center hierarchy find your hierarchy</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Step 2 : Execute&nbsp;Tcode&nbsp;</SPAN><SPAN>HRY_REPRELEV&nbsp;</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>SETCLASSS = 0106</SPAN><SPAN>&nbsp;(in my example)</SPAN></P><P><SPAN>Execute</SPAN><SPAN>&nbsp;</SPAN></P><P><SPAN>Step 3:&nbsp;</SPAN><SPAN>&nbsp;Enable the check boxes as&nbsp;</SPAN><STRONG><SPAN>&nbsp;</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><UL><LI><STRONG><SPAN>Report Relevant</SPAN></STRONG><SPAN>&nbsp;</SPAN><SPAN>&nbsp;</SPAN></LI><LI><STRONG><SPAN>Auto Replicate</SPAN></STRONG><SPAN>&nbsp;</SPAN></LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_18-1738862153744.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222880i26919C279CD24BCE/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_18-1738862153744.png" alt="kartheekkk_18-1738862153744.png" /></span></P><P><STRONG><SPAN>Step 4 : Click on Save</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><P><STRONG><SPAN>Step 5 : No Entries before executing the Hierarchy Replication&nbsp;Tcode&nbsp;</SPAN></STRONG><SPAN>&nbsp;</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_19-1738862174345.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222881i739CFC71C9A6737E/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_19-1738862174345.png" alt="kartheekkk_19-1738862174345.png" /></span></P><P><SPAN class=""><SPAN class="">Step 6: Execute<SPAN>&nbsp;</SPAN></SPAN><SPAN class="">Tcode</SPAN><SPAN class=""><SPAN>&nbsp;</SPAN>:<SPAN>&nbsp;</SPAN></SPAN></SPAN><SPAN class=""><SPAN class="">HRRP_REP<SPAN>&nbsp;and select your Hiearchy name at Hierarchy ID</SPAN></SPAN></SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_20-1738862253586.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222882i33ABF8E9F4E834D9/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_20-1738862253586.png" alt="kartheekkk_20-1738862253586.png" /></span></P><P><SPAN>Step 6 : Refresh the CDS View I_PROFITCENTERHIERARCHY&nbsp;</SPAN><SPAN>&nbsp;</SPAN></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kartheekkk_22-1738862366341.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/222886i30633B46695DA576/image-size/medium?v=v2&amp;px=400" role="button" title="kartheekkk_22-1738862366341.png" alt="kartheekkk_22-1738862366341.png" /></span></P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2025-02-06T19:33:50.934000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/dynamic-measure-filtering-using-input-fields-in-sap-analytics-cloud-sac/ba-p/14053985 Dynamic Measure Filtering Using Input Fields in SAP Analytics Cloud (SAC) 2025-03-23T15:19:36.294000+01:00 vignesh3027 https://community.sap.com/t5/user/viewprofilepage/user-id/160733 <P><STRONG><span class="lia-unicode-emoji" title=":rocket:">🚀</span>Problem Statement</STRONG></P><P>In <STRONG>SAP Analytics Cloud (SAC)</STRONG>, users often need to filter data in tables with both <STRONG>dimension</STRONG> and <STRONG>measure</STRONG> input controls. Typically, you can use:</P><UL><LI><STRONG>Dimension Filters</STRONG> to filter based on text or categorical data.</LI><LI><STRONG>Measure Filters</STRONG> to filter numeric data like totals, averages, or sales values.</LI></UL><P>However, when tables are dynamic (due to dimension and measure input controls), a <STRONG>direct measure filter</STRONG> using scripting is not available. This limitation creates the need for a custom solution.</P><P><STRONG>🧩 Challenges Faced</STRONG></P><OL><LI><STRONG>No Direct Measure Filtering:</STRONG> SAC supports filtering dimensions using setDimensionFilter() but lacks a direct way to filter measures through scripts.</LI><LI><STRONG>Limited Scripting Functions:</STRONG> Functions like isNaN(), null, and ! are not available, making error handling difficult.</LI><LI><STRONG>Dynamic Table Complexity:</STRONG> Since the table structure is not static, traditional filters cannot be applied easily.</LI></OL><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Output</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Filter.gif" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241534i5E51D027DB7B20B6/image-size/large?v=v2&amp;px=999" role="button" title="SACMeasure_Filter (2).gif" alt="Filter.gif" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Filter.gif</span></span></P><P>&nbsp;</P><P>&nbsp;<STRONG><span class="lia-unicode-emoji" title=":hammer_and_wrench:">🛠</span>️ Solution Approach</STRONG></P><P>To overcome these challenges, we use a <STRONG>calculated measure</STRONG> combined with scripting. Here’s a structured approach:</P><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Step 1: Create a Script Variable</STRONG></P><UL><LI><STRONG>Navigate to the Story in SAC.</STRONG></LI><LI>Create a <STRONG>Script Variable</STRONG> to capture user input.</LI><LI>Provide required details such as,</LI><LI>&nbsp;Name: ScriptVariable_1</LI><LI>Description: CatputreIP</LI><LI>Type: Integer (since we going to capture measure)</LI><LI>Default value: 0</LI><LI>Enable Expose variable and enable dynamic URL</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_3-1742736909468.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241523i05F20F1DAB24C346/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_3-1742736909468.png" alt="vignesh3027_3-1742736909468.png" /></span></P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_4-1742736909472.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241525i1E1E674A3B494486/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_4-1742736909472.png" alt="vignesh3027_4-1742736909472.png" /></span></P><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Step 2: Create an Input Field and Link to Script Variable</STRONG></P><OL><LI><STRONG>Add an Input Field:</STRONG></LI><UL><LI>Input field is used to get the value from the user.</LI></UL></OL><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_5-1742736909474.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241524iA301BDD18D1E36B4/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_5-1742736909474.png" alt="vignesh3027_5-1742736909474.png" /></span></P><P>&nbsp;</P><OL><UL><LI><STRONG>Data Source:</STRONG> Script variable</LI><LI><STRONG>Binding display:</STRONG> Map the created Script variable</LI><LI>Make sure enable the write-back runtime. (It will dynamically fetch the value without a apply button)</LI><LI>Map the script variable to the write-back runtime.</LI></UL></OL><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_6-1742736909477.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241526i18B180AF2628D6B0/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_6-1742736909477.png" alt="vignesh3027_6-1742736909477.png" /></span></P><P><STRONG>Ensure Real-Time Update:</STRONG></P><OL><UL><LI>The input field should update the script variable whenever the user enters a value.</LI></UL></OL><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Step 3: Create a Calculated Measure</STRONG></P><UL><LI>Go to your <STRONG>SAC Model</STRONG> and create a <STRONG>Calculated Measure</STRONG> using the formula:</LI></UL><P><STRONG>&nbsp;</STRONG></P><TABLE><TBODY><TR><TD width="601px" height="50px"><P><STRONG>IF ("Total Sales" &gt;= ScriptVariable_1, "Total Sales")</STRONG></P></TD></TR></TBODY></TABLE><P><STRONG>&nbsp;</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_7-1742736909481.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241528iBD01A52644485735/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_7-1742736909481.png" alt="vignesh3027_7-1742736909481.png" /></span></P><P><STRONG>Explanation:</STRONG></P><UL><LI>ScriptVaraible_1 stores the user input.</LI><LI>Displays the <STRONG>Total Sales</STRONG> value, only if it <STRONG>greater than or equals</STRONG> the input threshold.</LI></UL><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Step 4: Add the Calculated Measure to the Table</STRONG></P><UL><LI><STRONG>Add a Table:</STRONG></LI><UL><LI>Connect it to your data model.</LI><LI>Display the calculated measure in a column.</LI></UL></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_8-1742736909485.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241527iD5E7D7E760EEC863/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_8-1742736909485.png" alt="vignesh3027_8-1742736909485.png" /></span></P><P><STRONG><span class="lia-unicode-emoji" title=":white_heavy_check_mark:">✅</span>Step 5: Handle Empty Input with Scripting</STRONG></P><P>If no input is provided, SAC won't display any data since the calculated measure in the table awaits input. To address this, the following script ensures the input value defaults to <STRONG>0</STRONG> if left empty.</P><UL><LI><STRONG>Forgotten Reset:</STRONG> If users apply a filter and forget to enter a new input, no data will appear. The script ensures the input resets to <STRONG>0</STRONG> in such cases.</LI><LI><STRONG>Continuous Monitoring:</STRONG> After applying the filter, users can still adjust the threshold dynamically without facing data unavailability.</LI></UL><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_9-1742736909488.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241531i099D28D7FADDB714/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_9-1742736909488.png" alt="vignesh3027_9-1742736909488.png" /></span></P><P>&nbsp;</P><P><STRONG>Script Code:</STRONG></P><TABLE><TBODY><TR><TD width="601"><P>// Get the input value</P><P>var inputValue = InputField_1.getValue();</P><P>&nbsp;</P><P>// Check if the input field is empty using a boolean comparison</P><P>var isEmpty = inputValue === "";</P><P>&nbsp;</P><P>// Set the input to "0" if it is empty</P><P>if (isEmpty) {</P><P>&nbsp;&nbsp;&nbsp; inputValue = "0";</P><P>&nbsp;&nbsp;&nbsp; InputField_1.setValue("0"); // Reflect 0 in the input field</P><P>}</P><P>&nbsp;</P><P>// Show a success message</P><P>Application.showMessage(ApplicationMessageType.Success, "Total Sales filter value set to: " + inputValue);</P><P>&nbsp;</P><P>&nbsp;</P></TD></TR></TBODY></TABLE><P><STRONG><span class="lia-unicode-emoji" title=":light_bulb:">💡</span>Explanation:</STRONG></P><UL><LI><STRONG>inputField_Sales.getValue():</STRONG> Captures user input.</LI><LI><STRONG>inputValue === "":</STRONG> Checks if the input field is empty.</LI><LI><STRONG>inputField_Sales.setValue():</STRONG> Ensures input is set to <STRONG>0</STRONG> if empty.</LI><LI><STRONG>Application.showMessage():</STRONG> Provides feedback to users.</LI></UL><P><STRONG><span class="lia-unicode-emoji" title=":gear:">⚙️</span>UI/UX enhancement:</STRONG></P><UL><LI>Create a clear filter functionality, in my case I have used an image to clear the value.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vignesh3027_10-1742736909490.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241530i261BA5C2E1595DAB/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_10-1742736909490.png" alt="vignesh3027_10-1742736909490.png" /></span></P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="vignesh3027_11-1742736909491.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/241529i6816A6E187BA0CCB/image-size/large?v=v2&amp;px=999" role="button" title="vignesh3027_11-1742736909491.png" alt="vignesh3027_11-1742736909491.png" /></span></P><P>&nbsp;</P><P><STRONG><span class="lia-unicode-emoji" title=":direct_hit:">🎯</span>Conclusion</STRONG></P><P>This approach effectively handles dynamic measure filtering using input fields and scripting in SAC. Users can confidently filter data based on their criteria, while ensuring a smooth user experience with clear feedback.</P> 2025-03-23T15:19:36.294000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-analytics-cloud-sample-poc-dashboard-story/ba-p/14066947 SAP Analytics Cloud - Sample PoC Dashboard Story 2025-04-08T10:32:06.625000+02:00 Keyur_N_Shukla https://community.sap.com/t5/user/viewprofilepage/user-id/1385163 <P><STRONG>Sample Data Snapshot used in this Blog :</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_0-1743830297541.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246982iEDED90EC031260C8/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_0-1743830297541.png" alt="Keyur_N_Shukla_0-1743830297541.png" /></span></P><P><STRONG>Landing Page - SAP Analytics Cloud</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_1-1743830504986.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246983i6D45B406B8362C73/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_1-1743830504986.png" alt="Keyur_N_Shukla_1-1743830504986.png" /></span></P><P>At this landing page, click on "Files" option in left side frame. I uploaded Datasets (snap above) manually. You see two Datasets as last two entries in following screenshot.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_3-1743830815721.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246985i1B9BE290ECFE4B9A/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_3-1743830815721.png" alt="Keyur_N_Shukla_3-1743830815721.png" /></span></P><P>These Datasets can now be used to prepare a Dashboard Story.</P><P>At home page, click "Stories", and you land at the following :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_2-1743830605506.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246984i27D50B55E4FA4C0E/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_2-1743830605506.png" alt="Keyur_N_Shukla_2-1743830605506.png" /></span></P><P>I've used here the option "Smart Discovery". Then, you get following screen.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_0-1743831059150.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246986i1DA24F37411F5CCE/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_0-1743831059150.png" alt="Keyur_N_Shukla_0-1743831059150.png" /></span></P><P>Here, you select "existing data model" -&gt; "other data model", and there you see uploaded Datasets.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_1-1743831180847.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246987iB35B046C61059C18/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_1-1743831180847.png" alt="Keyur_N_Shukla_1-1743831180847.png" /></span></P><P>After selecting the Dataset(s), you need to select the dimension and measure (at least one), and also filters if needed.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_2-1743831403603.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246988i9C016827817700E8/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_2-1743831403603.png" alt="Keyur_N_Shukla_2-1743831403603.png" /></span></P><P>I selected - Dimension as "Order Amount" and Measure as "City".&nbsp;</P><P>Then, click "Run". Smart Discovery will generate following KPIs screens.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_3-1743831597965.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246989iB32A60B707CDA0D7/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_3-1743831597965.png" alt="Keyur_N_Shukla_3-1743831597965.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_4-1743831620483.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246990i78997C163B54F900/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_4-1743831620483.png" alt="Keyur_N_Shukla_4-1743831620483.png" /></span></P><P>Also, Smart Discovery generates "Smart Insights". For example, "Top contributing Customer in your Revenues generation", etc.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_5-1743831742106.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246991i6827219AB3D79D84/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_5-1743831742106.png" alt="Keyur_N_Shukla_5-1743831742106.png" /></span><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Keyur_N_Shukla_6-1743831756326.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/246992i2049D058E9AA52C0/image-size/medium?v=v2&amp;px=400" role="button" title="Keyur_N_Shukla_6-1743831756326.png" alt="Keyur_N_Shukla_6-1743831756326.png" /></span></P><P>I would like to conclude here with following Note and Bottom-line.</P><P>Covering different critical business areas and respective monitoring parameters can be presented in this way as a Digital Boardroom for CXO executives or Senior Management.</P><P>Thank you Readers !...</P><P>&nbsp;</P><P>&nbsp;</P> 2025-04-08T10:32:06.625000+02:00 https://community.sap.com/t5/technology-blog-posts-by-members/sap-work-zone-vs-analytics-cloud-vs-fiori-launchpad-unified-analytics/ba-p/14263714 SAP Work Zone vs. Analytics Cloud vs. Fiori Launchpad – Unified Analytics Comparison (Decision Tree) 2025-11-10T01:03:45.137000+01:00 GovindaRaoBanothu45 https://community.sap.com/t5/user/viewprofilepage/user-id/828979 <P><FONT face="arial,helvetica,sans-serif">In today's data driven world, enterprises need an intuitive and efficient digital workspace that unites business processes, analytics, and collaboration. SAP offers several powerful solutions like SAP Work Zone, SAP Analytics Cloud (SAC), and SAP Fiori Launchpad, while many organizations also extend their ecosystems with non-SAP BI tools such as Microsoft Power BI and Tableau. Selecting the right entry point depends on factors like data architecture, user experience, and the level of integration needed between SAP and non-SAP systems.</FONT></P><P><FONT face="arial,helvetica,sans-serif">With more than 15 years of experience across SAP Analytics and Business Intelligence implementations, I have witnessed how organizations evolve their analytics strategies from siloed reporting to fully integrated decision environments. Through these diverse engagements, it is clear that tools like SAP Work Zone, SAC, Fiori Launchpad, and external BI platforms can coexist to deliver a unified, connected, and scalable analytics landscape. When combined with modern data platforms like Snowflake or SAP Datasphere / SAP Business Data Cloud(BDC), they empower enterprises to turn information into real-time insight.</FONT></P><P><FONT face="arial,helvetica,sans-serif">Let's explore how each of these tools fits within a unified analytics ecosystem and how organizations can determine the right combination to maximize business value.</FONT></P><P><FONT face="arial,helvetica,sans-serif">The Contenders: A Quick Overview</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">SAP Fiori Launchpad: The gateway to SAP applications, offering a role based, personalized user experience.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">SAP Analytics Cloud (SAC): SAP's flagship analytics platform, combining BI, planning, and predictive capabilities in one solution.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">SAP Work Zone: A digital workplace that integrates SAP and non-SAP applications, fostering collaboration and productivity.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Non-SAP BI Tools (Power BI, Tableau): Industry leading analytics platforms known for flexibility and advanced visualization.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Fiori Launchpad: The Trusted Entry Point</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">SAP Fiori Launchpad has been a cornerstone of SAP's user experience strategy. It is the go to interface for accessing SAP S/4HANA FICO and other SAP applications.</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">Strengths:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Role based access to transactional applications and operational data.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Seamless integration with SAP systems.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Consistent, intuitive user experience.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Limitations:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Limited to SAP applications, no native support for non-SAP tools.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Lacks advanced analytics and collaboration features.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Best For: Organizations that primarily use SAP systems and need a streamlined interface for transactional workflows.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Analytics Cloud (SAC): The All-in-One Analytics Powerhouse</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">SAP Analytics Cloud is SAP's answer to modern analytics needs. It combines BI, planning, and predictive analytics in a single cloud based platform.</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">Strengths:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Unified platform for reporting, dashboards and predictive analytics.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Deep integration with SAP source systems like SAP S/4HANA .</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Built-in collaboration features for team based planning.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Limitations:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Limited flexibility for non-SAP data sources compared to tools like Power BI or Tableau.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Steeper learning curve for non-SAP users.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Best For: Organizations heavily invested in SAP systems that want a single platform for analytics and planning.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Work Zone: The Digital Workplace Revolution</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">SAP Work Zone is a relatively new addition to the SAP ecosystem, designed to create a unified digital workplace. It integrates SAP and non-SAP applications, making it a versatile tool for modern enterprises.</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">Strengths:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Unified access to SAP and non-SAP applications.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Built-in collaboration tools like chat, news feeds, and document sharing.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Customizable workspaces for teams and projects.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Limitations:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">May be overkill for organizations that only need analytics or transactional access.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Requires additional configuration for seamless integration with non-SAP tools.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Best For: Organizations looking to enhance collaboration and provide a single point of access for SAP and non-SAP tools.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif">SAP Work Zone, together with the SAP Analytics Cloud Catalog, now carries forward and enhances the capabilities of the earlier SAP Analytics Hub. While Analytics Hub focused primarily on report cataloging, Work Zone extends that vision into a full digital workplace integrating analytics, applications, and collaboration in a single experience.</FONT></P><P><FONT face="arial,helvetica,sans-serif"><STRONG>Non-SAP BI Tools: Power BI and Tableau</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">While SAP tools excel in their native ecosystem, non-SAP BI tools like Microsoft Power BI and Tableau offer unmatched flexibility and visualization capabilities.</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">Strengths:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Superior data visualization and user friendly interfaces.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Extensive connectivity to non-SAP data sources like Snowflake, AWS, and Google BigQuery.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Strong community support and third party integrations.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Limitations:</FONT></LI><UL><LI><FONT face="arial,helvetica,sans-serif">Limited native integration with SAP systems.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">May require additional middleware for seamless data flow.</FONT></LI></UL><LI><FONT face="arial,helvetica,sans-serif">Best For: Organizations with diverse data sources that prioritize visualization and flexibility over deep SAP integration.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif"><STRONG>The Hybrid Approach: SAC for SAP, Power BI for Non-SAP</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">Across several SAP Analytics implementations, a hybrid approach has consistently proven effective for unifying enterprise reporting.<BR />SAP Analytics Cloud (SAC) is typically used for SAP source systems, leveraging its deep integration with SAP S/4HANA, along with robust planning capabilities.<BR />Microsoft Power BI complements this by connecting to non-SAP systems such as Snowflake, offering flexibility, advanced visualization and ease of consumption for broader business audiences.</FONT></P><P><FONT face="arial,helvetica,sans-serif">This approach allows us to:</FONT></P><OL><LI><FONT face="arial,helvetica,sans-serif">Maintain a single source of truth for SAP data within SAC.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Leverage Power BI's advanced visualization capabilities for non-SAP data.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">Use Snowflake as the central data warehouse to integrate data across the organization.</FONT></LI></OL><P><FONT face="arial,helvetica,sans-serif"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="GovindaRaoBanothu45_0-1762730811825.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/337659iF9AB853539875D5E/image-size/large/is-moderation-mode/true?v=v2&amp;px=999" role="button" title="GovindaRaoBanothu45_0-1762730811825.png" alt="GovindaRaoBanothu45_0-1762730811825.png" /></span></FONT></P><P><FONT face="arial,helvetica,sans-serif"><STRONG>Which Tool Should Be the First Point of Enterprise Access?</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">The answer depends on your organization's priorities:</FONT></P><UL><LI><FONT face="arial,helvetica,sans-serif">If SAP integration is key: Start with SAP Fiori Launchpad or SAP Work Zone, depending on whether you need transactional access or a collaborative digital workplace.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">If analytics is the focus: Use SAP Analytics Cloud for SAP data and Power BI or Tableau for non-SAP data.</FONT></LI><LI><FONT face="arial,helvetica,sans-serif">If collaboration and integration are critical: SAP Work Zone is the clear winner, providing a unified interface for SAP and non-SAP tools.</FONT></LI></UL><P><FONT face="arial,helvetica,sans-serif"><STRONG>Visualizing the Decision: A Comparison Chart &amp; Decision Tree</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">To make this easier, here’s a quick comparison chart:</FONT></P><TABLE width="630"><TBODY><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Feature</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Fiori Launchpad</STRONG></FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Analytics Cloud</STRONG></FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Work Zone</STRONG></FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Power BI/Tableau</STRONG></FONT></P></TD></TR><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>SAP Integration</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif">Excellent</FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif">Excellent</FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif">Excellent</FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif">Limited</FONT></P></TD></TR><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Non-SAP Integration</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif">None</FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif">Limited</FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif">Excellent</FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif">Excellent</FONT></P></TD></TR><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Analytics Capabilities</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif">Basic</FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif">Advanced</FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif">Basic</FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif">Advanced</FONT></P></TD></TR><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Collaboration Features</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif">None</FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif">Moderate</FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif">Advanced</FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif">Limited</FONT></P></TD></TR><TR><TD width="156.009px"><P><FONT face="arial,helvetica,sans-serif"><STRONG>Best For</STRONG></FONT></P></TD><TD width="110.014px"><P><FONT face="arial,helvetica,sans-serif">SAP Transactions</FONT></P></TD><TD width="121.009px"><P><FONT face="arial,helvetica,sans-serif">SAP Analytics</FONT></P></TD><TD width="113.011px"><P><FONT face="arial,helvetica,sans-serif">Unified Workplace</FONT></P></TD><TD width="129.048px"><P><FONT face="arial,helvetica,sans-serif">Non-SAP Analytics</FONT></P></TD></TR></TBODY></TABLE><P>&nbsp;</P><P><FONT face="arial,helvetica,sans-serif"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="GovindaRaoBanothu45_1-1762730938848.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/337660i57F662DABBB2BADE/image-size/large/is-moderation-mode/true?v=v2&amp;px=999" role="button" title="GovindaRaoBanothu45_1-1762730938848.png" alt="GovindaRaoBanothu45_1-1762730938848.png" /></span>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; <EM>&nbsp;Figure: Decision Tree among the available Tools</EM></FONT></P><P><FONT face="arial,helvetica,sans-serif"><STRONG>Final Thoughts</STRONG></FONT></P><P><FONT face="arial,helvetica,sans-serif">In today's complex analytics environment, there is no single solution that fits every organization. Success lies in aligning technology choices with business objectives, data strategy, and user needs. The most forward-thinking enterprises design flexible analytics ecosystems, balancing governance, agility, and usability to ensure analytics becomes an enabler, not a barrier.</FONT></P><P><FONT face="arial,helvetica,sans-serif">Based on experience across multiple SAP programs, hybrid architectures often offer the strongest foundation. SAP Analytics Cloud (SAC) provides deep integration and governance for SAP data, while platforms like Power BI or Tableau enhance visualization and flexibility for non-SAP data sources. Combined with centralized data hubs like Snowflake or SAP Datasphere or SAP Business Data Cloud (BDC), this model ensures consistency, scalability, and end-to-end insight.</FONT></P><P><FONT face="arial,helvetica,sans-serif">As SAP continues to innovate, solutions such as SAP Work Zone and the SAC Catalog are transforming how users access, share, and collaborate on analytics content. They bridge the gap between SAP and non-SAP systems, creating a truly unified workspace for both business and IT teams.</FONT></P><P><FONT face="arial,helvetica,sans-serif">Ultimately, the goal is not to choose a single tool but to build an integrated analytics ecosystem,&nbsp;one that empowers users at every level to make data driven decisions confidently. Whether an organization adopts a purely SAP centric approach or a hybrid model, the future of analytics lies in seamless connectivity, collaboration and intelligent decision making.</FONT></P><P>&nbsp;</P> 2025-11-10T01:03:45.137000+01:00 https://community.sap.com/t5/technology-blog-posts-by-members/displaying-sap-analytics-cloud-kpi-tiles-from-stories-using-rest-apis/ba-p/14320914 Displaying SAP Analytics Cloud KPI Tiles from Stories Using REST APIs 2026-02-04T08:51:58.622000+01:00 Ajay105 https://community.sap.com/t5/user/viewprofilepage/user-id/2102459 <H2 id="toc-hId-1789469326">Introduction</H2><P class="lia-align-justify" style="text-align : justify;">SAP Analytics Cloud (SAC) is widely used by organizations to provide interactive storytelling and track the business KPIs with advanced visualizations. However, companies often need to obtain KPI information in ways other than the SAC user interface, including in custom web apps. Although SAC does not support the direct embedding of KPI tiles into external apps, it does provide REST APIs that allow programmatic access to widget-level data from SAC stories. These APIs can be used to collect and display KPI tile data, including number (value), number state (status), title, and subtitle, in a bespoke user interface.</P><P class="lia-align-justify" style="text-align : justify;">In this blog, I walk through a detailed, end-to-end implementation that illustrates how to fetch KPI tile data from a SAP Analytics Cloud story using the <FONT face="courier new,courier">widgetquery/getWidgetData</FONT> REST API. The approach uses Python for backend processing and Flask as a lightweight web framework to securely call SAC APIs and output KPI values on a web page.</P><H2 id="toc-hId-1592955821">Configuration of the Project</H2><P class="lia-align-justify" style="text-align : justify;">We may test API access and retrieve KPI tile data using a straightforward Python script before developing the Flask application. This program shows you how to:</P><OL class="lia-align-justify" style="text-align : justify;"><LI>Use OAuth 2.0 to authenticate with SAC</LI><LI>Acquire a token of access</LI><LI>Use the SAC widgetquery/getWidgetData RESTAPI.</LI><LI>Show the console’s KPI values</LI></OL><pre class="lia-code-sample language-python"><code>import requests import webbrowser import urllib.parse # ---------------- CONFIG ---------------- TENANT_URL = "https://&lt;your-tenant&gt;.hanacloudservices.cloud.sap" CLIENT_ID = "&lt;YOUR_CLIENT_ID&gt;" CLIENT_SECRET = "&lt;YOUR_CLIENT_SECRET&gt;" AUTHORIZATION_ENDPOINT = "https://&lt;your-tenant&gt;.hana.ondemand.com/oauth/authorize" TOKEN_ENDPOINT = "https://&lt;your-tenant&gt;.hana.ondemand.com/oauth/token" REDIRECT_URI = "https://your-app-domain.com/oauth/callback" # used only to capture code manually STORY_ID = "&lt;your-storyid&gt;" WIDGET_IDS = [ "Chart_1", "Chart_2", "Chart_3", "Chart_4", "Chart_5", "Chart_6", "Chart_8" ] # ---------------- STEP 1: LOGIN ---------------- params = { "response_type": "code", "client_id": CLIENT_ID, "redirect_uri": REDIRECT_URI } auth_url = AUTHORIZATION_ENDPOINT + "?" + urllib.parse.urlencode(params) print("\n Opening browser for SAC login...") webbrowser.open(auth_url) code = input("\n Paste the authorization code here: ").strip() # ---------------- STEP 2: TOKEN ---------------- payload = { "grant_type": "authorization_code", "code": code, "redirect_uri": REDIRECT_URI, "client_id": CLIENT_ID, "client_secret": CLIENT_SECRET } token_resp = requests.post( TOKEN_ENDPOINT, data=payload, headers={"Content-Type": "application/x-www-form-urlencoded"} ) token_resp.raise_for_status() access_token = token_resp.json()["access_token"] print("\n Access token received") # ---------------- STEP 3: FETCH KPI ---------------- headers = { "Authorization": f"Bearer {access_token}", "Accept": "application/json" } print("\n KPI VALUES\n" + "-" * 40) for widget_id in WIDGET_IDS: url = f"{TENANT_URL}/widgetquery/getWidgetData" params = { "storyId": STORY_ID, "widgetId": widget_id, "type": "kpiTile" } r = requests.get(url, headers=headers, params=params) if r.ok: data = r.json() number = data.get("number", "N/A") title = data.get("title", widget_id) print(f"{title}: {number}") else: print(f"{widget_id}: Error") print("\n Done")</code></pre><P class="lia-align-justify" style="text-align : justify;"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="pythonkpi_output_terminal.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368640iF530E352835892E6/image-size/large?v=v2&amp;px=999" role="button" title="pythonkpi_output_terminal.png" alt="pythonkpi_output_terminal.png" /></span></P><P class="lia-align-justify" style="text-align : justify;">&nbsp;<SPAN>How this operate</SPAN></P><OL class="lia-align-justify" style="text-align : justify;"><LI>Login Step:&nbsp;Launches a web browser to obtain the permission code and log into SAC.</LI><LI>Token Step: Provides an access token in exchange for the authorization code.</LI><LI>Fetch KPI Step: Prints the KPI number and title after contacting the SAC REST API for each widget ID.</LI></OL><P class="lia-align-justify" style="text-align : justify;"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="RESTAPI_Flow_diagram.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368634i0FF9C05120008397/image-size/large?v=v2&amp;px=999" role="button" title="RESTAPI_Flow_diagram.jpg" alt="RESTAPI_Flow_diagram.jpg" /></span></P><P class="lia-align-justify" style="text-align : justify;">&nbsp;</P><H2 id="toc-hId-1396442316">Code of Application</H2><P class="lia-align-justify" style="text-align : justify;">The full Flask application that is used to retrieve KPI tile data and authenticate with SAP Analytics Cloud is shown below. This code manages widget data fetching, token retrieval, login, and creates an eye-catching KPI dashboard in the browser.</P><pre class="lia-code-sample language-markup"><code>HTML_TEMPLATE = """ &lt;!DOCTYPE html&gt; &lt;html lang="en"&gt; &lt;head&gt; &lt;meta charset="UTF-8"&gt; &lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&gt; &lt;title&gt;SAC KPI Dashboard&lt;/title&gt; &lt;style&gt; body { font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; background: #f4f6f8; margin: 0; padding: 0; } h1 { text-align: center; margin-top: 20px; color: #333; } .container { display: flex; flex-wrap: wrap; justify-content: center; margin: 40px auto; max-width: 1200px; gap: 20px; } .card { color: #fff; width: 250px; height: 150px; border-radius: 16px; box-shadow: 0 10px 20px rgba(0,0,0,0.2); display: flex; flex-direction: column; justify-content: center; align-items: center; transition: transform 0.3s, box-shadow 0.3s; cursor: pointer; text-align: center; padding: 10px; } .card:hover { transform: translateY(-10px); box-shadow: 0 20px 30px rgba(0,0,0,0.3); } .number { font-weight: bold; white-space: nowrap; overflow: hidden; text-overflow: ellipsis; max-width: 90%; } .title { font-size: 1em; margin-top: 10px; color: #e0e0e0; } /* Rainbow colors for cards */ .rainbow-0 { background: linear-gradient(135deg, #ff6b6b, #f06595); } .rainbow-1 { background: linear-gradient(135deg, #feca57, #ff9f43); } .rainbow-2 { background: linear-gradient(135deg, #1dd1a1, #10ac84); } .rainbow-3 { background: linear-gradient(135deg, #54a0ff, #2e86de); } .rainbow-4 { background: linear-gradient(135deg, #5f27cd, #341f97); } .rainbow-5 { background: linear-gradient(135deg, #ee5253, #c0392b); } .rainbow-6 { background: linear-gradient(135deg, #48dbfb, #00d2d3); } &lt;/style&gt; &lt;script&gt; // Adjust font size based on length function adjustFontSize() { const numbers = document.querySelectorAll('.number'); numbers.forEach(num =&gt; { const length = num.innerText.length; if(length &lt;= 5) num.style.fontSize = '2.5em'; else if(length &lt;= 8) num.style.fontSize = '2em'; else num.style.fontSize = '1.5em'; }); } window.onload = adjustFontSize; &lt;/script&gt; &lt;/head&gt; &lt;body&gt; &lt;h1&gt;SAP Analytics Cloud - RESTAPI Fetched Sales KPI Dashboard&lt;/h1&gt; &lt;div class="container"&gt; {% for kpi in kpis %} &lt;div class="card rainbow-{{ loop.index0 % 7 }}"&gt; &lt;div class="number"&gt;{{ kpi.number }}&lt;/div&gt; &lt;div class="title"&gt;{{ kpi.title }}&lt;/div&gt; &lt;/div&gt; {% endfor %} &lt;/div&gt; &lt;/body&gt; &lt;/html&gt; """</code></pre><pre class="lia-code-sample language-python"><code>from flask import Flask, render_template_string import requests import urllib.parse import webbrowser # ---------------- CONFIG ---------------- TENANT_URL = "https://yourtenant.hanacloudservices.cloud.sap" CLIENT_ID = "&lt;YOUR_CLIENT_ID&gt;" CLIENT_SECRET = "&lt;YOUR_CLIENT_SECRET&gt;" AUTHORIZATION_ENDPOINT = "https://yourtenant.hana.ondemand.com/oauth/authorize" TOKEN_ENDPOINT = "https://yourtenant.hana.ondemand.com/oauth/token" REDIRECT_URI = "https://your-app-domain.com/oauth/callback" STORY_ID = "&lt;your_storyid&gt;" WIDGET_IDS = [ "Chart_1", "Chart_2", "Chart_3", "Chart_4", "Chart_5", "Chart_6", "Chart_8" ] # ---------------- FLASK APP ---------------- app = Flask(__name__) def get_access_token(): # Step 1: login manually params = {"response_type": "code", "client_id": CLIENT_ID, "redirect_uri": REDIRECT_URI} auth_url = AUTHORIZATION_ENDPOINT + "?" + urllib.parse.urlencode(params) print("\nOpen this URL in browser to login to SAC:") print(auth_url) webbrowser.open(auth_url) code = input("\nPaste the authorization code here: ").strip() # Step 2: get token payload = { "grant_type": "authorization_code", "code": code, "redirect_uri": REDIRECT_URI, "client_id": CLIENT_ID, "client_secret": CLIENT_SECRET } r = requests.post(TOKEN_ENDPOINT, data=payload, headers={"Content-Type": "application/x-www-form-urlencoded"}) r.raise_for_status() return r.json()["access_token"] def fetch_kpis(token): headers = {"Authorization": f"Bearer {token}", "Accept": "application/json"} kpis = [] for widget_id in WIDGET_IDS: url = f"{TENANT_URL}/widgetquery/getWidgetData" params = {"storyId": STORY_ID, "widgetId": widget_id, "type": "kpiTile"} r = requests.get(url, headers=headers, params=params) if r.ok: data = r.json() kpis.append({ "title": data.get("title", widget_id), "number": data.get("number", "N/A") }) else: kpis.append({"title": widget_id, "number": "Error"}) return kpis # HTML Code will be written Here @app.route("/") def dashboard(): token = get_access_token() kpis = fetch_kpis(token) return render_template_string(HTML_TEMPLATE, kpis=kpis) if __name__ == "__main__": app.run(debug=True,port=8000)</code></pre><P class="lia-align-justify" style="text-align : justify;"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="webpageoutput_terminal.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368641iF1104ACD8CBAC86C/image-size/large?v=v2&amp;px=999" role="button" title="webpageoutput_terminal.png" alt="webpageoutput_terminal.png" /></span></P><P class="lia-align-justify" style="text-align : justify;">&nbsp;<SPAN>An explanation of the code</SPAN></P><OL class="lia-align-justify" style="text-align : justify;"><LI><FONT face="courier new,courier">/route</FONT> logic<UL><LI>To manually retrieve an OAuth token, use <FONT face="courier new,courier">get_access_token()</FONT>.</LI><LI>Uses <FONT face="courier new,courier">fetch_kpis()</FONT> to retrieve KPI tile data.</LI><LI>Uses the HTML_TEMPLATE to render all KPIs with rainbow-colored cards.</LI></UL></LI><LI>Managing OAuth (<FONT face="courier new,courier">get_access_token</FONT>)<UL><LI>Launches the browser's SAC login.</LI><LI>The authorization code is pasted by the user. To call SAC REST APIs, code is exchanged for access tokens.</LI></UL></LI><LI>Data retrieval for widgets (<FONT face="courier new,courier">fetch_kpis</FONT>)<UL><LI>Repeats over every WIDGET_IDS.</LI><LI>For every widget, the <FONT face="courier new,courier">widgetquery/getWidgetData</FONT> endpoint is called.</LI><LI>Gathers the number and title for the display.</LI></UL></LI><LI>Using HTML Templates for UI rendering<UL><LI>Flex arrangement is used for tiles.</LI><LI>Each KPI card has a rainbow gradient background.</LI><LI>For a contemporary appearance, use shadow and hover effects.</LI><LI>The length of the integer automatically modifies the font size.</LI></UL></LI></OL><H2 id="toc-hId-1199928811">Result/Output</H2><P class="lia-align-justify" style="text-align : justify;">The KPI data is retrieved and shown in a personalized web dashboard after the program has launched and the user has successfully logged in using SAP Analytics Cloud OAuth.</P><H6 id="toc-hId-1519746182">SAP Analytics Cloud Story KPI Tiles</H6><P class="lia-align-justify" style="text-align : justify;">The original KPI tiles as they appear in the SAP Analytics Cloud narrative are depicted in the image below. Business users in SAC are in charge of configuring and maintaining these KPIs.<BR /><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="SAC_Story_pic.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368642i28BB476CF6F1D331/image-size/large?v=v2&amp;px=999" role="button" title="SAC_Story_pic.png" alt="SAC_Story_pic.png" /></span></P><P class="lia-align-justify" style="text-align : justify;">&nbsp;</P><H6 id="toc-hId-1323232677">Custom Web Dashboard with KPI Tiles</H6><P class="lia-align-justify" style="text-align : justify;">The <FONT face="courier new,courier">widgetquery/getWidgetData</FONT> REST API is used to retrieve the same KPI values, which are then shown in a specially created web application. Custom Web Dashboard Screenshot<BR /><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="webpage_pic_restapi.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368643i9989EF78FB6413D9/image-size/large?v=v2&amp;px=999" role="button" title="webpage_pic_restapi.png" alt="webpage_pic_restapi.png" /></span></P><H3 id="toc-hId-739471015">Important Findings</H3><P class="lia-align-justify" style="text-align : justify;">The KPI numbers in the SAC story and the web dashboard are same.<BR /><STRONG>Every KPI tile shows:</STRONG></P><OL class="lia-align-justify" style="text-align : justify;"><LI>Value or Number&nbsp;</LI><LI>The title</LI></OL><P class="lia-align-justify" style="text-align : justify;">The website's dashboard uses the following to improve visualization:</P><OL class="lia-align-justify" style="text-align : justify;"><LI>Gradient cards with rainbows</LI><LI>Hover animations and shadows</LI><LI>Adaptable design</LI></OL><P class="lia-align-justify" style="text-align : justify;">This proves to the secure consumption and reuse of SAC KPI data outside of the SAC user interface without requiring the duplication of business logic.</P><H2 id="toc-hId-413874791">Conclusion</H2><P class="lia-align-justify" style="text-align : justify;">In this blog post, we shown a simple yet effective technique for extracting SAP REST APIs are used to tile Analytics Cloud KPI data and display it on a special webpage.</P><H6 id="toc-hId-733692162">Important findings:</H6><UL><LI><P>SAC KPIs can be consumed using custom dashboards outside of the SAC UI.</P></LI><LI><P>Python provides an adaptable and lightweight backend for API integration.&nbsp;REST APIs enable KPI tracking in real-time, while front-end. Style increases visibility.</P></LI><LI><P>Setting up the environment and security is essential for a safe execution.</P></LI><LI><P class="lia-align-justify" style="text-align : justify;">SAC insights can be incorporated into executive dashboards, intranet portals, or external web apps, businesses are able to give users a consolidated view of key performance indicators without having to Launch SAC directly.</P></LI></UL> 2026-02-04T08:51:58.622000+01:00