https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Data-Services-blog-posts.xml SAP Community - SAP Data Services 2024-05-20T11:12:43.090855+00:00 python-feedgen SAP Data Services blog posts in SAP Community https://community.sap.com/t5/technology-blogs-by-members/join-vs-lookup-which-function-is-efficient-in-sap-cpi-ds/ba-p/13580903 Join vs LookUp - Which function is efficient in SAP CPI-DS? 2023-11-08T23:03:13+01:00 _Hemant_Sharma https://community.sap.com/t5/user/viewprofilepage/user-id/167445 Hi SAP,<BR /> <BR /> In this blog I am going to put forth a detailed discussion between the two functions in SAP Cloud Platform Intergration Data Services (CPI-DS) which can be said are two sides of the same coin, namely -"JOIN" (Inner Join) and "LOOKUP".<BR /> <BR /> In my working with CPI-DS these are two are most common functions that I have used to build various dataflows to load MasterData and KeyFigure data across various business lines.<BR /> <BR /> A little bit of background on the functionality of both functions.<BR /> <BR /> "JOIN" function:<BR /> <BR /> The "JOIN" function in CPI-DS is a complete tab on its own when creating a dataflow as visible under the Transform Details. "JOIN" helps to bring common results by matching same columns from Table A and Table B. There are two types of Joins which are available in CPI-DS:<BR /> <OL><BR /> <LI>Left Outer Join - In this Join, let's say we Table A on the Left Side input and Table B on the Right-Side input.&nbsp; A Left Outer Join will bring all the records from the Table A and only the matching records from Table B which is a left join will always bring all the records from Left Side Input and only matching records from Right Side Input.</LI><BR /> <LI>Inner Join - An inner join will only bring records which are matching in both tables - Table A &amp; Table B regardless of whether the tables are inserted into the Left or Right-Side Input.</LI><BR /> </OL><BR /> When defining the Join function, we have to fulfill 3 things:<BR /> <OL><BR /> <LI>Select Left and Right-Side Input tables you want to use.</LI><BR /> <LI>Select the type of join whether you want to use Left Outer join or Inner join.</LI><BR /> <LI>Give proper matching condition in the Join Condition box and save it.</LI><BR /> </OL><BR /> We can define multiple Join Pairs under the Join Tab. Each Join should be relevant and should make logical sense.<BR /> <BR /> Additionally, there is a feature under the Join Tab called Join Rank. This Join Rank comes in handy when a single dataflow has multiple joins defined and it's required that there is order of priority set to the joins as to which join will be executed from first to last. Join Rank also as an option called "Cache" which allows data to be Cached according to the option set against it. (The Cache options include - Automatic, Yes, and No)<BR /> <BR /> "LOOKUP" function:<BR /> <BR /> LookUp fucntion allows result to be retrieved from a second table based on the comparison of columns between the original table and the lookup table. Additionally, apart from defining comparison between matching columns between the tables, CPI-DS also allows matching of specific Strings/ Characters present in the columns of the lookup table (For example: If a user only wants to LookUp records where in the column "PRODUCT DESCRIPTION" of the lookup table there exists String 'CHEMCALS'. This will only allow records from the column "PRODUCT DESCRIPTION" which has 'CHEMICALS' record in it. (Please note that the any given String is Case sensitive)).<BR /> <BR /> LookUp too has three different types of cache functions:<BR /> <OL><BR /> <LI>NO_CACHE - This reads value from lookup table without caching records.</LI><BR /> <LI>PRE_LOAD_CACHE - This loads the result column and the compare column into memory after applying filters but before executing the function. (By default, when applying LookUp function you can see that PRE_LOAD_CACHE is selected).</LI><BR /> <LI>DEMAND_LOAD_CACHE - This loads the result column, and the compare column values into memory as the function identifies them.</LI><BR /> </OL><BR /> As mentioned previously, Lookup and Join (Inner Join) function are two sides of the same coin. What I mean here is that both functions are used to validate data sets of first table (most likely a Source Table from SAP or NON-SAP Source) with a secondary table ( most likely in a reference or index table) so that only required data can be loaded into the system, garbage data can be eliminated, hence decreasing un-necessary processing and increasing efficiency of each dataflow in-terms of runtime and system resources utilized.<BR /> <BR /> Scenario - I will be using a very simple scenario to analyse the use case of both functions. Let's say XYZ Company has multiple business lines - A,B,C,D,&amp;E for which they want to create separate planning areas to perform demand and supply planning individually without bearing of business lines on each other. IBP team is able to define various Master Datas like Product, Location, Customer etc by identifying filters when loading data from source tables for each planning area. The catch is that the transactional source tables consist of data from across all the 5 business lines and IBP team needs to find a way to bring only relevant data for every business line. Now to tackle this IBP team decides to create extraction tables for each Master Data in MySQL and use that as a reference table when loading data to KeyFigure.<BR /> <BR /> (Note: Join and LookUp functions used is enabling the "Cache" feature in all sub-scenarios mentioned further. For sake of simplicity PRODUCT MD load tasks are used as below to discuss the sub-scenarios)<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog1-1.jpg" height="251" width="623" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">TSK_SQL_PRODUCT_TASKWITHJOIN - Cache set to "YES" for the both source tables</P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog12.jpg" /></P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog2-2.jpg" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">TSK_SQL_PRODUCT_TASKWITHLOOKUP&nbsp; -Cache set to "PRE_LOAD_CACHE"</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog13.jpg" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>Sub-Scenario A:</STRONG> In this scenario we are loading data in both tasks without adding any other function or logic before Join or Lookup function.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog3.jpg" /></P><BR /> <P style="text-align: center">"Image A"</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog4.jpg" /></P><BR /> <P style="text-align: center">"Image B"</P><BR /> We can see below in the Monitor Log for TSK_SQL_PRODUCT_TASKWITHJOIN when there is absolutely no other logic or function to be performed before the Join as in "Image A" the Absolute Time taken for it to complete the run is 7.421 secs.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog6.jpg" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px">Similarly, we can see below for TSK_SQL_PRODUCT_TASKWITHLOOKUP when there is absolutely no other logic or function to be performed before the LookUp as in "Image B" (Note: After the LookUp the "Default Value" in this case 'N/A' has to be filtered out) the Absolute Time taken for it to complete the run is 8.359 secs.</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog7.jpg" /></P><BR /> <EM>Findings</EM> - We can see in this case the difference between the two is negligible and Lookup took a second extra as it has to filter out Default Value 'N/A' as we don't want to load such records.<BR /> <BR /> <STRONG>Sub-Scenario B:</STRONG> But what happens when we employ simple function just before the Lookup and Join function Here, we are taking a scenario where the data has to be cleansed using ltrim and rtrim functions as below.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog8.jpg" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Image C</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog9.jpg" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Image D</P><BR /> We can see below in the Monitor Log for TSK_SQL_PRODUCT_TASKWITHJOIN when there is simple logic to be performed before the Join as in "Image C" the Absolute Time taken for it to complete the run is 10.468 secs.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog10.jpg" /></P><BR /> Similarly, we can see below in the Monitor Log for TSK_SQL_PRODUCT_TASKWITHLOOKUP when there is simple logic to be performed before the Join as in "Image D" the Absolute Time taken for it to complete the run is 8.843 secs.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog11.jpg" /></P><BR /> <EM>Findings</EM> - Again in this case we can observe the difference between the Join and Lookup is hardly 2 seconds which can be called negligible, but this time Lookup function was the faster task to completion.<BR /> <BR /> Let's explore sub-scenarios where there are more than 2 tables involved in the Join and Lookup functions as below. (Here for sake of simplicity, I will be loading data to Location Product MD)<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog14-1.jpg" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Image E</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog18.jpg" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Image F</P><BR /> We can see below in the Monitor Log for TSK_SQL_LOCATIONPRODUCT_TASKWITHJOIN when there are more than 2 tables involved in an Inner Join as in "Image E" the Join function starts building a cartesian product between the 3 tables used here. This results in the creation of more than 337 million lines of record and counting as seen in the Monitor Log below. At some point of time this task will fail if it keeps performing the cartesian product process as it will timeout and/or lose connection to the server. Hence, making the Join Function unreliable in such scenarios.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog16.jpg" /></P><BR /> We can see below in the Monitor Log for TSK_SQL_LOCATIONPRODUCT_TASKWITHLOOKUP when there are more than 2 tables involved in a LookUp as in "Image F" the Absolute Time taken for it to complete the run is 9.281 secs and unlike the Join function which most likely will fail even if left to run for a full day.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Blog17.jpg" /></P><BR /> <EM>Findings -&nbsp;</EM>So we observed that Joins in cases where more than 2 tables are involved is going to build a cartesian product between the tables creating millions of lines of records. This results in a waste of system resources, time and money. While the Join function performs the same task within seconds and succeeds to load the data.<BR /> <BR /> In conclusion, although Join and LookUp can be used interchangeable, in some complex scenarios it's best practice to pro-actively use LookUp function over Inner Join when building dataflows involving 2 or more tables. At the end of the day a Data Integration specialist should look to use the most efficient logics and functions to build dataflows which help build an efficient system and also save resources and time in the process.<BR /> <BR /> I hope, I have added some value via this blog. Thank You for your time and patience for going through this write up. 2023-11-08T23:03:13+01:00 https://community.sap.com/t5/technology-blogs-by-members/success-factor-adapter-issue-in-data-services/ba-p/13579788 Success Factor Adapter Issue in Data Services. 2023-11-14T22:24:03+01:00 Sruti_P https://community.sap.com/t5/user/viewprofilepage/user-id/14063 <STRONG>Issue:</STRONG><BR /> <BR /> We have Success Factor as a source and BW as target, sometimes we used to get the below error and the jobs will be in hanged state or will be very slow. We got an update that reboot of system will help, but we have to leave the job to run slow as other DB jobs will be running as it will be affected.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/SF-Adapter-issue.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Success Factor Adapter in DSMC during issue</P><BR /> &nbsp;<BR /> <BR /> Even reboot was not helping, and adapter was going into stopped state.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/Adpater-Status-after-reboot-1.png" /></P><BR /> <P class="image_caption" style="text-align: center;font-style: italic">Adapter Status in DSMC after server reboot.</P><BR /> &nbsp;<BR /> <BR /> <STRONG>Solution:</STRONG><BR /> <BR /> Adapters in BODS will write their trace and error in server path like all the other jobs.<BR /> <BR /> Example:<BR /> <BR /> Z:\ProgramData\SAP BusinessObjects\Data Services\adapters\log<BR /> <BR /> File name : SuccessFactorAdapter_trace.txt<BR /> <BR /> When the path of adapter file was investigated it occupied the whole drive space and the file size was bigger. We renamed the existing file as backup and started the Success Factor Adapter, it started working and created new file in the adapter path of server. The old file was kept in a shared path for any further issue after retention policy so it can be removed.<BR /> <BR /> &nbsp;<BR /> <BR /> <STRONG>Conclusion:</STRONG><BR /> <BR /> We have to figure what are the necessary files needed and always clear up logs after checking their importance. Cleaning it up in a retention period will be a best practice. Instead of reboot this work around help us to fix the issue and completes our job within SLA. After this there was no hang or slowness to the jobs. 2023-11-14T22:24:03+01:00 https://community.sap.com/t5/technology-blogs-by-sap/new-free-learning-journey-exploring-sap-data-services/ba-p/13572496 New free Learning Journey: Exploring SAP Data Services 2023-11-20T10:26:16+01:00 Johann https://community.sap.com/t5/user/viewprofilepage/user-id/137238 <SPAN data-contrast="auto">SAP Data Services is a comprehensive data integration solution enabling businesses to extract, transform, and load (ETL) data from various sources. With features like data quality management, text data processing, and real-time data integration, SAP Data Services addresses the challenges of a heterogenous data environment.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <SPAN data-contrast="auto">The SAP Data Services Learning Journey provides a structured and comprehensive learning experience for both beginners and professionals. Embark on a focused Learning Journey designed for Developers and Consultants seeking in-depth expertise in SAP Data Services. Dive into key aspects like installation, scheduling, security, and configuration.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/11/DataServicesWhatfor_Image.png" /><SPAN data-contrast="auto">Upon completion, participants will gain the skills to create, configure, and execute jobs seamlessly using the SAP Data Services Designer.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P><BR /> <SPAN data-contrast="auto">Get started with the new learning journey on Learning.sap.com </SPAN><A href="https://learning.sap.com/learning-journey/exploring-sap-data-services?url_id=text-blogs-LSCPLCoE-DATASER" target="_blank" rel="noopener noreferrer"><B><I><SPAN data-contrast="none">Exploring SAP Data Services</SPAN></I></B></A><SPAN data-contrast="auto"> to take the first steps in creating your own jobs in SAPs Data Services Designer.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN><BR /> <BR /> <I><SPAN data-contrast="auto">Fill the gap through upskilling and enjoy SAP’s learning offerings on the SAP Learning site. This article is created and brought to you by SAP Product Learning CoE experts!</SPAN></I><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN> 2023-11-20T10:26:16+01:00 https://community.sap.com/t5/enterprise-resource-planning-blogs-by-sap/partner-function-update-in-ps-project-wbs-table-ihpa/ba-p/13577689 Partner Function update in PS Project WBS (Table IHPA) 2023-12-11T10:58:26+01:00 Sumonta https://community.sap.com/t5/user/viewprofilepage/user-id/149523 <H3 id="toc-hId-1093130147"><STRONG>Introduction:</STRONG></H3><BR /> In Projects there are various people involved. This may include the project managers, project owners, Division Head, Finance etc. Normally most consultants use “Pers. Resp.No.” (VERNR) field to maintain this information. However, the requirements is to add multiple ids with different roles. This is where the concept of Partner Function in Project Systems comes in the picture.<BR /> <BR /> Partner functions can be added both at Project Definition and at WBS level.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Description.jpg" /></P><BR /> <BR /> <H3 id="toc-hId-896616642"><STRONG>Issue</STRONG></H3><BR /> During PS Project system data migration there are no standard template to do so. The partner&nbsp; function data gets reflected in the IHPA table. There is no direct way to update the table.<BR /> <H3 id="toc-hId-700103137"><STRONG>Solution</STRONG></H3><BR /> We can use FM “PM_PARTNER_UPDATE” to do so. The same FM can be called through data services as well.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/IHPA.jpg" /></P><BR /> Sample dataset looks like below :-<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/sAMPLE-DATA.jpg" /></P><BR /> Please keep in mind to populate the field UPDKZ. Otherwise it won’t reflect in the table.<BR /> <TABLE width="181"><BR /> <TBODY><BR /> <TR><BR /> <TD width="181">'D'&nbsp; Delete</TD><BR /> </TR><BR /> <TR><BR /> <TD width="181">'L'&nbsp; Physically delete</TD><BR /> </TR><BR /> <TR><BR /> <TD width="181">'E'&nbsp; Inherited</TD><BR /> </TR><BR /> <TR><BR /> <TD width="181">'I'&nbsp; New</TD><BR /> </TR><BR /> <TR><BR /> <TD width="181">'''&nbsp; No alteration</TD><BR /> </TR><BR /> <TR><BR /> <TD width="181">'U'&nbsp; Update</TD><BR /> </TR><BR /> </TBODY><BR /> </TABLE><BR /> UPDKZ = 'D' ---&gt; Updates the field KZLOESCH as 'X' in the table IHPA.<BR /> <BR /> UPDKZ = 'U' -&gt; Updates entry in the table.<BR /> <BR /> UPDKZ = 'I' --&gt; Inserts new record into the table.<BR /> <BR /> UPDKZ = 'L' ---&gt; Record is deleted from the table. 2023-12-11T10:58:26+01:00 https://community.sap.com/t5/technology-blogs-by-members/transformation-summary-in-sap-data-service-bods-part1-validation-and-case/ba-p/13580082 Transformation Summary in SAP data service ( BODS) - Part1 - Validation and Case Transformation 2023-12-28T03:10:00+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 Today I am going to discuss different Transformation scenarios in SAP data service ( BODS) which is required for transformation in Dataflows.<BR /> <BR /> <STRONG>1. Validation Transformation :</STRONG><BR /> <BR /> This Transformation is used to validate data. It filters the data based on the validation rule. If passed the rule, go to one target, If it fails go to another target set to deliver the failed items. If the data fails and violates the rule the the rule or failed condition goes to another target based on the scenario.<BR /> <BR /> We need to create a validation rule on the validation transformation based on that it filters the data and sends it to the destination you mentioned in the target.<BR /> <BR /> There is a standard project structure that we will use for all of our transformation and given below -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Project-Structure.png" height="231" width="386" /></P><BR /> &nbsp;<BR /> <BR /> For example, if the Datastore of the source table consists of the below rows -<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Source-Table.png" height="163" width="378" /></P><BR /> &nbsp;<BR /> <BR /> Sample DF Diagram for Validation Transformation :<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/V4.png" height="156" width="441" /></P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/V1-2.png" /></P><BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Send-to-fail.png" height="256" width="358" /></P><BR /> &nbsp;<BR /> <BR /> In the PS1 table (set to get Passed) -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Yes1.png" height="120" width="347" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px">in fl1 table ( set to get Failed) -</P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/No2.png" height="120" width="362" /></P><BR /> &nbsp;<BR /> <BR /> Validation transformation filters the data based on certain conditions and sends the data to different targets based on rule validation.<BR /> <BR /> 2. <STRONG>Case Transformation: </STRONG>Similar to like case statement in <STRONG>JAVA, HANA, and ABAP</STRONG>. You can direct your source data row into multiple(<STRONG>N</STRONG> numbers of direction) targets based on different conditions.<BR /> <BR /> For example, if the Datastore of the source table consists of the below rows -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Source-Table-1.png" height="143" width="332" /></P><BR /> Sample <STRONG>DF Diagram for Case Transformation</STRONG> :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Case-Transform.png" height="176" width="418" /></P><BR /> Here we will apply 3 conditions on the case statement and if Cond 1-&gt; Passed&nbsp; &nbsp;-&gt;CUST1, Cond 2-&gt; Passed&nbsp; &nbsp;-&gt;CUST2, and Cond 3-&gt; Passed&nbsp; &nbsp;-&gt;CUST3<BR /> <BR /> <STRONG>CASE 1:&nbsp; CUST.CREDIT_AMOUNT&nbsp; &gt; 400000</STRONG><BR /> <BR /> <STRONG>Case&nbsp; 2: CUST.CREDIT_AMOUNT&nbsp; &gt; 300000</STRONG><BR /> <BR /> <STRONG>Case 3: CUST.CREDIT_AMOUNT&nbsp; &gt; 1000000</STRONG><BR /> <BR /> Put all the conditions similar to be below one by adding cases :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Case1.png" height="208" width="466" /></P><BR /> As a result :<BR /> <BR /> &nbsp;<BR /> <BR /> 1. CUST1 table generated with the below 3 entities :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/CUST1.png" height="105" width="370" /></P><BR /> &nbsp;<BR /> <BR /> 2. CUST-2 table generated with the below 1 entities :<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/CUST2-1.png" height="84" width="363" /></P><BR /> &nbsp;<BR /> <BR /> 2. <STRONG>CUST3</STRONG> - table generated with the below 2 entities :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/CUST3.png" height="125" width="356" /></P><BR /> &nbsp;<BR /> <H4 id="toc-hId-1222922077"># What is the difference between Validation and Case transform and which scenario we will use each transformation? which we will used to develop a better, cost effective and faster dataflow design ?</H4><BR /> The execution time difference between the Case and Validation transforms is probably very very small. But the target limits two main tables/files based on fail and pass. Case can use for route the data to multiple target based on different case.<BR /> <BR /> In some business scenario , the load failed reason required by business as a log. The Validation transform can gather statistics about the validation process and very useful in that scenario.<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp; 2023-12-28T03:10:00+01:00 https://community.sap.com/t5/technology-blogs-by-members/transformation-summary-in-sap-data-service-bods-part-2-merge-and-query/ba-p/13572306 Transformation Summary in SAP data service - BODS - Part 2 - Merge and Query Transformation 2023-12-28T21:17:44+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 In this blog I am going to discuss the&nbsp; Merge Transformation and Query Transformation in SAP Data Services :<BR /> <BR /> <STRONG>Merge Transformation :</STRONG><BR /> <BR /> Merge Transformation is used to merge multiple data sources i.e. tables or files into a single data source. But the field all the sources need to have a similar number of fields, the same data type, and sequences.<BR /> <BR /> Basically, merge transformation acts as a <STRONG>UNION</STRONG>.<BR /> <BR /> For example, if we want to merge&nbsp; below source tables of Indian and Pak employees for an organization that exists in two countries:<BR /> <P style="overflow: hidden;margin-bottom: 0px"><STRONG>Table 1: IND_EMP :</STRONG></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><STRONG><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/IND_EMP.png" height="121" width="430" /></STRONG></P><BR /> <STRONG>Table 2: PAK_EMP :</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/PK_EMP.png" height="109" width="364" /></P><BR /> Design of a Merge Scenario :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Merge-1.png" /></P><BR /> The target table EMPLOYEE should contain all the data due to the union.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/RESULT.png" height="143" width="346" /></P><BR /> &nbsp;<BR /> <H4 id="toc-hId-1222060777">Query Transformation :</H4><BR /> Query transform is the most used transformation in BODS dataflow. It projects the data source fields to a subset or the same set.&nbsp; it is used in single or with other transformations based on the business scenario.<BR /> <BR /> It is used to -<BR /> <OL><BR /> <LI>Narrow down the column subset by projection.</LI><BR /> <LI>Narrow down the data using the filter.</LI><BR /> <LI>Merge data from multiple sources using join or union.</LI><BR /> <LI>Perform other operations also.</LI><BR /> </OL><BR /> For example, if there are two data sources like the below -<BR /> <BR /> <STRONG>Data Source Table 1 :&nbsp;</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/EMP.png" height="152" width="356" /></P><BR /> <STRONG>Data Source Table 2 :&nbsp;</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/INDEMP.png" height="113" width="361" /></P><BR /> &nbsp;<BR /> <BR /> The business wants to know the highest salary of an employee outside India region. We can use query transformations for this -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Query-1.jpg" height="179" width="358" /></P><BR /> Now double click on the <STRONG>Query transform </STRONG>and it will let you go to the query editor.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Query-Editor-1.png" /></P><BR /> &nbsp;<BR /> <BR /> Now in the FROM tab put the joining condition.-<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/JOIN.png" /></P><BR /> In the Where Tab editor add the below filter.<BR /> <BR /> PAK_EMP.SAL &gt;=10000.<BR /> <BR /> Save the Job and execute it. The Data preview of the target table should look like the below -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/EMP_T.png" height="84" width="156" /></P><BR /> &nbsp;<BR /> <H4 id="toc-hId-1025547272"># Which scenarios Merge and Query Transformation is preferable?</H4><BR /> Merge transform only can merge data from data sources having&nbsp; the same number of fields, the same data type, and the same sequences. It acts like an UNION operation.<BR /> <BR /> But Query transformation can combine any set of datasets, project a subset of data, filter data, and add parameter to data.&nbsp; The usage of query transformation is a very large set of utilities. It is commonly used with other transformation also.<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp; 2023-12-28T21:17:44+01:00 https://community.sap.com/t5/technology-blogs-by-members/transformation-summary-in-sap-data-service-bods-part-3-sql-and-map/ba-p/13572503 Transformation Summary in SAP data service - BODS – Part 3 – SQL and MAP Transformation 2024-01-03T04:10:54+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 <STRONG>SQL</STRONG> and <STRONG>MAP</STRONG> Transformation are two transformation is important transformation used in different business scenarios.<BR /> <BR /> <STRONG>SQL Transform:</STRONG> Similar to Query Transform It projects the data source fields to a subset of the same set.&nbsp; it is used in single or with other transformations based on the business scenario. But&nbsp;Query transform generates optimized SQL, while SQL transform allows directly entering SQL queries<BR /> <BR /> But in SQL transformation the Query which we use to project or filter the data that is executed and mapped into output. Complex logic needs to extract data from procedures or Table functions based on certain input parameters. In those scenarios SQL transformation is beneficial.<BR /> <BR /> &nbsp;<BR /> <BR /> In our case, we will implement the same business scenario where we populate the employees outside India having a salary of more than <STRONG>10000</STRONG>.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><STRONG>Source Table :</STRONG></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/EMP-1.png" height="131" width="306" /></P><BR /> <STRONG>The design :</STRONG><BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/Diagram.png" height="103" width="488" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>The Editor :</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/EDITOR-1.png" height="139" width="457" /></P><BR /> &nbsp;<BR /> <BR /> The result should be the same as what we got using the query editor -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2023/12/EMP_T-1.png" height="76" width="142" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>MAP Transformation : </STRONG><BR /> <BR /> Map transformation is a very important transformation. <STRONG>Map Operation modifies</STRONG>&nbsp;<STRONG>Opcodes (I/U/D)</STRONG> operations. Allows conversion between opcodes provided as a input from previous transformation. After map operation use can only use transformation that understand opcode or a permanent table. Query transformation is optional to use. The ‘Map_Operation’ transform changes the operation codes on the input data sets.<BR /> <BR /> In Simple word If any row updated on the source table , using map transformation you can update or insert the data in the target data set.<BR /> <BR /> <STRONG>Important Topic :&nbsp; When we will use MAP without Table comparison and when with Table comparison?</STRONG><BR /> <BR /> Map Operation doesn't identify the new or updated rows from the source automatically.<BR /> <BR /> Table Comparison return Opp. Code for Insert, Update and Delete for each source row after comparing with target table data as an output.<BR /> <BR /> So we Can implement Insert, Update, and Delete operations using a single flow for a data source if we use Map with table comparison.&nbsp; But if we want to used a Single MAP operation We need to use Parallel flows for each operation (insert, update, and delete).<BR /> <BR /> The transformation Table Comparison able to identify the records that need to be updated or inserted in the target table by comparing the source. If there is a requirement of&nbsp; mix of insert and updates use TC along with Map operation.<BR /> <BR /> But if you know all the records need updates then use Map only. Also, say you want to delete a set of records based on some condition then the map will work better.<BR /> <BR /> <STRONG>Scenario 1 :</STRONG><BR /> <BR /> A. <STRONG>Map Operation with Table comparison :</STRONG><BR /> <BR /> In this scenario if a row is updated in the target system. The source will be updated. If a&nbsp; row is inserted or deleted in the source it will insert to and delete from the target.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/MAPWTBC.png" /></P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/MAP3.jpg" /></P><BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> It is similar result we can obtain from History preservation and Row generation along with table comparison -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/History-Preservation-4.png" /></P><BR /> A. <STRONG>Map Operation with Table comparison :&nbsp;</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/MAP1.png" height="136" width="515" /></P><BR /> &nbsp;<BR /> <BR /> For a simple update, Insert and delete we can use the below flow with the default MAP Opp code setting -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/MAP2.png" height="176" width="561" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/MAP5-1.jpg" /></P><BR /> &nbsp;<BR /> <BR /> But Without using <STRONG>Table Compare</STRONG> and Generation we can do the same update, insert and Delete. using below flow and opcode :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Merge.jpg" /></P><BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> For Normal Opcode Setting -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Normal.png" /></P><BR /> For Insert Opcode Setting -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Insert.png" /></P><BR /> For Update&nbsp; Opcode Setting -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Update.png" /></P><BR /> For Delete Opcode Setting -<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Delete.png" /></P><BR /> &nbsp;<BR /> <BR /> In the next blog I will discuss about the Data Integration Transform like History Preservation, Table comparison Row generation etc.<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp; 2024-01-03T04:10:54+01:00 https://community.sap.com/t5/technology-blogs-by-members/transformation-summary-in-sap-data-service-bods-part-4-table-comparison-and/ba-p/13573955 Transformation Summary in SAP data service – BODS – Part 4 – Table Comparison and History preservation and Row Generation Transformation 2024-01-05T00:36:44+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 Today I will discuss on three useful Data Integration Transformation. Together if they used they can perform a number of transformation output required from business. These transformations are Table Comparison and History preservation and Row Generation Transformation<BR /> <BR /> &nbsp;<BR /> <BR /> These major transformations&nbsp; are given below –<BR /> <BR /> <STRONG>1. Table Comparison :</STRONG>&nbsp;Used to compare data between Source and Target Table/Flat files. Others and&nbsp; return opcode for Insert or update based on the expected changed column you defined in the transformation. It is not mandatory but good practice to use Query transform before table comparison to project/select the specific data. Key Generation is not mandatory or map function is not mandatory with Table comparison but in different scenario where insert required for update operation i.e. historical preservation needed they Key Generation and map Operation is useful.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Table-comparison-2-1.png" height="87" width="578" /></P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Table-comparison.png" height="405" width="571" /></P><BR /> &nbsp;<BR /> <BR /> 2. <STRONG>History Preservation :</STRONG>&nbsp; This transformation is used to keep historical data which is updated as a separate row and set the flag to ‘N’ and for new data row, set the flag to ‘Y’. With Key_Generation transform history preservation transform gives better result. When source table row has operation code of&nbsp;<STRONG>Insert/Update</STRONG>&nbsp;then it&nbsp;<STRONG>insert</STRONG>&nbsp;a new record in the target table.<BR /> <BR /> The below 4 field is preferable in Target Table structure to for History Preservation –<BR /> <UL><BR /> <LI><STRONG>S-key</STRONG>&nbsp;(surrogate key) will act as the primary key as you will get duplicate records</LI><BR /> <LI><STRONG>STRT_DATE</STRONG>&nbsp;– have the valid from date</LI><BR /> <LI><STRONG>END_DATE</STRONG>&nbsp;– have the change date</LI><BR /> <LI><STRONG>Flag</STRONG>&nbsp;Indicating&nbsp; the current record or the old record.</LI><BR /> </UL><BR /> Also preferable to have start and end date in the source column as well. Using History preservation transform with Table comparison and Key generation is preferable.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/History-Preservation-4-1.png" /></P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/History-Preservation3.png" /></P><BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <BR /> 3.<STRONG>Key Generation :&nbsp;&nbsp;</STRONG>This transformation generate and extra key in incremental order to identify the old and new record inserted.<BR /> <BR /> Using History preservation transform with Table comparison and Key generation is preferable.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Key-Generation.png" /></P><BR /> &nbsp;<BR /> <BR /> Now we will discuss different delta mechanism implementation in BODS using those above transformation.<BR /> <BR /> Most common and popular types of delta mechanism are SCD Types and CDC types (source and Target Based) which we are going to discuss.<BR /> <BR /> The three transformation used in different scenario together to achieve different goal.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/History-Preservation-4-2.png" /></P><BR /> &nbsp;<BR /> <BR /> In the next session I will discuss about some of the Data Quality Transformation that will required day to day basis as a BOS+DS developer.<BR /> <BR /> &nbsp; 2024-01-05T00:36:44+01:00 https://community.sap.com/t5/technology-blogs-by-members/important-quality-transformation-summary-in-sap-data-service-bods-part-5/ba-p/13574322 Important Quality Transformation Summary in SAP data service – BODS – Part 5 – Match, Data Cleanse and Global Address Cleanse 2024-01-12T05:30:38+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 Before push the data to a data flow it is better to profile and cleanse or maintain quality of the data. For data profiling, Information Steward is the best tool which we will discuss later in a different blog.<BR /> <BR /> My intention in each and every blog is to a point explanation of any object which is only required for any developer to start development. For more details, there are many places where it is described.<BR /> <BR /> Today I will discuss three important Quality Transformation in SAP data service. These are given below -<BR /> <H4 id="toc-hId-1222120417">1. Match:</H4><BR /> By seeing the name you can predict that this transformation match the records and Identify the records. But Match transformation do more than that.<BR /> <BR /> For an example in one of the cases It will identify that potential identical data may be duplicate from data those do not have primary key to identify the duplicated. Based on matching method an criteria it can identify those duplicated, then -<BR /> <UL><BR /> <LI>Classify into Master and Subordinate data</LI><BR /> <LI>Fill the master data from the Subordinate data.</LI><BR /> <LI>Remove the subordinate and keep only master data row.</LI><BR /> </UL><BR /> <STRONG>Source Table :</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/M1.png" height="146" width="402" /></P><BR /> &nbsp;<BR /> <BR /> There are many kind of <STRONG>MATCH</STRONG> transformation is there like <STRONG>Match Base</STRONG>, <STRONG>Address Match</STRONG> , Different type of consumer match.<BR /> <BR /> By changing the configuration need to set rules for matching records then establish the thresholds and scores that BODS uses to determine near matches and matching records. it is based on the below rules-<BR /> <UL><BR /> <LI><STRONG>Rule Based method</STRONG></LI><BR /> <LI><STRONG>Weighted Scoring method</STRONG></LI><BR /> <LI><STRONG>Combination method</STRONG></LI><BR /> </UL><BR /> We will use simple <STRONG>Base_Match</STRONG> for that.<BR /> <BR /> Configuration in Match Wizard:<BR /> <BR /> &nbsp;<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/M3.png" height="230" width="371" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/M4-2.png" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>Expected Result Table :&nbsp;</STRONG><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Ttable-1.png" height="49" width="319" /></P><BR /> &nbsp;<BR /> <BR /> There are multiple scenarios where we can use Match Transformation different types.<BR /> <BR /> &nbsp;<BR /> <H4 id="toc-hId-1025606912">2. Data Cleanse :</H4><BR /> Used to clean the data to prepare for other transformation . The below option you can use to clean the data , replace a pattern of string using the below option -<BR /> <UL><BR /> <LI>Remove diacritical characters.</LI><BR /> <LI>Information and status codes</LI><BR /> <LI>Phone parser</LI><BR /> <LI>Parse discrete input</LI><BR /> <LI>New Data Cleanse output fields</LI><BR /> <LI>New Data Cleanse input fields</LI><BR /> </UL><BR /> For more detail you can visit the below SAP URL in which it is described in very clear and details .<BR /> <BR /> <A href="https://help.sap.com/docs/SAP_DATA_SERVICES/9f1b4472ec98409682d91953b9e68c92/578e652a6d6d1014b3fc9283b0e91070.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/SAP_DATA_SERVICES/9f1b4472ec98409682d91953b9e68c92/578e652a6d6d1014b3fc9283b0e91070.html</A><BR /> <H4 id="toc-hId-829093407">3. Global_Address_Cleanse :</H4><BR /> Global Address Cleanse transformation&nbsp; are used to parse the input data ,cleanse the data , correct and standardize address data for different countries.<BR /> <BR /> The address proposed field are given below -<BR /> <UL><BR /> <LI>Street Number</LI><BR /> <LI>Street Number</LI><BR /> <LI>Apartment Numbers</LI><BR /> <LI>Cardinal Directions</LI><BR /> <LI>Locality</LI><BR /> <LI>Region</LI><BR /> <LI>Zip Codes.</LI><BR /> </UL><BR /> In the panel we need to configure the scenario and customized accordingly.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/1234.jpg" /></P><BR /> <BR /> <H4 id="toc-hId-632579902"></H4><BR /> <H4 id="toc-hId-436066397"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/345.png" height="171" width="543" /></H4><BR /> In each scenario we need to customized as per retirement. 2024-01-12T05:30:38+01:00 https://community.sap.com/t5/technology-blogs-by-members/repository-job-server-and-job-server-group-creation-in-sap-data-service/ba-p/13575349 Repository , Job Server and Job server Group Creation in SAP Data service 2024-01-16T02:01:05+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 Today I am going to discuss about Repository , Job Server Creation, Job server Group in SAP Data service. Some part is part of the administration job but it is better to let know all the stubs before starting <STRONG>design</STRONG> and <STRONG>development</STRONG> in the&nbsp; BODS Designer.<BR /> <H4 id="toc-hId-1222150277">SAP BODS System Preparation&nbsp; :</H4><BR /> <H4 id="toc-hId-1025636772">1. Database Creation :</H4><BR /> This is purely a DB work and DB admin is used to create the DB. This is the first part of the preparation process.&nbsp; There are many places where the steps are given and I will not discuss in detail this steps.<BR /> <BR /> <STRONG>2. Repository Creation(Local, Central, Profiler) :</STRONG><BR /> <BR /> What is Repository?<BR /> <BR /> Repository is a place <B>used to store metadata of objects used in BODS</B>. Each and every&nbsp; Repository should be -<BR /> <OL><BR /> <LI>Registered in Central Management Console, CMC.</LI><BR /> <LI>&nbsp;Linked with single or many job servers, which are responsible to execute the jobs that are created.</LI><BR /> </OL><BR /> <STRONG>Different type of repository :</STRONG><BR /> <BR /> <STRONG>A. Local Repository : </STRONG>The local repository stores metadata of all objects like&nbsp;<SPAN role="tooltip"><SPAN class="c5aZPb" role="button" data-enable-toggle-animation="true" data-extra-container-classes="ZLo7Eb" data-hover-hide-delay="1000" data-hover-open-delay="500" data-send-open-event="true" data-theme="0" data-width="250" data-ved="2ahUKEwjIsO7U-NuDAxXfEVkFHS3RDIQQmpgGegQIGhAD"><SPAN class="JPfdse" data-bubble-link="" data-segment-text="jobs">jobs</SPAN></SPAN></SPAN>, workflow, data flow and projects which you developed.<BR /> <BR /> <B>B. Central Repository</B> − This repository like a main branch of Git repo. There are local branch(Local repository here) where developer used to develop their artifacts and push the version to the main branch .It is used to control the version management of the objects. Central Repository stores all the versions of an application object.<BR /> <BR /> <B>C. Profiler Repository</B>&nbsp;−<BR /> <BR /> Data Profiling means cleansing, prepare and standardize the data for use in enterprise landscape.<BR /> <BR /> This repo used to manage all the metadata related to profiler tasks performed in SAP BODS designer. Information Steward&nbsp; used for data profiling and this repository stores metadata of profiling tasks and objects created in information steward .CMS Repository stores metadata of all the tasks performed in CMC on BI platform.<BR /> <BR /> <STRONG>Steps to create Repository :</STRONG><BR /> <BR /> A. Go to <STRONG>SAP Business Objects Data Services 4.2 SP3-&gt; Data Services Repository Manager</STRONG>.<BR /> <BR /> B.&nbsp; Create the repository by providing the below details :<BR /> <UL><BR /> <LI>Repository type , Database Type, DB name, user name and password.</LI><BR /> <LI>Click on the Create button.</LI><BR /> <LI>If a <STRONG>REPO</STRONG> was already created with the same name then it will give you a warning and not allow you to create another one.</LI><BR /> </UL><BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/REPO.png" height="407" width="406" /></P><BR /> <STRONG>3. Attach the local&nbsp; repository to the Job Server or Job Server Group :</STRONG><BR /> <BR /> The created local repo needs to be attached to a Job server.&nbsp; The object like Data flow created in the Local repo needs to be executed on a server. The Job server is the server where we will execute the developed objects.<BR /> <P style="overflow: hidden;margin-bottom: 0px">Note : In the popup wizard you can add and existing Job Server or you can create an new Job Server and associate it. Our case we will create new jobs server.</P><BR /> A. SAP BusinessObjects Data Services 4.2 SP3-&gt;Data Services Server Manager.<BR /> <BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/JS1.png" height="331" width="472" /></P><BR /> <P style="overflow: hidden;margin-bottom: 0px">B. Fill the credentials : Job Server name, Job Server port, Database type, Database Server name, Database name, Username and Password&nbsp; and check the default repository.</P><BR /> &nbsp;<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/JS4.png" height="382" width="501" /></P><BR /> &nbsp;<BR /> <BR /> C. Test the created Job server :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/JS5-1.png" height="336" width="529" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>3. Repository Registration in CMC :&nbsp;</STRONG><BR /> <BR /> A. After logging into the BO CMC URL go to the Dara Service -&gt; Repository :<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/cmcdc.png" height="278" width="516" /></P><BR /> B.&nbsp; Then right-click on the link <STRONG>Repositories -&gt; Configure repository or right-click on the link Repositories -&gt; Manage -&gt; Configure repository</STRONG> and fill in the credential.<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/Panel2.png" height="329" width="527" /></P><BR /> &nbsp;<BR /> <BR /> <STRONG>4. Job Server Creation:</STRONG><BR /> <BR /> That portion was already discussed while attaching the repo with the Job Server in section <STRONG>3B</STRONG>.<BR /> <BR /> <STRONG>5. Job server group creation :</STRONG><BR /> <BR /> Before creating it let's know what the Job Server Group is and why we need Job Server Group?<BR /> <BR /> Server Group is a logical&nbsp;<SPAN class="ph pname">SAP Data Services</SPAN> component that is used to group Job Servers on different computers or the same computers.<BR /> <BR /> A Job Server group automatically measures resource availability on each Job Server in the group and distributes scheduled batch jobs to the Job Server with the lightest load at runtime.<BR /> <BR /> All the Job Servers in an individual server group must be associated with the same repository, which must be defined as a default repository. The Job Servers in the server group must also have:<BR /> <UL class="ul"><BR /> <LI style="list-style-type: none"><BR /> <UL class="ul"><BR /> <LI class="li"><BR /> <P class="p">Identical&nbsp;<SPAN class="ph pname">SAP Data Services</SPAN>&nbsp;versions</P><BR /> </LI><BR /> <LI class="li"><BR /> <P class="p">Identical database server versions</P><BR /> </LI><BR /> <LI class="li"><BR /> <P class="p">Identical locale</P><BR /> </LI><BR /> </UL><BR /> </LI><BR /> <LI class="li"><BR /> <P class="p">Each computer can only contribute one Job Server to a server group.</P><BR /> <BR /> <DIV><IMG class="break" src="https://help.sap.com/doc/4d2e69d5c5c2492681f0c64062ba66f1/4.3.2/en-US/loio56ebf42c6d6d1014b3fc9283b0e91070_LowRes.gif" width="386" height="296" /></DIV></LI><BR /> </UL><BR /> <STRONG>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Note : This diagram taken from SAP.</STRONG><BR /> <BR /> <STRONG>** Steps to create Job Server Group :&nbsp;</STRONG><BR /> <BR /> Job server Group required at least two or more Job servers.<BR /> <BR /> From the SAP Data Service Management Console -<BR /> <P style="overflow: hidden;margin-bottom: 0px"><IMG class="migrated-image" src="https://community.sap.com/legacyfs/online/storage/blog_attachments/2024/01/server_groups.jpg" height="249" width="517" /></P><BR /> <BR /> <OL><BR /> <LI>Select Server Groups- &gt; All Server Groups</LI><BR /> <LI>Click the Server Group Configuration tab.</LI><BR /> <LI>Add a server group by using the "Add" button</LI><BR /> <LI>select a repository, all Job Servers registered with that repository display. You can create one<BR /> server group per repository. Keep the default server group name. It is the name of your repository with the prefix SG_ (for server group). You can change the default name, however, labeling a server group with the repository name is recommended.</LI><BR /> <LI>One Job Server on a computer can be added to a server group. Use the Host and Port column to verify that the Job Servers you select are each installed on a different host.</LI><BR /> <LI>After you select the Job Servers for a server group. Job server Group required at least two or more Job servers.</LI><BR /> </OL><BR /> For more details, you can visit the reference URL -<BR /> <BR /> <A href="https://help.sap.com/docs/SAP_DATA_SERVICES/4d2e69d5c5c2492681f0c64062ba66f1/574f3d336d6d1014b3fc9283b0e91070.html" target="_blank" rel="noopener noreferrer">https://help.sap.com/docs/SAP_DATA_SERVICES/4d2e69d5c5c2492681f0c64062ba66f1/574f3d336d6d1014b3fc9283b0e91070.html</A> 2024-01-16T02:01:05+01:00 https://community.sap.com/t5/technology-blogs-by-members/transport-in-sap-data-services/ba-p/13588143 Transport in SAP Data Services 2024-01-31T04:56:40.675000+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 <P>Today I will discuss about the different transport techniques in SAP data services. Based on the project scenario and configuration we will select appropriate techniques.</P><P><STRONG>1. Normal Import ATL file and export:</STRONG></P><OL class="lia-list-style-type-upper-alpha"><LI>Taking Backups of ATL Files.</LI><LI>Select the object that needs to move in the local object repository.</LI></OL><P>&nbsp; &nbsp;Right-click -&gt;export-&gt;export to ATL File. Select the Target REPO of QA or Prod Environment. Then -&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;Right click-&gt;Import-&gt;Import from the backup ATL File. change the Datastore connections according to the environment.</P><P><STRONG>2. Using Central Repository:</STRONG></P><P>By using Check in/check out concept.</P><P>1. Create a central repository and connect it to the local repository and activate the central repository.</P><P>2. From Local repository: Select Local Object -&gt; Right Click. Add to central repository.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_0-1706667052118.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56673i2CDC0606E1EAC957/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_0-1706667052118.png" alt="pallab_haldar_0-1706667052118.png" /></span></P><P>In another way -</P><P>1. You need to go to tools and select Central Repository:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_1-1706667052122.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56671i342E1E4855664918/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_1-1706667052122.png" alt="pallab_haldar_1-1706667052122.png" /></span></P><P>&nbsp;</P><P>2. From the dialog box we need to select the object which we want to promote.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_2-1706667052124.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56672i5D699F2824351B6F/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_2-1706667052124.png" alt="pallab_haldar_2-1706667052124.png" /></span></P><P>3. From Target Repository in different environment - Check out the PROJP from central repo "with objects and dependents without replacement" option and check in again "with objects and dependents" OR</P><P>4. Right click on the Object à Get Latest Version à With Filtering.</P><P>&nbsp;</P><P>3. Transport or promote the object with <STRONG>CTS +&nbsp;</STRONG>:</P><P>Here I will not discuss about the <STRONG>CTS+ </STRONG>configuration. You can go through the below link –<A href="https://help.sap.com/doc/f6e6dd2617e44e1fa06e8a6e023b63fa/4.2.14/en-US/ds_42_cts_en.pdf" target="_blank" rel="noopener noreferrer">https://help.sap.com/doc/f6e6dd2617e44e1fa06e8a6e023b63fa/4.2.14/en-US/ds_42_cts_en.pdf</A></P><P>I will discuss about how to transport or promote object with CTS++. Below are the steps –Basic Architecture of&nbsp; CTS +&nbsp; configuration with BODS is given below&nbsp; -</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_3-1706667052127.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56675i736DC5459FDF580F/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_3-1706667052127.png" alt="pallab_haldar_3-1706667052127.png" /></span></P><P>&nbsp;</P><P><STRONG>Steps to Transport:</STRONG></P><P><STRONG>&nbsp;1.&nbsp;</STRONG>Navigate to SAP Data Services Development system Administrator screen and click on Object Promotion</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_4-1706667052128.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56676i2C22E449AA1E0912/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_4-1706667052128.png" alt="pallab_haldar_4-1706667052128.png" /></span></P><P>&nbsp;</P><P>2. Select repository type and Type -</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_6-1706667052133.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56678iF6605D97B4BFA792/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_6-1706667052133.png" alt="pallab_haldar_6-1706667052133.png" /></span></P><P>3. Select the objects you want to add and press export button:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_7-1706667052136.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56679iE96104D3BFC52906/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_7-1706667052136.png" alt="pallab_haldar_7-1706667052136.png" /></span></P><P>4. Check when the <STRONG>Job/Object</STRONG> export completes :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_9-1706667052138.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56681iA06BF12FF581804F/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_9-1706667052138.png" alt="pallab_haldar_9-1706667052138.png" /></span></P><P>5. When export complete release transport for next system :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_10-1706667052141.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56682i52E5E3ED9EA19E4B/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_10-1706667052141.png" alt="pallab_haldar_10-1706667052141.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_11-1706667052142.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56680iF1B634C362C0A8A5/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_11-1706667052142.png" alt="pallab_haldar_11-1706667052142.png" /></span></P><P>6. Import transport will be in your target queue :</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_12-1706667052143.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/56683iCEFD7A85C395CBC2/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_12-1706667052143.png" alt="pallab_haldar_12-1706667052143.png" /></span></P><P>In the next blog I will discuss about the job scheduling in SAP Data Services.</P><P>&nbsp;</P><P>&nbsp;</P><P><STRONG>&nbsp;</STRONG></P> 2024-01-31T04:56:40.675000+01:00 https://community.sap.com/t5/technology-blogs-by-members/load-data-to-sftp-folder-using-authorization-type-as-public-key-in-sap-data/ba-p/13589319 Load data to SFTP folder Using Authorization Type as Public Key in SAP Data Service 2024-01-31T13:18:53.136000+01:00 vinay_lohakare5 https://community.sap.com/t5/user/viewprofilepage/user-id/191828 <P><STRONG>Requirement:</STRONG></P><P>Load data to SFTP folder Using Authorization Type as Public Key in SAP Data Service</P><P><STRONG>Description:</STRONG></P><P>The blog will provide information about the use of file location object to upload file in SFTP server using <STRONG>Authorization Type as Public Key</STRONG></P><P>We already have a blog that give you a good idea of how to use the File location object to access files in SFTP and FTP after&nbsp;Document Version: 4.2 Support Package 6 (14.2.6.0). The blog talks of the file config using the authorization type as password.</P><P><A title="Export File to SFTP/FTP" href="https://community.sap.com/t5/technology-blogs-by-members/use-file-location-object-to-read-import-and-write-export-files-in-ftp-and/ba-p/13192794" target="_self">Export File to SFTP/FTP</A></P><P>I will be focusing only on the SFTP file upload using <EM><U>authorization type as Public Key:</U></EM></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vinay_lohakare5_0-1706703194318.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/57449iC1A220212631078D/image-size/medium?v=v2&amp;px=400" role="button" title="vinay_lohakare5_0-1706703194318.png" alt="vinay_lohakare5_0-1706703194318.png" /></span></P><P>Follow the steps to create the file location for SFTP as mentioned in the above blog.</P><P>Select the authorization Type as “Public Key” and you would get the below options:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vinay_lohakare5_1-1706703209636.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/57450iD825537F51002EBC/image-size/medium?v=v2&amp;px=400" role="button" title="vinay_lohakare5_1-1706703209636.png" alt="vinay_lohakare5_1-1706703209636.png" /></span></P><P>You will be provided by the Username and RSA Private key by the SFTP team. With the provided RSA private Key, you won’t be able to create the connection and would get the below error message:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vinay_lohakare5_2-1706703244667.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/57451i6DEDAF4206E76100/image-size/medium?v=v2&amp;px=400" role="button" title="vinay_lohakare5_2-1706703244667.png" alt="vinay_lohakare5_2-1706703244667.png" /></span></P><P>To resolve the error, use a software that would be able to generate the Public and Private key using the provided RSA Private key. We have used “<STRONG>PuTTY Key Generator</STRONG>” (You can use any other available software).</P><P>Open the PuTTY Key Generator and Go to File and select “Load Private Key”.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vinay_lohakare5_3-1706703297681.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/57452i6AF37DC9AFF4B35A/image-size/medium?v=v2&amp;px=400" role="button" title="vinay_lohakare5_3-1706703297681.png" alt="vinay_lohakare5_3-1706703297681.png" /></span></P><P>Once loaded you can select “Save public Key” followed by “Save private key”. The Public key should be saved by file extension “.pub” and the Private key with file extension “.pem”.</P><P><STRONG>Note </STRONG>– You can also paste the generated public/private key in a note pad and save it with the extension as mentioned above.</P><P><STRONG>File Location Configuration:</STRONG></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vinay_lohakare5_4-1706703355205.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/57453iC00AF4CD159C19A6/image-size/medium?v=v2&amp;px=400" role="button" title="vinay_lohakare5_4-1706703355205.png" alt="vinay_lohakare5_4-1706703355205.png" /></span></P><UL><LI>Name: Name of the file location</LI><LI>Protocol: Type of protocol (SFTP in our example)</LI><LI>Port: you can leave it default. Usually for SFTP it is 22 and for FTP it is 21</LI><LI>HostKey fingerprint: It is required for SFTP not for FTP. But don't worry system will propose this key once user and other provided details are passed.</LI><LI>SSH Authorization Private Key file path: The file path where the Putty Generated Private key is saved.</LI><LI>SSH Authorization Private Key Passphrase: Generated</LI><LI>SSH Authorization Public Key file path: The file path where the Putty Generated Public key is saved.</LI><LI>Remote Directory: SFTA Path where the generated file is to be pushed.</LI><LI>Local directory: Required only if you are copying it to local system folder.</LI><LI>Connection retry count and interval can leave as it is.</LI></UL><P>Once the file location is configured go to the file in data flow chose the file location from drop down.</P><P>Provide the file name and you are good to go&nbsp;<span class="lia-unicode-emoji" title=":slightly_smiling_face:">🙂</span>.</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-01-31T13:18:53.136000+01:00 https://community.sap.com/t5/technology-blogs-by-members/job-scheduling-in-sap-data-services/ba-p/13593966 Job Scheduling in SAP Data services 2024-02-04T18:35:05.210000+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 <P>In This blog, I will discuss the steps to schedule a job using the Data Services Management Console.</P><P><STRONG>A. Steps are given below for batch Job scheduling -</STRONG></P><P>1.&nbsp; Log In to the Data Services Management Console.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_0-1707067762958.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59987i494B18E191292738/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_0-1707067762958.png" alt="pallab_haldar_0-1707067762958.png" /></span></P><P>2.&nbsp; Now we need to '<STRONG>Administrator</STRONG>' and expand the 'Batch' menu –</P><P>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_1-1707067762965.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59985iD0EAC5B76C94BF08/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_1-1707067762965.png" alt="pallab_haldar_1-1707067762965.png" /></span></P><P>3. Select the repository where the job you want to schedule has been developed.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_2-1707067762971.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59986iD5BC7A31D517FC6B/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_2-1707067762971.png" alt="pallab_haldar_2-1707067762971.png" /></span></P><P>&nbsp;</P><P>4. Please Go to the 'Batch job Configuration' Tab. Select the project of your job. Click on the right hand side 'Schedules' which is under the other information column of the job for which we want to create a schedule –</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_3-1707067762981.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59989i9AAF504DD732FA2A/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_3-1707067762981.png" alt="pallab_haldar_3-1707067762981.png" /></span></P><P>5. Then click on the 'Add' button. Give a name for your schedule. We need to select Date-Day for our schedule. There is an option of 'recurring' if want to schedule the job in recurring. In our case, we will schedule it once a day at 11.23 PM. Need to click on the 'apply' button and a schedule for your job is created.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_4-1707067762992.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59990i81B68C2E2ADB6B86/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_4-1707067762992.png" alt="pallab_haldar_4-1707067762992.png" /></span></P><P>7. If we want to check, activate, and deactivate the created job then go to the <STRONG>'Repository Schedules'</STRONG> tab and there we can see the scheduled job.</P><P>&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_5-1707067762997.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59988i85D153028A635A77/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_5-1707067762997.png" alt="pallab_haldar_5-1707067762997.png" /></span></P><P><STRONG>B. Real-Time job scheduling: </STRONG></P><P>&nbsp;Real-time jobs do not need to wait for an internal trigger or a scheduling like a batch job. It is waiting for a message from the access server, once it is it starts automatically.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</P><P>Realtime Jobs are suitable for XML Messages. The basic architecture is like below –</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_6-1707067762999.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59992i21D3F6137C566CDD/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_6-1707067762999.png" alt="pallab_haldar_6-1707067762999.png" /></span></P><P>Before we need to start development, we need to configure the access server.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_7-1707067762999.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/59991iF91F67125E9DDBAC/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_7-1707067762999.png" alt="pallab_haldar_7-1707067762999.png" /></span></P><P>Real-time jobs start with a message and end with a message.&nbsp; The input message must be the first step in the job and the end message the last one.&nbsp;After the real-time job is published, DS checks the job inside for tables, syntax, and all objects and if everything is OK, the job starts and can accept requests.</P><P>Hope it will help. In the next blog I will discuss about the different fine tuning techniques/performance tuning techniques in SAP data services.&nbsp;</P> 2024-02-04T18:35:05.210000+01:00 https://community.sap.com/t5/technology-blogs-by-members/moving-a-task-from-one-ci-ds-organization-to-another/ba-p/13599397 Moving a Task from One CI DS Organization to Another 2024-02-13T09:39:40.899000+01:00 Harshavardhan_N https://community.sap.com/t5/user/viewprofilepage/user-id/124498 <P>The process of importing a task from one SAP Cloud Integration for Data Services (CI DS) client to another is like copying a job or task. You can transfer a single task or all tasks within a project by exporting them and subsequently importing them into a different organization or a new datacenter.</P><P><STRONG>Export Task:</STRONG></P><P>Log in to the CI DS organization from where you wish to copy the task.</P><P>Choose the specific task or project that includes the tasks you want to export.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Harshavardhan_N_0-1707396072974.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/62764i5E1D4E804D9D2C42/image-size/medium?v=v2&amp;px=400" role="button" title="Harshavardhan_N_0-1707396072974.png" alt="Harshavardhan_N_0-1707396072974.png" /></span></P><P>Click&nbsp;<STRONG>More Actions &gt; Export</STRONG>.</P><P>A file is saved to your local Downloads directory. Single tasks are exported to a flat file in XMI format and saved with a&nbsp;.xml&nbsp;file extension. All tasks in a project are exported in a zip file.</P><P><STRONG>Import Task:</STRONG></P><P>After exporting a single task or all tasks in a project, finalize the transfer by importing them into a new organization or datacenter. To import tasks, you must have the Administrator role. Tasks are imported into a project, so identify the project where you intend to import the tasks.</P><OL><LI>Choose the project where you intend to import the individually exported task or group of tasks from an exported project, then click on "More Actions" &gt; "Import."</LI></OL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Harshavardhan_N_1-1707396072979.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/62763iBA7D062D16491418/image-size/medium?v=v2&amp;px=400" role="button" title="Harshavardhan_N_1-1707396072979.png" alt="Harshavardhan_N_1-1707396072979.png" /></span></P><P>Once you click "Import," a new window will pop up. Follow the instructions on the screen to choose the file containing the task or project you want to import.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Harshavardhan_N_2-1707396072980.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/62762iC681819C8187675B/image-size/medium?v=v2&amp;px=400" role="button" title="Harshavardhan_N_2-1707396072980.png" alt="Harshavardhan_N_2-1707396072980.png" /></span></P><P>Browse to the location where you saved the exported task or project.</P><UL><LI>If you exported a single task, the file has a .xml extension.</LI><LI>If you exported a project, the file has a .zip extension.</LI></UL><P>Additionally, provide the task name, source datastore name, and target datastore name. If the Datastore belongs to a File Format Group, check the box located on the right side of the Datastore details.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Harshavardhan_N_3-1707396072981.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/62766i0BA58F5BE8986887/image-size/medium?v=v2&amp;px=400" role="button" title="Harshavardhan_N_3-1707396072981.png" alt="Harshavardhan_N_3-1707396072981.png" /></span></P><P>Click OK.</P><P>The task will be imported successfully. Now, you can change the target or source by editing the dataflow. You can modify the transformations, filters, and mappings as per the required.</P> 2024-02-13T09:39:40.899000+01:00 https://community.sap.com/t5/technology-blogs-by-members/sap-bi-4-x-installation-quot-error-quot-an-install-is-already-running-at/ba-p/13603909 SAP BI 4.x Installation - "Error" An install is already running at this location on Linux 2024-02-14T09:34:56.200000+01:00 Abhishek_Sinha https://community.sap.com/t5/user/viewprofilepage/user-id/172574 <P>Recently I was tasked to install SAP Data Services on Linux and while doing so I faced an issue where the installer failed with the error - <STRONG>"Install already in progress"</STRONG></P><P><U><STRONG>Environment:</STRONG></U></P><P style=" padding-left : 30px; "><STRONG>Application:</STRONG> Business Object Data Services</P><P style=" padding-left : 30px; "><STRONG>OS:</STRONG> Linux</P><P><U><STRONG>Steps to reproduce the error</STRONG>:</U></P><OL><LI>Log on as a non-root user</LI><LI>Set the environment variables</LI><LI>Run <STRONG>./setup.sh</STRONG> of the BI installer (IPS in this case)</LI><LI>Enter the install directory</LI><LI>The installer will fail with the error message - <STRONG>"Install already in progress"</STRONG></LI></OL><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="BODS_Issue.png" style="width: 508px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/64688i02F5B0C1AFAB9A64/image-size/large?v=v2&amp;px=999" role="button" title="BODS_Issue.png" alt="BODS_Issue.png" /></span></P><P>I tried the below steps to resolve it but failed:</P><P style=" padding-left : 30px; ">1. Delete and recreate the installation folder</P><P style=" padding-left : 30px; ">2. Perform a reboot of the server</P><P>I found a few notes and blogs related to this error that pointed towards deleting all files <STRONG>".mutex"</STRONG> in the <STRONG>/tmp</STRONG> folder.&nbsp;<SPAN>While trying to follow one of the SAP notes <A href="https://me.sap.com/notes/0002790905" target="_self" rel="noopener noreferrer">2790905</A>, I found no <STRONG>".mutex"&nbsp;</STRONG>file in the <STRONG>/tmp</STRONG>&nbsp;folder as no installer was running in parallel or the root user was used to perform a failed installation.</SPAN></P><P><SPAN>The logs were not helpful at this point as they were directing to an installer running in parallel or failing causing the issue.&nbsp;</SPAN></P><P><SPAN>To gather more helpful logs I tried running the trace as follows:</SPAN></P><pre class="lia-code-sample language-abap"><code>strace -f -o install.out ./setup.sh (followed by your normal patch install command arguments)</code></pre><P style=" padding-left : 30px; ">(note, install.out needs to be written to a location this user has permission to write to. So you might need to run it like so: strace -f -o /tmp/install.out for example)</P><P>Once the installer fails, check the install.out file for "mutex" by running the below command</P><pre class="lia-code-sample language-abap"><code>grep mutex install.out</code></pre><P>In my case, the output showed that the user was unable to set the "mutex" file due to lack of permission and was assuming that the presence of another such file was preventing this hence the error message read - "Install already in progress"</P><P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="mutex.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/64709iEDA4B354E43AC6B6/image-size/large?v=v2&amp;px=999" role="button" title="mutex.png" alt="mutex.png" /></span></P><P><SPAN>After setting the correct permission I ran the installer again and this time it moved past the error resulting in a successful installation.</SPAN></P><P><SPAN><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="IPS.png" style="width: 852px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/64712i5F1A2CA12A816D1E/image-size/large?v=v2&amp;px=999" role="button" title="IPS.png" alt="IPS.png" /></span></SPAN></P><P>&nbsp;</P><P><STRONG><SPAN>Conclusion:</SPAN></STRONG></P><P style=" padding-left : 30px; "><SPAN>In case anyone comes across this issue and is unable to find any ".mutex" file in the /tmp folder to delete then perform a check on the write permissions of the non-root user being used to perform the installation. The strace command can also be run to ascertain the root cause.</SPAN></P><P><SPAN>Refer to the note:&nbsp;<A href="https://me.sap.com/notes/0002790905" target="_self" rel="noopener noreferrer"><SPAN>2790905 - An install is already running at this location on Linux in BI 4.x while performing update</SPAN></A></SPAN></P> 2024-02-14T09:34:56.200000+01:00 https://community.sap.com/t5/technology-blogs-by-members/performance-tuning-in-sap-data-service/ba-p/13608857 Performance tuning in SAP Data Service 2024-02-19T20:07:03.690000+01:00 pallab_haldar https://community.sap.com/t5/user/viewprofilepage/user-id/594699 <P>Today I will discuss about performance tuning in SAP Data Service.&nbsp;</P><P><STRONG>Performance tuning in SAP Data Service can be divided into two parts :</STRONG></P><P><STRONG>Configurations/ Setting Changes :</STRONG></P><UL><LI><STRONG>1. </STRONG>Use Degree of parallelism (DOP) option in the data flow to a value greater than one and 3-4. Use the data integrator features like Table partitioning, File multithreading , Degree of parallelism for data flows with DOP to get better performance.</LI></UL><P><STRONG>2.</STRONG>&nbsp; In Source database If we increase the size of database I/O, increase the size of the shared buffer to cache more data if will help to perform the&nbsp;Select&nbsp;statements quickly.&nbsp;</P><P><STRONG>3.</STRONG> In Target Database disable the Archive logging, disable the Redo logging for all tables. It will help to perform&nbsp;<STRONG>INSERT</STRONG>&nbsp;and&nbsp;<STRONG>UPDATE</STRONG>&nbsp;quickly.</P><P><STRONG>4.</STRONG> We need to increase the monitor sample rate to 40K or more as per standard to get more performance.</P><P>5. To avoid performance degradation we need to&nbsp;exclude the Data Services logs from the virus scan if the virus scan is configured on the job server.</P><P>6.&nbsp; For the first time execution&nbsp;Select the option COLLECT STATISTICS FOR OPTIMIZATION . For the second time onwards. Use collected stats which is selected by default.</P><P>7. Design a job such a way that runs one&nbsp;<STRONG>‘Al engine’</STRONG>&nbsp;process per CPU at a time.&nbsp;</P><P>8. Based on the scenario ( for Large scale of data) increase the&nbsp;Array Fetch size value&nbsp; to avoid repetitively go to the database every time to fetch the data. Set it more than 800.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_3-1708369509382.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/67867iD584B3A26EDB5820/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_3-1708369509382.png" alt="pallab_haldar_3-1708369509382.png" /></span></P><P>&nbsp;</P><P><STRONG>Design/Development Changes :</STRONG></P><P><STRONG>1.</STRONG>&nbsp; <STRONG>Push Down all the transformation logic implemented in SQL</STRONG> to Database layer to&nbsp;leverage the power of the database engine. To check the Optimized code pushed to the database engine please check the below -</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_0-1708367188988.png" style="width: 475px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/67851i491FF601A0AF2BAD/image-dimensions/475x112?v=v2" width="475" height="112" role="button" title="pallab_haldar_0-1708367188988.png" alt="pallab_haldar_0-1708367188988.png" /></span></P><P>&nbsp;</P><P>But there are few restriction and clause for SQL push down operation.&nbsp;</P><UL><LI>When pushdown is not possible in a DF then enable Bulk Loader on the target table. Bulk loader is much faster than using direct load.</LI></UL><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_1-1708367359361.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/67854i0B97F0FBA010FBFD/image-size/medium?v=v2&amp;px=400" role="button" title="pallab_haldar_1-1708367359361.png" alt="pallab_haldar_1-1708367359361.png" /></span></P><UL><LI>In case there is a "<STRONG>select Distinct</STRONG>" then the code will not push down fully. If we can not avoid to use it please use select distinct then use it in the last query transform before the target table.</LI><LI>&nbsp;Use Single Query Transform you want to use Group by and Order by.</LI><LI>Try to avoid&nbsp;data type conversions which prevents full push down.</LI><LI>Always try to avoid&nbsp;parallel execution of Query Transforms&nbsp;which prevents full push down.</LI></UL><P><STRONG>2. Join Rank :</STRONG> Please define the rank properly to make execution speed faster.&nbsp;Open the Query Editor&nbsp; assign the rank for larger tables. The highest ranked table will act as a driving table and execute .&nbsp;Monitor log file allows you to see the order in which the&nbsp;Data Services Optimizer performs the joins which&nbsp;will help you to identify the performance improvement. To add the trace go to&nbsp;<STRONG>Optimized Data Flow --&gt;&nbsp;</STRONG><STRONG>Trace&nbsp;</STRONG>tab --&gt;&nbsp;&nbsp;<STRONG>Execution Properties</STRONG>&nbsp;dialog.</P><P>3. Set the <STRONG>"Rows per Commit"</STRONG> value between 500 and 2000 where default is 1000.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pallab_haldar_2-1708368449686.png" style="width: 521px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/67855i287D90E0BFFDA575/image-dimensions/521x249?v=v2" width="521" height="249" role="button" title="pallab_haldar_2-1708368449686.png" alt="pallab_haldar_2-1708368449686.png" /></span></P><P>&nbsp;</P><P>4. Try to split complex logics of a single dataflow into multiple dataflows which is easy to maintenance and help to SQL pushdown.&nbsp;</P><P>5.&nbsp; Create index for columns which used in where clause as joining condition. This is a major point to improve performance&nbsp;drastically.</P><P>6. Always&nbsp;full pushdown is not possible. In that scenario if the dataset is larger enable the Bulk Loader on the target table.&nbsp;</P><P>7. Use join instead of Lookup function if possible.</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P><P>&nbsp;</P> 2024-02-19T20:07:03.690000+01:00 https://community.sap.com/t5/technology-blogs-by-sap/data-quality-audit/ba-p/13613355 Data Quality Audit 2024-02-21T12:46:34.652000+01:00 amitharayil https://community.sap.com/t5/user/viewprofilepage/user-id/160304 <P><FONT face="helvetica" size="2">One of the indispensables, <STRONG>essential and dominant parts of our lives is our smart phone</STRONG>. You can almost say there is nothing a smart phone cannot do nowadays. However, <STRONG>technological advancements do not stop there</STRONG>, we have continuous innovations been introduced into our surroundings.</FONT></P><P><FONT face="helvetica" size="2"><STRONG>Over 20 years</STRONG>, we saw the first mobile phones with antennas that <STRONG>evolved into faster, lighter, graphical, and more intuitive devices</STRONG>. This is a splendid <STRONG>example of technological transformation</STRONG>.</FONT></P><P><FONT face="helvetica" size="2">We started off with just voice communication over analog signals which transformed into so much more. We can now access the <STRONG>world’s news and information at our fingertips</STRONG>, <STRONG>communicate</STRONG> and <STRONG>network</STRONG> within our circle - even with strangers based on our geographic location - <STRONG>shop</STRONG>, <STRONG>trade</STRONG>, <STRONG>manage finances</STRONG> and organize our lives globally from wherever we may be.</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_0-1708442681695.png" style="width: 741px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68401i0C90B2C259070682/image-dimensions/741x278?v=v2" width="741" height="278" role="button" title="amitharayil_0-1708442681695.png" alt="amitharayil_0-1708442681695.png" /></span></FONT></P><P><FONT face="helvetica" size="2">As mobiles get smarter and smarter, think about the <STRONG>data</STRONG> that has been transferred from one device to another. And these transfers came with several implications and adjustments to the data over the years. For instance,</FONT></P><UL><LI><FONT face="helvetica" size="2"><STRONG>Compliance wise</STRONG> – with the onset of international calling, phone numbers were suddenly required to be stored in international format.</FONT></LI><LI><FONT face="helvetica" size="2"><STRONG>Competition wise</STRONG> – amongst varied brands, more features kept pouring in terms of speed, additional fields, and memory to store more contacts and addresses.</FONT></LI><LI><FONT face="helvetica" size="2">Looking at <STRONG>technology trends</STRONG> – devices now have the power of internet telephony, connecting themselves to self-driving cars and comprehensive mobile applications to become a pocket computer.</FONT></LI></UL><P><FONT face="helvetica" size="2">Data quality is the least of our concern as we are focused on the <STRONG>outcome</STRONG> to move to the new device and get started, by just ensuring that we retain the data in the new device in some form. The advanced intelligent machines assume implicitly that the data transferred to them is suitable for their intended purposes. We all know that transfers across platforms and vendors are limited by data models and technical limitations.</FONT></P><P><FONT face="helvetica" size="2">Consider this scenario: during the transfer of data between different device platforms, envision a situation where contact numbers or addresses are jumbled. You may have been consuming this data over years across devices manually, knowing the limitations and issues in the data, and deciding accordingly <EM>e.g., Mobile phone and Office phone fields have data interchanged during transfer, and postcode missed in the addresses during transfer from Apple to Google platform</EM>. Now, imagine relying on this data on your mobile device with a self-driving car to navigate you to your destination e.g., AI considering only street name to navigate and deciding between multiple options!</FONT></P><P><FONT face="helvetica" size="2">How can you <STRONG>trust the data</STRONG> that has been <STRONG>moved from your old devices into the newest device via multiple devices and platforms</STRONG>?</FONT></P><P><FONT face="helvetica" size="2">Would you be able to <STRONG>use the full capability of the new AI self-driving cars</STRONG>?</FONT></P><P><FONT face="helvetica" size="2">Is the data that we have been maintaining for decades <STRONG>fit to be used in these innovations without human intervention</STRONG>?</FONT></P><P><FONT face="helvetica" size="2">Or more simply, is the <STRONG>data fit for the new intended purpose</STRONG>?</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_1-1708442703186.png" style="width: 743px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68404i2285C3E599EF4F08/image-dimensions/743x339?v=v2" width="743" height="339" role="button" title="amitharayil_1-1708442703186.png" alt="amitharayil_1-1708442703186.png" /></span></FONT></P><P><FONT face="helvetica" size="2">Now coming back to the corporate world and considering the business point of view for customers who have moved their enterprise data assets across multiple platforms over last few decades to SAP ECC or any other ERP system. Most of these customers are now in the process of transforming their technical landscape and business processes to get ready for the future, i.e., Cloud! Whether these organizations are moving to private or public cloud, RISE with SAP or GROW with SAP, going greenfield or brownfield; the transition mirrors the evolution we discussed earlier with mobile phones but, in this case, businesses are shifting from their familiar ERP ECC systems to the futuristic landscape of SAP S/4HANA. It involves migrating from mixed landscapes, including SAP, non-SAP, and legacy systems, into the innovative <STRONG>SAP S/4HANA platform</STRONG>. The <STRONG>AI and Generative AI</STRONG> capabilities embedded in SAP S/4HANA mimics <STRONG>driverless cars </STRONG>we mentioned before. So, we find ourselves asking the same questions once again:</FONT></P><P><FONT face="helvetica" size="2"><STRONG>Does the data make sense for the S/4HANA system? Can we take advantage of the GenAI capabilities?</STRONG></FONT></P><P><FONT face="helvetica" size="2"><STRONG>Can we rely on and trust the data for the system to predict your business outcomes or take business decisions?</STRONG></FONT></P><P><FONT face="helvetica" size="2"><STRONG>In the end, is the data fit for intended purpose?</STRONG></FONT></P><P><FONT face="helvetica" size="2"><STRONG>Data quality (DQ) is always the key focus and concern from the business at the beginning of the transformation, however as program evolves, DQ topic fades as the business priority moves to ensure processes are designed and tested. </STRONG>The challenge is that no one could really quantify and establish the problem to define a business case to invest in data cleansing to save costs for the organization. By the time business users realize the scope of issues through migration, <STRONG>it is too late to fix this proble</STRONG><STRONG>m</STRONG> and they are already testing the processes. If at the end they are successful in migration, they still end up with <STRONG>technically correct and not “business correct” data set</STRONG>.</FONT></P><P><FONT face="helvetica" size="2">This helps us see those businesses, like individuals with mobile phones, need to make sure their data is ready for the changes and innovations in their systems.</FONT></P><P><FONT face="helvetica" size="4" color="#3366FF"><STRONG>How do we ensure data is fit for purpose?</STRONG></FONT></P><P><FONT face="helvetica" size="2">Let us address this topic again with the same mobile phone analogy. Consider this scenario: you are moving from an old Nokia phone to an iPhone. The former mobile phone saved contacts separately and did not have any tagging features to link landline or mobiles numbers of the same person together. However, the new iPhone has the feature to add multiple numbers with tags to a single contact. Now when you are moving to the new phone quite literally, the contacts can become redundant in your new device and cause confusion. <STRONG>How would you be able to tell apart which number is landline and which is mobile?</STRONG></FONT></P><P><FONT face="helvetica" size="2">Similarly, what if some mobile numbers were <STRONG>transferred incorrectly</STRONG> and lead to <STRONG>missing information</STRONG> in your contacts? Or during transfers and conversion of formats to comply with international codes, <STRONG>incorrect patterns</STRONG> were transferred, or the numbers truncated?</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_2-1708442703201.png" style="width: 749px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68403iEF12553534FFCE79/image-dimensions/749x277?v=v2" width="749" height="277" role="button" title="amitharayil_2-1708442703201.png" alt="amitharayil_2-1708442703201.png" /></span></FONT></P><P>&nbsp;</P><P><FONT face="helvetica" size="2">This would affect<STRONG> your new device’s ability to function as expected and for you to utilise innovative features</STRONG> on your contacts.</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_3-1708442703207.png" style="width: 746px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68402i5C9599FB57C072EF/image-dimensions/746x360?v=v2" width="746" height="360" role="button" title="amitharayil_3-1708442703207.png" alt="amitharayil_3-1708442703207.png" /></span></FONT></P><P><FONT face="helvetica" size="2">Many such examples can come up in various parts of the transfer journey between the devices especially when you are <STRONG>moving between technically diverse landscapes</STRONG>. How do we ensure we <STRONG>do not move junk or bad data into our systems</STRONG>? Or if we <STRONG>already have moved without auditing the data first</STRONG>, how can we <STRONG>identify inferior quality data</STRONG> in our current landscapes?</FONT></P><P><FONT face="helvetica" size="2">The first step to check if data is fit for purpose is to <STRONG>understand the depth and extent of your data issues</STRONG>. To <STRONG>understand</STRONG> your data issues, you need to <STRONG>audit the data</STRONG>. The data may appear clean, but the question is if it’s fit for the intended purpose.</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_4-1708442703211.png" style="width: 752px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68406i6BE5E47D31DD8152/image-dimensions/752x109?v=v2" width="752" height="109" role="button" title="amitharayil_4-1708442703211.png" alt="amitharayil_4-1708442703211.png" /></span></FONT></P><P><FONT face="helvetica" size="2">In the business world, the information spread across your systems is undeniably the most valuable asset. Achieving 100% data quality is a challenging task for any organization, but depending on how crucial each aspect is to the business, you can prioritize and make it feasible over time. For example, if there are 100 fields or attributes that require attention, and only 70% of them are essential for your business operations, you can identify key areas to concentrate on and begin planning for improvement. So, where do we begin?</FONT></P><P><FONT color="#3366FF"><STRONG><FONT face="helvetica" size="4">What is Data Quality Audit?</FONT></STRONG></FONT></P><P><FONT face="helvetica" size="2">Data Quality Audit, a service offered by SAP, assesses, and checks the actual data stored in systems to uncover the existing issues and determine fitness for target systems like S/4HANA, C4C, ARIBA, MDG etc. Irrespective of the number of systems you have, or at which stage you are in your transformation journey, Data Quality Audit checks the ‘quality’ of the data in terms of ideal principles set by your business and generates a ‘scorecard’ along with a business impact ‘cost analysis’. This allows organizations to assess the status of their data and then help formulate their cleansing, resource and migration plans accordingly.</FONT></P><P><FONT face="helvetica" size="2">Let us briefly look at the steps that this service follows to achieve the results:</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_5-1708442703219.png" style="width: 758px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68407iFB4962C60C63248F/image-dimensions/758x192?v=v2" width="758" height="192" role="button" title="amitharayil_5-1708442703219.png" alt="amitharayil_5-1708442703219.png" /></span></FONT></P><P><FONT face="helvetica" size="2">Once scoping is established, data quality ‘rules’ are defined on the data with business or IT. The ‘rules’ originate from ideal principles of data quality: Accuracy, Completeness, Conformity, Consistency, Integrity, Timeliness, Uniqueness. SAP offers best practices content of data quality rules as a QuickStart to any quality audit.</FONT></P><P><FONT face="helvetica" size="2">These ‘rules’ are then executed on the data, and ‘scorecards’ are established in a dashboard-like view. The scorecards also depict the associated cost impact analysis for the poor data quality. This then helps enterprises to assess their data, constantly review their data during the cleansing activities, supporting control and governance.</FONT></P><P><FONT face="helvetica" size="2"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="amitharayil_6-1708442703220.png" style="width: 341px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/68405i032C52909D0955F6/image-dimensions/341x179?v=v2" width="341" height="179" role="button" title="amitharayil_6-1708442703220.png" alt="amitharayil_6-1708442703220.png" /></span></FONT></P><P><FONT face="helvetica" size="2"><FONT size="4"><STRONG><FONT color="#3366FF">To conclude</FONT></STRONG><STRONG><FONT color="#3366FF">,</FONT></STRONG></FONT></FONT></P><P><FONT face="helvetica" size="2">Data Quality Audit is a service that is pivotal in the initial 'Get Clean' phase of your transformation. Unfortunately, these aspects are often underestimated, leading to the failure or delay of numerous transformations. Taking these steps seriously is crucial, as neglecting them can jeopardize the success of your large-scale investment, impede growth, and hinder the identification of business issues.</FONT></P><P><FONT face="helvetica" size="2">Consider these three key takeaways:</FONT></P><UL><LI><FONT face="helvetica" size="2">Recognize the vital role of Reference Data Quality Audit and Data Quality Audit in achieving a clean transformation. <FONT color="#339966"><EM>Look out for a similar blog on <STRONG>Reference Data Quality Audit</STRONG> - stay tuned!</EM></FONT></FONT></LI><LI><FONT face="helvetica" size="2">Be mindful of the commonly underestimated nature of these steps, as it can significantly impact the success of your initiative.</FONT></LI><LI><FONT face="helvetica" size="2">Formulate robust 'Stay Clean' and ‘Get Clean’ plans based on the insights gained, avoiding pitfalls that have derailed many transformations."</FONT></LI></UL> 2024-02-21T12:46:34.652000+01:00 https://community.sap.com/t5/technology-blogs-by-members/farewell-to-an-era-the-end-of-an-iconic-certification-of-sap-bods-c-ds-42/ba-p/13654158 Farewell to an Era: The End of an Iconic Certification of SAP BODS - C_DS_42 2024-03-31T19:16:30.842000+02:00 venkateshgolla https://community.sap.com/t5/user/viewprofilepage/user-id/226126 <P>Hello All,</P><P>&nbsp;</P><P>continuing to the previous blogs on SAP DS 4.2 and 4.3</P><OL><LI><A class="" title="End of ERA - SAP BODS 4.2" href="https://community.sap.com/t5/technology-blogs-by-members/end-of-era-sap-bods-4-2/ba-p/13562152" target="_self">End of ERA - SAP BODS 4.2</A></LI><LI><A href="https://blogs.sap.com/2021/10/06/sap-bods-data-services-road-map-ds-4.3/" target="_blank" rel="noopener noreferrer">SAP BODS / Data services road map DS 4.3 | SAP Blogs</A></LI><LI><A href="https://blogs.sap.com/2022/05/14/sap-data-services-bods-4.3-release-installation-new-features-of-bods-4.3/" target="_blank" rel="noopener noreferrer">SAP Data services (BODS) 4.3 release / Installation / new features of BODS 4.3 | SAP Blogs</A></LI></OL><P>Today, we're diving into a topic near and dear to the hearts of many in the data services realm: the legendary certification, C_DS_42, for BODS 4.2. For over a decade, this certification has been the holy grail for data services consultants, a badge of honor symbolizing expertise and proficiency in SAP's Data Services platform.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="venkateshgolla_0-1711905285942.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/88896i25AFD8410A3FED61/image-size/large?v=v2&amp;px=999" role="button" title="venkateshgolla_0-1711905285942.png" alt="venkateshgolla_0-1711905285942.png" /></span></P><P><SPAN>However, as with all good things, its time has come to an end. On the 28th of March 2024, the C_DS_42 certification officially expired, marking the end of an era for many professionals who have cherished and pursued it.</SPAN></P><P>What makes this transition even more surprising is the fact that its successor, C_DS_43, the certification for the latest version of Data Services, has also met the same fate, entering the realm of expired certifications.</P><P>For those who have dedicated countless hours to mastering BODS 4.2 and preparing for the C_DS_42 exam, this news may come as a bittersweet moment. It's a reminder of the ever-evolving nature of technology and the necessity of staying current in a field where innovation waits for no one.</P><P>But fear not, for every ending marks a new beginning. While bidding farewell to C_DS_42 may evoke a sense of nostalgia, it also presents an opportunity for renewal and growth. As the industry moves forward, so must we.</P><P>So what's next? For those who held C_DS_42 with pride, it's time to embark on a new journey. Whether it's upgrading to the latest version of Data Services or exploring other avenues within the data management landscape, there are plenty of opportunities to expand our horizons and continue making a meaningful impact.</P><P>As we reflect on the legacy of C_DS_42, let's remember the skills and knowledge it has imparted upon us. Let's honor the hard work and dedication that went into achieving this certification, and let's carry that same passion into the next chapter of our professional lives.</P><P>Farewell, C_DS_42. You may be expired, but your legacy lives on in the countless professionals you've inspired along the way.</P><P>Here's to new beginnings and the endless possibilities that lie ahead.</P><P>Thanks,</P><P>Venkatesh Golla</P><P>&nbsp;</P> 2024-03-31T19:16:30.842000+02:00 https://community.sap.com/t5/enterprise-resource-planning-blogs-by-members/futuristic-aerospace-or-defense-btp-data-mesh-layer-using-collibra-next/ba-p/13666113 Futuristic Aerospace or Defense BTP Data Mesh Layer using Collibra, Next Labs ABAC/DAM, IAG and GRC 2024-04-11T19:00:57.747000+02:00 STALANKI https://community.sap.com/t5/user/viewprofilepage/user-id/13911 <H1 id="toc-hId-862635520">Background</H1><P>In this blog, we will explore few ideas for creating a domain-centric data mesh using SAP components, without engaging in the debate of whether data fabric or data mesh is the right approach.</P><P>We will focus on a futuristic use case within the Aerospace or Defense industry, which demands stringent data governance due to compliance requirements such as ITAR, EAR, BAFA, DOE 810, NERC/CIP, and SEC. Additionally, safeguarding intellectual property is a critical concern as business growth often relies on increased collaboration, both internally and externally, spanning product and engineering, supply chain, cross industry- partnerships, and joint ventures. I am happy to hear your ideas too!</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="aero.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/95260iED73CA1C7AC54354/image-size/large?v=v2&amp;px=999" role="button" title="aero.png" alt="aero.png" /></span></P><H1 id="toc-hId-666122015">Business Problem</H1><P>Aerospace and defense customers often face challenges in keeping up with rapid technological advancements due to the burden of data debts and the need to comply with stringent legal and regulatory data security requirements.</P><P>This challenge is further amplified when dealing with legacy ERP systems, as identifying and restricting sensitive data becomes complex, hindering the end-to-end data lifecycle management.&nbsp;Moreover, the Aerospace and defense industry grapples with the formidable challenge of reducing IT costs, as the presence of significant data risks renders offshore operations an impractical option.</P><P>Let's consider the example of a product manager based in the United States, working for a US corporation. Their product is subject to ITAR regulations but has both government and commercial applications.</P><P>In order to comply with the business rule, the access to ITAR data in SAP should only be granted to US persons while they are in US locations. However, when this product manager is on a business trip to Singapore, meeting with suppliers at their APAC regional headquarters, exposing material data, CAD drawings, or BOMs stored in SAP would violate ITAR regulations.</P><P>In the context of a UK energy company establishing a joint venture with a local company in China to cater to the emerging market, an added layer of data security is required based on location. This ensures that access to BOM items and intellectual property not related to the joint venture is restricted, safeguarding sensitive information and preserving the integrity of the collaboration.</P><P><FONT face="georgia,palatino" color="#3366FF"><EM><STRONG>In the context of the aerospace and defense , "Data is not only the ammunition that fuels engines but it is also an the armor that protects international and national peace"</STRONG></EM></FONT></P><H1 id="toc-hId-469608510">BTP SAP S/4 HANA Data Mesh Pattern</H1><P>To solve the problem described above, we will hypothetically integrate Collibra, Next Labs ABAC/DAM, IAG and GRC&nbsp; and SAP S/4 HANA&nbsp; to provide data insights to users without compromising&nbsp; data security requirements. Please note that is is hypothetical pattern and we have to review and apply it according to client specific data security requirements.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-left" image-alt="BTP Data Mesh Architecture.jpg" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/95266i6D71E029D628C0A9/image-size/large?v=v2&amp;px=999" role="button" title="BTP Data Mesh Architecture.jpg" alt="BTP Data Mesh Architecture.jpg" /></span></P><TABLE border="1" width="100%"><TBODY><TR><TD width="50%" height="30px"><STRONG><FONT color="#3366FF">Solution Components</FONT></STRONG></TD><TD width="50%" height="30px"><STRONG><FONT color="#3366FF">Usage</FONT></STRONG></TD></TR><TR><TD width="50%" height="139px">Collibra</TD><TD width="50%" height="139px">This component can be used to document and define data governance policies, meta data catalogue, end to end data lineage, data quality KPI's, data protection and data privacy rules for bill of materials.</TD></TR><TR><TD height="116px">SAP IAG</TD><TD height="116px">&nbsp;SAP IAG can handle the user access request and provisioning workflows to authenticate and authorize the user identity with the client's identity provider (ex: Active Directory)</TD></TR><TR><TD height="139px">SAP GRC</TD><TD height="139px">SAP GRC can provide the necessary controls and policies for access management and perform risk analysis and segregation of duties (SoD) checks, ensuring compliance with regulatory requirements across all SAP applications in the landscape.</TD></TR><TR><TD height="850px">Next Labs ABAC/DAM</TD><TD height="850px"><P>Next Labs ABAC/DAM provides robust protection against unauthorized access to sensitive SAP data by implementing fine-grained access controls.</P><P>These controls can be applied at the level of individual data attributes or data ranges, enabling customers to safeguard their data while meeting compliance requirements. By examining the attributes of the data being accessed, the context of the request, and the user's identity, Next Labs ABAC/DAM allows organizations to control access to data, business transactions, and batch processes based on defined policies.</P><P>With SAP DAM, any changes in the attributes of the data or the user are dynamically considered, and the relevant policies are applied in real-time to enforce fine-grained access controls across various business functions. For example, a rule may specify that only US-based employees can access ITAR-classified materials from US locations. When a user attempts to access such materials, this rule is validated in real-time, ensuring that access is granted only to authorized individuals who meet the specified criteria.</P><P>Through the integration of Next Labs ABAC/DAM with SAP systems, organizations can effectively protect their sensitive data, maintain compliance, and enforce granular access controls across a wide range of business operations.</P></TD></TR><TR><TD height="50px">SAP Data sphere</TD><TD height="50px"><P>This is optional but we can use this if you want to provide flexible predictive analytics to the users.</P></TD></TR><TR><TD>SAP BTP AI Launch Pad</TD><TD><P>This is optional and can be used to identify repeat breach patterns, time and detect security data anomalies in advance and add further access controls.&nbsp;</P></TD></TR></TBODY></TABLE><H1 id="toc-hId-273095005">MVP- Bill of Material Data Mesh</H1><P>In the world of aerospace and defense, organizations face the challenge of managing bill of material (BOM) data across multiple systems. R By integrating diverse systems such as SAP S/4HANA, Team Center, and Siemens, they created a unified network of interconnected data mesh. This will enable seamless collaboration among internal and external engineering, supply chain, and product sales teams. With real-time visibility into BOMs, teams made informed decisions, optimized designs, synchronized manufacturing, and tailored offerings. The BOM data mesh can empower the organizations to achieve faster product development cycles, reduced costs, and improved customer satisfaction.&nbsp;</P><P><STRONG>Step 1:</STRONG> Define Role Requirements, Meta-data catalo Data Governance and access policies for BOM in Collibra</P><UL><LI>Identify the specific role requirements based on your organization's needs and compliance regulations.</LI><LI>Determine the attributes that will be used for access control, such as user roles, data sensitivity, and contextual factors.</LI></UL><P><STRONG>Step 2:</STRONG> Understand BOM Creation and Editing Requirements</P><UL><LI>The organization requires that only users with specific engineering roles can create and edit BOMs. Additionally, certain fields in the BOM may be restricted for external supply chain users based on data sensitivity, such as pricing information.</LI></UL><UL><LI>Determine the functional access (actions user can perform) and data access (data records or fields users can see) and governance (rules for access)</LI></UL><P><STRONG>Step 3: </STRONG>Define SAP S/4HANA Role<BR /><BR /></P><UL><LI>Create a custom role named "BOM Specialist" or copy standard role in SAP S/4HANA.<BR />Assign the authorization object M_BOM_GRP to the role, which controls access to BOM groups.</LI><LI>Assign transaction codes CS01 (Create BOM) and CS02 (Change BOM) to allow users with this role to perform BOM creation and editing tasks.</LI></UL><P><STRONG>Step 4: Configure Next Lab ABAC DAM</STRONG></P><P>SAP Next Lab ABAC DAM works natively with SAP and manages authorization logic through an externalized, standards-based policy framework. For instance, a rule may state, “Allow only US-based employees to access ITAR-classified materials from US locations.” When a user attempts to access materials, this rule is validated in real-time before access is granted.</P><UL><LI>&nbsp;Configure Next Lab ABAC with attributes such as "User Role," "Data Sensitivity Level," and "Contextual Factors."</LI><LI>&nbsp;Define "User Role" as an attribute to determine the user's role in the organization.</LI><LI>&nbsp;Define "Data Sensitivity Level" as an attribute to classify BOMs based on their sensitivity, such as "Confidential" or "Public."</LI><LI>&nbsp;Define "Contextual Factors" as attributes to consider additional factors, such as project or department.</LI></UL><P><STRONG>Step 5: Integrate SAP GRC and Next Lab ABAC</STRONG></P><P>This MVP use case leverages SAP GRC Access Control and SAP authorization for Governance and Functional Authorization and leverages ABAC for Data Authorization. It combines the features and fully integrated capabilities of SAP GRC Access Control and SAP authorization, such as ease of user assignment and role management, to efficiently supporting data attributes and avoiding the “role explosion” and custom development that would otherwise be necessary and costly.</P><UL><LI>Integrate SAP GRC and Next Lab ABAC to synchronize roles and access control policies.</LI><LI>&nbsp;Map the "BOM Specialist" role in SAP GRC to the corresponding role in Next Lab ABAC, ensuring consistency in access control.</LI></UL><P><STRONG>Step 5: Define ABAC Policies</STRONG></P><UL><LI>&nbsp;Define ABAC policies in Next Lab ABAC to enforce attribute-based access control for BOM creation and editing.</LI><LI>&nbsp;Create a policy that allows users with the "BOM Specialist" role (User Role attribute) to create and edit BOMs.</LI><LI>Create a policy that restricts access to certain fields in the BOM based on the "Data Sensitivity Level" attribute and user profile policy (<STRONG>Ex:</STRONG> Manufacturing, Engineering, Supply Chain).</LI></UL><P><STRONG>Step 6: Test and Validate the Role using AI</STRONG></P><UL><LI>Train the model on different user profiles to validate data access using SAP BTP AI Launch Pad</LI><LI>Train the model further to auto-correct access issues and detect patterns to suggest changes to role access profiles (don’t let AI implement dynamic role access changes as it can be dangerous <span class="lia-unicode-emoji" title=":smiling_face_with_smiling_eyes:">😊</span>)</LI><LI>Perform supervised automated test to test the “BOM Specialist" role by assigning it to a user and verifying that they can successfully create and edit BOMs.</LI><LI>&nbsp;Validate that the ABAC policies.</LI></UL><P><STRONG>Step 7: Integration SAP BTP Identity Access Governance to Active Directory</STRONG></P><P>By integrating SAP BTP Identity Access Governance directly, users can seamlessly access data from multiple systems, including SAP S/4HANA, Team Center, Siemens, and other engineering, manufacturing, and supply chain systems. This integration enables a cohesive data mesh approach, allowing users to view and manage bill of materials across various systems.</P><P><STRONG>Step 8: Integrate Collibra and Datasphere to Monetize and Publish bill of material insights to engineering, supply chain and product sales team.</STRONG></P><P>You have the ability to define and design self-service analytic insight reports, which can be monetized and shared with both your internal and external engineering, supply chain, and product sales teams.</P> 2024-04-11T19:00:57.747000+02:00 https://community.sap.com/t5/data-and-analytics-blog-posts/why-don-t-we-use-data-and-analytics-group-on-sap-community/ba-p/13666201 Why Don't We Use Data and Analytics Group on SAP Community? 2024-04-18T18:28:14.375000+02:00 TuncayKaraca https://community.sap.com/t5/user/viewprofilepage/user-id/137163 <P>Hey Data and Analytics Folks! Why Don't We Use <A href="https://community.sap.com/t5/data-and-analytics/gh-p/data-analytics" target="_self">Data and Analytics</A> Group on SAP Community instead of <A href="https://community.sap.com/t5/technology/ct-p/technology" target="_self">Technology</A>?</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="TuncayKaraca_1-1712778519559.png" style="width: 761px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/94808iB9F1693F79E6207B/image-dimensions/761x194?v=v2" width="761" height="194" role="button" title="TuncayKaraca_1-1712778519559.png" alt="TuncayKaraca_1-1712778519559.png" /></span></P><P>Of course, it does not matter at the end. As long as if you tag your posts with SAP Managed tags everybody can find your posts. But there is a really one huge difference for usability that here in <A href="https://community.sap.com/t5/interest-groups/ct-p/interests" target="_self">Interests Groups</A> we can <STRONG>reply-the-reply</STRONG> but under <A href="https://community.sap.com/t5/products-and-technology/ct-p/products" target="_self">Products and Technology</A>&nbsp;we cannot! Check out this blog post&nbsp;<A href="https://community.sap.com/t5/technology-blogs-by-sap/sap-datasphere-is-ready-to-take-over-the-role-of-sap-bw/bc-p/13666176" target="_self">SAP Datasphere is ready to take over the role of SAP BW</A> by&nbsp;<a href="https://community.sap.com/t5/user/viewprofilepage/user-id/705968">@kenneth_dalvik</a>&nbsp;and all responses that are very all boring sequential --there is no&nbsp;<STRONG>reply-the-reply</STRONG> option!&nbsp;<span class="lia-unicode-emoji" title=":upside_down_face:">🙃</span></P><P>So let's start using this group&nbsp;Here is already available <A href="https://community.sap.com/t5/data-and-analytics/gh-p/data-analytics" target="_self">Data and Analytics</A>.&nbsp; &nbsp;The group will be definitely a perfect fit for <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud/pd-p/67838200100800006884" class="lia-product-mention" data-product="3-1">SAP Analytics Cloud</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-1">SAP Datasphere</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud%25252C+hybrid+analytics/pd-p/464f79a9-d5e9-4113-8e9f-7ff61b577b4f" class="lia-product-mention" data-product="6-1">SAP Analytics Cloud, hybrid analytics</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+HANA+Cloud%25252C+SAP+HANA+database/pd-p/ada66f4e-5d7f-4e6d-a599-6b9a78023d84" class="lia-product-mention" data-product="40-1">SAP HANA Cloud, SAP HANA database</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud%25252C+connectivity/pd-p/0db4caf8-3039-4a93-9d11-543de33255a4" class="lia-product-mention" data-product="193-1">SAP Analytics Cloud, connectivity</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Analytics+Cloud%25252C+analytics+designer/pd-p/3f33380c-8914-4b7a-af00-0e9a70705a32" class="lia-product-mention" data-product="97-1">SAP Analytics Cloud, analytics designer</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA+Cloud%25252C+extended+edition/pd-p/73b37c4d-a4aa-4de9-aeb9-a5dc59710b26" class="lia-product-mention" data-product="244-1">SAP S/4HANA Cloud, extended edition</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA+Embedded+Analytics/pd-p/8492b555-b489-4972-8e37-83f2f27ae399" class="lia-product-mention" data-product="1067-1">SAP S/4HANA Embedded Analytics</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Business+Planning+and+Consolidation%25252C+version+for+SAP+NetWeaver/pd-p/01200615320800001016" class="lia-product-mention" data-product="460-1">SAP Business Planning and Consolidation, version for SAP NetWeaver</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Business+Warehouse+Accelerator/pd-p/01200615320800000698" class="lia-product-mention" data-product="722-1">SAP Business Warehouse Accelerator</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/BW+%252528SAP+Business+Warehouse%252529/pd-p/242586194391178517100436979900901" class="lia-product-mention" data-product="1-1">BW (SAP Business Warehouse)</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BW%25252F4HANA/pd-p/73554900100800000681" class="lia-product-mention" data-product="466-1">SAP BW/4HANA</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/BW+SAP+BEx+Analyzer/pd-p/720735023700999283551380474299965" class="lia-product-mention" data-product="919-1">BW SAP BEx Analyzer</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/BW+SAP+BEx+Web/pd-p/835872679136515185293228681234825" class="lia-product-mention" data-product="920-1">BW SAP BEx Web</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+ASE+-+BW+Enablement/pd-p/758617099728293421716080695502398" class="lia-product-mention" data-product="1038-1">SAP ASE - BW Enablement</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/BW+SAP+HANA+Data+Warehousing/pd-p/337684911283545157914465705009179" class="lia-product-mention" data-product="921-1">BW SAP HANA Data Warehousing</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Services/pd-p/01200314690800000395" class="lia-product-mention" data-product="527-1">SAP Data Services</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Intelligence/pd-p/73555000100800000791" class="lia-product-mention" data-product="15-1">SAP Data Intelligence</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Custodian/pd-p/73554900100700002051" class="lia-product-mention" data-product="1204-1">SAP Data Custodian</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Big+Data+Services/pd-p/73555000100800000691" class="lia-product-mention" data-product="439-1">SAP Big Data Services</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+-+Platform+Administration/pd-p/493706448058243238508632186627562" class="lia-product-mention" data-product="1051-1">SAP BusinessObjects - Platform Administration</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+-+Semantic+Layer/pd-p/280909257853820289811451573728573" class="lia-product-mention" data-product="1053-1">SAP BusinessObjects - Semantic Layer</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+Mobile/pd-p/01200314690800000346" class="lia-product-mention" data-product="337-1">SAP BusinessObjects Mobile</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+-+Semantic+Layer+-+SDK/pd-p/724814917412511087954547042734363" class="lia-product-mention" data-product="1054-1">SAP BusinessObjects - Semantic Layer - SDK</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+-+Web+Intelligence+%252528WebI%252529/pd-p/907900296036854683333078008146613" class="lia-product-mention" data-product="1055-1">SAP BusinessObjects - Web Intelligence (WebI)</a>&nbsp;</P><P>These are welcome too <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Master+Data+Governance/pd-p/67837800100800004488" class="lia-product-mention" data-product="697-1">SAP Master Data Governance</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+NetWeaver+Master+Data+Management/pd-p/01200615320800000588" class="lia-product-mention" data-product="739-1">SAP NetWeaver Master Data Management</a>&nbsp;<a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Quality+Management+for+SAP+NetWeaver+MDM/pd-p/01200615320800002027" class="lia-product-mention" data-product="526-1">SAP Data Quality Management for SAP NetWeaver MDM</a>&nbsp;</P><P>One thing I've noticed we cannot submit the blog posts directly. We need to "Submit for Review" only. You can still post discussions aka questions directly.&nbsp;</P><P>Thanks owners&nbsp;@ColinC, @meganhoy, @moshenaveh, @caroleighdeneen, @craigcmehil for their support!</P><P>Regards,<BR />Tuncay Karaca</P> 2024-04-18T18:28:14.375000+02:00