https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Data-Services-qa.xmlSAP Community - SAP Data Services2026-02-20T00:12:28.441028+00:00python-feedgenSAP Data Services Q&A in SAP Communityhttps://community.sap.com/t5/technology-q-a/sap-data-services/qaq-p/14164501Sap Data Services2025-07-28T14:38:57.267000+02:00Amruta4https://community.sap.com/t5/user/viewprofilepage/user-id/1658699<P>Hi team, I am creating a job for the JOB log execution history for a particular job in BODS. the variables are defined in global variables as start time end time and error log. execution history table is created into the hana. after post script. I have created a data flow in which put SQL transform and target table . but in the SQL transform given the insert statement to map the variables to target table columns. while validating it. I am getting an error like cannot use parameter variable. </P><P> </P><P>please help to solve the error ..thank you </P>2025-07-28T14:38:57.267000+02:00https://community.sap.com/t5/technology-q-a/sybase-ase-client-software-unable-to-find/qaq-p/14166638Sybase ASE Client Software - Unable to Find2025-07-30T11:04:36.287000+02:00SAPSupporthttps://community.sap.com/t5/user/viewprofilepage/user-id/121003<P>Hi Team,</P><P>We have a requirement to install the sybase ase client to the users. Unfortunately we couldn't client software in the SMP. Could you please assist us or share path where we can download the sybase client.</P><P>Best regards</P><P><BR />--- Support Assistant ---<BR />None of these<BR /><BR /></P><BR />------------------------------------------------------------------------------------------------------------------------------------------------<BR /><B>Learn more about the SAP Support user and program <A target="_blank" href="https://community.sap.com/t5/enterprise-resource-planning-blogs-by-sap/maximizing-the-power-of-sap-community-at-product-support/ba-p/13501276">here</A>.</B>2025-07-30T11:04:36.287000+02:00https://community.sap.com/t5/technology-q-a/extract-data-from-s-4hana-cloud-public-edition-using-sap-data-services-bods/qaq-p/14171257Extract data from S/4HANA Cloud Public Edition using SAP Data Services (BODS)2025-08-04T17:53:07.036000+02:00MSSAP07https://community.sap.com/t5/user/viewprofilepage/user-id/1668040<P>I assume the "Migrate your data" app is the only option to perform the initial data load into <a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA+Cloud+Public+Edition/pd-p/08e2a51b-1ce5-4367-8b33-4ae7e8b702e0" class="lia-product-mention" data-product="1199-1">SAP S/4HANA Cloud Public Edition</a> . Does this mean that IDoc or BAPI-based data loads (by calling them from SAP Data Services) are not supported?</P><P>Additionally, I'm hoping to extract SAP table data from <a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA+Cloud+Public+Edition/pd-p/08e2a51b-1ce5-4367-8b33-4ae7e8b702e0" class="lia-product-mention" data-product="1199-2">SAP S/4HANA Cloud Public Edition</a> using SAP Data Services with a data store connection established via a cloud connector. Is this a feasible approach?</P>2025-08-04T17:53:07.036000+02:00https://community.sap.com/t5/technology-q-a/data-service-schema-header/qaq-p/14184094Data Service - Schema Header2025-08-19T16:38:59.689000+02:00Julia_Bernardihttps://community.sap.com/t5/user/viewprofilepage/user-id/2145374<P>Hi, everyone.</P><P>I need a small help on SAP Data Services with the schema's header on query or looking at a table.</P><P>When looking at a table, my header is "Type", "Type" and "Content Type". (Image 3)</P><P>I would like to change it do include Description, as it will help me a lot during mapping for migration. (Image 2)</P><P>I found an option that I thought it would do the trick, but even setting it exactly as another project (print shown above), it didn't change. (Image 1)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="mstsc_1hkbiLGl4m.png" style="width: 654px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/302514i04AE198A1CA96D92/image-size/large?v=v2&px=999" role="button" title="mstsc_1hkbiLGl4m.png" alt="mstsc_1hkbiLGl4m.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="nkMUdENUa9.png" style="width: 802px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/302511i9F4683D54E8ACA66/image-size/large?v=v2&px=999" role="button" title="nkMUdENUa9.png" alt="nkMUdENUa9.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ms-teams_HvlvsqX7AW.png" style="width: 636px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/302508i355F659776B0A5ED/image-size/large?v=v2&px=999" role="button" title="ms-teams_HvlvsqX7AW.png" alt="ms-teams_HvlvsqX7AW.png" /></span></P>2025-08-19T16:38:59.689000+02:00https://community.sap.com/t5/technology-q-a/json-not-supported-using-the-datastore-connection-webservicerest/qaq-p/14207243JSON not supported using the Datastore connection WebServiceREST2025-09-04T12:54:50.794000+02:00Kranthi_1987https://community.sap.com/t5/user/viewprofilepage/user-id/1728456<P>Hi SAP Experts, </P><P><BR />We are using SAP Data Services of version 4.3 SP2. The database is Microsoft SQL Server 2022.<BR />We are trying to read the data using the WebservieREST connection. Using the SAP Article, the web-service endpoint is reachable however the response is received in the form of JSON and not the XML. We contacted the vendor; they said that the API can only return JSON.<BR />Can you please help how to load the data if the response is in the form of JSON?<BR /><BR /><BR /></P>2025-09-04T12:54:50.794000+02:00https://community.sap.com/t5/technology-q-a/extract-selected-data-from-very-large-hana-table-using-sap-data-services/qaq-p/14223822Extract selected data from very large Hana table using sap Data Services2025-09-22T10:35:53.757000+02:00ugjusthttps://community.sap.com/t5/user/viewprofilepage/user-id/717189<P>We would like to using SAP Data Services to copy data from SAP Hana database Production to the Test system, for a very large table (over 1 Billion records), but only for specific selections. So we currently have a Job with a Dataflow, with a query with a where statement and a Data Transfer transformation. This takes about 3 hours to select around 6 million records. Currenly all the data in the S/4 table is first selected and then the where is carried out. Is there anyway that we can "push" the Where down to the SAP Hana database select?</P><P>We are using Data Services version 14.3.3</P>2025-09-22T10:35:53.757000+02:00https://community.sap.com/t5/technology-q-a/cms-connection-error-during-data-services-client-installation/qaq-p/14231116CMS connection error during Data Services Client installation2025-09-30T05:22:43.988000+02:00venkataramana_paidihttps://community.sap.com/t5/user/viewprofilepage/user-id/183887<P>Hi,</P><P>We are attempting to install SAP DS 2005 Designer on the client machine; however, the installation is resulting in an error.</P><P><EM><SPAN>"<STRONG>Please check the CMS connection information, make sure they are correct and the CMS is up and running, then try again</STRONG>."</SPAN></EM></P><P>The server installation was completed successfully, and the CMS is up and running.<BR />Additionally, the CMC is accessible from the client machine without any issues.</P><P><BR /><SPAN>--- Steps to Reproduce ---</SPAN></P><OL><LI>The server was successfully installed on Windows Server 2022 and is up and running.</LI><LI>Access to the CMC from the client machine was tested and is working correctly.</LI><LI>However, an error occurs when attempting to install the client tools on the client machine — a CMS connectivity error is being thrown. </LI></OL>2025-09-30T05:22:43.988000+02:00https://community.sap.com/t5/technology-q-a/user-id-locked-bods-are-automated-with-that-id/qaq-p/14231205User Id locked BODS are automated with that id2025-09-30T07:39:59.232000+02:00BANDLAVENKATESHhttps://community.sap.com/t5/user/viewprofilepage/user-id/2231339<P>For integrating data from outside api to s4hana using bods (sap data services). With one user id the jobs are automated now that user is locked due to inactive for past 45 days now jobs running but data is not replicating in the s4hana. Here my assumption the user id is locked is it true or any other reasons can be here.<BR />Sharing your views and suggestions it helps me alot.<BR /><BR /><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Services/pd-p/01200314690800000395" class="lia-product-mention" data-product="527-1">SAP Data Services</a> </P><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+Dashboards/pd-p/01200314690800000342" class="lia-product-mention" data-product="338-1">SAP BusinessObjects Dashboards</a> </P><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+S%25252F4HANA+for+advanced+ATP/pd-p/314fb51c-b3d3-4169-a015-fc9e9e510969" class="lia-product-mention" data-product="1068-1">SAP S/4HANA for advanced ATP</a> </P>2025-09-30T07:39:59.232000+02:00https://community.sap.com/t5/technology-q-a/bods-job-running-but-data-is-not-integration-from-api-to-s4hana/qaq-p/14231211Bods job running but data is not integration from api to s4hana2025-09-30T07:50:32.543000+02:00BANDLAVENKATESHhttps://community.sap.com/t5/user/viewprofilepage/user-id/2231339<P>Bods jobs are automated with one user id in s4hana to integrate the data from api to s4hana. But recently that user id is locked due to inactive now the data is not integrating from api to s4hana even though that bods jobs are running. My assumption is that user id locked. This is main reason or any other reasons causing to integrating data from api to s4hana.<BR />Any knowledge on this issue can help me alot.<BR /><BR /><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Data+Services/pd-p/01200314690800000395" class="lia-product-mention" data-product="527-1">SAP Data Services</a> </P><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+BusinessObjects+-+Platform+Administration/pd-p/493706448058243238508632186627562" class="lia-product-mention" data-product="1051-1">SAP BusinessObjects - Platform Administration</a> </P><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Job+Scheduling+service/pd-p/26cb1f2e-ad6d-4ec2-a9f1-80470a28f744" class="lia-product-mention" data-product="1253-1">SAP Job Scheduling service</a> </P>2025-09-30T07:50:32.543000+02:00https://community.sap.com/t5/technology-q-a/sap-information-platform-services-installation-for-sap-ds-client/qaq-p/14234705SAP Information Platform Services installation for SAP DS Client Installation2025-10-04T07:18:42.913000+02:00venkataramana_paidihttps://community.sap.com/t5/user/viewprofilepage/user-id/183887<P>Hi,</P><P>I am trying to install the SAP DS client in the laptop, and it is throwing below warnings after entering the CMS information.</P><P>No valid local SIA nodes exist for the given CMS connection information. You will not be able to install APS Services (RFC Server, Administrator, Metadata, Viewdata, and Job Launcher).</P><P>If we click continue below option are greyed out. I could not find any DS Client 2025 installation step. Please let me know prerequisites.</P><P>Adaptive Processing Server (APS) services, and Management Console features.</P><P>Thanks & Regards,</P><P>Ramana</P>2025-10-04T07:18:42.913000+02:00https://community.sap.com/t5/technology-q-a/sap-data-services-pre-installation-check-failing/qaq-p/14242700SAP Data Services pre-installation check failing2025-10-13T18:50:03.243000+02:00andreapancottihttps://community.sap.com/t5/user/viewprofilepage/user-id/711432<DIV>Hello everyone</DIV><DIV>I'm having a problem installing SAP Data Services 4.3 SP3 PL08</DIV><DIV> </DIV><DIV>The operating system is:</DIV><DIV>Red Hat Enterprise Linux release 9.5 (Plow)</DIV><DIV> </DIV><DIV>I installed IPS 4.3 SP05 Patch 1 without any problem using the "IPS4305P_100-70002777.tar" package.</DIV><DIV>I can access the CMC without any problem and, from there, the product version is confirmed:</DIV><DIV>SAP BusinessObjects BI Platform 4.3 Support Pack 5 Patch 1</DIV><DIV>Version: 14.3.5.5429</DIV><DIV> </DIV><DIV>As a database I chose SQL Anywhere installed directly from the IPS installer.</DIV><DIV> </DIV><DIV>When I try to install Data Services, the pre-installation checks give me the following errors:</DIV><DIV> </DIV><DIV>***pasted text ***</DIV><DIV> </DIV><DIV>Summary of any missing critical or optional prerequisites.</DIV><DIV> </DIV><DIV>Failed: Check KSH (Critical)</DIV><DIV>Information: The ksh(/bin/ksh) was not installed. You must install ksh before the installation program can continue.</DIV><DIV> </DIV><DIV>Failed: </DIV><DIV>Minimum patch level requirements for OS (Critical) Information: [CheckPatchLevelFailReason]</DIV><DIV> </DIV><DIV>Failed: SAP Information platform services Version (Critical)</DIV><DIV>Information: SAP Information platform services or SAP BusinessObjects Business Intelligence platform is detected on this machine, but the version is not compatible. Data Services 4.3 requires SAP Information platform services or BusinessObjects Business Intelligence platform 4.3 SP1 or above versions.</DIV><DIV> </DIV><DIV>Failed: Disk space in /var (Critical)</DIV><DIV>Information: The /var folder does not have enough free disk space. You must make available 20 MB or more of disk space before the installation can continue.</DIV><DIV> </DIV><DIV>***End of pasted text ***</DIV><DIV> </DIV><DIV>Initially, indeed, ksh was not present on the Linux system but, even after installing it, I got the same errors (I also rebooted the Linux machine and tried again... same error).<BR />I tried redownloading the Data Services package (same error).<BR />I tried installing Data Services with a lower patch level installation package (same error).<BR />I tried reinstalling everything (including IPS) to a different folder (not sure what to do)... same error again !</DIV><DIV> </DIV><DIV>I checked and double-checked that, both the Linux version (on the PAM) and the IPS version (in note 3197694) are correct for Data Services 4.3 SP3 PL08.</DIV><DIV>As for the last error, even, in the fs "/var" there are 19GB free!</DIV><DIV> </DIV><DIV>Does anyone have any idea why these checks fail?</DIV><DIV>Is it possible that ksh was not installed correctly (I didn't do it but the Linux system administrator) and the subsequent checks fail because of this?</DIV><DIV> </DIV><DIV>Please help me.<BR />Thank you very much.</DIV><DIV>Andrea.</DIV>2025-10-13T18:50:03.243000+02:00https://community.sap.com/t5/technology-q-a/ds-designer-4-3-sp03-pl8-does-not-connect/qaq-p/14252830DS Designer 4.3 SP03 PL8 does not connect2025-10-24T18:37:14.645000+02:00andreapancottihttps://community.sap.com/t5/user/viewprofilepage/user-id/711432<P>Hello everyone</P><P>A colleague of mine is unable to connect, with Data Services Designer 4.3 SP03 PL8 installed on his notebook, with Windows 11, to a DS server at the exact same version installed in a customer's network.</P><P>After filling in the server parameters, port, username and password, clicking on "log on", instead of getting the list of repositories, gets this error:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="andreapancotti_0-1761323613782.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/332119i8A06405C1614B522/image-size/medium?v=v2&px=400" role="button" title="andreapancotti_0-1761323613782.png" alt="andreapancotti_0-1761323613782.png" /></span></P><P>I succeed without any problem.</P><P>Consider that:</P><OL><LI>We used exactly the same installation file to install the DS Designer.</LI><LI>We installed exactly the same DS Designer components.</LI><LI>We both connect to the customer's network, from our notebooks (subject to the same company policies) via a VPN client and get IP addresses in the same subnet.</LI><LI>The CMS is, by default, listening on port 6400 and the CMS request port is set on 6405. We are both able to telnet the IP address of the DS server from our notebooks on both port 6400 and port 6405 of the server, successfully.</LI><LI>He gets the same error even when starting DS Designer "as Administrator".</LI><LI>He also gets the same error when turning off the Windows Defender firewall before trying to connect the DS Designer.</LI><LI>If I do a traffic capture with Wireshark, I see communications, both on port 6400 and on 6405. On his notebook, there are communications on the 6400 but nothing on the 6405.</LI><LI>He gets the same error trying to connect with two different users (both work for me).</LI><LI>We both reach, via browser, the CMC on port 8080.</LI><LI>We tried to configure the request port of the CMS to 6408 but the result is exactly the same.</LI><LI>It gets this error on two different DS servers (development and production). I can log in, without problems, on both.</LI></OL><P>What do you think? Does anyone have something to suggest ?</P><P>Thank you all !</P>2025-10-24T18:37:14.645000+02:00https://community.sap.com/t5/technology-q-a/fill-bods-variables-from-a-sql-database/qaq-p/14260089Fill BODS variables from a sql database2025-11-04T10:04:55.841000+01:00thomas_altnickel99https://community.sap.com/t5/user/viewprofilepage/user-id/802089<P>HI,</P><P>i have an sql datastore called "EM_DATASERVICES". There i have the database "EM_DATASERVICES_DEV".</P><P>I have created the table "<SPAN>SELECTION_EXPRESSION_RESULT". There i have the colums "Variable" and "Expression".</SPAN></P><P><SPAN>I have loaded this table to BODS. Now i will update the variable in BODS with the value from column "Expression".</SPAN></P><P><SPAN>I create a script in the workflow:</SPAN></P><P><SPAN><!-- ScriptorStartFragment -->$G_WERKS = sql('EM_DATASERVICES','SELECT Expression FROM SELECTION_EXPRESSION_RESULT WHERE Variable = ''$G_WERKS''');<!-- ScriptorEndFragment --></SPAN></P><P><SPAN>And i got this error: </SPAN></P><P>[Script:Script530]<BR />(Ln 1) : Syntax error : near <'> found <;> expecting <')', ','></P><P>What did i wrong?</P><P>Thanks TA</P><P> </P><P> </P>2025-11-04T10:04:55.841000+01:00https://community.sap.com/t5/technology-q-a/sap-data-services-job-quot-hangs-quot-occasionally-requires-quot-abort-quot/qaq-p/14278302SAP Data Services Job "Hangs" Occasionally. Requires "Abort"2025-11-26T17:29:46.127000+01:00dunncrewhttps://community.sap.com/t5/user/viewprofilepage/user-id/7404<P>We have 1 SAP DS job, scheduled every 30 minutes, that hangs occasionally, for no apparent reason. We have to manually abort it, so that it will run at the next interval. I looked at the Windows Task Scheduler for the job, and it shows "Task Completed" after about 1 minute, so the Windows task is not hung. Apparently it is stuck within the SAP job application.</P><P>In Management Console, the "End Time" is blank, although the steps have completed.</P><P>Any thoughts about how to solve this ?</P><P>SAP Data Services<BR />Version: 14.2.13.25270479</P>2025-11-26T17:29:46.127000+01:00https://community.sap.com/t5/technology-q-a/error-80101/qaq-p/14281874ERROR 801012025-12-02T11:53:49.119000+01:00mo-shabeerhttps://community.sap.com/t5/user/viewprofilepage/user-id/2268688<P>Hi</P><P> </P><P>I am having an issue when executing my job in SAP DS. So I am pulling in an Excel file and executing my job but I keep on getting error message 80101. </P><P>I have checked the file path and that looks correct and I also have checked the permission within the excel file and looks correct as well.So I am bit stumped as to what to do. If it helps I am currently running SAP DS on a remote desktop that is downloaded on my local machine.</P><PRE>080101: |Data flow DF_Players_Stat<BR />Cannot open file <C:/Users/mhussain/football_dataset_for_sap_designer.xlsx>. Please check the files path and permissions.</PRE><P> </P>2025-12-02T11:53:49.119000+01:00https://community.sap.com/t5/enterprise-resource-planning-q-a/is-hyphen-supported-for-server-name-in-sap-data-services/qaq-p/14298979Is Hyphen supported for Server Name in SAP Data Services?2025-12-29T18:14:52.872000+01:00SAPSupporthttps://community.sap.com/t5/user/viewprofilepage/user-id/121003<P>Using a <STRONG>hyphen (-)</STRONG> in the hostname is fully supported for DS/IPS installation and operation?</P><BR />------------------------------------------------------------------------------------------------------------------------------------------------<BR /><B>Learn more about the SAP Support user and program <A target="_blank" href="https://community.sap.com/t5/enterprise-resource-planning-blogs-by-sap/maximizing-the-power-of-sap-community-at-product-support/ba-p/13501276">here</A>.</B>2025-12-29T18:14:52.872000+01:00https://community.sap.com/t5/technology-q-a/is-sap-bods-capable-to-write-to-microsoft-fabric-warehouse/qaq-p/14305912is SAP BODS capable to write to Microsoft Fabric Warehouse?2026-01-12T17:32:43.720000+01:00albertosimeonihttps://community.sap.com/t5/user/viewprofilepage/user-id/829608<P>Hello Experts,<BR />I'm trying to write to Microsoft Fabric with SAP BODS.</P><P>I can not create a working BODS Datastore with Fabric (datastore type: database, database type: Microsoft).</P><P>I can create an ODBC based datastore and for data reading it works.</P><P>when I try to write to MS Fabric warehouse, BODS is haging during the write, the problem seems to be on the commit as the files under the delta tables are created.</P><P>Have you experienced this? (BODS 4.2.14, odbc driver for SQL server version 17).</P><P>Thanks,<BR />Alberto</P>2026-01-12T17:32:43.720000+01:00https://community.sap.com/t5/technology-q-a/sap-data-services-generated-abap-df-program-can-not-be-transported-with/qaq-p/14308625SAP Data Services, generated ABAP DF program can not be transported with variables!!2026-01-16T11:54:27.053000+01:00albertosimeonihttps://community.sap.com/t5/user/viewprofilepage/user-id/829608<P>hello experts,</P><P>I try to generate an ABAP program related to an ABAP Dataflow, the DF uses a global variable (or a parameter) => Upload Ok.</P><P>I try to release the CR in SAP S/4HANA 2021 system => CR releasae Ok.</P><P>When I transport there are errors related to how the Global variables or parameters in ABAP dataflow are translated in PARAMETER $PARAM# in ABAP program (# = 1,2,3... positional parameter identifier).<BR /><BR />So during STMS transport of a generated ABAP program from an ABAP dataflow, a check on ABAP consistency is made (parameters needs to start with literals or '_').<BR />By default the code generated by SAP DS contains the '$' symbol as first character.</P><P>This prevent of transporting any ABAP dataflow that is using Gobal variables or Parameters.</P><P>There is no workaround or mention in SAP Help (tested an export of PDF and query to notebooklm),</P><P>I can't find any workaround in SAP notes.</P><P> </P><P>This limitation prevent the usage of "Execute Preloaded" with SAP S/4 or SAP ECC production datastore, as the ABAP programs can not be transported beccause of '$' issue, and with closed client we can not directly or manually upload these programs to S/4HANA productive tenant.</P><P>So the only workaround I found is open the client to changes and use "Generate and Execute" with a productive S/4HANA system.</P><P> </P><P>Do you have any workaround?</P><P>best regards,</P><P>Alberto</P>2026-01-16T11:54:27.053000+01:00https://community.sap.com/t5/technology-q-a/clarification-on-row-type-quot-normal-quot-behavior/qaq-p/14318210Clarification on Row Type "NORMAL" behavior2026-01-30T01:32:35.442000+01:00Dawonhttps://community.sap.com/t5/user/viewprofilepage/user-id/2052005<P>I am currently studying SAP Data Services (BODS) and have a question regarding how the <STRONG>Operation Code (Row Type)</STRONG> changes throughout a Dataflow. Specifically, I am confused about the different behaviors of the <STRONG>"NORMAL"</STRONG> flag depending on whether a <STRONG>Table_Comparison (TC)</STRONG> transform is used.</P><P>Based on the official documentation, I understand that:</P><OL><LI><P>All rows extracted from a source are initially flagged as <STRONG>NORMAL</STRONG>.</P></LI><LI><P>When a row flagged as <STRONG>NORMAL</STRONG> is loaded into a target, it is treated as a new row and an <STRONG>INSERT</STRONG> operation is performed.</P></LI></OL><P>I have designed a test Dataflow as shown in the attached image to compare two scenarios where the source and target data are identical (no changes).</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Dawon_0-1769732879182.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/367197i6348EBC66FF62725/image-size/medium?v=v2&px=400" role="button" title="Dawon_0-1769732879182.png" alt="Dawon_0-1769732879182.png" /></span></P><P> </P><P><STRONG>Scenario 1: Source -> Map_Operation (MO) -> Target (DW_DATA_COMP2)</STRONG></P><UL><LI><P><STRONG>MO Setting:</STRONG> Normal to Normal.</P></LI><LI><P><STRONG>Result:</STRONG> It triggers a Unique Constraint Violation error because the <STRONG>NORMAL</STRONG> flag from the source is treated as an <STRONG>INSERT</STRONG>. I understand this part as it aligns with the documentation.</P></LI></UL><P><STRONG>Scenario 2: Source -> Table_Comparison (TC) -> Map_Operation (MO) -> Target (DW_DATA_COMP)</STRONG></P><UL><LI><P><STRONG>TC Setting:</STRONG> Comparing Source and Target.</P></LI><LI><P><STRONG>MO Setting:</STRONG> Normal to Normal.</P></LI><LI><P><STRONG>Result:</STRONG> The rows are <STRONG>Discarded</STRONG> (no SQL is executed) by the target.</P></LI></UL><P><STRONG>My Questions:</STRONG> In <STRONG>Scenario 2</STRONG>, how is each row flagged at each stage of the flow when the source and target data match exactly?</P><UL><LI><P><STRONG>Source Table Stage:</STRONG> Is the row flagged as <STRONG>NORMAL</STRONG>?</P></LI><LI><P><STRONG>Table_Comparison Stage:</STRONG> Does the TC transform maintain the <STRONG>NORMAL</STRONG> flag because it's a "Match with no difference"?</P></LI><LI><P><STRONG>Map_Operation Stage:</STRONG> If the MO receives a <STRONG>NORMAL</STRONG> flag and outputs it as <STRONG>NORMAL</STRONG>, why does the Target treat this differently compared to Scenario 1?</P></LI></UL><P>Does the <STRONG>Table_Comparison</STRONG> transform add some internal metadata or "instruction" to the <STRONG>NORMAL</STRONG> flag that tells the Target to discard it instead of inserting it? I would appreciate a clear explanation of how the engine distinguishes between these two "Normal" states.</P><P> </P><P>Here is the screenshot. </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Dawon_0-1769765428359.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/367317i39C67DAEFCB4302F/image-size/medium?v=v2&px=400" role="button" title="Dawon_0-1769765428359.png" alt="Dawon_0-1769765428359.png" /></span></P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Dawon_1-1769765487545.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/367318iF620157BC49A5F2F/image-size/medium?v=v2&px=400" role="button" title="Dawon_1-1769765487545.png" alt="Dawon_1-1769765487545.png" /></span></P><P> </P><P>Thank you in advance for your help!</P>2026-01-30T01:32:35.442000+01:00https://community.sap.com/t5/technology-q-a/installation-sap-data-services-4-3/qaq-p/14320083Installation SAP Data Services 4.32026-02-02T17:53:02.718000+01:00rouchad_abderrazak1042https://community.sap.com/t5/user/viewprofilepage/user-id/764953<P>I would like to install SAP Data Services 4.3, and I want to make sure I fully understand the prerequisites before getting started.<BR />So far, I know that I need to:</P><P>Download and install the SAP Installation Planner (IPS) to generate the installation files.</P><P>Install the Data Services 4.3 Server on the target server.</P><P>Install the Data Services 4.3 Client (Designer) on the user workstations.</P><P>However, I am confused about the CMS (Central Management Server). I’m not exactly sure what it is:</P><OL><LI>Is it a standalone product?</LI><LI>Is it part of IPS or the Data Services Server/Client?</LI><LI>Is it mandatory to use SAP Data Services, or only required if one wants to integrate with the SAP BusinessObjects Management Console?</LI></OL><P>I would like to clarify the role of the CMS in the SAP Data Services 4.3 architecture and understand whether I can install and use DS 4.3 independently for my ETL jobs and database connections without installing the CMS.</P><P>Thank you in advance for your clarifications!</P>2026-02-02T17:53:02.718000+01:00