https://raw.githubusercontent.com/ajmaradiaga/feeds/main/scmt/topics/SAP-Datasphere-qa.xmlSAP Community - SAP Datasphere2026-02-15T12:12:36.621784+00:00python-feedgenSAP Datasphere Q&A in SAP Communityhttps://community.sap.com/t5/technology-q-a/sap-datasphere-monitoring-in-clouldalm/qaq-p/14307046SAP Datasphere Monitoring in CLouldALM2026-01-14T07:50:13.188000+01:00SAPSupporthttps://community.sap.com/t5/user/viewprofilepage/user-id/121003<P> I am trying to set up SAP Datasphere monitoring in Cloud ALM, which includes the backend HANA database. I was able to add SAP Datasphere systems to Cloud ALM, however when I try to retrieve the backend HANA DB details, I don't get the setup details in cloud ALM. </P><P>from this link <A href="https://support.sap.com/en/alm/sap-cloud-alm/operations/expert-portal/setup-managed-services/calm-setup-datasphere.html" target="_blank" rel="noopener noreferrer">https://support.sap.com/en/alm/sap-cloud-alm/operations/expert-portal/setup-managed-services/calm-setup-datasphere.html</A> i am able to open the SAP-Managed connectivity to use opentelementry based data to report monitoring data to SAP Cloud ALM. </P><P>Could you please advise if this is possible to monitor the backend HANA DB information into Cloud ALM for the respective SAP Datasphere services/instances? </P><BR />------------------------------------------------------------------------------------------------------------------------------------------------<BR /><B>Learn more about the SAP Support user and program <A target="_blank" href="https://community.sap.com/t5/enterprise-resource-planning-blogs-by-sap/maximizing-the-power-of-sap-community-at-product-support/ba-p/13501276">here</A>.</B>2026-01-14T07:50:13.188000+01:00https://community.sap.com/t5/technology-q-a/dynamically-pivot-table/qaq-p/14308069Dynamically pivot Table2026-01-15T14:18:14.951000+01:00NicolasRivas1991https://community.sap.com/t5/user/viewprofilepage/user-id/1976185<P>Hello,<BR /><BR />We need to dynamically pivot table. We have a table with an equipment and its type and each equipment type has certain characteristics names and values. We need to dynamically pivot this table so each characteristic name is a column with its own value for each equipment. I read a lot of the community and with gemini but I cant find anything dynamic.<BR /><BR />Here is an example </P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="NicolasRivas1991_0-1768482816778.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/361664i9B15BA355986E1F7/image-size/medium/is-moderation-mode/true?v=v2&px=400" role="button" title="NicolasRivas1991_0-1768482816778.png" alt="NicolasRivas1991_0-1768482816778.png" /></span></P><P>This is a migration from QlikView to SAP Analytics cloud, where, in QlikView, we had a table that, when en equipment type was selected, ceratain columns where selected. We need to replicate that in SAP analytics Cloud.<BR /><BR />Thank you.</P>2026-01-15T14:18:14.951000+01:00https://community.sap.com/t5/technology-q-a/hierarchy-with-directory-in-sap-datasphere-description-vs-id-shown/qaq-p/14308429Hierarchy with Directory in SAP Datasphere – Description vs. ID shown inconsistently across folders2026-01-16T08:47:22.210000+01:00tn85https://community.sap.com/t5/user/viewprofilepage/user-id/2274964<P> </P><P>Hello SAP Datasphere Community,</P><P>I am currently facing an issue with hierarchies created in SAP Datasphere, and I would like to understand whether this is a known behavior or a modeling issue on my side.</P><P>I created a product hierarchy with directory using a SQL View. The hierarchy is based on a remote table and exposed into Datasphere via a SQL View. The implementation follows exactly the approach described in the official SAP Community blog post “Modeling a basic hierarchy with directory in SAP Datasphere”. Overall, the setup works as expected: the hierarchy is created correctly, all levels are visible, and it can be used in an Analytic Model. Descriptions are maintained and initially displayed correctly.</P><P>However, I am experiencing a problem when reusing the same hierarchy with directory in a different folder. In this case, the hierarchy is still visible and structurally correct in the Analytic Model, but when switching the display setting from ID to Description, the system continues to show the ID instead of the description. This happens even though the descriptions are clearly maintained and work correctly when the hierarchy is used in its original folder.</P><P>What makes this behavior particularly confusing is that my modeling approach is always identical, yet the outcome differs. When the hierarchy is used directly on top of the SQL View without persistence, the descriptions are displayed correctly. As soon as the SQL View is persisted, the hierarchy still works structurally, but only IDs are shown and the descriptions are no longer respected in the Analytic Model. The same inconsistency can also be observed when reusing the identical hierarchy definition in a different folder. In some cases, descriptions are displayed correctly, while in others only IDs appear. This inconsistent behavior makes it difficult to identify a clear root cause, as the same hierarchy definition behaves differently depending on persistence and folder context.</P><P>Hierarchies coming from BW are currently not an option. The issue occurs repeatedly across multiple hierarchies, not just a single one.</P><P>Has anyone experienced similar behavior with hierarchies and directories in SAP Datasphere? Are there known limitations or dependencies when reusing hierarchies across folders or when persisting SQL Views? Could this be related to metadata handling, caching, folder context, or semantic usage in Analytic Models? Any hints, explanations, or best practices to ensure that descriptions are consistently displayed instead of IDs would be highly appreciated.</P><P>Thank you in advance for your support.</P>2026-01-16T08:47:22.210000+01:00https://community.sap.com/t5/technology-q-a/left-join-lateral-in-datasphere-sql-view/qaq-p/14311007LEFT JOIN LATERAL In DataSphere SQL View2026-01-20T11:44:44.409000+01:00hleboeufhttps://community.sap.com/t5/user/viewprofilepage/user-id/882900<P>Hello,</P><P>Being original a MS SQL writer switching now to DataSphere I'm struggling with the alternavtive for a CROSS APPLY.</P><P>I've found some sample on the LEFT JOIN LATERAL, but when I use this code DS always gives errors.</P><pre class="lia-code-sample language-sql"><code>SELECT FT_CURR."FROM_CURR", FT_CURR."TO_CURR"
, DATES."DATE_SQL"
, EXCH_DT."EXCH_RATE"
, EXCH_PRVDT."EXCH_RATE"
FROM (SELECT DISTINCT "FROM_CURR", "TO_CURR" FROM "EXCHANGERATES") FT_CURR
CROSS JOIN (SELECT "DATE_SQL" FROM "CENTRAL.2VS_DS_Dates" WHERE "YEAR" = 2025) DATES
LEFT OUTER JOIN "EXCHANGERATES" EXCH_DT ON EXCH_DT."FROM_CURR" = FT_CURR."FROM_CURR"
AND EXCH_DT."TO_CURR" = FT_CURR."TO_CURR"
AND EXCH_DT."EXCH_DATE" = DATES."DATE_SQL"
LEFT JOIN LATERAL (SELECT "FROM_CURR", "TO_CURR", "EXCH_DATE", "EXCH_RATE"
FROM "EXCHANGERATES"
WHERE "FROM_CURR" = "FT_CURR"."FROM_CURR"
AND "TO_CURR" = "FT_CURR"."TO_CURR"
AND "EXCH_DATE" < "DATES"."DATE_SQL"
ORDER BY "EXCH_DATE" DESC
LIMIT 1) EXCH_PRVDT ON EXCH_PRVDT."FROM_CURR" = FT_CURR."FROM_CURR"
AND EXCH_PRVDT."TO_CURR" = FT_CURR."TO_CURR"
AND EXCH_PRVDT."EXCH_DATE" < "DATES"."DATE_SQL"</code></pre><P>First error is <SPAN>‘SELECT’ is a reserved word - write ‘![SELECT]’ instead if you want to use it as name</SPAN></P><P>So I'm not sure DS allows this LATERAL, and if not, what would be an alternative ?</P>2026-01-20T11:44:44.409000+01:00https://community.sap.com/t5/technology-q-a/sap-datasphere-abap-connection-issues-post-bdc-re-wire-upgrade-process/qaq-p/14311909SAP Datasphere ABAP connection issues post BDC re-wire upgrade process2026-01-21T14:09:43.743000+01:00JayConnellhttps://community.sap.com/t5/user/viewprofilepage/user-id/95367<P>I had a standalone SAP Datasphere tenant where source connection was ABAP and Target was ADLS2. Had over 100 replication flows designed and running for over 6 months no issues. We decided to convert our Datasphere tenant to BDC. The BDC cockpit was enable and the re-wire upgrade process was initiated and all showed green and ready. The next day the replications all failed. We checked ABAP system and SAP Cloud Connecter(SCC) no issues found. We found when validating the ABAP connection, it was now failing with a CPIC error message. The exact same error message you get if you did a connection test in sm59 from an ABAP instance where the target computer name was wrong or could not be resolved. In the error message the name it had was wrong and not the virtual name defined in SCC. We confirmed the configuration in SCC and restarted the services. The subaccount tunnel connection showed good and internal connection check was also good. We have SCC installed on the same ABAP server. In Datasphere we removed and re-added the SCC Location in Data Source Connection config and still no luck. Turned on SCC tracing including tunnel tracing and nothing was captured in the tunnel trace when doing a validation, it was blank. We created a new Datasphere free tenant in the exact same area and using the exact same SCC and ABAP the connection validation worked, and we could see data in the tunnel trace for that subaccount. We also upgraded SCC and SAPJVM to the latest versions and still nothing. This was going on for almost 2 weeks meanwhile SAP support was checking it out as well between re-wire, DI and SCC Teams. We worked on troubleshooting this every day and then one day the connection just started working again. We changed nothing in SCC or ABAP. The DI Team thinks the issue is with SCC and waiting on their input. I don't see how since we proved the SCC works with the other Datasphere tenant. My assumption and SAP Java Connector in DI is used to perform an RFC_PING and it's failing never reaching the SCC tunnel because of the wrong name being used. We just don't know where or how it's getting the wrong name.</P><P>Has anyone had any similar issues like this or was it just my bad luck. This was just our QA tenant and now concerned before doing our production upgrade to BDC because we can't be down.</P>2026-01-21T14:09:43.743000+01:00https://community.sap.com/t5/technology-q-a/sap-business-data-cloud-landscape-q-prod/qaq-p/14311974SAP Business Data Cloud landscape (Q/Prod)2026-01-21T15:47:28.508000+01:00benhaddouhttps://community.sap.com/t5/user/viewprofilepage/user-id/891003<DIV>Hi everyone,</DIV><DIV> </DIV><DIV>I have a question about the BDC landscape. Do you typically run separate BDC instances for Q and PROD (BDC‑Q and BDC‑PROD), or is there a single BDC where Datasphere (DSP) and SAP Analytics Cloud (SAC) each have Q and PROD tenants?</DIV><DIV> </DIV><DIV><DIV>Thanks in advance!</DIV></DIV>2026-01-21T15:47:28.508000+01:00https://community.sap.com/t5/technology-q-a/how-to-configure-cloud-alm-to-monitor-datasphere-or-sac-connections/qaq-p/14312130How to configure Cloud ALM to monitor Datasphere or SAC Connections?2026-01-21T19:10:07.449000+01:002023_kurthttps://community.sap.com/t5/user/viewprofilepage/user-id/892582<P>Greetings experts. I have searched the interwebs and my Frenemy AI tells me Cloud ALM can monitor Datasphere or SAC connections, but then the instructions listed only show an overview of connecting Datasphere or SAC to Cloud ALM. <BR /><BR />Can this be done? If so, how can it be? I have not found this info anywhere on SAP websites.</P>2026-01-21T19:10:07.449000+01:00https://community.sap.com/t5/technology-q-a/cds-extraction-container-missing-in-replication-flow/qaq-p/14312532CDS_EXTRACTION container Missing in Replication flow2026-01-22T11:06:02.384000+01:00kadiyala_venkatesulu5e7f8fhttps://community.sap.com/t5/user/viewprofilepage/user-id/2276187<P>Hi Experts,</P><P>CDS_EXTRACTION container Missing in Replication flow while choosing source container. but CDS_VIEW container only visible. so unable to select standard CDS views. </P>2026-01-22T11:06:02.384000+01:00https://community.sap.com/t5/technology-q-a/changing-the-default-row-limit-in-sap-datasphere-data-preview/qaq-p/14314366Changing the Default Row Limit in SAP Datasphere Data Preview2026-01-26T01:32:07.051000+01:00vijihttps://community.sap.com/t5/user/viewprofilepage/user-id/1474656<P>I am using an <STRONG>Analytic Model</STRONG> in SAP Datasphere and want to increase the default row limit in the Data Preview from <STRONG>1,000 to 5,000 records</STRONG>.</P><P>Currently, when I open the preview settings, I only see options for <STRONG>filters</STRONG>.</P><P>In SAP HANA Studio, there is a global setting under <CODE>Preferences > SAP HANA > Runtime > Result View</CODE> to change this. Is there a similar setting in the SAP Datasphere web interface to change the default row count for all previews, or is the 1,000-row limit fixed</P>2026-01-26T01:32:07.051000+01:00https://community.sap.com/t5/technology-q-a/sap-datasphere-sql-views-as-sql-script-automatic-generation-of-columns-in/qaq-p/14315801SAP Datasphere - SQL Views as SQL Script – Automatic generation of columns in the left side pane2026-01-27T14:53:08.701000+01:00anna_sch-25https://community.sap.com/t5/user/viewprofilepage/user-id/2277169<P>Hi everyone,</P><P>I am working with <STRONG>SQL Views using SQLScript (Table Function)</STRONG> in <STRONG>SAP Datasphere</STRONG> and noticed that the <STRONG>output columns must always be created manually</STRONG> in the <EM>Model Properties → Columns</EM> section before the SQLScript can be validated or deployed.</P><P>Even when the SQLScript contains a simple <CODE>RETURN SELECT ...</CODE> statement, Datasphere does <STRONG>not</STRONG> automatically generate the column definitions based on the SQL result. As far as I can see, this is different from standard SQL views, where column inference works automatically.</P><H3 id="toc-hId-1917776484"><STRONG>My questions:</STRONG></H3><OL><LI><STRONG>Is there any way to automatically generate the output columns</STRONG> in the left side panel when using SQLScript (Table Function) in SAP Datasphere?</LI><LI>If not:<UL><LI>Is this a current technical limitation?</LI><LI>Is there any roadmap indicating that automatic column generation will be supported in the future?</LI></UL></LI></OL><H3 id="toc-hId-1721262979"><STRONG>Context / What I tried:</STRONG></H3><UL><LI>SQLScript with <CODE>RETURN SELECT ...</CODE></LI><LI>SQLScript with table variables</LI><LI>Validation to trigger auto‑generation</LI><LI>Switching between SQL Standard and SQLScript</LI></UL><P>In all cases, Datasphere requires me to manually maintain the business/technical names and data types in the Model Properties panel before validation succeeds.</P><P>Example SQLScript:</P><pre class="lia-code-sample language-sql"><code>return
with "cte_customers" as (
select
"id",
"name"
from "customers"
)
, "cte_customers_ranked" as (
select
"id",
"name",
1 as "rank_number"
from "cte_customers"
)
select
"id",
"name",
"rank_number"
from "cte_customers_ranked";</code></pre><H3 id="toc-hId-1524749474"><STRONG>Expected behavior (wish):</STRONG></H3><P>Some way to auto‑populate the column list based on the SQLScript output (similar to SQL Standard views).</P><H3 id="toc-hId-1328235969"><STRONG>Actual behavior:</STRONG></H3><P>Columns must be fully defined manually, otherwise validation fails.</P><P>If anyone has additional insight, workaround ideas, or official SAP statements regarding this behavior, I would really appreciate it.</P><P>Thank you.</P><P>Kind regards,<BR />Anna</P><P> </P><DIV>Clicking <STRONG>Validate</STRONG> for the following statement <STRONG>automatically generates</STRONG> the column <EM>ID</EM>:</DIV><pre class="lia-code-sample language-sql"><code>tab =
select
"id" AS ID
from "customers";
return
select ID from :tab;</code></pre><P> </P><DIV>Clicking <STRONG>Validate</STRONG> for the following statement <STRONG>does not automatically generate</STRONG> the column <EM>ID</EM>:</DIV><pre class="lia-code-sample language-sql"><code>tab = with "cte1" (ID) as (
select
"id" AS ID
from "customers"
)
select ID from "cte1";
return
select ID from :tab;</code></pre><P> </P><DIV>Is there an issue with the <CODE>WITH</CODE> clause?</DIV>2026-01-27T14:53:08.701000+01:00https://community.sap.com/t5/technology-q-a/minimum-cu-for-datasphere-with-bdc-connector/qaq-p/14319393Minimum CU for Datasphere with BDC Connector?2026-02-02T02:10:04.402000+01:00k_taniohttps://community.sap.com/t5/user/viewprofilepage/user-id/115792<P><SPAN>Hello Community,</SPAN></P><DIV><P>We are currently evaluating the architecture for performing <STRONG>zero-copy data sharing from SAP Data Products to Databricks</STRONG> using the <STRONG>SAP BDC Connector</STRONG>.</P><P>According to my understanding, when using the BDC Connector for zero-copy data sharing, an <STRONG>SAP Datasphere tenant is still required</STRONG>. I would like to confirm whether the <STRONG>minimum Capacity Units (CU)</STRONG> required for Datasphere in this scenario follows the same logic described in the following blog post, which explains the <STRONG>minimum CU needed to activate the Object Store</STRONG>:</P><DIV><A href="https://community.sap.com/t5/technology-blog-posts-by-sap/what-is-the-minimum-capacity-units-cu-to-activate-object-store-in-sap/ba-p/14181475" target="_blank">https://community.sap.com/t5/technology-blog-posts-by-sap/what-is-the-minimum-capacity-units-cu-to-activate-object-store-in-sap/ba-p/14181475</A></DIV><DIV><P>When performing zero-copy sharing of SAP Data Products to Databricks via the SAP BDC Connector, is the minimum CU requirement for Datasphere simply the minimum CU needed to activate the Object Store, as described in the blog above?</P><P>Thank you!</P></DIV></DIV>2026-02-02T02:10:04.402000+01:00https://community.sap.com/t5/technology-q-a/extract-a-measure-depending-on-a-max-measure/qaq-p/14319968Extract a measure depending on a max measure2026-02-02T15:29:00.243000+01:00FrancescoPetrocellihttps://community.sap.com/t5/user/viewprofilepage/user-id/1579799<P>I have datas built this way.</P><P>I want to create an other measure that extracts the measure "tempo lordo lavorato" where the measure "qta colli" is maximum, aggregating on "centro di lavoro" and "codice materiale".</P><P>For example in this case the max "qta colli" for centro =M155 and "codice materiale" = 150.IG074.01 the new measure will be 330.</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="FrancescoPetrocelli_0-1770042449257.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368147i683ED7809CBF4D5A/image-size/medium?v=v2&px=400" role="button" title="FrancescoPetrocelli_0-1770042449257.png" alt="FrancescoPetrocelli_0-1770042449257.png" /></span></P><P>for the combination "centro" = M155 and "codice materiale" = 1C2P.F0424014 is 480.</P><P>The report will be interrogated filtering the period od analysis.<BR />_________________________________________________________<BR />These are the data and the problem of the partition is that the value is calculated day by day. For example If the users analyses this period It should obtain 397</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="FrancescoPetrocelli_0-1770195748246.png" style="width: 999px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368835iDCBE6AA3E29642ED/image-size/large?v=v2&px=999" role="button" title="FrancescoPetrocelli_0-1770195748246.png" alt="FrancescoPetrocelli_0-1770195748246.png" /></span></P><P>For the quantity I can make a formula in the analytical model this way:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="FrancescoPetrocelli_1-1770195838019.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368839iAA0F876EBA8C9FB8/image-size/medium?v=v2&px=400" role="button" title="FrancescoPetrocelli_1-1770195838019.png" alt="FrancescoPetrocelli_1-1770195838019.png" /></span></P><P>With Exception Aggregation dimensions: WORKCENTER – MATERIAL – ID_RILEVAZIONE</P><P> </P><P>And I obtain by "BEST" (for the days selected)</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="FrancescoPetrocelli_2-1770195874658.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/368840i721075D94DE87208/image-size/medium?v=v2&px=400" role="button" title="FrancescoPetrocelli_2-1770195874658.png" alt="FrancescoPetrocelli_2-1770195874658.png" /></span></P><P>but I'm wondering how I can obtain the related value 397.</P>2026-02-02T15:29:00.243000+01:00https://community.sap.com/t5/technology-q-a/bw-modernization-shift-infoproviders-from-bw-pce-to-dsp/qaq-p/14320497BW Modernization - Shift Infoproviders from BW PCE to DSP2026-02-03T10:30:27.612000+01:00fcasuhttps://community.sap.com/t5/user/viewprofilepage/user-id/816634<P>What will happen with shifted Infoproviders as Local Table Files to DSP after shutting down BW PCE? </P>2026-02-03T10:30:27.612000+01:00https://community.sap.com/t5/technology-q-a/replication-flow-not-being-selected-as-analytic-model-dependency/qaq-p/14321648Replication Flow not being selected as Analytic model Dependency2026-02-04T16:01:33.460000+01:00NicolasRivas1991https://community.sap.com/t5/user/viewprofilepage/user-id/1976185<P>Hello everyone,</P><P>I was creating a new package for an analytic model and I noticed that neither the replication flows that replicate the tables being used in the pipeline nor the task chain that execute those RF are being selected as dependencies of that analytic model. </P><P>Why is this happening?</P><P>It happens with every asset?</P><P>How can I achieve this?</P><P>Thank you.</P>2026-02-04T16:01:33.460000+01:00https://community.sap.com/t5/technology-q-a/read-data-from-ec-pca-3/qaq-p/14323096Read data from EC_PCA_32026-02-06T15:26:06.233000+01:00Davide_Ricotti45https://community.sap.com/t5/user/viewprofilepage/user-id/2279530<P>Hi everyone,</P><P>we are trying to read data from the extractor <SPAN>EC_PCA_3 but every time we have an error saying that the field DABRZ cannot be empty. To be honest the he's right because the field is empty in the GLPCA table.</SPAN></P><P><SPAN>The problem is that we have SAP R3 and we are limited on what we can do</SPAN><SPAN> and if we try to load directly the table GLPCA we get an error because it's too big.</SPAN></P><P><SPAN>We wanted to use an extractor to use the delta load and improve the performance.</SPAN></P><P><SPAN>Do you have any suggestion?</SPAN></P><P> </P>2026-02-06T15:26:06.233000+01:00https://community.sap.com/t5/technology-q-a/connection-between-s-4hana-on-prem-and-datasphere/qaq-p/14326840Connection Between S/4HANA ON PREM and Datasphere2026-02-12T06:54:33.941000+01:00ege_yelkovan20https://community.sap.com/t5/user/viewprofilepage/user-id/2211713<P>Hi SAP Gurus,</P><P>We are facing an error like below screenshot while trying to connect between S/4HANA on prem and Datasphere.</P><P> </P><P>Our current S/4HANA version is 1709 and we don't plan to upgrade it in the near future. </P><P>Do you have any suggestions for this issue?</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ege_yelkovan20_2-1770875192646.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371768i46BAFCD83367E39F/image-size/medium?v=v2&px=400" role="button" title="ege_yelkovan20_2-1770875192646.png" alt="ege_yelkovan20_2-1770875192646.png" /></span></P><P> </P><P>BR,</P><P>Ege</P><P> </P>2026-02-12T06:54:33.941000+01:00https://community.sap.com/t5/technology-q-a/custom-data-product/qaq-p/14326886Custom Data Product2026-02-12T08:23:20.746000+01:00Venky999https://community.sap.com/t5/user/viewprofilepage/user-id/155467<DIV>Hi, Can I create a custom data product on top of a view that was created in a customer-managed space with storage type “SAP HANA Database (Disk and In‑Memory)”? When I try to create a data product, the system only shows spaces that use object storage. Why is this happening?</DIV><DIV> </DIV><DIV><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Venky999_0-1770880921040.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/371789i9B3B18E2614AC8F4/image-size/medium?v=v2&px=400" role="button" title="Venky999_0-1770880921040.png" alt="Venky999_0-1770880921040.png" /></span><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Business+Data+Cloud/pd-p/73554900100700003531" class="lia-product-mention" data-product="1249-1">SAP Business Data Cloud</a> <a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-1">SAP Datasphere</a> </P></DIV>2026-02-12T08:23:20.746000+01:00https://community.sap.com/t5/technology-q-a/is-any-cds-view-available-in-s4-pce-which-is-compatible-with-0uc-sales/qaq-p/14327416Is any CDS View available in S4 PCE which is compatible with 0UC_SALES_STATS_02 data source2026-02-12T21:00:15.887000+01:00askrajeevhttps://community.sap.com/t5/user/viewprofilepage/user-id/2004276<P>Hi All,</P><P>Is any CDS View available in S4HANA PCE,, which is got same structure as <SPAN> </SPAN>0UC_SALES_STATS_02 data source for SAP DataSphere? </P><P> </P><P>Thanks,</P><P>Raj</P>2026-02-12T21:00:15.887000+01:00https://community.sap.com/t5/technology-q-a/programmatically-retrieving-lineage-source-object-info-for-analytical/qaq-p/14327625Programmatically retrieving lineage/source object info for Analytical Models in SAP Datasphere2026-02-13T07:54:19.363000+01:00Shubhamniftyhttps://community.sap.com/t5/user/viewprofilepage/user-id/1579646<P>Hi Experts,</P><P>I am trying to understand whether it is possible to programmatically retrieve lineage metadata for an <STRONG>Analytical Model</STRONG> in SAP Datasphere.</P><P><STRONG>Use case:</STRONG><BR />Within the Datasphere UI, the <EM>Impact & Lineage</EM> view allows us to trace a field from an Analytical Model back to its underlying objects (views and source tables). I am currently developing a small web application intended to List AMs automatically along with few technical details(for example: fields, and their originating tables/views).</P><P>Using the Consumption OData API, I am already able to extract the list of fields of an Analytical Model:</P><pre class="lia-code-sample language-markup"><code>GET https://<tenant_url>/api/v1/datasphere/consumption/analytical/<space_id>/<asset_id>/$metadata</code></pre><P>This provides the semantic fields exposed by the model.</P><P><BR />However, my requirement is to enrich this information with the <STRONG>technical source information</STRONG>, specifically:</P><P>Analytical Model Field → Datasphere View Column → Underlying Table (or Remote Table)</P><P>Example:</P><P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Shubhamnifty_0-1770964938832.png" style="width: 400px;"><img src="https://community.sap.com/t5/image/serverpage/image-id/372104iD7786724D0480BE9/image-size/medium?v=v2&px=400" role="button" title="Shubhamnifty_0-1770964938832.png" alt="Shubhamnifty_0-1770964938832.png" /></span></P><P>I would like to know:</P><OL><LI><P>Is there a REST API available to retrieve lineage information similar to what is shown in the Datasphere <EM>Impact & Lineage</EM> diagram?</P></LI><LI><P>If not, are there any supported system metadata tables/views (for example via SQL access to the space) that can be used to obtain this mapping?</P></LI></OL><P>Any guidance or recommended approach would be greatly appreciated.</P><P>Thank you!</P><P><a href="https://community.sap.com/t5/c-khhcw49343/SAP+Datasphere/pd-p/73555000100800002141" class="lia-product-mention" data-product="16-1">SAP Datasphere</a> <a href="https://community.sap.com/t5/c-khhcw49343/API/pd-p/b31da0dd-f79a-4a1e-988c-af0755c2d184" class="lia-product-mention" data-product="123-1">API</a> </P>2026-02-13T07:54:19.363000+01:00https://community.sap.com/t5/technology-q-a/unable-to-filter-by-a-temporal-attr-in-a-datasphere-s-am/qaq-p/14327930Unable to filter by a temporal ATTR in a DataSphere's AM2026-02-13T12:37:12.460000+01:00mlan_cuatrecasas-1https://community.sap.com/t5/user/viewprofilepage/user-id/2281201<P>Hi everyone,</P><P>Has anyone experienced issues when filtering by an employee’s temporal attribute in an Analytical Model in Datasphere?</P><P>The issue occurs regardless of whether a reference date is used. It seems that the Analytical Model first applies the reference date filter and then the prompt filter, which prevents me from adding Year and Month prompts.</P><P>Has anyone faced a similar behavior or found a workaround?</P><P>Thanks in advance!</P>2026-02-13T12:37:12.460000+01:00