Breaking New Ground: NCUM Data Modeling in SAP Datasphere (DSP) vs. BW/4HANA – Part 3: NCUM in SAP Datasphere (DSP)

In Part 2, we explored how SAP BW/4HANA handles non-cumulative (NCUM) key figures through inventory-enabled aDSOs, leveraging built-in features like record types, time references, and automated delta handling to calculate accurate stock levels. The system processes movements through inbound, active, and reference tables, ensuring efficient and reliable inventory reporting.

Now in Part 3, we shift to SAP Datasphere to replicate this logic in a cloud-native context. Unlike BW/4HANA, Datasphere requires explicit modeling of record types and flow measures but offers greater flexibility in data integration and semantic design. We’ll walk through a step-by-step example—from uploading data and defining semantic types to creating an analytic model with non-cumulative behavior.

Now, let’s bring this example into SAP Datasphere and see how it compares to SAP BW/4HANA.

Setting Up our NCUM example in SAP Datasphere

To get started, switch to your SAP Datasphere space and navigate to the Data Builder.

  1. Click on the upload symbol (2) to add a CSV file (3).

2. Configure the file as needed.

3. In the window that appears, you can perform transformations such as:

  • Renaming columns
  • Defining keys
  • Adding new culloms

Requirements for Non-Cumulative Key Figures in SAP Datasphere

Non-cumulative key figures can only be created if the following conditions are met:

1. Flow Measure Configuration

  • The key figure acting as the flow measure must be of type „Fact“.

2. Record Type Field

  • A field must store the Record Type of the measure.
  • It must have a data type of „Integer“.
  • It can only contain the values:
    • 0 (Delta),
    • 1 (Reference Point) or
    • 2 (Delta Included in Reference Point)

3. Time Dimension

  • A time dimension must exist in the fact source.
  • It must have a data type of „Date“.
  • It must have an associated time dimension.


Open the newly created local table and perform the following steps:

  • Change the Semantic Usage from „Relational Dataset“ to „Fact“.
  • Go to Attributes and create a Unit column (optional but recommended).
  • Change its Semantic Type to „Unit of Measure“ and set its constant value to „EA“ (Each).
  • Drag and drop the column containing the movement key figure into the „Measures“ section.
  • Set its Semantic Type to „Quantity with Unit“.
  • Choose the „Unit Column“ that holds the unit information.

Click on “+” near Associations and:

  • Add a time dimension.
  • Map the time dimension accordingly.

Creating an Analytic Model

Now that all requirements are met, let’s create an Analytic Model:

1. Drag and drop the newly created table into the working area of the Analytic Model.

2. Confirm the prompt to add associations.

3. Click on “+” next to Measures and select “Non-Cumulative Measure”.

4. Provide a Description and a Technical Name.

5. Assign:
    ○ The Dimension that holds Info Record data.
    ○ The Time Dimension required for exception aggregation.

6. (Optional) Define a time frame for non-cumulative reporting.

Finalizing & Validating

  • Click on the top-right corner “Preview”.
  • Drill down on Plant, Material, and Calendar Date.
  • Apply filters if necessary.
  • If everything is configured correctly, your result should resemble the following:

Summary of Part 3

In this part, we demonstrated how to:

  • Upload and structure stock movement data in SAP Datasphere.
  • Configure semantic properties such as units, measures, and time dimensions.
  • Define and assign record types (0, 1, 2) to enable non-cumulative behavior.
  • Create an Analytic Model that calculates stock levels accurately across time.

This approach replicates the logic of BW/4HANA’s inventory handling—adapted for the flexibility and openness of a cloud-native data environment.

Outlook Part 4

Coming up in Part 4, we’ll take this model one step further by integrating delta loads from an S/4HANA source system—laying out the technical steps required for end-to-end inventory data flow in a real-world enterprise scenario.

Related Blogs

×