From REST to Datasphere: A CAP-based Integration Approach

We recently wrote a blog about how we integrate data from a REST service into SAP Datasphere. For that we deployed a small service on SAP BTP, which used the OPEN SQL schema and FastAPI to write this data directly into Datasphere. That service could also be triggered via REST endpoint from SAP Datasphere directly (ZPARTNER Link, SAP Community Link).  

In this blog we now choose a different setup to retrieve data and consume it into SAP Dataphere.  

We Created a CAP Application on SAP BTP. 

This CAP service connects to the SCIM API of SAP Datasphere, extracts users and roles, and makes the data consumable for analytics. It uses the following BTP services: 

  • HDI Container 
    To persist our data from the REST calls. The HDI container is deployed to the HANA Cloud instance of our Datasphere tenant, so the data can be consumed directly. 
  • BTP Credential Store 
    We store the credentials for the SCIM API in the BTP Credential Store and retrieve them in our CAP application securely. 
  • XSUAA Authorization and Trust Management 
    To secure our REST endpoints and enforce token-based access control. 
  • Cloud Foundry Space 
    Provides the runtime environment for deploying the CAP application. 

Use Case 

In our use case, we extract user information from SAP Datasphere’s own SCIM API. 

We retrieve: 

  • the list of users, 
  • all available roles, and 
  • the assignments between users and roles. 

This can be helpful for reporting on the number of users in a tenant and their corresponding roles. However, the same setup can be reused to extract and process any other REST API data into SAP Datasphere. 

Credential Store 

We create a dedicated namespace in the Credential Store and store four secrets for the SCIM API authentication flow: 

  • dsp-scim-token-url 
  • dsp-scim-base-url 
  • dsp-scim-client-id 
  • dsp-scim-client-secret 

The CAP application retrieves these secrets at runtime, ensuring no credentials are hardcoded. 

XSUAA Authorization and Trust Management Service 

We create a dedicated xsuaa service to get a Token URL and credentials. This service is bound to our CAP application so that every incoming request is authorized and protected. 

HDI Container 

We create an HDI container where the database artifacts (tables and associations) are deployed. This container is provisioned in our SAP Datasphere tenant, making the synchronized data directly available for consumption. 

CAP Application 

The heart of the solution is the CAP service DSPUsers.  

It exposes two kinds of endpoints: 

Virtual Views (live from SCIM) 

  • GET /data/UsersVH → List of users with profile info. 
  • GET /data/RolesVH → Roles with aggregated user counts. 
  • GET /data/UserRolesVH → Which users have which roles. 

These views are always fresh, since they query SCIM directly. 

Persistent Entities (stored in HDI) 

  • GET /data/Users → Users persisted in HANA. 
  • GET /data/Roles → Roles persisted in HANA. 
  • GET /data/UserRoles → User-to-role assignments persisted in HANA. 

Actions (to sync SCIM → HDI) 

  • POST /data/SyncUsersVHToUsers → Upserts SCIM users into the Users table. 
  • POST /data/SyncRolesFromSCIM → Stores aggregated roles in the Roles table. 
  • POST /data/SyncUserRolesFromSCIM → Refreshes both Roles and UserRoles. 

This design lets you both query SCIM data live and persist snapshots into HANA for modeling in Datasphere. 

The complete CAP service code is available here: Link to GitHub Repository

SAP Datasphere 

Once deployed, the HDI container is automatically integrated with our SAP Datasphere tenant. This allows: 

  • Consuming the Users, Roles, and UserRoles tables directly in Datasphere, 
  • Building reporting scenarios on top of user and role data, 
  • Joining with other business data to enrich analytics. 

To consume the entities, we need to add the HDI Container to a specific space. Also, we can now create a generic HTTP Connection. This enables to call the Endpoints for sync in a Task Chain. 

Conclusion 

The integration of a CAP-based service into SAP Datasphere showcases how SAP BTP services (HDI Container, Credential Store, XSUAA, Cloud Foundry) can be combined to securely retrieve and persist data from external REST APIs. By consuming SAP Datasphere’s own SCIM API, we demonstrated how to expose both live views for real-time insights and persistent tables for analytical modeling, all within a lightweight and extensible architecture. 

Looking ahead, this framework can be reused for any REST API integration into SAP Datasphere, opening the door to richer reporting scenarios, advanced data modeling, and seamless orchestration with Task Chains. With this foundation, organizations gain a simple but powerful pattern for bridging external services with Datasphere while maintaining enterprise-grade security and governance. 

Related Blogs


Christian has been working as a Business Intelligence Consultant at ZPARTNER since 2020. He is specialized in advanced SAP BW/4HANA, HANA native modeling and SAP Datasphere solutions. Christian has a strong technical background in ABAP programming, AMDP transformations, Python-based data processing. He has worked in projects in various industries and developed solutions for complex data extraction, integration and modeling.

×