SAP's Leading Enterprise Solutions.
Hadoop's Awesome Cluster Power.
HBuilder brings them together.

Make Big Data accessible to SAP users.

SAP, Hadoop and Spark. Even Better Together!

Deliver Big Data Applications using your existing SAP team and skills.

Combining rich functionality with high-speed analytics at scale.

SAP is a premium platform: it provides rich functionality, covering all aspects of an Enterprise. Hadoop and Spark are engineered to exploit the power of clusters of commodity hardware. Moving long-running computation from SAP to Hadoop/Spark frees up valuable hardware resources. And it allows to store virtually unlimited source data and keep complete history of your business transactions.

HBuilder helps you connect both of these worlds.

HBuilder's SAP module lets you easily connect both worlds and develop SAP-style native Hadoop/Spark applications. The SAP team can immediately apply their skills and get productive right away. This translates directly into higher ROI. And gives the developers a huge motivational boost - seeing that their skills are valueable for next-generation applications.

• Connect with SAP ERP, CRM, and BW

• Works with classic SAP, HANA optional

• Load Data from SAP directly in Spark RDDs

• Import DDIC Metadata Into BDA Builder

• Re-Use Your ABAP skills with BDA Builder

Hadoop and Spark are not just for Web Companies.

Manufacturing, Automotive, Retail, Financial Services, Utilities, And Many Other Industries
Need to Master Big Data Technologies, Too!

Connect Systems

HANA & Non-HANA/Classic Systems

HBuilder helps you connect HANA to Hadoop/Spark. This is a perfect combination: it delivers the benefits of a high-value in-memory, transactional database and cost-efficient scale-out clusters.

Clients with a classic set-up, i.e. with a traditional, disk-based RDBMS that do not use HANA need not worry: those system are fully supported. Integrating these systems gives you the opportunity to improve the performance of your transactional SAP system dramatically, at very low cost.

ERP, CRM, and Business Warehouse

Move long-running batch jobs from SAP ERP and CRM to Hadoop and have Spark execute them on a cluster of computers. This does not only speed up the batch computation massively but it also frees up hardware resources on the SAP transaction system.

Using Hadoop/Spark as the ETL layer for SAP BW will simultaneously reduce latency and allow to store raw source data: steps that need a lot of time such as delta detection or lookup's are dramatically increased while cluster hardware provides virtually unlimited storage space.

Share Data

Replicate

HBuilder contains adapters to read from and write to SAP systems. Data can be copied and transferred between the two systems. All this requires is a simple setup via an intuitive GUI. Data compression can be enabled with the click of a button. This also helps speed up the transfer time. Beyond a full copy, with some additional customization it is also possible to just select subset of a source table, such as particular time slices.

HBuilder automatically detects and extracts the data changes ("deltas") only stores this data. This ensure that the complete history of a data set is stored in a highly efficient manner.

On-Demand

Sometimes a particular data set does not have to be persisted in the cluster. In that case, the table can be kept at its origin, ie. the SAP database. It is only read and transferred to the cluster when needed. This is often the case for smaller data sets that need to be accessed to look up information and provide context.

The HBuilder GUI makes this step extremely easy: just drag the table with your mouse into your application and HBuilder reads the data from SAP and loads it directly into a Spark DataFrame where it is available for any kind of cluster processing.

Integrate Development

Copy Metadata

In order to develop applications you need to have the corresponding metadata for a dataset. HBuilder lets you import data elements, structures, and tables from SAP's DDIC

This makes development much more productive. Programmers can work on the same abstractions that they are already familiar with and they can combine it new data sources to build next-gen applications.

Re-use ABAP skills.

Virtually all SAP experts know and work with ABAP, SAP's programming language. This means that a large resource pool is available. And a huge set of problems has been codified in ABAP.

HBuilder includes PL/B which implements core elements of ABAP. This allows SAP experts to get productive immediately and write native Hadoop/Spark code with the skills they have acquired over a long time.

Contact

Austrasse 40

9494 Vaduz

Phone: +423 79 46067

Email: info@wireframe.li