SAP HANA Big Data Strategy and Engineering Workshop (5 days)

We consider key business drivers and objectives and help you identify use cases for Big Data Analytics. We will evaluate your technical infrastructure and help design the right Big Data platform for your use cases.

We evaluate various architectural options, and Hadoop Data Lakes to come up with the design of an Enterprise Data Warehouse architecture that meets your business goals. We also recommend the right visualization option for Big Data Analytics that meets the needs of your user base.

  • Identify business drivers, objectives and use cases for Big Data Analytics
  • Evaluate Centralized Data Architectures, Logical Data Warehouse across data marts, Big Data Hadoop Architectures, and Real Time Analytics to meet requirements for the identified use cases
  • Infrastructure assessment
  • Architecture and Design of Enterprise Data Warehouse driven by SAP HANA-Hadoop integration
  • Data temperature management approaches with SAP HANA and Hadoop
  • Data visualization approach

Proof of Concept: SAP HANA-Hadoop Integration Common Use Cases

We help you build your Big Data lab within your current technology infrastructure that can be leveraged explicitly for experimentation of use cases and approaches to Big Data Analytics. This will create a “Proof of Concept” that shows how SAP HANA and Hadoop can be integrated into your existing enterprise architecture to deliver Big Data use cases.

We work with you in proving two common uses of SAP HANA-Hadoop integration, Data Temperature Management and Data Virtualization.

The Proof of Concept scope includes:

  • Establishing PoC Objectives, Scope and Success Criteria
  • Identify scenario/dataset for PoC
  • Evaluate, Recommend and Build Big Data Infrastructure
    • Archius Cloud or Customer On-Premise/Cloud platform
    • SAP HANA Installation
    • Apache Hadoop – Hortonworks or Cloudera
    • SDA/SDI configuration
    • Integrate SAP BusinessObjects Visualization and BI tools to the Big Data platform
  • Implement use case 1 – SAP HANA-Hadoop Data Temperature Management using Dynamic Tiering
    • Data Pruning and Managing space
    • Data retrieval and Query Performance
    • Data Lifecycle Manager functionality
    • Storage and Memory Management Capabilities
    • Administration – Security, Backup and Failover
    • Integration of Hadoop to the Dynamic Tiering Solution – Capabilities and Limitations
  • Implement use case 2 – Data Virtualization using SAP HANA-Hadoop
    • Data Provisioning using Smart Data Access and Smart Data Integration
    • Combine SAP HANA tables/views with Hadoop data (Hive and Spark tables) to create virtual data models
    • Single reporting platform using SAP BusinessObjects on SAP HANA and Hadoop
  • Provide a clear understanding of the total cost of ownership of the Big Data platform data lakes – is that right?