SAP HANA Tiering: The Pressures of Data Growth
Fri, 01 May 2020 14:24:20 -0000|
Read Time: 0 minutes
“Data growth is accelerating!” Quotes like this appear frequently in studies, papers, and blogs. You will not find more data growth quotes in this blog article, however, because I think it is more interesting to look at this from a data management investment policy perspective. A data management investment policy has similar benefits to a corporate travel investment policy—the goal is to efficiently maximize the bottom line. In most customer accounts, the SAP HANA licensing investment happens early and the business must maximize the benefits in the long term. First cost is not the sole driver because volume, variety, veracity, and velocity are all considerations when a company is looking for a strategy. Evaluating a data management investment policy for the long term can be complex. SAP provides Native Storage Extensions (NSE) to address both cost pressures and the intelligent placement of data over time.
SAP HANA with NSE offers the functionality of tiering data using different storage solutions based on the age of the data. The NSE data management policy categorizes data into three classes: hot, warm, or cold. This blog post focuses on the hot and warm data tiers. Hot data can use both volatile and nonvolatile memory, as follows:
- DRAM: DRAM is the fastest storage media. DRAM is volatile, however, meaning the data must be loaded into memory on restart of the database or server.
- PMEM: Persistent Memory (or PMEM) is faster than SSD storage but not as fast as DRAM. PMEM is also nonvolatile memory, meaning the data does not have to be loaded into memory on restart of the database or server. PMEM is used for the SAP HANA Fast Restart option.
SAP HANA on-premises Native Storage Extension
If you are interested in learning more about maximizing your data management investment strategy, the SAP HANA TDI on Dell EMC PowerEdge Servers Validation Guide provides detailed configurations. The hot data tier both drives the fastest performance and is the most expensive tier (hardware + SAP HANA licensing + annual support). Maximizing the performance-to-cost trade-off of hot data placement requires consideration of two factors:
- Keep actively used data that is critical to the business in the hot data tier
- Migrate less frequently used data out of the hot data tier to sustain costs
The first consideration is a performance guideline for when the responsiveness of the database and applications is at a premium for the business. Data that is less frequently used can be placed in the warm tier to minimize the impact on queries on the hot data tier. Another benefit is related to SAP HANA restarts. For example, planned maintenance events involving Linux operating system or SAP HANA database updates can require a restart. An SAP HANA system with NSE could have less data in the hot tier compared to the same SAP HANA system without NSE, thus improving restart times.
NOTE: SAP HANA also has a Fast Restart Option that uses file storage to speed up restarts. Fast Restart leverages PMEM to accelerate file storage access, significantly reducing the start time of the database. SAP HANA Fast Restart applies to scenarios in which only SAP HANA is restarted and not the operating system.
The second consideration is an avoidance guideline that sustains existing investments while mitigating additional ones. Success could be defined as a strategy in which performance increases with each new server generation while SAP HANA costs remain constant if the size of the hot data tier remains the same. The business impact is a continual increase in performance combined with efficiently maximizing the bottom line.
The warm data tier is for less frequently accessed data that occasionally resides in SAP HANA memory. If kept in memory, warm data accelerates costs, mainly through the additional licensing that is needed to increase the memory size. To mitigate the impact of rapid data growth, maximize the usage of the warm data tier. Keep in mind that the warm data tier is limited to four times the size of the hot data tier. For example, a hot data tier of 1 TB means the warm data tier can be up to 4 TB. The warm data tier also cannot exceed 10 TB in size. The 10 TB maximum is a first-release restriction.
Data in the warm tier is transactionally consistent with the hot data tier. This means that the warm data tier must be protected in conjunction with the hot data tier so that the entire database backup is consistent. While the hot and warm data tiers are transactionally consistent, they differ in how data is loaded into memory. The hot data tier is ”column loadable,” meaning the columnar tables are loaded into memory. In contrast, the warm data tier is “page loadable,” meaning granular portions of data are loaded into memory or partially in memory. The page-loadable design has two key benefits for the warm data tier:
- It does not significantly impact the memory footprint.
- It does not significantly impact the start time of the database.
Use of the warm data tier depends on the SAP HANA NSE buffer cache. This buffer cache is enabled by default and is initially sized as 10 percent of SAP HANA memory (for the sizing reference, see the SAP HANA Administration Guide for SAP HANA Platform 2.0 SPS 04). For example, the NSE buffer size is recommended to be at least 12.5 percent of the total size of the warm data tier. You can modify the NSE buffer cache size by using the ALTER SYSTEM ALTER CONFIGURATION command.
Warm Data Tier
Overall, use of the warm data tier enables customers to balance fast performance with increased data volumes while minimizing cost, thus achieving greater value. Customers have the flexibility to design an amazingly fast warm data tier with storage I/O latencies measured in microseconds, narrowing the difference between the hot and warm data tiers in terms of performance.
Dell Technologies has a team of experienced SAP HANA experts that can assist with accurate sizing and design of an infrastructure solution for your databases. Our goal is to work closely with you to maximize the value of NSE and create an extremely fast warm data tier that narrows the performance gap with the hot data tier. Your Dell Technologies representative can put you in contact with one of our SAP HANA experts.
Related Blog Posts
On-prem vs. Public Cloud: Understanding the true cost of running steady-state workloads in the public cloud
Thu, 15 Apr 2021 13:19:54 -0000|
Read Time: 0 minutes
Infrastructure is not the only thing that may be more expensive in the public cloud than on-premises.
Organizations have found value with their investments in Public Cloud; however, we've also heard the stories about some workloads or use cases that were moved to the public cloud that led to buyer’s remorse as their public cloud infrastructure costs were demonstrably higher than on-premises. The initial allure of the public cloud was quickly offset by higher costs, but also hidden costs that many don’t realize until they make significant investments that include systems integrators and consultants.
Software licenses for many popular business applications are not cost-optimized for public cloud environments. In addition to potentially high cloud infrastructure costs, they may also face software true-ups in cases of resource inefficiency. One way to solve for this is to abandon traditional software licenses in favor of a new software subscription with a new metering method that is more conducive to cloud environments. However, this has the potential for a higher overall software TCO, the license conversion exercise may be confusing because metering methods are typically different, usage forecasting may not be as predictable, and there are hidden depreciation costs of the old licensing which may further drive the TCO upside down.
To better understand the costs of running a steady-state workload such as SAP in the public cloud vs. on-premises, Dell Technologies commissioned Krystallize Technologies to conduct an evaluation. Many studies have already been carried out on Private vs. Public cloud costs as well as data egress charges, cloud support, and staff retraining. Instead, Krystallize focused on comparing the hardware costs depreciated over 3 years to the 3 year cost of public cloud IaaS services, even though the useful on-premises server lifespan is 5 years.
The full report: Krystallize Technologies SAP HANA PowerEdge Whitepaper
While the cost savings in infrastructure between public cloud and on-premises is significant enough to attract any CFO's attention, consider also license efficiency. Many customers have invested significant CAPEX dollars into enterprise applications that are licensed by the core and, as a result, need to be deployed efficiently.
Krystallize Technologies found that with a similar amount of memory, a 72 vCPU (36 core) PowerEdge server and a 96 vCPU compute instance from a public cloud provider were able to accomplish the same amount of work1. This finding could illustrate that the cost of running comparable workloads licensed by core, such as SQL Server, Oracle and in some cases SAP, may require 33% to 167% more software licenses to run in the public cloud.2
Take for instance, an Oracle Enterprise Edition workload deployed on the PowerEdge server - it would cost $521,550 USD for acquisition and first year support. The same workload deployed on the 96 vCPU instance would require potentially spending up to 2.6 times as much, $1,390,800 USD for the acquisition and first year support!3
Public Cloud may be the right choice for organizations looking for temporary capacity, whether it’s an unexpected computing demand or an unproven DevOps environment; however, to steal a phrase I heard a colleague use, it is cheap to fail in the cloud but expensive to succeed.
When an organization has a workload that has moved past the POC phase and is in production, that workload provides a valuable outcome or business function and must run 24/7/365. Workloads, such as SAP or Oracle, that are always-on and have a steady state performance profile no longer benefit from the elasticity of public cloud.
Organizations that have invested considerable capital in software licenses may find they cannot be efficiently deployed in the Public Cloud. The good news is that there are multiple avenues, such as hybrid cloud, for organizations to explore in order to maintain applications and associated licensing where they can keep costs low, while using public cloud for the right workloads. Customers seeking a pay-for-use model can achieve this on-premises with Dell Technologies on Demand and will soon benefit from Dell Technologies Project Apex as well.
To Learn More
- Krystallize Technologies SAP HANA PowerEdge Whitepaper
- Krystallize Technologies SAP HANA PowerEdge Infographic
- Dell Technologies Solutions for SAP
- Dell Technologies PowerEdge Servers
- Dell Technologies on Demand
- Dell Technologies Project Apex
- Dell Technologies Competitive Advantage
1 Based on the Krystallize Technologies whitepaper commissioned by Dell Technologies, “Krystallize Technologies SAP HANA PowerEdge Whitepaper”, comparing cost-performance of SAP HANA running a benchmark load on a physical and cloud provider environment over a 3-year period, Nov. 2019. Actual results may vary.
2 Based on Dell analysis, January 2021, comparing the number of licenses required to run the same SAP and Oracle load with similar resources on-premises on a Dell EMC PowerEdge R940 vs a Cloud Service Provider. Actual results will vary based on configuration, environment, and other variable factors.
3 Based on Dell cost analysis, January 2021, comparing the number of licenses required to run the same workload with similar resources on-premises on a Dell EMC PowerEdge server vs a Cloud Service Provider. Oracle Enterprise Edition licensing costs are in US dollars obtained from a publicly available price list. Actual costs will vary based on configuration, environment, and other variable factors.
Deploying SAP HANA at the Rugged Edge
Mon, 14 Dec 2020 18:38:19 -0000|
Read Time: 0 minutes
SAP HANA is one of those demanding workloads that has been steadfastly contained within the clean walls of the core data center. However, this time last year VxRail began to chip away at these walls and brought you SAP HANA certified configurations based on the VxRail all-flash P570F workhorse and powerful quad socket all-NVMe P580N. This year, we are once again in the giving mood and are bringing SAP HANA to the edge. Let us explain.
Dell Technologies defines the edge as “The edge exists wherever the digital world & physical world intersect. It’s where data is securely collected, generated and processed to create new value.” This is a very broad definition that extends the edge from the data center to oil rigs, to mobile response centers for natural disasters. It is a broad claim not only to provide compute and storage in such harsh locations, but also to provide enough of it that meets the strict and demanding needs of SAP HANA, all while not consuming a lot of physical space. After all -- it is the edge where space is at a premium.
Shrinking the amount of rack space needed was the easier of the two challenges, and our 1U E for Everything (or should that be E for Everywhere?) was a perfect fit. The all-flash E560F and all-NVMe E560N, both of which can be enhanced with Intel Optane Persistent Memory, can be thought of as the shorter sibling of our 2U P570F, packing a powerful punch with equivalent processor and memory configurations.
While the E Series fits the bill for space constrained environments, it still needs data center like conditions. This is not the case for the durable D560F, the tough little champion that joined the VxRail family in June of this year, and which is now the only SAP HANA certified ruggedized platform in the industry. Weighing in at a lightweight 28 lbs. and a short depth of 20 inches, this little fighter will run all day at 45°C with eight hour sprints of up to 55°C, all while enduring shock, vibration, dust, humidity, and EMI, as this little box is MIL-STD 810G and DNV-GL Maritime certified. In other words, if your holiday plans involve a trip to hot sand beaches, a ship cruise through a hurricane, or an alpine climb, and you’re bringing SAP HANA with you (we promise we won’t ask why), then the durable D560F is for you.
The best presents sometimes come in small packages. So, we won’t belabor this blog with anything more than to announce that these two little gems, the E560 and the D560, are now SAP HANA certified.
Author: David Glynn, Sr. Principal Engineer, VxRail Tech Marketing
360° View: VxRail D Series: The Toughest VxRail Yet
Video: HCI Computing at the Edge
Solution brief: Taking HCI to the Edge: Rugged Efficiency for Federal Teams
SAP Certification link: Certified and Supported SAP HANA® Hardware Directory