Your Browser is Out of Date

ShareDemos uses technology that works best in other browsers.
For a full experience use one of the browsers below

Blogs

Short articles related to Dell Technologies solutions for Microsoft SQL Server

Blogs(12)

Tag :

All Tags

Author :

All Authors

SQL Server Unity Red Hat Microsoft Kubernetes OpenShift Big Data Clusters

Dell Technologies partners with Microsoft and Red Hat running SQL Server Big Data Clusters on OpenShift

Doug Bernhardt Steve Wanless

Mon, 06 Jul 2020 18:39:56 -0000

|

Read Time: 0 minutes

Introduced with Microsoft SQL Server 2019, SQL Server Big Data Clusters allow customers to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes. Complete info on Big Data Clusters can be found in the Microsoft documentation. Many Dell Technologies customers are using Red Hat OpenShift Container Platform as their Kubernetes platform of choice and with this and many other solutions we are leading the way on OpenShift applications.


In Cumulative Update 5 (CU5) of Microsoft SQL Server 2019 Big Data Clusters (BDC), OpenShift 4.3+ is supported as a platform for Big Data Clusters. This has been a highly anticipated launch as customers not only realize the power of BDC and OpenShift but also look for the support of Dell Technologies, Microsoft, and Red Hat to run mission-critical workloads. Dell Technologies has been working with Microsoft and Red Hat to develop architecture guidance and best practices for deploying and running BDC on OpenShift.  


For this effort we utilized the Databricks’ TPC-DS Spark SQL kit to populate a dataset and run a workload on the OpenShift 4.3 BDC cluster to test the various architecture components of the solution. The TPC-DS benchmark is a popular database benchmark used to evaluate performance in decision support and Big Data environments. 


Based on our testing we were able to achieve linear scale of our workload while fully exercising our OpenShift cluster consisting of 12 Dell EMC R640 PowerEdge Servers and a single Dell EMC Unity 880F storage array.   

  Total time of all queries run for 10,20,and 30TB datasets 


As a result of this testing, a fully detailed OpenShift reference architecture and a best practices paper for running Big Data Clusters on Dell EMC Unity storage are under way and will be published soon. More information on Dell Technologies solutions for OpenShift can be found on our OpenShift Info Hub.  Additional information on Dell Technologies for SQL Server can be found on our Microsoft SQL webpage. 


Read Full Blog
SQL Server Microsoft security

Database security methodologies of SQL Server

Anil Papisetty

Mon, 01 Jun 2020 23:35:25 -0000

|

Read Time: 0 minutes

In general, security touches every aspect and activity of an information system. The subject of security is vast, and we need to understand that security can never be perfect. Every organization has unique way of dealing with security based on their requirements. In this blog, I describe database security models and briefly review SQL Server security principles. 

A few definitions:

  • Database: A collection of information stored in computer
  • Security: Freedom from danger
  • Database security: The mechanism that protects the database against intentional or accidental threats or that protects it against malicious attempts to steal (view) or modify data

Database security models

Today’s organizations rely on database systems as the key data management technology for a large variety of tasks ranging from regular business operations to critical decision making. The information in the databases is used, shared, and accessed by various users. It needs to be protected and managed because any changes to the database can affect it or other databases. 

The main role of a security system is to preserve integrity of an operational system by enforcing a security policy that is defined by a security model. These security models are the basic theoretical tools to start with when developing a security system. 

Database security models include the following elements:

  • Subject: Individual who performs some activity on the database
  • Object: Database unit that requires authorization in order to manipulate 
  • Access mode/action: Any activity that might be performed on an object by a subject 
  • Authorization: Specification of access modes for each subject on each object
  • Administrative rights: Who has rights in system administration and what responsibilities administrators have
  • Policies: Enterprise-wide accepted security rules
  • Constraint: A more specific rule regarding an aspect of an object and action 

Database security approaches

A typical DBMS supports basic approaches of data security—discretionary control, mandatory control, and role-based access control. 

Discretionary control: A given user typically has different access rights, also known as privileges, for different objects. For discretionary access control, we need a language to support the definition of rights—for example, SQL. 

Mandatory control: Each data object is labeled with a certain classification level, and a given object can be accessed only by a user with a sufficient clearance level. Mandatory access control is applicable to the databases in which data has a rather static or rigid classification structure—for example, military or government environments.

In both discretionary and mandatory control cases, the unit of data and the data object to be protected can range from the entire database to a single, specific tuple.

Role-based access control (RBAC): Permissions are associated with roles, and users are made members of appropriate roles. However, a role brings together a set of users on one side and a set of permissions on the other, whereas user groups are typically defined as a set of users only.

Role-based security provides the flexibility to define permissions at a high level of granularity in Microsoft SQL, thus greatly reducing the attack surface area of the database system.

RBAC mechanisms are a flexible alternative to mandatory access control (MAC) and discretionary access control (DAC).

RBAC terminology:

  • Objects: Any system, resource file, printer, terminal, database record, etc.
  • Operations: An executable image of a program, which upon invocation performs some function for the user.
  • Permissions: An approval to perform an operation on one or more RBAC-protected objects   
  • Role: A job function within the context of an organization with some associated semantics regarding the authority and responsibility conferred on the user assigned to the role.

For more information, see Database Security Models — A Case Study

Note: Access control mechanisms regulate who can access which data. The need for such mechanisms can be concluded from the variety of actors that work with a database system—for example, DBA, application admin and programmer, and users. Based on actor characteristics, access control mechanisms can be divided into three categories – DAC, RBAC, and MAC.

Principles of SQL Server security

A SQL Server instance contains a hierarchical collection of entities, starting with the server. Each server contains multiple databases, and each database contains a collection of securable objects. Every SQL Server securable has associated permissions that can be granted to a principal, which is an individual, group, or process granted access to SQL Server. 

For each security principal, you can grant rights that allow that principal to access or modify a set of the securables, which are the objects that make up the database and server environment. They can include anything from functions to database users to endpoints. SQL Server scopes the objects hierarchically at the server, database, and schema levels:

  • Server-level securables include databases as well as objects such as logins, server roles, and availability groups.
  • Database-level securables include schemas as well as objects such as database users, database roles, and full-text catalogs.
  • Schema-level securables include objects such as tables, views, functions, and stored procedures.

SQL Server authentication approaches include: 

  • Authentication: Authentication is the SQL Server login process by which a principal requests access by submitting credentials that the server evaluates. Authentication establishes the identity of the user or process being authenticated. SQL Server authentication helps ensure that only authorized users with valid credentials can access the database server. SQL Server supports two authentication modes, Windows authentication mode and mixed mode. 
  • Windows authentication is often referred to as integrated security because this SQL Server security model is tightly integrated with Windows.
  • Mixed mode supports authentication both by Windows and by SQL Server, using usernames and passwords. 
  • Authorization: Authorization is the process of determining which securable resources a principal can access and which operations are allowed for those resources. Microsoft SQL -based technologies support this principle by providing mechanisms to define granular object-level permissions and simplify the process by implementing role-based security. Granting permissions to roles rather than users simplifies security administration.
  • It is a best practice to use server-level roles for managing server-level access and security, and database roles for managing database-level access. 
  • Role-based security provides the flexibility to define permissions at a high level of granularity in Microsoft SQL, thus greatly reducing the attack surface area of the database system.

Here are a few additional recommended best practices for SQL Server authentication: 

  • Use Windows authentication. 
  • Enables centralized management of SQL Server principals via Active Directory
  • Uses Kerberos security protocol to authenticate users
  • Supports integrated password policy enforcement including complexity validation for strong passwords, password expiration, and account lockout
  • Use separate accounts to authenticate users and applications. 
  • Enables limiting the permissions granted to users and applications 
  • Reduces the risks of malicious activity such as SQL injection attacks
  • Use contained database users. 
  • Isolates the user or application account to a single database
  • Improves performance, as contained database users authenticate directly to the database without an extra network hop to the master database 
  • Supports both SQL Server and Azure SQL Database, as well as Azure SQL Data Warehouse

Conclusion

Database security is an important goal of any data management system. Each organization should have a data security policy, which is set of high-level guidelines determined by:

  • User requirements 
  • Environmental aspects
  • Internal regulations
  • Government laws

Database security is based on three important constructs—confidentiality, integrity, and availability. The goal of database security is to protect your critical and confidential data from unauthorized access. 

References

https://docs.microsoft.com/en-us/dotnet/framework/data/adonet/sql/overview-of-sql-server-security

https://sqlsunday.com/2014/07/20/the-sql-server-security-model-part-1/

https://www.red-gate.com/simple-talk/sysadmin/data-protection-and-privacy/introduction-to-sql-server-security-part-1/

 

Read Full Blog
SQL Server 2019 Microsoft Big Data Cluster

Manage and analyze humongous amounts of data with SQL Server 2019 Big Data Cluster

Anil Papisetty

Thu, 07 May 2020 18:50:24 -0000

|

Read Time: 0 minutes

A collection of facts and statistics for reference or analysis is called data, and, in a way, the term “big data” is a large sum of data. The big data concept has been around for many years, and the volume of data is growing like never, which is why data is a hugely valued asset in this connected world. Effective big data management enables an organization to locate valuable information with ease, regardless of how large or unstructured the data is. The data is collected from various sources including system logs, social media sites, and call detail records.

The four V's associated with big data are Volume, Variety, Velocity, and Veracity:

  • Volume is about the size—how much data you have.
  • Variety means that the data is very different—that you have very different types of data structures.
  • Velocity is about the speed of how fast the data is getting to you.
  • Veracity, the final V, is a difficult one. The issue with big data is that it is very unreliable.

SQL Server Big Data Clusters make it easy to manage this complex assortment of data.

You can use SQL Server 2019 to create a secure, hybrid, machine learning architecture starting with preparing data, training a machine learning model, operationalizing your model, and using it for scoring. SQL Server Big Data Clusters make it easy to unite high-value relational data with high-volume big data.

Big Data Clusters bring together multiple instances of SQL Server with Spark and HDFS, making it much easier to unite relational and big data and use them in reports, predictive models, applications, and AI. 

In addition, using PolyBase, you can connect to many different external data sources such as MongoDB, Oracle, Teradata, SAP HANA, and more. Hence, SQL Server 2019 Big Data Cluster is a scalable, performant, and maintainable SQL platform, data warehouse, data lake, and data science platform that doesn’t require compromising between cloud and on-premises. Components include:

Controller

The controller provides management and security for the cluster. It contains the control service, the configuration store, and other cluster-level services such as Kibana, Grafana, and Elastic Search.

Compute pool

The compute pool provides computational resources to the cluster. It contains nodes running SQL Server on Linux pods. The pods in the compute pool are divided into SQL compute instances for specific processing tasks.

Data pool

The data pool is used for data persistence and caching. The data pool consists of one or more pods running SQL Server on Linux. It is used to ingest data from SQL queries or Spark jobs. SQL Server Big Data Cluster data marts are persisted in the data pool.

Storage pool

The storage pool consists of storage pool pods comprising SQL Server on Linux, Spark, and HDFS. All the storage nodes in a SQL Server Big Data Cluster are members of an HDFS cluster.

Following is the reference architecture of SQL Server 2019 on Big Data Cluster: 

Reference: https://docs.microsoft.com/en-us/sql/big-data-cluster/big-data-cluster-overview?view=sqlallproducts-allversions 

Big data analysis

Data analytics is the science of examining raw data to uncover underlying information. The primary goal is to ensure that the resulting information is of high data quality and accessible for business intelligence as well as big data analytics applications. Big Data Clusters make machine learning easier and more accurate by handling the four Vs of big data:


The impact of the Vs on analytics

How a Big Data Cluster helps 

Volume

The greater the volume of data processed by a machine learning algorithm, the more accurate the predictions will be.

Increases the data volume available for AI by capturing data in scalable, inexpensive big data storage in HDFS and by integrating data from multiple sources using PolyBase connectors.

Variety 

The greater the variety of different sources of data, the more accurate the predictions will be.

Increases the number of varieties of data available for AI by integrating multiple data sources through the PolyBase connectors. 

Velocity

Real-time predictions depend on up to-date data flowing quickly through the data processing pipelines.

Increases the velocity of data to enable AI by using elastic compute and caching to speed up queries. 

Veracity

Accurate machine learning depends on the quality of the data going into the model training.

Increases the veracity of data available for AI by sharing data without copying or moving data, which introduces data latency and data quality issues. SQL Server and Spark can both read and write into the same data files in HDFS.

Cluster management

Azure Data Studio is the tool that data engineers, data scientists, and DBAs use to manage databases and write queries. Cluster admins use the admin portal, which runs as a pod inside the same namespace as a whole cluster and provides information such as status of all pods and overall storage capacity.

Azure Data Studio is a cross-platform management tool for Microsoft databases. It’s like SQL Server Management Studio on top of the popular VS Code editor engine, a rich T-SQL editor with IntelliSense and plug-in support. Currently, it’s the easiest way to connect to the different SQL Server 2019 endpoints (SQL, HDFS, and Spark). To do so, you need to install Data Studio and the SQL Server 2019 extension.

If you have a Kubernetes infrastructure, you can deploy this with a single server cluster in single command and have a cluster in about 30 minutes.

If you want to install SQL Server 2019 Big Data Cluster on your on-premises Kubernetes cluster, you can find an official deployment guide for Big Data Clusters on Minikube in Microsoft docs.

Conclusion

Planning is everything and good planning will get a lot of problems out of the way, especially if you are thinking about streaming data and real-time analytics. 

When it comes to technology, organizations have many different types of big data management solutions to choose from. Dell Technologies solutions for SQL Server help organizations achieve some of the key benefits of SQL Server 2019 Big Data Clusters:

  • Insights to everyone: Access to management services, an admin portal, and integrated security in Azure Data Studio, which makes it easy to manage and create a unified development and administration experience for big data and SQL Server users
  • Enriched data: Data using advanced analytics and artificial intelligence that’s built into the platform
  • Overall data intelligence:
    • Unified access to all data with unparalleled performance 
    • Easily and securely manage data (big/small)
    • Build intelligent apps and AI with all data 
  • Management of any data, any size, anywhere: Simplified management and analysis through unified deployment, governance, and tooling
  • Easy deployment and management of using Kubernetes-based big data solution built in to SQL Server 

To make better decisions and to gain insights from data, large, small, and medium-size enterprises use big data analysis. For information about how the SQL solutions team at Dell help customers store, analyze, and protect data with Microsoft SQL Server 2019 on Big Data Cluster technologies, see the following links:

https://www.delltechnologies.com/en-us/big-data/solutions.htm#dropdown0=0

https://infohub.delltechnologies.com/t/sql-server/

https://infohub.delltechnologies.com/t/microsoft-sql-server-2019-big-data-clusters-a-big-data-solution-using-dell-emc-infrastructure/


Read Full Blog
SQL Server 2019 Linux containers XtremIO Microsoft VxFlex CSI plug-in

SQL Server in containers: Dell EMC CSI plug-in—It's about manageability!

Sam Lucido

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

A picture can be worth a thousand words, however, not every slide in a presentation is self-explanatory and sometimes even the speaker notes don’t provide enough real estate to cover the full meaning of the content. That happened to me recently with this slide in a technical presentation that I created: 

The unanswered question was what does this sentence mean? - “Get fixes and upgrades faster as Dell EMC’s plug-in doesn’t require Kubernetes updates and upgrades!”  I wrote this blog give more background and details about that statement.  Before we can get to that, let’s discuss the value that the CSI plug-in has for customers using XtremIO X2 and VxRack FLEX. The CSI is a standard used by Dell EMC and other storage providers to provide an interface for container orchestration systems to expose storage services to containers. Thus, the CSI plug-in enables orchestration between containers and storage via Kubernetes. Other orchestration systems such as Mesos, Docker, and Cloud Foundry also use the same CSI specification for managing containers and storage together.

The CSI plug-in has another advantage for both orchestration systems (like Kubernetes) and the storage providers. For example, Kubernetes development can progress independently without requiring storage vendors to check code into the core Kubernetes repository. Similarly, the storage vendors update the CSI plug-in only when required and not with every update or upgrade of Kubernetes. Overall there is less complexity for both Kubernetes developers and storage vendors because the CSI plug-in simplifies the integration between the orchestration and storage layers. Thus, the CSI plug-in enables faster fixes and upgrades by Dell EMC to work with Kubernetes. I hope that answers the question from above.   You can also take a look at this Kubernetes blog that goes into greater detail: Introducing Container Storage Interface (CSI) Alpha for Kubernetes.

We also recently wrote a white paper about SQL Server Containers that provides an overview of how the XtremIO X2 features available with our CSI plug-in can be used with SQL Server 2019 Linux containers . Here is a shortcut to the CSI plug-in overview in the paper. With the CSI plug-in, the Kubernetes administrator can:

  • Dynamically provision and decommission volumes
  • Attach and detach volumes from a host node
  • Mount and unmount a volume from a host node

The Kubernetes administrator can even use the XtremIO X2 snapshot capabilities to provision a copy of the SQL Server. It’s these capabilities that really make automation and orchestration of SQL Server containers easier and faster. Want to learn more? The SQL Server Containers white paper is the right starting place because it takes you through the technology and shows how the XtremIO X2 CSI plug-in with Kubernetes and Docker can address traditional challenges.

Please rate this blog and provide us with ideas for future solutions. Thanks!


Read Full Blog
SQL Server Microsoft

The new DBA role—Time to get your aaS in order

Robert F. Sonders

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

Yes, a “catchy” little title for a blog post but aaS is a seriously cool and fun topic and the new DBA role calls for a skill set that’s incredibly career-enhancing. DBA teams, in many cases, will be leading the data-centric revolution! The way data is stored, orchestrated, virtualized, visualized, secured and ultimately, democratized.

Let’s dive right in to see how aaS and the new Hybrid DBA role are shaking up the industry and what they’re all about!

The Future-ready Hybrid DBA

I chatted about the evolution of the DBA in a previous blog within this series called Introduction to SQL Server Data Estate Modernization. I can remember, only a few short years ago, many of my peers were vocalizing the slow demise of the DBA. I never once agreed with their opinion. Judging from the recent data-centric revolution, the DBA is not going anywhere.

But the million-dollar question is how will the DBA seamlessly manage all the different attributes of data mentioned above? By aligning the role with the skill that will be required to exceed in exceptional fashion.

Getting Acquainted with the aaS Shortlist

To begin getting assimilated with what aaS has in store, the DBA will need to get his/her aaS in order and become familiar with the various services. I am not a huge acronym guy (in meetings I often find that folks don’t even know the words behind their own acronyms), so let’s spell out a shortlist of aaS acronyms, so there’s no guesswork at their meaning.

And this is by no means is a be-all-end-all list – technology is advancing at hyper-speed – as Heraclitus of Ephesus philosophized, “Change is the only constant in life.”

Why all this aaS, you ask?

Because teams must align with some, if not all of these services, to enable massive scalability, multitenancy, independence and rapid time to value. These services make up the layers in the cake which comprise the true solution.

Let’s take SQL Server 2019, for instance, a new and incomparable version that’s officially generally available and whose awesome goodness will be the topic of many of my blogs in 2020!

SQL 2019 Editions and Feature Sets

I started with SQL Server 6.5, before a slick GUI existed. Now, with SQL 2019, we have the ability to manage a stateful app, SQL Server, in a container, within a cluster, and as a platform.

That is “awesome sauce” spread all around!

And running SQL Server 2019 on Linux? We’re back to scripting again. I love it! Scripting is testable, repeatable, sharable, and can be checked into source control. Which, quite simply, enables collaboration, which in turn makes a better product. You’ll need to be VCaaS-enabled to store all these awesome scripts.

Is your entire production database schema in source control? If it isn’t, script it out and put it there. A backup of a database is not the same as scripting version control. DBA’s often tell me their production code resides in the database, which is backed up. It’s best practice, however, to begin your proven, repeatable pattern by scripting out the DDL and checking into and deploying from a VCaaS. Aligning your database DDL and test data sets with a CI/CD application development pipeline, should be addressed ASAP. By the way, try this with ADS (Azure Data Studio) and the dacpac extension. Many of the aaSes can be managed with ADS. And, with the #SQLFamily third-party community – the ADS extension tooling will only continue to grow.

DBA as a DevOps Engineer?

Reading this blog as a DBA, you may be thinking I don’t need to know about all these aaS details. Simply put, yes, you do! In many respects, the DBA is now the new DevOps engineer, utilizing all the services listed above. This hybrid role is becoming more of a full stack developer – providing support for multiple scripting languages, an “IT Polyglot.”

Moreover, as a DBA, you are already executing on all of these attributes, and you may not even know it! It turns out that DBAs have always been a part of DevOps. Think about it, you align with Dev — like writing SQL, tuning performance, doing Object Analysis and reporting — and you already do Ops — like, configuring servers and VMS, running backups and restores, and tuning the OS, network and storage. So, you are uniquely positioned to now offer all this as a service.

Congratulations! Your organization is looking to you to lead the charge as the data-centric view of “all things” has arrived!

The Hybrid DBA Role and His or Her Requisite “aaSes”

Let’s look deeper into the chain of services.

As a hybrid DBA, you want to provide DBaaS. What will you need to provide, and understand from a tooling perspective? Well, you are going to need to understand IaaS. Infrastructure as code, to define and automate your core database target workload servers. You will be grouping these services into networking namespaces, there is a bit of NaaS alignment. Even more services if you have data sources residing in a public or on-premises cloud. Maybe you have a FaaS-enabled in a public cloud, that is writing to an unstructured data store. A FaaS, remember, can very much be the replacement for older ETL processes.

Next for agility, let’s containerize these things using CaaS. What will you need to manage and orchestrate those containers? KaaS or Kubernetes which provides the opportunity to build a SQL 2019 Big Data Cluster. In theory, you could say the Kubernetes platform is PaaS with the flexibility of IaaS. Now we are mixing aaS’s together for maybe yet another acronym. :)

Have you started noodling around with SQL 2019 yet? No? Why not? Microsoft has done an incredible job ramping up the product for its launch. I highly recommend jumping in and starting to play with it!

The core Microsoft SQL Server team has put together this excellent reference for SQL 2019 to get you started.

As a DBA, you can then provide PaaS in a few different ways. Again, DBaaS is one form of PaaS. You have a true Data Virtualization layer with SQL 2019 that can also provide PaaS for sourcing across all the data sources – simply with a T-SQL script and PolyBase External Table. That you will enable! There are also many self-service reporting enablement features with PowerBI.

Here is where I like to say XaaS enables data virtualization!

The Beneficial Impact of the Hybrid DBA on Various BUs

I also hear, from the App owners and IT decision makers, that the database team must react to requests, at a much faster rate than the previous norm.

The hybrid DBA will accelerate the cycles for many, if not all, of these business units in the following ways:

Product Managers and Enterprise Architects

  • Reduced/eliminated waits for infrastructure
  • Simultaneous features/projects
  • Improved and reliable software quality
  • Governance around data management and access

Infrastructure Administrators

  • Drastic reduction in turn-around times
  • Operational simplicity
  • Reduced human errors through automation

Developer teams

  • Accelerate iterations of testing and development
  • Secure data sets – infused into the DevOps cycles
  • Production deployments with database alters

LOB Executives

  • Lower CapEx for storing multiple copies
  • Improved employee productivity
  • Faster time to market

Embrace the Containerized World

Big Data Clusters run on a Linux platform on containers within a Kubernetes Cluster. Remember, SQL Server on Linux is the same codebase as SQL Server on Windows. With the only difference being the SQLPAL (SQL Platform Abstraction Layer) and a host extension. This host extension allows SQL Server to interact with the Linux kernel. The hybrid DBA will need to understand how containers are managed and provisioned. Ensuring a solid storage foundation is paramount here, before embarking on containers in production. (*Hint:* Dell Technologies is really good at this!). Additionally, the hybrid DBA should also embrace PowerShell, Bash, R and Python. All in the name of more intelligent tooling across a wide range of modern development platforms and technologies.

If, as a DBA, you are concerned, worried, or downright scared of a containerized world and orchestrator like Kubernetes, no need to worry at all. In fact, there are similarities that you will absolutely love. With K8s there is a declarative object configuration management technique, which is recommended in production.

The Declarative Perspective and Release of SQL 2019

Something else is declarative that we know and love: SQL Server. Declarative languages let users or administrators express a desired state or query. In the case of SQL Server, they want to retrieve as a result, providing broad instructions about what tasks are to be executed and completed. Then, the declarative engine goes to work. You then deal with the results, not the process or automation. Kubernetes and SQL Server, from a declarative perspective, act exactly the same way. I say this because data professionals, especially the DBA, are wired a certain way. They like patterns and they like to declare, “Here is my work to do, go make it happen”.

With the release of SQL 2019, the Hybrid DBA will continue to be in the driver’s seat. You will now administer from the edge (SQL Server Edge) to on-premises SQL Server to additional SQL workloads that may also be in a public cloud. To that end, any cloud, all with SQL Server. Companies large and small will benefit from intelligent SQL Server deployments.

Dell Technologies Is Here to Help You Succeed

Dell Technologies, as your company’s trusted advisor, is positioned and aligned to help you right now with our solutions and services! We also have whitepapers on SQL 2019 available for your reference to help determine which solutions are correct for your workloads.

Dell Technologies offers solutions and services to address hyper-scale data requirements and next-gen hybrid, including:

Many more solutions will become available in the very near future. Why wait? A Dell Technologies Service Expert is ready and able to help you get your aaS in order!

Summary

The hybrid DBA needs to abide to the creed of #neverstoplearning. If you’re stuck and complacent, you will get passed up. There really is no excuse. The cloud playground options are abundant, at minimal or no cost, to play and learn within these environments. Everything in the aaS table above can be vetted and tested within a public or on-premises cloud environment.

Learning a new technology can be simple and fun. Do what I do. Make learning part of your daily life. Carve out a minimum of 2-hour blocks, 5 days a week, and maybe an additional Saturday or Sunday morning to simply play with tech and learn. It works, and you will be much better off in the long run. Personally, some of my best learning happens on a quiet Sunday morning when the email and phone requests are non-existent.

What changes are you willing to incorporate into your daily routine to keep your skills fresh and relevant?

Other Blogs in This Series

Best Practices to Accelerate SQL Server Modernization (Part I)

Best Practices to Accelerate SQL Server Modernization (Part II)

Introduction to SQL Server Data Estate Modernization

Recommended Reading

Running Containerized Applications on Microsoft Azure’s Hybrid Ecosystem – Introduction

Deploy K8s Clusters into Azure Stack Hub User Subscriptions

Deploy a Self-hosted Docker Container Registry on Azure Stack Hub

Read Full Blog
SQL Server Microsoft virtualization data gravity

In a data-driven world, innovation changes are forcing a new paradigm

Stephen McMaster

Wed, 01 Apr 2020 14:29:17 -0000

|

Read Time: 0 minutes

For the last two decades, I’ve enjoyed working at Dell Technologies focusing on customer big-picture ideas. Not just focusing on hardware changes, but on a holistic solution including hardware, software, and services that achieves a business objective by addressing customer goals, problems, and needs. I’ve also partnered with my clients on their transformation journey. The concepts of digital transformation and IT transformation have been universal themes and turning these ideas into realities is where the rubber meets the road.

Now as I engage with customers and partners about Microsoft solutions, an incremental awareness of the idea of “data”, and how data is accessed and leveraged, has become evident. A foundational shift around data has occurred.

We are now living in a new era of data management, but many of us were not aware this change was developing. This has crept up on us without the fanfare you might see from a new technology launch. When you take a step back and look at these shifts in their entirety you see these changes aren’t just isolated updates, but instead are amplifying their benefits within each other. This is a fundamental transformation in the industry, similar to when virtualization was first adopted 15 years ago.

For many, this change started to become apparent with the end of support for SQL Server 2008 earlier this year (along with support for all previous versions of the product). This deadline, coupled with the large install base that still exists on this platform, is helping the conversation along but it’s not just a replace the old with the new in a point-by-point swap out. The doors opened in this new era force a completely different view and approach. We no longer need to have a SQL, Oracle, SAP, or Hadoop conversation – instead it becomes a holistic “data” point of view.

In our hybrid/multi-cloud world, there is not just one answer for managing data. Regardless of the type of data or where it resides, all the diverse data languages and methods of control, the word “data” can encompass a great deal.

Emerging technologies including IoT, 5G, AI and ML are generating greater amounts and varied types of data. How we access that data and derive insight from it becomes critical, but we have been limited by people, processes, and technology.

People have become stuck in the rut of, “I want it to be this way because it has always been this way.” Therefore, replacing dated/expired architectures becomes a swap out story verses a re-examine story and new efficiencies are completely missed. Processes within the organization become rigid with that same mindset and, dare I say politics, where access to that data becomes path- limited. Technology is influenced by both people and process as “the old way is good enough, right?”

The value/importance of “data” really points back to the insight that you drive from it. Having a bunch of ones and zeros on a hard drive is nice but what you derive from that data is critically important. The conversations I have with customers are not so much, “Where is my data and how is it stored?” The conversation is more commonly, “I have a need to get business analytics from my proprietary data so I can impact my customers in a way I never did before.”

To put my Stephen Covey hat on, we are in a paradigm change. What is occurring is incredibly impactful for how customers should view and treat data. There are three key areas that we will examine with the new paradigm today and we’ll start with data gravity.

Data Gravity

Data gravity is the idea that data has weight. Wherever data is created, it tends to remain. Data stores are getting so big that moving data around is becoming expensive, time constrained, and database performance impacting. This in turn, results in silos of data by location and type. Versioning and lack of upgrade/migration/consolidation of databases also perpetuates these silo challenges.

As with physical gravity, we understand that data’s mass encourages applications and analytics to orbit that data store where it resides. Then, application dependency upon the data’s language version cements the silo requirement even further. We have witnessed the proliferation of intelligent core and edge devices, as well as bringing applications to that place where the data resides – at the customer location.

Silos of data based on language, version, and location can’t be readily accessed from a common interface. If I am a SQL user, how do I get that Oracle data I need? I cannot just pull all my data together into a huge common dataset – it’s just too big. We see these silos in almost every customer environment.

Data Virtualization

This is where data virtualization comes into the story. Please note this is not a virtual machine (a common confusion on the naming). Think instead of this being data democratization: the ability to allow all the people access to all the data – within reason, of course. Data virtualization allows you to access the data where the data is stored without a massive ETL event.  You can see and control the data regardless of language, version, or location. The data remains where it is, but you have real-time source access to this data. You can access data from remote or diverse sources and perform actions on that data from one common point of view.

Data virtualization allows access into the silos that, in the past, have been very rigid, blocking the ability to effectively use that data. From a non-SQL Server point of view, having unstructured data or structured data in a different format (like Oracle), required you to hire a specialized person with a specific skill set to access that data. With data virtualization, that is no longer a barrier as these silo walls are reduced. Data virtualization becomes data democratization, meaning that all people (with appropriate permissions) can access and do things with that data.

From a Microsoft point of view, that technology came into reality with Polybase. Polybase with SQL Server allows access with T-SQL, the most commonly used database language. I started using this resource with the Analytics Platform System (APS) many years ago. After Microsoft placed this tool into SQL Server in 2016 and updated its functionality tremendously in SQL Server 2019, we now can ingest Hadoop, Oracle, and use orchestrators like Spark, to access all these disparate data sources. To visualize this, think of Polybase with SQL Server 2019 as a wrapper around these diverse silos of data. You now can access all these disparate data sources within one common interface: T-SQL using Polybase.

Holistic Solution

The final tenet of this fundamental change is the advent of containerization. This enablement technology allows abstraction beyond virtualization and runs just about anywhere. Data becomes nimble and you can move it where needed.

It’s amazing how pervasive containers have become. It’s no longer a science experiment, but is quickly becoming the new normal. In the past, many customers had a forklift perception that when a new technology comes into play, it requires a lift and replace. I’ve heard, “What I am doing today is no longer good, so I have to replace it with whatever your new product is, and it will be painful.”

I’ve been using the phrase that containerization enables “all the things”. Containerization has been adopted by so many architectures that it’s easier to talk about where you can’t do it verses where you can. Traditional SAN, converged, hyperconverged, hybrid cloud — you can place this just about anywhere. There is not just one right path here — do what makes sense for you. It becomes a holistic solution.

There are multiple ways to address the business need that customers have even if it’s leveraging existing designs that they’ve been using for years. Dell Technologies has published details of several architectures supporting SQL Server and has just recently published the first of many papers on SQL Server in containers.

The answer is, you can do all these things with all these architectures. By the way, this isn’t specific to Microsoft and SQL Server. We see similar architectures being created in other databases and technology formats.

These three tenets are each self-supporting to the new paradigm. Data gravity is supported by data virtualization and containerization. Data virtualization allows silos when needed (gravity) and is enabled by containerization. Containerization gives access to silos (wrapper) and is the mechanism to activate data virtualization.

From a Dell Technologies point of view, we are aggressively embracing these tenets. Our enablement technologies to support this paradigm are called out in three discrete points – accelerate, protect, and reuse. We will review these points in a separate blog.

There is much more to come as we continue this journey into the new era of data management. Dell Technologies has deeply invested in resources around this topic with several recent publications and reference designs embracing this paradigm change. Our leadership on this topic is the result of our 30+ year relationship with Microsoft and our continuing “better together” story. A detailed white paper that further expands the ideas within this blog is available here.

Read Full Blog
SQL Server Microsoft

Introduction to SQL Server data estate modernization

Robert F. Sonders

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

This blog is the first in a series discussing what’s entailed in modernizing the Microsoft SQL server platform.

I hear the adage “if it ain’t broke don’t fix it” a lot during my conversations with clients about SQL Server, and many times it’s from the Database Administrators. Many DBA’s are reluctant to change—maybe they think their jobs will go away. It’s my opinion that their roles are not going anywhere, but they do need to expand their knowledge to support coming changes. Their role will involve innovation at a different level, merging application technology (think CI/CD pipelines) integrated into the mix. The DBA will also need to trust that their hardware partner has fully tested and vetted a solution for best possible SQL performance and can provide a Future Proof architecture to grow and change as the needs of the business grow and change.

The New Normal Is Hybrid Cloud

When “public clouds” first became mainstream, the knee-jerk reaction was everything must go to the cloud! If this had happened, then the DBA’s job would have gone away by default. This could not be farther from the truth and, of course, it’s not what happened. With regards to SQL Server, the new normal is hybrid.

Now some years later, will some data stores have a place in public cloud?

Absolutely, they will.

However, it’s become apparent that many of these data stores are better suited to existing on-premises.  Also, there are current trends where data is being re-patriated from the public cloud back to the on-premises environments due to cost, management and data gravity (keeping the data close to the business need and/or specific regulatory compliance needs).

The SQL Server data estate can be a vast, and in some cases, a hodgepodge of “Rube Goldberg” design processes. In the past, I was the lead architect of many of these designs, I am sad to admit. (I continue to ask forgiveness from the IT gods.) Today, IT departments manage one base set of database architecture technology for operational databases, another potential hardware partner caching layer, and yet another architecture for data analytics and emerging AI.

Oh wait…one more point…all this data needs to be at the fingertips of a mobile device and edge computing. Managing and executing on all these requirements, in a real-time fashion, can result in a highly complex and specialized platform.

Data Estate Modernization

The new normal everyone wants is to keep it as simple as possible. Remember those “Rube Goldberg” designs referenced above? They’re no longer applicable. Simple, portable and seamless execution is key. As data volumes increase, DBA’s partnering with hardware vendors need to simplify as security, compliance and data integrity remain a fixed concern. However, there is a new normal for data estate management; one where large volumes of data can be referenced in place, with push down compute or seamlessly snapped and copied in a highly efficient manner to other managed environments. The evolution of SQL Server 2019 will also be a part of the data estate orchestration solution.

Are you an early adopter of SQL 2019?

Get Modern: A Unified Approach of Data Estate Management

A SQL Server Get Modern architecture from Dell EMC can consolidate your data estate, defined with our high value core pillars that align perfectly for SQL Server.

The pillars will work with any size environment, from the small and agile to the very large and complex SQL database landscape. There IS a valid solution for all environments. All the pillars work in concert to complement each other across all feature sets and integration points.

  • Accelerate – To not only accelerate and Future-Proof the environment but completely modernize your SQL infrastructure. A revamped perspective on storage, leveraging RAM and other memory technologies, to maximum effect.
  • Protect – Protect your database with industry leading backups, replication, resiliency and self-service Deduplicated copies.
  • Reuse – Reuse snapshots. Operational recovery. Dev/Test repurposing. CI/CD pipelines.

Aligning along these pillars will bring efficiency and consistency to a unified approach of data estate management. The combination of a strong, consistent, high-performance architecture supporting the database platform will make your IT team the modernization execution masters.

What Are Some of the Compelling Reasons to Modernize Your SQL Server Data Estate?

Here are some of the pain point challenges I hear frequently in my travels chatting with clients. I will talk through these topics in future blog posts.

1. Our SQL Server environment is running on aging hardware:

  • Dev/Test/Stage/Prod do not have the same performance characteristics making it hard to regression test and performance test.

2. We have modernization challenges:

  • How can I make the case for modernization?
  • My team does not have the cycles to address a full modernization workstream.

3. The Hybrid data estate is the answer… how to we get there?

4. We are at EOL (End of Life) for SQL Server 2008 / 2008R2 and Windows Server but are stuck due to:

  • ISV (Independent Software Vendor) “lock in” requirement to a specific SQL Server engine version.
  • Migration plan to modernize SQL cannot be staffed and executed through to completion.

5. We need to consolidate SQL Server sprawl and standardize on a SQL Server version:

  • Build for the future, where disruptive SQL version upgrades become a thing of the distant past. Think…containerized SQL Server. OH yeah!
  • CI/CD success – the database is a key piece of the puzzle for success.
  • Copies of databases for App Dev / Test / Reporting copies are consuming valuable space on disk.
  • Backups take too long and consume too much space.

6. I want to embrace SQL Server on Linux.

7. Let’s talk modern SQL Server application tuning and performance.

8. Where do you see the DBA role in the next few years?

Summary

I hope you will come along this SQL Server journey with me as I discuss each of these customer challenges in this blog series:

Best Practices to Accelerate SQL Server Modernization (Part I)

Best Practices to Accelerate SQL Server Modernization (Part II)

And, if you’re ready to consider a certified, award-winning Microsoft partner who understands your Microsoft SQL Server endeavors for modernization, Dell EMC’s holistic approach can help you minimize risk and business disruption. To find out more, contact your Dell EMC representative.

 

Read Full Blog
SQL Server Microsoft

Recommendations for modernizing the Microsoft SQL Server platform

Robert F. Sonders

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

This blog follows Introduction to SQL Server Data Estate Modernization, the second in a series discussing what’s entailed in modernizing the Microsoft SQL server platform and recommendations for executing an effective migration.

SQL Server—Love at First Sight Despite Modernization Challenges

SQL Server is my favorite Microsoft product and one of the most prolific databases of our time. I fell in love with SQL Server version 6.5, back in the day when scripting was king! Scripting is making a huge comeback, which is awesome! SQL Server, in fact, now exists in pretty much every environment—sometimes in locations IT organizations don’t even know about.

It can be a daunting task to even kick off a SQL modernization undertaking. Especially if you have End of Life SQL Server and/or Windows Server, running on aging hardware. My clients voice these concerns:

  1. The risk is too high to migrate. (I say, isn’t there a greater risk in doing nothing?)
  2. Our SQL Server environment is running on aging hardware—Dev/Test/Stage/Prod do not have the same performance characteristics, making it hard to regression test and performance test.
  3. How can I make the case for modernization? My team doesn’t have the cycles to address a full modernization workstream.

I will address these concerns, first, by providing a bit of history in terms of virtualization and SQL Server, and second, how to overcome the challenges to modernization, an undertaking in my opinion that should be executed sooner rather than later.

When virtualization first started to appear in data centers, one of its biggest value propositions was to increase server utilization, especially for SQL Server. Hypervisors increased server utilization by allowing multiple enterprise applications to share the same physical server hardware. While the improvement of utilization brought by virtualization is impressive, the amount of unutilized or underutilized resources trapped on each server starts to add up quickly. In a virtual server farm, the data center could have the equivalent of one idle server for every one to three servers deployed.

Fully Leveraging the Benefits of Integrated Copy Data Management (iCDM)

Many of these idle servers, in several instances, were related to Dev/Test/Stage. The QoS (Quality of Service) can also be a concern with these instances.

SQL Server leverages Always On Groups as a native way to create replicas that can then be used for multiple use cases. The most typical deployment of AAG replicas are for high availability failover databases and for offloading heavy read operations, such as reporting, analytics and backup operations.

iCDM allows for additional use cases like the ones listed below with the benefits of inline data services:

  • Test/Dev for continued application feature development, testing, CI/CD pipelines.
  • Maintenance to present an environment to perform resource intensive database maintenance tasks, such as DBCC and CHECKDB operations.
  • Operational Management to test and execute upgrades, performance tuning, and pre-production simulation
  • Reporting to serve as the data source for any business intelligence system or reporting.

One of the key benefits of iCDM technology is the ability to provide cost efficient lifecycle management environment. iCDM provides efficient copy data management at the storage layer to consolidate both primary data and its associated copies on the same scale-out, all-flash array for unprecedented agility and efficiency. When combined with specific Dell EMC products, bullet-proof, consistent IOPS and latency, linear scale-out all-flash performance, and the ability to add more performance and capacity as needed with no application downtime, iCDM delivers incredible potential to consolidate both production and non-production applications without impacting production SLAs.

While emerging technologies, such as artificial intelligence, IoT and software-defined storage and networking, offer competitive benefits, their workloads can be difficult to predict and pose new challenges for IT departments.

Traditional architectures (hardware and software) are not designed for modern business and service delivery goals. Which is yet another solid use case for a SQL modernization effort.

As I mentioned in my previous post, not all data stores will be migrating to the cloud. Ever. The true answer will always be a Hybrid Data Estate. Full stop. However, we need to modernize for a variety of compelling reasons.

5 Paths to SQL Modernization

Here’s how you can simplify making the case for modernization and what’s included in each option.

Do nothing (not really a path at all!):

  • Risky roll of the dice, especially when coupled with aging infrastructure.
  • Run the risk of a security exploit, regulatory requirement (think GDPR) or a non-compliance mandate.

Purchase Extended Support from Microsoft:

  • Supports SQL 2008 and R2 or Windows Server 2008 and R2 only.
  • Substantial cost using a core model pricing.
  • Costs tens of thousands of dollars per year…and that is for just ONE SQL instance! Ouch. How many are in your environment?
  • Available for 3 additional years.
  • True up yearly—meaning you cannot run unsupported for 6 months then purchase an additional 6 months. Yearly purchase only. More – ouch.
  • Paid annually, only for the servers you need.
  • Tech support is only available if Extended Support is purchased (oh…and that tech support is also a separate cost).

Transform with Azure/Azure Stack:

  • Migrate IaaS application and databases to Azure/ Dell EMC Azure Stack virtual machines (Azure, on-premises…Way Cool!!!).
  • Receive an additional 3 years of extended security updates for SQL Server and Windows Server (versions 2008 and R2) at no additional charge.
  • In both cases, there is a new consumption cost, however, security updates are covered.
  • When Azure Stack (on-premises Azure) is the SQL IaaS target, there are many cases where the appliance cost plus the consumption cost, is still substantially cheaper than #2, Extended Support, listed above.
  • Begin the journey to operate in Cloud Operating Model fashion. Start with the Azure Stack SQL Resource Provider and easily, within a day, be providing your internal subscribers a full SQL PaaS (Platform as a Service).
  • If you are currently configured with a Highly Available – Failover Cluster Instance with SQL 2008, this will be condensed into a single node. The Operating System protection you had with Always On FCI is not available with Azure Stack. However, your environment will now be running within a Hyper-Converged Infrastructure. Which does offer node failure (fault domain) protection, not operating system protection, or downtime protection from an Operating System patch. There are trade-offs. Best to weigh the options for your business use case and recovery procedures.

Move and Modernize (the Best Option!):

  • Migrate IaaS instances to modern all Flash Dell EMC infrastructure.
  • Migrate application workloads on a new Server Operating System – Windows Server 2016 or 2019.
  • Migrate SQL Server databases to SQL Server 2017 and quickly upgrade to SQL 2019 when Generally Available. Currently SQL 2019 is not yet GA. Best guess, before end of year 2019.
  • Enable your IT teams with more efficient and effective operational support processes, while reducing licensing costs, solution complexity and time to delivery for new SQL Server services.
  • Reduce operating and licensing costs by consolidating SQL Server workloads. With the Microsoft SQL Server per-core licensing model in SQL Server 2012 and above, moving workloads to a virtual/cloud environment can often present significant licensing savings. In addition, through the Dell Technologies advisory services; we typically discover within the enterprise SQL server landscape, that there are many SQL Server instances which are underutilized, which presents an opportunity to reduce the CPU core counts or move SQL workloads to shared infrastructure to maximize hardware resource utilization and reduce licensing costs.

Rehost 2008 VMware Workloads:

  • Run your VMware workloads natively on Azure
  • Migrate VMware IaaS applications and databases to Azure VMware Solution by CloudSimple or Azure VMware solution by Virtustream
  • Get 3 years of extended security updates for SQL Server and Windows Server 2008 / R2 at no additional charge
  • vSphere network full compatibility

Remember, your Windows Server 2008 and 2008 R2 servers will also be EOL January 14, 2020.

Avoid Risks and Disasters Related to EOL Systems (in Other Words, CYA)

Can your company afford the risk of both an EOL Operating System and an EOL Database Engine? Pretty scary stuff. It really makes sense to look at both. Do your business leaders know this risk? If not, you need to be vocal and explain the risk. Fully. They need to know, understand, and sign off on the risk as something they personally want to absorb when an exploit hits. Somewhat of a CYA for IT, Data professionals and line of business owners. If you need help here, my team can help engage your leadership teams with you!

In my opinion, the larger problem, between SQL EOL and Windows Server EOL, is the latter. Running an unsupported operating system that supports an unsupported SQL Server workload is a serious recipe for disaster. Failover Cluster Instances (Always On FCI) was the normal way to provide Operating System High Availability with SQL Server 2008 and lower, which compounds the issue of multiple unsupported environment levels. Highly Available FCI environments are now left unprotected.

Summary

Some migrations will be simple, others much more complex, especially with mission critical databases. If you have yet to kick off this modernization effort, I recommend starting today. The EOL clock is ticking. Get your key stakeholders involved. Show them the data points from your environment. Send them the link to this blog!

If you continue to struggle or don’t want to go it alone, Dell Technologies Consulting Services, the only holistic SQL Server workload provider, can help your team every step of the way. Take a moment to connect with your Dell Technologies Service Expert today and begin moving forward to a modern platform.

Other Blogs in This Series:

Best Practices to Accelerate SQL Server Modernization (Part II)

Introduction to SQL Server Data Estate Modernization

Read Full Blog
SQL Server Microsoft

Accelerate SQL Server modernization: Getting started

Robert F. Sonders

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

In Part I of this blog series, I discussed issues concerning EOL SQL Server and Windows Server running on aging hardware, and three pathways to SQL Server modernization. In this blog, I’ll discuss how a combination of Dell Technologies Consulting Services and free tools from Microsoft can help you get started on your SQL migration journey and how to meet your SQL Server data modernization objectives.

Beginning Your SQL Migration Journey

During my client conversations, I hear three repeating pain points: (1) they would love to modernize, don’t have the cycles or staff to drive a solid iterative SQL migration approach through to completion, (2) they still need to “keep the lights on” and (3) despite having the skills, they lack the team cycles to execute on the plan.

This is where Dell Technologies Consulting Services can help you plan a solid foundational set of goals and develop a roadmap for the modernization. Here are the key points our SQL modernizations teams address:

  • Discover the as-is SQL environment including the current state of all in scope SQL servers, associated workloads and configurations.
  • Inventory and classify applications (which align to SQL databases) and all dependencies. Critical here to think through all the connections, reporting, ETL/ELT processes, etc.
  • Group and prioritize the SQL databases (or entire instance) by application group and develop a near-term modernization plan and roadmap for modernization. Also an excellent time to consider database consolidation.
  • Identify the rough order of magnitude for future state compute, storage and software requirements to support a modernization plan. Here is where our core product teams would collaborate with the SQL modernization teams. This collaboration is a major value add for Dell Technologies.

Additionally, there are excellent, free tools from Microsoft to help your teams, assess, test and begin the migration journey. I will talk about these tools below.

Free Microsoft Tools to Help You Get Started

Microsoft has stated they will potentially assist if there are performance issues with SQL procedures and queries. It is best to utilize the Query Tuning Assistant while preparing your database environment for a migration.

Microsoft provides query plan shape protection when:

  • The new SQL Server version (target) runs on hardware that is comparable to the hardware where the previous SQL Server version (source) was running.
  • The same supported database compatibility level is used both at the target SQL Server and source SQL Server.
  • Any query plan shape regression (as compared to the source SQL Server) that occurs in the above conditions will be addressed (contact Microsoft Customer Support if this is the case).

Here are a few other free Microsoft tools I recommend running ASAP that will enable you to understand your environment fully and provide measurable and actionable data points to feed into your SQL modernization journey. Moreover, proven and reliable upgrades should always start with these tools (also explained in detail within the Database Migration Guide):

  • Discovery – Microsoft Assessment and Planning Toolkit (MAP)
  • Assessment – Database Migration Assistant (DMA)
  • Testing – Database Experimentation Assistant (DEA)

Database engine upgrade methods are listed here. Personally, I am not a fan of any upgrade in-place option. I like greenfield and a minimal cutover with an application connection string change only. There are excellent, proven, migration paths to minimize the application downtime due to a database migration. Another place our Professional Services SQL Server team can provide value and reliability to execute a successful transition.

Here are a few cutover migration options for you to consider:

FeatureNotes
Log ShippingCutover measured in (typically) minutes
ReplicationCutover measured in (potentially) seconds
Backup and RestoreThis is going to take a while for larger databases. However, a Full, Differential and T-log backup / restore process can be automated
Filesystem/SAN CopyCan also take time
Always On Groups (>= SQL 2012)Cutover measured in (typically) seconds – Rolling Upgrade
*Future – SQL on ContainersNo down time – Always On Groups Rolling Upgrade

A consideration for you — all these tools and references are great — but does your team have the cycles and skills to execute the migration?

3 Pillars to Help You with Your SQL Server Data Modernization Objectives

As I mentioned at the start of this blog, it makes perfect sense to use the experts from Dell Technologies Consulting Services for your SQL Server migration. Our Consulting Services teams are seasoned in SQL modernization processes, including pathways to Azure, Azure Stack, the Dell EMC Ready Solutions for SQL Server, existing hardware or cloud, and align perfectly with a server and storage refresh, if you have aging hardware (will be covered in future blog).

Within Dell Technologies Consulting Services, we support 3 pillars to help you with your SQL Server data modernization objectives.

Platform Consulting Services

Dell Technologies Consulting Services provides planning, design and implementation of resilient data architectures with Microsoft technology both on-premises and in the cloud. Whether you are installing or upgrading a data platform, migrating and consolidating databases to the Cloud (Public, Private, Hybrid) or experiencing performance problems with an existing database, our proven methodologies, best practices and expertise, will help you make a successful transition.

Data Modernization Services

Modernizing your data landscape improves the quality of data delivered to stakeholders by distributing workloads in a cost-efficient manner. Platform upgrades and consolidations can lower the total cost of ownership while efficiently delivering data in the hands of stakeholders. Exploring data workloads in the cloud, such and test/development, disaster recovery or for active archives provide elastic scale and reduced maintenance.

Business Intelligence and Analytics Services

By putting data-driven decisions at the heart of your business, your organization can harness a wealth of information, gain immediate insights, drive innovation, and create competitive advantage. Dell Technologies Consulting Services provides a complete ecosystem of services that enable your organization to implement business intelligence and create insightful analytics for on-premises, public cloud, or hybrid solutions.

We have in place, the proven methodologies with an focus, to drive a repeatable and successful SQL Server migration.

3 Approaches to Drive a Repeatable and Successful SQL Server Migration

Migration Procedures and Validation Approach

A critical success factor for migrations is ensuring the migration team has a well-documented set of migration processes, procedures and validation plan. Dell Technologies Consulting Services ensure that every team member involved in the migration process has a clear set of tasks and success metrics to measure and validate the process throughout the migration lifecycle.

Tools-based Migration and Validation

Our migration approach includes tools-based automation solutions and scripts to ensure the target state environment is right-sized and configured to exacting standards. We leverage industry standard tools to synchronize data from the source to target environments. Lastly, we leverage a scripted approach for validating data consistency and performance in the target environment.

Post Migration Support

Once the migration event is complete, our consultants remain at the ready for up to 5 business days, providing Tier-3 support to assist with troubleshooting and mitigating any issues that may manifest in the SQL environment as a result of the migration.

Considerations for the Data Platform of the Future

There are a number of pathways for modernizing SQL Server workloads. Customers need flexibility when it comes to the choice of platform, programming languages & data infrastructure to get from the most from their data. Why? In most IT environments, platforms, technologies and skills are as diverse as they have ever been, the data platform of the future needs to you to build intelligent applications on any data, any platform, any language on premises and in the cloud.

SQL Server manages your data, across platforms, on-premises and cloud. The goal of Dell Technologies Consulting Services is to meet you where you are, on any platform, anywhere with the tools and languages of your choice.

SQL Server now has support for Windows, Linux & Docker Containers. Kubernetes orchestration with SQL 2019 coming soon!

Additionally, SQL allows you to leverage the language of your choice for advanced analytics – R and Python.

Where Can You Move and Modernize these SQL Server Workloads?

Microsoft Azure and Dell EMC Cloud for Microsoft Azure Stack

Migrate your SQL Server databases without changing your apps. Azure SQL Database is the intelligent, fully managed relational cloud database service that provides the broadest SQL Server engine compatibility. Accelerate app development and simplify maintenance using the SQL tools you love to use. Reduce the burden of data-tier management and save time and costs by migrating workloads to the cloud.  Azure Hybrid Benefit for SQL Server provides a cost-effective path for migrating hundreds or thousands of SQL Server databases with minimal effort. Use your SQL Server licenses with Software Assurance to pay a reduced rate when migrating to the cloud. SQL 2008 support will be extended for three years if those IaaS workloads are migrated to Azure or Azure Stack.

Another huge value add, with Azure Stack and SQL Server, is the SQL Server Resource Provider. Think SQL PaaS (Platform as a Service). Here is a quick video I put together on the topic.

The SQL RP does not execute the SQL Server engine. However, it does manage the various SQL Server instances, that can be provided to tenants as SKUs, for SQL database consumption. These managed SQL Server SKUs can exist on, or off, Azure Stack.

Now that is cool if you need big SQL Server horsepower!

Dell EMC Proven SQL Server Solutions

Dell EMC Proven solutions for Microsoft SQL are purpose-designed and validated to optimize performance with SQL Server 2017. Products such as Dell EMC Ready Solutions for Microsoft SQL helps you save the hours required to develop a new database and can also help you avoid the costly pitfalls of a new implementation, while ensuring that your company’s unique needs are met. Our solution also helps you reduce costs with hardware resource consolidation and SQL Server licensing savings by consolidating and simplifying SQL systems onto modern infrastructure that supports mixed DBMS workloads, making them future ready for data analytics and next generation real-time applications.

Leverage Existing Cloud or Physical Infrastructure

Enable your IT teams with more efficient and effective operational support processes, while reducing licensing costs, solution complexity and time to delivery for new SQL Server services. Dell Technologies Consulting Services has experience with all the major cloud platforms. We can assist with planning and implementation services to migrate, upgrade and consolidate SQL Server database workloads on your existing cloud assets.

Consolidate SQL Workloads

Reduce operating and licensing costs by consolidating SQL Server workloads. With the Microsoft SQL Server per-core licensing model in SQL Server 2012 and above, moving workloads to a virtual/cloud environment can often present significant licensing savings. In addition, Dell Technologies Consulting Services will typically discover within the enterprise SQL server landscape, that there are many SQL Server instances which are underutilized, which presents an opportunity to reduce the CPU core counts or move SQL workloads to shared infrastructure to maximize hardware resource utilization and reduce licensing costs.

Summary

Some migrations will be simple, others much more complex, especially with mission critical databases. If you have yet to kick off this modernization effort, I recommend starting today. The EOL clock is ticking. Get your key stakeholders involved. Show them the data points from your environment. Use the free tools from Microsoft. Send them the link to this blog!

If you continue to struggle or don’t want to go it alone, Dell Technologies Consulting Services, the only holistic SQL Server workload provider, can help your team every step of the way. Take a moment to connect with your Dell Technologies Service Expert today and begin moving forward to a modern platform.

Blogs in the Series

Best Practices to Accelerate SQL Server Modernization (Part I)

Introduction to SQL Server Data Estate Modernization

 

Read Full Blog
SQL Server PowerEdge XtremIO Microsoft

New Dell EMC Ready Solution powers SQL Server, the complete performance platform

Sam Lucido

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

Working on the new Dell EMC Ready Solution for SQL Server was like going from 0 to 60 mph in under 3 seconds. The exhilaration of being pushed into the seat as the road roars past in a blur is absolute fun. That’s what the combination of Dell EMC PowerEdge R840 servers and the new Dell EMC XtremIO X2 storage array did for us in our recent tests.

The classic challenge with most database infrastructures is diminishing performance over time. To use an analogy, it’s like gradually increasing the load a supercar must pull until its 0-to-60 time just isn’t impressive anymore. In the case of databases, the load is input/output operations per second (IOPS). As IOPS increase, response times can slow and database performance suffers. What is interesting is how this performance problem happens over time. As more databases are gradually added to an infrastructure, response times slow by a fraction at a time. These incremental hits on performance can condition application users to accept slower performance—until one day someone says, “Performance was good two years ago but today it’s slow.”

When reading about supercars, we usually learn about their 0-to-60 mph time and their top speed. While the top speed is interesting, how many supercars have you seen race by at 200+ mph? Top speeds apply to databases too. Perhaps you have read a third-party study that devoted a massive hardware infrastructure to one database, thereby showing big performance numbers. If only we had the budget to do that for all our databases, right? Top speeds are fun, but scalability is more realistic as most infrastructures will be required to support multiple databases.

Dell EMC Labs took the performance scalability approach in testing the new SQL Server architecture. Our goals were aggressive: Run 8 virtualized databases per server for a total of 16 databases running in parallel, with a focus on generating significant load while maintaining fast response times. To make the scalability tests more interesting, 8 virtualized databases used Windows Server Datacenter on one server and the other 8 databases used Red Hat Enterprise Linux on another server. Figure 1 shows the two PowerEdge R840 servers and the 8-to-1 consolidation ratio (on each server) achieved in the tests.

Figure 1: PowerEdge R840 servers

Quest Benchmark Factory was used to create the same TPC-E OLTP workload across all 16 virtualized databases. On the storage side, XtremIO X2 was used to accelerate all database I/O. The XtremIO X2 configuration included two X-Brick modules, each with 36 flash drives for a total of 72. According to the XtremIO X2 specification sheet, a 72-drive configuration can achieve 220,000 IOPS at .5 milliseconds (ms) of latency with a mixture of 70 percent reads and 30 percent writes using 8K blocks. Figure 2 shows the two X-Brick configuration of the X2 array with some of key features that make the all-flash system ideal for SQL Server databases.

Figure 2: XtremIO X2

Before we review the performance findings, let’s talk about IOPS and latency. IOPS is a measure that defines the load on a storage system. This measurement has greater context if we understand the maximum recommended IOPS for a storage system for a specific configuration. For example, 16 databases running in parallel don’t represent a significant load if they are only generating 20,000 IOPS. However, if the same databases generated 200,000 IOPS, as they did on the XtremIO X2 array that we used in our tests, then that’s a significant workload. Thus, IOPS are important in understanding the load on a storage system.

Response time and latency are used interchangeably in this blog and refer to the amount of time used to respond to a request to read or write data. Latency is our 0-to-60 metric that tells us how fast the storage system responds to a request. Just like with supercars, the lower the time, the faster the car and the storage system. Our goal was to determine if average read and write latencies remained under .5 ms.

Looking at IOPS and latency together brings us to our overall test objective. Can this SQL Server solution remain fast (low latency) under a heavy IOPS load? To answer this question is to understand if the database solution can scale. Scalability is the capability of the database infrastructure to handle increased workload with minimal impact to performance. The greater the scalability of the database solution, the more workload it can support and the greater return on investment it provides to customers. So, for our tests to be meaningful we must show a significant load; otherwise, the database system has not been challenged in terms of scalability.

We broke the achievable IOPS barrier of 220,000 IOPS by more than 55,000 IOPS! In large part, the PowerEdge R840 servers enabled the SQL Server databases to really push the OLTP workload to the XtremIO X2 array. We were able to simulate overloading the system by placing a load that is greater than recommended. In one respect we were impressed that XtremIO X2 supported more than 275,000 IOPS, but then we were concerned that there might have been a trade-off with performance.

The average latency for all physical reads and writes was under .5 ms. So not only did the SQL Server solution generate a large database workload, the XtremIO X2 storage system maintained consistently fast latencies throughout the tests. The test results show that this database solution was designed for performance scalability: The system maintained performance under a large workload across 16 databases. Figure 3 summarizes the test findings.

Figure 3: Summary of test findings

The capability to scale without having to invest in more infrastructure provides greater value to customers. Would I recommend pushing the new SQL Server solution past its limits like Dell EMC Labs did in testing for scalability? No. Running database tests involves achieving a steady state of performance that is uncharacteristic of real-world production databases. Production databases have peak processing times that must be planned for so that the business does not experience any performance issues. Dell EMC has SQL Server experts that can design the Ready Solution for different workloads. In my opinion, one of the key strengths of this solution is that each physical component can be sized to address database requirements. For example, the number of servers might need to be increased, but no additional investment is necessary on XtremIO X2, thus, saving the business money.

If I were to address just one other topic, I would pick the space savings achieved with a 1 TB SQL Server database. In figure 4, test results show a 3.52-to-1 data reduction ratio, which translates to a 71.5 percent space savings for a 1 TB database on the XtremIO X2 array. Always-on inline data reduction saves space by writing only unique blocks and then compressing those blocks to storage. The value of inline data reduction is the resulting ability to consolidate more databases to the XtremIO X2 array.

Figure 4: XtremIO X2 inline data reduction

Are you interested in learning how SQL Server performed on Windows Server Datacenter edition and Red Hat Enterprise Linux Server? I recommend reading the design guide for Dell EMC XtremIO X2 with PowerEdge R840 servers. The validation and use case section of that guide takes the reader through all the performance findings. Or schedule a meeting with your local Microsoft expert at Dell EMC to explore the solution.

Why Ready Solutions for Microsoft SQL?

The Ready Solutions for Microsoft SQL Server team at Dell EMC is a group of SQL Server experts who are passionate about building database solutions. All of our solutions are fully integrated, validated, and tested. Figure 5 shows how we approach developing database solutions. Many of us have been on the customer or consulting side of the business, and these priorities reflect our passion to develop specialized database solutions that are faster and more reliable.

Figure 5: Our database solutions development approach

I hope you enjoyed this blog. If you have any questions, please contact me.

Additional Resources:

Read Full Blog
SQL Server Microsoft Live Optics

How IT organizations benefit from the Dell EMC Live Optics monitoring tool

Anil Papisetty

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

How can a monitoring tool bring value to any organization? Why is it, in fact, essential for any IT organization? And do IT organizations need to invest in monitoring to see the value prop?

This blog provides an overview of how organizations benefit from monitoring. Later in the blog, you will get to know about the Live Optics free online software that Dell EMC introduced in 2017 and the value proposition that this monitoring software brings to your organization.

Traditional Monitoring Challenges

Traditionally, infrastructure was managed and monitored by IT engineers who would log in to each device or server and check the disk space, memory, processor, network gears, and so on. This required a lot of manual effort and time to identify the issues. It was difficult for IT engineers to proactively predict the issues, and so their efforts were typically reactive.  Later, due to the rapid changes and evolution of technologies, consolidated monitoring tools were introduced to help IT administrators analyze the environment, foresee threats, detect anomalies, and provide end-to-end dashboard reports of the environment.

 Today’s digital transformation has triggered a growth in the number of products, resources, and technologies. The challenge for organizations is investing budget and time into monitoring solutions that can enable greater efficiencies on premises, in the public cloud, or in a blended hybrid environment. The goal is to move away from the traditional labor-intensive monitoring that uses scripts and relies on knowledge experts to automated monitoring that enables IT engineers to focus more on innovation.

Modern Monitoring Tool Benefits

 The benefits of monitoring and how it plays a major role in your organizational growth:

  • Never miss a beat: Helps to prevent and reduce downtime and business losses by actively monitoring the heartbeat of the server’s IT infrastructure
  • Faster alerts: Identifies business interruptions and actively monitors and alerts via email, mobile calls, text, and instant messages
  • Comprehensive view: Helps to resolve uncertainty and provides understanding on how end-to-end infrastructure and its applications work and perform
  • Insights: Recommends upgrades, identifies architectural or technical hiccups, and tracks the smooth transitions (technology upgrades, migrations, and third-party integrations)
  • Budgeting and planning: Enables the IT organization to develop a plan for future projects and costs
  • Protecting against threats: Helps to detect early threats or problems to mitigate risks
  • Analytics: Incorporates analytics and machine learning techniques to analyze live data and to bring about greater improvements in productivity and performance
  • Rich dashboard reporting: Powers BI integration and capturing of consolidated dashboard reports for management leads

In a monitoring context, artificial intelligence plays a major role by proactively tuning or fixing issues—by sending notifications to the appropriate team or individual, or even automatically creating a ticket in a service desk and assigning it to a queue.

Dell EMC Live Optics

We live in a world of constantly changing products. New features are added, competitive features are enhanced, new alternatives are introduced, and prices are changed. To address these changes, Dell EMC offers Live Optics, free online software that helps you to collect, visualize, and share data about your IT environment and workloads.  Live Optics is an agentless monitoring tool that you can set up in minutes.

Live Optics.PNG.png

Eliminate overspending and speed decision-making in your IT environment. Live Optics captures performance, software, OS distribution, and VM data for time frames ranging from a few hours to one week. Live Optics lets you share IT performance and workload data characteristics securely and anonymously. You can collaborate with peers, vendors, or channel partners without compromising security. With Live Optics, you can even model project requirements to gain a deeper knowledge of workloads and their resource requirements.

What is the customer value proposition? Live Optics brings together customer intelligence, competitive insight, and product valuation. Here are ways in which Dell EMC Live Optics can bring value to any IT organization:

 Host Optics:

  • Software for data collection—platform and hardware agnostic; physical or virtual
  • Support for all major operating systems and virtualization platforms
  • Quite often, all the information you need to make a recommendation

 Workload or File Optics:

  • Intense workload-specific assessments for diagnostic issues
  • Rapid file characterization of unstructured data
  • Data archival candidacy
  • Data compression estimates using proprietary algorithms

 Hardware Optics:

  • Performance and configuration retrieved on supported platforms via API and/or file processing
  • Custom options for support of proprietary, OEM-specific APIs

Please reach out to our Live Optics support team at https://support.liveoptics.com/hc/en-us for assistance. We can help in the following ways:

  • Consultation
  • Deployment
  • Adoption
  • Support
  • Optimization

One of my favorite idioms is "Health is wealth." In the same way, the wealth of an IT environment is measured by the health of an organization’s IT infrastructure.

Important links:
https://www.liveoptics.com
https://support.liveoptics.com/hc/en-us
https://www.youtube.com/LiveOptics
https://app.liveoptics.com/Account/Login?ReturnUrl=%2f
https://support.liveoptics.com/hc/en-us/community/topics


Read Full Blog
SQL Server Microsoft

How SQL Server can protect your digital currency

Anil Papisetty

Mon, 30 Mar 2020 18:46:49 -0000

|

Read Time: 0 minutes

Do you wonder if your data is under attack? When should we worry if our data is safe and secure?   What precautionary steps we have taken to protect data? Can we eliminate data breaching?  In this article I want to introduce some of the great security features built into SQL 2017.  No product can prevent all risk of data loss or unauthorized access.The best defense is a combination of good products, knowledgeable people, and rigorous processes design with data protection at all levels of the organization. 

Let us start by understanding what data is?

There were relatively few methods to create and share data before the advent of computers – primarily paper and film. Today there are many ways to create, store and access digital data (0’s and 1’s).  Data may be a collection of raw facts, data may be a numbers or words, data may be a recorded information of something or someone and in typical digital language data is binaries.  Digital data is much easier to create, share, transfer and store in digital forms, such as an email, digital images, digital movies, e-books but also much more difficult to secure.

The digital data ecosystem

Most data can be classified as structured and unstructured. Most of data being created today is unstructured.  With the advancement of computer and communication technologies, the rate of data generation and sharing has increased exponentially.

In simple terms, structured data is typically stored using a database management system (DBMS) in rows and columns. Structured data is easily searchable by basic algorithms. Unstructured data is pretty much everything else and does not have a predefined data model. Unstructured nature is much more difficult to retrieve and process. Numerous sources and techniques (data mining, natural language processing (NLP) and text analysis) are evolving rapidly by industry to analyze, derive, manage and store both unstructured and structured data.

In 1988, Merrill Lynch cited a rule of thumb somewhere around 80 to 90% all potentially usable business information may have originated in unstructured form. IDC and EMC projected that data will grow to 40 zettabytes by 2020.

https://www.kdnuggets.com/2012/12/idc-digital-universe-2020.html

https://www.emc.com/about/news/press/2012/20121211-01.htm

The chart below shows the amount of the data generated every minute in social media according to Domo's Data Never Sleep 5.0 report.

https://web-assets.domo.com/blog/wp-content/uploads/2017/07/17_domo_data-never-sleeps-5-01.png

It is not necessary to store all the unwanted data. IDC predicts that by 2025 nearly 20% of the data in global data sphere will be critical to our daily lives. Organization should have a prior plan to store right amount of data and how to extract the business value, the value for human experience and personal value. That's the choice and its a definitive challenge.

Following chart provides a view of the total number of records containing personal and other sensitive data that have been in compromised between Jan 2017 and March 2018.

As per Gartner forecast the total spending of cyber security by the organizations world wide were up by 8% from 2017 and the predicted number is $96 Billions in 2018.

What could possibly go wrong?

A data breach is when confidential information exposed or compromised by intentional or unintentional means.

Malware: Any type of virus, including worms, ransomware, spyware and Trojans which gains access to damage a computer without the knowledge of the owner. Malware is usually injected and installed on a machine by tricking to user to install or access a program from the internet. 


Password attack:  Brute force attacks can be very successful for gaining access to systems with insecure passwords. 81% of confirmed data breaches involved weak, default or stolen passwords.

Phishing: Capitalizing on our apparent human need to click things, phishing campaigns try to get the recipient to open an infected attachment or click and equally infectious link.

Social Engineering:  Email or phone call contact with authorized user of sensitive data for obtaining personal information that can be used in an attack to gain unauthorized access.

One ounce of prevention is worth a pound of cure.

SQL 2017 is equipped with many features to help secure and protect your data from breaches. With SQL server, security is just so well integrated, it’s literally something you mostly just turn on.  For example, it is extremely easy to encrypt data on disk, on the wire and in memory, which is big.

  • Always Encrypted (Secure at rest in motion): Large amounts of data lead to added complexity. Data is queried, transmitted, backed up, and replicated nearly and constantly. With all that activity, any link in the chain could be a potential vulnerability. Always Encrypted, enables encryption of sensitive data inside application and on the wire, while never revealing the encryption keys to the database engine. As a result, always encrypted provides a separation between those who own the data and those who manages the data.
  • SQL Dynamic data masking prevents the abuse of sensitive data by controlling what users can access the unmasked data.
  • SQL Server Authentication ensures that only authorized users have access by requiring valid credentials to access the data in databases.

https://docs.microsoft.com/en-us/sql/relational-databases/security/choose-an-authentication-mode?view=sql-serer-2017

  • SQL Server 2017 audit is the primary auditing tool in SQL Server, enabling you to track and log server-level events as well as individual database events. It uses extended events to help create and run audit-related events. SQL server audit components are SQL Server Audit, SQL Server Specification and Database Audit Specification. 
  • Row-Level Security, helps database administrator to implement restricted access to the specific engineer or a user to the rows in a database table. This makes the security system more reliable and robust by reducing the systems surface area

SQL Server Provides enterprise-grade security capabilities on Windows and On Linux. All built in.

 

 

Protect Data

Transparent Data Encryption

Backup Encryption

Cell-level Encryption

Transport layer Security (SSL/TLS)

Always Encrypted

Control Access (Database Access/Application Access)

SQL Server Authentication

Active Directory Authentication

Granular Permission

Row Level Security

Dynamic Data Masking

Monitor Access

Tracking Activities (Fine-grained Audit)

 Summary:  Digitization has led to an explosion of new data that is not expected to abate anytime soon. As data continues to play a vital role in our future, Cyber Criminals are causing organizations to spend ever increasing amounts of money every year to protect data. It is important that organizations get the most value from these investment in data protection.

DATA IS DIGITAL CURRENCY

Read Full Blog