(SASS) Tabular Modeling
Beginner. Intermediate. Advanced
When it comes to query language for relational database management systems, SQL is the industry benchmark. This hands-on course will help you advance your knowledge and keep pace with today’s SQL standards. You will practice designing, writing, and testing complex SQL queries to execute on a practice database using SQL Server Express.
What you will learn
- Learn how to write complex SQL queries to execute on a SQL Server database
- Design and write SQL queries to filter, group, and summarize complex data subsets to retrieve specific data
- Design and write SQL sub-queries and queries using functions, joins, and unions
- Explore the benefit of stored procedures as you practice writing SQL user-defined stored procedures and trigger queries
How you will benefit
- Expand your knowledge of SQL, the standard database query language
- Become more confident in your ability to understand and write SQL queries
- Open the door to new career opportunities as a SQL developer
How the course is taught
- Instructor-led or self-paced online course
- 8 Weeks or 12 Months access
- 80 course hours
SQL Server Analysis Services – Instructor Led
SQL Server Analysis Services – Self Paced
In this lesson, you will use the Table Import Wizard in SQL Server Data Tools to connect to the AdventureWorksDW2012 SQL database, select data, preview, and filter the data, and then import the data into your model workspace. By using the Table Import Wizard, you can import data from a variety of relational sources: Access, SQL, Oracle, Sybase, Informix, DB2, Teradata, and more. The steps for importing data from each of these relational sources are very similar to what is described below. Additionally, data can be selected using a stored procedure.
In this lesson, you will rename many of the columns in each table you imported. Renaming makes columns more identifiable and easier to navigate in both the model designer as well by users selecting fields in a client application.
Mark as Date Table
In Lesson 2: Add Data, you imported a dimension table named DimDate. You then renamed the DimDate table, in Lesson 3: Rename Columns, to simply, Date. While in your model this table is now named Date, it can also be known as a Date table, in that it contains date and time data. Whenever you use Time Intelligence functions in calculations, as you will do when you create measures a little later, you must specify a Date table and a unique identifier Date column in that table. You can then create valid relationships between other tables and the Date table; necessary for calculations using DAX time intelligence functions. In this lesson, you will mark the imported and renamed Date table as the Date table and the Date column (in the Date table) as the Date column (unique identifier). All the use of the name Date can get kind of confusing, but you’ll soon get the idea.
In this lesson, you will verify the relationships that were created automatically when you imported data and add new relationships between different tables. A relationship is a connection between two tables that establishes how the data in those tables should be correlated. For example, the Product table and the Product Subcategory table have a relationship based on the fact that each product belongs to a subcategory.
Create Calculated Columns
In this lesson, you will create new data in your model by adding calculated columns. A calculated column is based on data that already exists in the model.
In this lesson, you will create measures to be included in your model. Similar to the calculated columns you created in the previous lesson, a measure is essentially a calculation created using a DAX formula. However, unlike calculated columns, measures are evaluated based on a filter; a user selection, for example, a particular column or slicer added to the Row Labels field in a PivotTable. A value for each cell in the filter is then calculated by the applied measure. Measures are powerful, flexible calculations that you will want to include in almost all tabular models, to perform dynamic calculations on numerical data.
Create Key Performance Indicators
In this lesson, you will create Key Performance Indicators (KPIs). KPIs are used to gauge performance of a value, defined by a Base measure, against a Target value, also defined by a measure or by an absolute value. In reporting client applications, KPIs can provide business professionals a quick and easy way to understand a summary of business success or to identify trends.
In this lesson, you will create an internet Sales perspective. A perspective defines a viewable subset of a model that provides focused, business-specific, or application specific viewpoints. When a user connects to a model using a perspective, they see only those model objects (tables, columns, measures, hierarchies, and KPIs) as fields defined in that perspective. The Internet Sales perspective you create in this lesson will exclude the Customer table object. When you create a perspective that excludes certain objects from view, that object still exists in the model; however, it is not visible in a reporting client field list. Calculated columns and measures either included in a perspective or not can still calculate from object data that is excluded. The purpose of this lesson is to describe how to create perspectives and become familiar with the tabular model authoring tools. If you later expand this model to include additional tables, you can create additional perspectives to define different viewpoints of the model, for example, Inventory and Sales Force.
In this lesson, you will create hierarchies. Hierarchies are groups of columns arranged in levels; for example, a Geography hierarchy might have sub-levels for Country, State, County, and City. Hierarchies can appear separate from other columns in a reporting client application field list, making them easier for client users to navigate and include in a report.
In this lesson, you will create partitions to divide the Internet Sales table into smaller logical parts that can be processed (Refreshed) independent of other partitions. By default, every table you include in your model has one partition which includes all of the table’s columns and rows. For the Internet Sales table, we want to divide the data by year; one partition for each of the table’s five years. Each partition can then be processed independently.
In this lesson, you will create roles. Roles provide model database object and data security by limiting access to only those Windows users which are role members. Each role is defined with a single permission: None, Read, Read and Process, Process, or Administrator. Roles can be defined during model authoring by using the Role Manager dialog box in SQL Server Data Tools. After a model has been deployed, you can manage roles by using SQL Server Management Studio.
Analyze in Excel
In this lesson, you will use the Analyze in Excel feature in SQL Server Data Tools to open Microsoft Excel, automatically create a data source connection to the model workspace, and automatically add a PivotTable to the worksheet. The Analyze in Excel feature is meant to provide a quick and easy way to test the efficacy of your model design prior to deploying your model. You will not perform any data analysis in this lesson. The purpose of this lesson is to familiarize you, the model author, with the tools you can use to test your model design. Unlike using the Analyze in Excel feature, which is meant for model authors, end-users will use client reporting applications such as Excel or Power View to connect to and browse deployed model data. In order to complete this lesson, Excel must be installed on the same computer as SQL Server Data Tools.
In this lesson, you will configure deployment properties; specifying a deployment server instance of Analysis Services running in Tabular mode, and a name for the model you deploy. You will then deploy the model to that instance. After it is deployed, users can connect to the model by using a reporting client application.