Saturday, February 28, 2015

How to stop users seeing all databases by default

Here is the scenario. You have created a login called "Jane" and have not added to any server role. And you are 100% sure that she has not been given any permission on any database and she has no default database. Here is the statement you use:

CREATE LOGIN Jane
WITH PASSWORD = 'Pa$$w0rd'
, CHECK_POLICY = OFF;

Now she logs into SQL Server as below;



And she sees the object explore. Not only that she can expand the databases node and see databases.



Of course, she cannot go into a database and see the content but this is an issue for many cases, why should we let Jane to see even names of databases?

This becomes possible by default because of the Public server role. By default Public server role has permission on VIEW ANY DATABASE and all Logins are autmatically added to Public role.



If you need to stop this, change this permission like below (you can easily change this using GUI too).

DENY VIEW ANY DATABASE TO Public;

Now Jane cannot see even names of databases unless she has permission on it.


Friday, February 27, 2015

SQL Server Security Concepts: Securables, Pricipals and Permissions

When discussing and learning security related to SQL Server, it is always better to know the key concepts and terms related to it. Some of the concepts are not only related to SQL Server, they are related to almost all applications in terms of security. In simplest way, security is all about allowing someone or something to access a resource and perform actions on it. Here are some terms used when describing it:

Securables
Securable is mainly a resource which we can assign permissions. SQL Server has securables at multiple level of a hierarchical architecture. The hierarchy starts from the server level which includes securables like Endpoints, Logins, Server Roles and Databases. These securables are called as server-level securables as well as server scope securables.

Next level of the hierarchy is the database. Database-level or database scope securables includes items like Users, Database Roles, Certificates and Schemas.

SQL Server includes securables at Schema level too. They are called as schema scope securables that includes resources like Tables, Views, and Procedures.

Principals
The someone or something that perform actions on securables is called as a Principal. There are three types of principals related to SQL Server security: Window's Principals, SQL Server Principals, and Database Principals.

Window's principals and SQL Server principals are considered as server level principals. Windows level principals are generally domain or local server user accounts or groups that are used to connect with SQL Server instance. Authentication is done by either local server or domain controller, and SQL Server trusts the account without performing authentication. SQL Server level principals are logins created at SQL Server instance and authentication is done by SQL Server itself.

Database principals includes database users, database roles and application roles.

** Some principals are also securables. As an example, Login is a principal as well as a securable. It is a principal that because it can access the SQL Server instance and it is also a securable because there are actions that can be performed on it such as enabling and disabling that require permission.

Permissions
Permissions allow principals to perform actions on securables. There are two types of permissions related to SQL Server: Statement permission and Object Permission.

Statement permissions refer actions that can be performed by executing a statement. The principal creating a table using CREATE TABLE statement with CREATE TABLE permission is an example for it.

Object permissions refer actions that can be performed on securables. A principal having SELECT permission on a table is an example for this.

This image shows how these concepts are worked together. Note that the image has been taken from an article in TechNet Magazine. You can refer the article at: https://technet.microsoft.com/en-us/magazine/2009.05.sql.aspx


Tuesday, February 24, 2015

Gartner Magic Quadrant 2015 for Business Intelligence and Analytics Platforms is published

Gartner has released its Magic Quadrant for Business Intelligence and Analytics platforms for 2015. Just like the 2014 one, this shows business intelligence market share leaders, their progress and position in terms of business intelligence and analytics capabilities.

Here is the summary of it;


This is how it was in 2014;


As you see, position of Microsoft has been bit changed but it still in leaders quadrant. As per the published document, main reasons for the position of Microsoft are strong product vision, future road map, clear understanding on market desires, and easy-to-user data discovery capabilities. However, since the Power BI is yet-to-be-completed in terms of complexities and limitations, and its standard-alone version is still in preview stage, including some key analytic related capabilities (such as Azure ML), the market acceptance rate is still low.

You can get the published document from various sites, here is one if you need: http://www.birst.com/lp/sem/gartner-lp?utm_source=Birst&utm_medium=Website&utm_campaign=BI_Q115_Gartner%20MQ_WP_Website

Monday, February 23, 2015

Tools for carrying out performance monitoring and tuning of SQL Server

Regular and well-defined monitoring guarantees smooth running of SQL Server. Based on your goals related to monitoring, you should select the appropriate tools for monitoring. For that you have to know the tools supported by SQL Server. Microsoft SQL Server offers number of tools that can be used for performance monitoring and tuning SQL Server databases. Here is the list of tools available;

  1. Activity Monitor
    This tool can be opened with Management Studio and it gives detail view of current activities. It includes five sections: Overview, Processes, Resource Waits, Data File I/O, Recent Expensive Queries.
    https://msdn.microsoft.com/en-us/library/hh212951.aspx
    http://www.kodyaz.com/sql-server-tools/sql-server-activity-monitor-tool.aspx
    .
  2. Dynamic Management View and Functions
    These views and functions return state information that can be used to monitor the health of the instance, diagnose problems and tune performance. There are two type of views and functions: Server-scoped and Database-scoped. These are one of the best tools for monitoring, specifically on ad-hoc monitoring.
    https://msdn.microsoft.com/en-us/library/ms188754.aspx
    http://download.red-gate.com/ebooks/SQL/eBook_Performance_Tuning_Davidson_Ford.pdf

  3. Performance Monitor
    This Windows administrative tool allows to track resource usage on Microsoft Operating System and can be used to monitor information specific to SQL Server. This is used as a monitoring tool for identifying trends over a period of time and as a ad-hoc monitoring for identifying resource bottleneck responsible for performance issue.
    http://www.brentozar.com/archive/2006/12/dba-101-using-perfmon-for-sql-performance-tuning/
    https://www.youtube.com/watch?v=nAtlan1qgso
    .
  4. SQL Server Profiler
    This graphical tool allows to capture a trace of all events occurred in SQL Server. This is heavily used for seeing current T-SQL activities and captured info can be saved for further analysis. This tool also offers the captured events to be replayed.
    ** This tool has been deprecated in SQL Server 2012, instead use Extended events for capturing and Distributed replay for replaying events.
    https://msdn.microsoft.com/en-us/library/ms181091.aspx
    http://www.codeproject.com/Articles/21371/SQL-Server-Profiler-Step-by-Step
    .
  5. SQL Trace
    This is T-SQL way that provides same SQL Server Profiler tracing facility. Since it does not provide GUI, it is bit difficult to set it up but it is not as heavy as Profiler and leverages T-SQL features.
    https://technet.microsoft.com/en-us/library/ms191006(v=sql.105).aspx
    http://blogs.msdn.com/b/sqlsecurity/archive/2008/12/12/how-to-create-a-sql-trace-without-using-sql-profiler.aspx
    .
  6. Database Engine Tuning Advisor
    This tool facilitates getting queries or workload analyzed and getting recommendations on indexes and statistics. This is a useful tool for determining the best index for queries and identifying less-efficient indexes added.
  7. Distributed Replay
    This is an advanced tool that support replaying captured workload across distributed set of servers. This is useful for accessing the impact of SQL Sever upgrades, hardware upgrades and operating system upgrades.
    https://msdn.microsoft.com/en-us/library/ff878183.aspx
    http://blogs.msdn.com/b/mspfe/archive/2012/11/08/using-distributed-replay-to-load-test-your-sql-server-part-1.aspx
    https://msdn.microsoft.com/en-us/library/ee210548.aspx

    .
  8. SQL Server Extended Events
    This is a highly scalable and configurable architecture that offers a lightweight system with an UI for collecting information to troubleshoot SQL Server.
    https://technet.microsoft.com/en-us/library/bb630354(v=sql.105).aspx
    https://www.simple-talk.com/sql/database-administration/getting-started-with-extended-events-in-sql-server-2012/

    .
  9. SQL Server Data Collection
    This is an automated system for collecting and storing and reporting performance data for multiple SQL Server instances.
    https://msdn.microsoft.com/en-us/library/bb677179.aspx
    http://blog.sqlauthority.com/2010/04/13/sql-server-configure-management-data-collection-in-quick-steps-t-sql-tuesday-005/

    .
  10. SQL Server Utility Control Point
    This is a centralized management portal for monitoring multiple instances of SQL Server based on specific collection sets.
    https://technet.microsoft.com/en-us/library/ee210548(v=sql.120).aspx
    http://sqlmag.com/sql-server-2008/introducing-sql-server-utility.
    .
  11. Microsoft System Center Operations Manager
    This is an enterprise level infrastructure management solution that uses management packs to collect performance and health info from windows and application services such as SQL Server. SQL Server has a management pack that enables to create exception-driven events for resolving specific issues.
    http://channel9.msdn.com/Events/TechEd/NorthAmerica/2014/DCIM-B419#fbid=
    http://www.microsoft.com/en-us/download/details.aspx?id=10631

Friday, February 20, 2015

Does my table hold invalid data after importing data? Constraints have been violated?

It is rare to see a table without at least one constraint because constraints are required to manage data integrity rules. Although we add constraints, for certain scenario, we disable constraints as a temporary measure for increasing the performance of data importing. Keeping constraints enabled makes sure that no invalid records are inserted to the table but this slows down the entire loading process as each and every value has to be validated against the constraints. Because of this, rather than checking each value during the importing process, we disable constraints, complete the importing process and then enable constraints either checking all values loaded against the constraints or checking none of the values loaded against the constraints. Enabling without checking loaded values improves the overall performance but this can result the table holding invalid data, hence this should be done only when data can be trusted. 

However, if data has been loaded by disabling constraints and later constraints have been enabled without checking loaded values, and you need know whether constraints have been enabled without checking loaded values, specifically on foreign key and check constraints, how do you do it?

It can be done using two catalog views: sys.check_constraints and sys.foreign_keys. Both return a column called is_not_trusted and 1 indicates that rows have been inserted or updated without getting values validated against constraints. Although this indicator does not say that table hold invalid data, it gives an indication on possibilities of having invalid data, hence it is always better to have this checking as a part of routine checkup. And do not forget, "not trusted" constraints will lead to poorly-optimized query execution plans for queries running against tables with "not trusted" constraints.

Here is a sample code that shows the way of checking CHECK constraints with sys.check_constraint catalog view.

-- Create a table with a check constraint
USE tempdb;
GO

CREATE TABLE dbo.Customer
(
 CustomerId int not null
 , CustomerName varchar(10) not null
 , CreditAmount decimal(18,2) not null
 , CONSTRAINT Customer_SalaryCheck CHECK (CreditAmount < 100000)
);
GO

-- Inserting valid data
INSERT INTO dbo.Customer
 VALUES (1, 'Joe Brown', 65000);
INSERT INTO dbo.Customer
 VALUES (2, 'Mary Smith', 75000);


-- Inserting invalid data
-- This will not be possible as it violates the constraint
INSERT INTO dbo.Customer
 VALUES (3, 'Joe Brown', 110000);


-- Disabling the constraint and inserting the same again
-- Now it allows to insert a row with an invalid value
ALTER TABLE dbo.Customer NOCHECK CONSTRAINT Customer_SalaryCheck;
GO

INSERT INTO dbo.Customer
 VALUES (3, 'Joe Brown', 110000);
GO


-- Enabling the constraint.
-- Note that updated records are not validated
ALTER TABLE dbo.Customer CHECK CONSTRAINT Customer_SalaryCheck;
GO
-- Now table contains invalid data.

-- Checking with sys.check_constraints catalog view.
SELECT name, is_not_trusted FROM sys.check_constraints
WHERE parent_object_id  = OBJECT_ID('dbo.Customer');
-- This shows the value of is_not_trusted as 1
-- indicating that table might contain invalid data

-- Query the table and see. This shows 3 records including the invalid row.
SELECT * FROM dbo.Customer;
GO

-- If you want to make sure no invalid data exist
-- and validate all records agains the constraint
-- disable it again and enable it with CHECK option
ALTER TABLE dbo.Customer NOCHECK CONSTRAINT Customer_SalaryCheck;
GO

ALTER TABLE dbo.Customer WITH CHECK CHECK CONSTRAINT Customer_SalaryCheck;
GO
-- This will throw and error because current rowset has an invalid value.
-- This can be corrected by either updating it or deleting it.
DELETE FROM dbo.Customer WHERE CustomerId = 3;

-- Now constraint can be enabled with CHECK option
ALTER TABLE dbo.Customer WITH CHECK CHECK CONSTRAINT Customer_SalaryCheck;
GO
-- And if you run sys.check_constraint again, you will see 0 for is_not_trusted column.

Wednesday, February 18, 2015

SSAS: How to find User-Defined Member Properties of a dimension?

User-Defined Member Properties are attribute relationships added to specific named level in a dimension. Once defined, they can be access using either PROPERTIES key word or Properties function. Here is and example of accessing a user-defined properties.

-- from MSDN
SELECT 
   CROSSJOIN([Ship Date].[Calendar].[Calendar Year].Members, 
             [Measures].[Sales Amount]) ON COLUMNS,
   NON EMPTY Product.Product.MEMBERS
   DIMENSION PROPERTIES 
              Product.Product.[List Price],
              Product.Product.[Dealer Price]  ON ROWS
FROM [Adventure Works]
WHERE ([Date].[Month of Year].[January]) 


However, if you do not know all user-defined properties added and need to get a list, you can simply use Analysis Services Schema Rowsets for querying out information such as objects, state, sessions, connections, etc. There are many schema rowset under OLEDB for OLAP Schema Rowsets and the one we have to use is MDSCHEMA_PROPERTIES. Here is the query for getting all user-defined properties of a level in a dimension.

SELECT *
FROM $SYSTEM.MDSCHEMA_PROPERTIES
WHERE [DIMENSION_UNIQUE_NAME] = '[Dim Customer]'
 AND [HIERARCHY_UNIQUE_NAME] = '[Dim Customer].[Customer]'
 AND [LEVEL_UNIQUE_NAME] = '[Dim Customer].[Customer].[Customer]'
 AND [PROPERTY_TYPE] = 1;

Read more on Analysis Services Schema Rowsets: https://msdn.microsoft.com/en-us/library/ms126233.aspx
Read more on OLE DB for OLAP Schema Rowsets: https://msdn.microsoft.com/en-us/library/ms126079.aspx

Tuesday, February 10, 2015

Reading the content of the databse while restoring backup-sets: Restore with STANDBY

Assume that you have multiple transaction log backups that have been taken every hour, starting from 9am to 5pm, and you need to restore the database up to a certain set of transactions that have been happened during the day, how do you do it? If you do not know the exact time-tag of the transactions and no name given for the transactions, then restore with STOPAT or STOPATMARK is not possible. Since we cannot read the content of the database during the restore process, only way to figure it out is, restore one set (full database backup and 9am transaction log backup), complete and read the content. If the required content is not there, then restore the again with another set (full database backup, 9am transaction log backup and 10am transaction log backup). Again, If the required content is not there, then restore the again with another set (full database backup, 9am transaction log backup, 10am transaction log backup, and 11am transaction log backup). This unnecessary, time-consuming process has to be continued until you see your transactions in the database. What if SQL Server allows you to see the content during the restoration that helps you to identifying the last transaction log backup to be restored?

If you have more than one backup-set to be restored, all backup-sets should be restored with NORECOVERY option except the last one. Last one should be restored with RECOVERY option. Until that database remains Recovering state which blocks users connecting with it. Simply NORECOVERY option does not allow users to access the database until a backup-set is restored with RECOVERY. However, using STANDBY option instead NORECOVERY allows you to read the content of the database between transaction log restores.

Generally, SQL Server rolls back uncommitted transactions at the end of restore process with RECOVERY option to make the database readable. The STANDBY option also rolls back uncommitted transactions but keeps all the information needed for "undo" process which will allow for further restore. This basically allows you to read the content of the database after one transaction log restoration, without completing the restore process. Since it keeps necessary information for "undo" process, if required content is not available in the database, you can continue the restore process with additional transaction log backups.

This option is the best way of restoring the database for above-mentioned situation.

Here is a sample code for understanding restore operation with STANDBY option.

Backing up the database with full and multiple transactions log backups.

-- Creating a database for testing
USE master;
GO

IF EXISTS (SELECT * FROM sys.databases WHERE name = 'TestDatabase')
 DROP DATABASE TestDatabase;
GO

CREATE DATABASE TestDatabase
ON PRIMARY
(
 NAME = N'TestDatabase_Data'
 , FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\TestDatabase_Data.mdf'
 , SIZE = 10240KB, FILEGROWTH = 1024KB
)
LOG ON
(
 NAME = N'TestDatabase_Log'
 , FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\TestDatabase_Log.ldf'
 , SIZE = 5120KB, FILEGROWTH = 10%
);
GO

-- Changing the recovery model to full
ALTER DATABASE TestDatabase SET RECOVERY FULL;
GO

-- Take the initial backup
BACKUP DATABASE TestDatabase
 TO DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase.bak'
WITH INIT;
GO

-- create a table for testing purposes
USE TestDatabase;
GO

CREATE TABLE dbo.TestTable
(
 Id int identity(1,1) PRIMARY KEY
 , StringColumn nvarchar(600)
 , IntColumn bigint
);
GO

-- Insert the first record
INSERT INTO dbo.TestTable
 (StringColumn, IntColumn)
VALUES ('This is the first data', 12345);

-- Check records. You should see one record
SELECT * FROM dbo.TestTable

-- Perform first transaction log backup
-- This backup contains our first record inserted
BACKUP LOG TestDatabase
 TO DISK =
 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase_log.trn'
WITH INIT;
GO

-- Insert the second record
INSERT INTO dbo.TestTable
 (StringColumn, IntColumn)
VALUES ('This is the second data', 12345);

-- Check records. You should see two records now
SELECT * FROM dbo.TestTable

-- Perform the second transaction log backup
-- This backup contains first and second records inserted
BACKUP LOG TestDatabase
 TO DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase_log.trn'
WITH NOINIT;
GO

-- Insert the third record
INSERT INTO dbo.TestTable
 (StringColumn, IntColumn)
VALUES ('This is the third data', 12345);

-- Check records. You should see three records now
SELECT * FROM dbo.TestTable

-- Perform the third transaction log backup
-- This backup contains first, second and third records inserted
BACKUP LOG TestDatabase
 TO DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase_log.trn'
WITH NOINIT;
GO


Assume that we need to restore up to the second record and we do not know the exact time which record was inserted, and we do not know which transaction log backup holds it. Then only option is, using STANDBY option, here is the way of doing it.

-- Restoring the full backup
USE master;
GO

RESTORE DATABASE TestDatabase
 FROM DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase.bak'
WITH REPLACE, NORECOVERY;
GO

-- Check the state of the database
-- This will show as "Restoring"
SELECT 
 name, state_desc
FROM sys.databases
WHERE name = 'TestDatabase';
GO

-- Restore first log with STANDBY instead NORECOVERY
RESTORE DATABASE TestDatabase
 FROM DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase_log.trn'
WITH FILE =1, STANDBY = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\ROLLBACK_UNDO_TestDatabase_.bak';
GO

-- Read the state of the database
-- Now you will see that it is "online".
-- But remember, now the database is read-only.
-- Have a look on it with Object Explorer, you will the state.
SELECT 
 name, state_desc
FROM sys.databases
WHERE name = 'TestDatabase';
GO

-- Read the table
USE TestDatabase;
GO

SELECT * FROM dbo.TestTable;
-- We see the first record, but we need the second record.
-- Now let's try with our second log backup.


USE master;
GO

RESTORE LOG TestDatabase
 FROM DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\TestDatabase_log.trn'
WITH FILE =2, STANDBY = 'C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Backup\ROLLBACK_UNDO_TestDatabase_.bak';
GO

-- Again, let read the table
USE TestDatabase;
GO

SELECT * FROM dbo.TestTable;
-- As you see, we have the required record.
-- No need to continue the restore operation.
-- Let's complete the restore process

USE master;
GO

RESTORE DATABASE TestDatabase WITH RECOVERY;

-- Done.