Feeds:
Posts
Comments

Posts Tagged ‘SQL Server 2005’

Recently I came across a situation where queries are loading extremely slow from a table. After careful analysis we found the root cause being, a column with ntext datatype was getting inserted with huge amounts of text content/data. In our case DATALENGTH T-SQL function came real handy to know the actual size of the data in this column.

According to books online, DATALENGTH (expression) returns the length of the expression in bytes (or) the number of bytes SQL needed to store the expression which can be of any data type. From my experience this comes very handy to calculate length and size especially for LOB data type columns (varchar, varbinary, text, image, nvarchar, and ntext) as they can store variable length data. So, unlike LEN function which only returns the number of characters, the DATALENGTH function returns the actual bytes needed for the expression.

Here is a small example:

Use AdventureWorksLT2012
GO
Select ProductID, DATALENGTH(Name) AS SizeInBytes, LEN(Name) AS NumberOfCharacters
FROM [SalesLT].[Product]

 

–Results

DATALENGTH

If your column/expression size is too large like in my case, you can replace DATALENGTH(Name) with DATALENGTH(Name)/1024 to convert to KB or with DATALENGTH(Name)/1048576 to get the size in MB.

Advertisement

Read Full Post »

Most people prefer to have “sa”  account as the database owner, primary reason being sa login cannot be removed/deleted unlike any user account or service account and so the databases will never end-up in an orphaned stage.

I came-up with the below method to change the ownership to sa on all the 40 databases in our environment recently.

Step 1: Check the databases that does not have sa account as the owner

SELECT name AS DBName, suser_sname(owner_sid) AS DBOwner  
FROM sys.databases
WHERE suser_sname(owner_sid) <> 'sa'
 

Step 2: Generate the scripts to make sa account as owner for all the databases

SELECT 'ALTER AUTHORIZATION ON DATABASE::' + QUOTENAME(name) + ' TO [sa];'
from sys.databases
where name not in ('master', 'model', 'tempdb', 'msdb')
AND suser_sname(owner_sid) <> 'sa'
 

Step 3: Execute the result set from step 2 above to change the ownership to sa

--Sample result set from step2 above
ALTER AUTHORIZATION ON DATABASE::[AdventureWorksLT2012] TO [sa];
ALTER AUTHORIZATION ON DATABASE::[Northwind] TO [sa];
ALTER AUTHORIZATION ON DATABASE::[Pubs] TO [sa];
 

For more information on sa account you can check my previous blog post HERE

Read Full Post »

Most of us know the default port for SQL Server is 1433, but there are various ports being used by SQL Server for other database services and SQL features.

In the below table I tried to list the ports that are frequently used by the Database Engine

Item Port
Default instance TCP port 1433
Named instance in default configuration Dynamic port. You can configure named instances to use fixed TCP ports
Dedicated Admin Connection (DAC) TCP port 1434
SQL Server Browser service UDP port 1434
SQL Server instance running over an  HTTP end-point TCP port 80 for CLEAR_PORT traffic   TCP port 443 for SSL_PORT traffic
Service Broker TCP port 4022
Replication TCP port 1433 for default instance
Transact-SQL Debugger TCP port 135
Analysis Services TCP port 2383 for default instance
Reporting Services Web Services TCP port 80
Reporting Services configured for use through HTTPS TCP port 443
Integration Services: Microsoft remote
procedure calls
TCP port 135
Integration services run time TCP port 135
Microsoft Distributed Transaction Coordinator (MS DTC) TCP port 135
SQL Server Management Studio browse connection to browser service UDP port 1434

You can find more information about the TCP and UDP ports SQL Server uses from books online here: Configure the Windows Firewall to Allow SQL Server Access

Read Full Post »

In order to reduce Tempdb contention one of  the best practices is to maintain multiple sized Tempdb data files, matching the number of processors and up to a maximum of 8. In this post I will show you T-SQL script to identify current Tempdb configuration and number of logical processors along with adding additional Tempdb data files as required.

Script 1: Find current tempdb configuration


select DB_NAME(mf.database_id) database_name
, mf.name logical_name, mf.file_id
, CONVERT (DECIMAL (20,2)
, (CONVERT(DECIMAL, size)/128)) as [file_size_MB]
, CASE mf.is_percent_growth
WHEN 1 THEN 'Yes'
ELSE 'No'
END AS [is_percent_growth]
, CASE mf.is_percent_growth
WHEN 1 THEN CONVERT(VARCHAR, mf.growth) + '%'
WHEN 0 THEN CONVERT(VARCHAR, mf.growth/128) + ' MB'
END AS [growth_in_increment_of]
, CASE mf.is_percent_growth
WHEN 1 THEN CONVERT(DECIMAL(20,2)
,(((CONVERT(DECIMAL, size)*growth)/100)*8)/1024)
WHEN 0 THEN CONVERT(DECIMAL(20,2)
, (CONVERT(DECIMAL, growth)/128))
END AS [next_auto_growth_size_MB]
, physical_name from sys.master_files mf
where database_id =2 and type_desc= 'rows'

Script 2: Find number of logical processors

SELECT cpu_count AS logicalCPUs FROM sys.dm_os_sys_info

Script 3: Add tempdb data files as per processor count from the above query

ALTER DATABASE tempdb ADD FILE ( NAME = N'tempdev2',
FILENAME = N'D:\DBA\Data\tempdev2.ndf' , SIZE =8MB , FILEGROWTH = 5MB) --<<--Update the data file location/Size/AutoGrowth
GO

ALTER DATABASE tempdb ADD FILE ( NAME = N'tempdev3',
FILENAME = N'D:\DBA\Data\tempdev3.ndf' , SIZE =8MB , FILEGROWTH = 5MB)--<<--Update the data file location/Size/AutoGrowth
GO
---ETC, add files as per processors count

Reboot/Restart of SQL services is not required for making the tempdb changes. Here is a great post that explains how to best remove extra Tempdb files.

For more information:

 

Read Full Post »

Though there are several ways for importing SSIS packages into SQL Server either by creating a stored procedure or using a SSIS package itself, I find this simple code of line much easier and handy to work with.  It utilizes the “dtutil.exe” application which is installed by default with SQL Server installation. It is capable of importing and exporting packages and is found in the Binn folder of SQL Server (For eg: C:\Program Files\Microsoft SQL Server\110\DTS\Binn\dtutil.exe)

Below is a post from Suresh, which describes how dtutil works for importing and exporting one package at a time.

How to Copy or Export an SSIS Package Using Command Prompt Utility – DTUTIL

In this article we will see deploying multiple SSIS packages.

Demo for deploying multiple packages.

1. Open command prompt where the packages (.dtsx) are stored

Tip: Hold ‘Shift’ and right click to open command prompt window at the desired location.

2

2. Execute the dtutil script from cmd.

for %I in (*.dtsx) do dtutil /FILE "%I" /COPY SQL;"/Packages/%~nI" /DESTSERVER localhost

3

Note to make the below changed before you run the above script.

  • Change the package folder location where the SSIS packages will be deployed
  • Change the server name at the end of the code to reflect the destination server. We can choose ‘localhost’ if running this within the server as shown in the below figure.

4

As seen in the above figure all our packages are imported into the server.

For more dtutil Utility options please see the MSDN article HERE.

Read Full Post »

Before we get started on this topic, here is a quick fact..in SQL Server 2000, there used to be hard limit on the data that can be stored in a single row, which is 8,060 bytes. So, if the data exceeds this limit, the update or insert operation would fail!

Fortunately, in later SQL Server versions, rows are dynamically managed to exceed this limit and the combined width of the row can now exceed the 8,060 byte limit. I wanted to refresh this in our memory as this will help us to better understand the allocation units concept.

What are Allocation Units in SQL Server:

Every partition in a SQL Server table can contain 3 types of data, each stored on its own set of pages. And each of these types of pages is called an Allocation Unit. Below are the 3 types of Allocation Units.

  • IN_ROW_DATA
  • ROW_OVERFLOW_DATA
  • LOB_DATA

So, an Allocation Unit is basically just a set of particular type of pages. Now, let us try to understand each of these allocation units using a demo.

  • IN_ROW_DATA 

When the row size stays within the 8,060-byte limit, SQL Server stores all of the data in the IN_ROW_DATA allocation unit and usually this unit holds the majority of data in most of the applications.

To better explain the concept, I came up with this simple Demo:

--Create a sample db AllocationUnitsDemo
USE master
GO
CREATE DATABASE AllocationUnitsDemo
GO

--Cretae a sample table ProductDetails in the AllocationUnitsDemo db
--Total length of the row in this table is 1000 + 4000 = 5000 (< 8000)
Use AllocationUnitsDemo
GO
CREATE TABLE ProductDetails
(
ProductName varchar(1000),
ProductDesc varchar (4000), 
)
GO

--Check the allocation unit type
Use AllocationUnitsDemo
GO
SELECT type_desc, total_pages, used_pages,data_pages 
FROM sys.allocation_units
WHERE container_id = (SELECT partition_id FROM sys.partitions 
WHERE OBJECT_ID = OBJECT_ID('ProductDetails'))

Results:
In_Row_Data
  • ROW_OVERFLOW_DATA 

Remember the introduction? so, when the row exceeds the 8,060-byte limit, SQL Server then moves one or more of the variable-length columns to pages in the ROW_OVERFLOW_DATA allocation unit.

We still have a limitation here for the row size. Though the combined width of the row can exceed the 8,060 byte limit, the individual width of the  columns must be within the limit of 8,000 bytes. This means we can have a table with two columns defined as nvarchar(5000), nvarchar(5000), but we are not allowed nvarchar(10000)

Demo Continued..

--Add an extra column to the above table ProductDetails
--Make the total length of the row to become 5000 + 4000 = 9000 (>8000)
Use AllocationUnitsDemo
GO
ALTER TABLE ProductDetails ADD ProductSummary nvarchar(4000) 

--Now, Check the allocation unit type
Use AllocationUnitsDemo
GO
SELECT type_desc, total_pages, used_pages,data_pages 
FROM sys.allocation_units
WHERE container_id = (SELECT partition_id FROM sys.partitions 
WHERE OBJECT_ID = OBJECT_ID('ProductDetails'))

Results:
Row_OverFlow_Data
  • LOB_DATA 

If a column with LOB data type is defined, then SQL Server uses the LOB_DATA allocation unit. To know what data types are considered LOB and to get the list of LOB columns from a database, please refer my previous post: “SQL Server – Find all the LOB Data Type Columns in a Database Using T-SQL Script

Demo Continued..

--Add LOB data type column to the table ProductDetails
Use AllocationUnitsDemo
GO
ALTER TABLE ProductDetails ADD ProductImage Image

--Again, Check the allocation unit type
Use AllocationUnitsDemo
GO
SELECT type_desc, total_pages, used_pages,data_pages 
FROM sys.allocation_units
WHERE container_id = (SELECT partition_id FROM sys.partitions 
WHERE OBJECT_ID = OBJECT_ID('ProductDetails'))

Results:
LOB_Data
--Cleanup
Use master
GO
DROP DATABASE AllocationUnitsDemo

How many Allocation Units can a Table have?

It actually depends on the number of partitions and indexes on the table.

To simplify the concept, as shown in the below picture, assume there is one table having no indexes (HEAP) and no partitions. Having no partitions mean, all of the table’s contents are stored in a single partition, meaning every table has at-least 1 partition.

AllocationUnits_Figure1

Based on the above, we can have upto 3 allocation units for a table with no partitions and no indexes. And how about if we have partitions and Indexes? Below is the formula I came up with to get the maximum possible number of allocation units per table.

  • No of Allocation Units = No of Partitions × No of Indexes × 3

AllocationUnits_Count

So, as we see from the figures above, a table can have up to 45 million allocation units in SQL Server 2012!

Read Full Post »

The command prompt utility dtutil can be very handy when we want to quickly export an SSIS package from either file system to msdb or vice-versa.

For a quick demo, I created a package called “ProductPrice” uder the file system C:\packages, as shown in the screenshot below

1

Also, I created another package called “UpdatePrice” in SQL Server, which gets stored in msdb database, below is the screnshot

2

Now, let us see how we can quickly import or export these packages using cmd. For this we will be using the COPY option in the dtutil utility

  • To copy package from file system to msdb

Run the below syntax from cmd:

dtutil /FILE C:\Packages\ProductPrice.dtsx /COPY SQL;ProductPrice

This copies/exports “ProductPrice” package from file system to msdb database as shown in the below screenshot

3

  • To copy a package from msdb to file system

Run the below syntax from cmd:

dtutil /SQL UpdatePrice /COPY FILE;C:\Packages\UpdatePrice.dtsx

This copies/exports “UpdatePrice” package from msdb database to file system. Below is the screenshot

4

I have used “Windows Authentication” in this demo. To use mixed mode authentication or to export packages to a different server, we need to provide proper dtutil options, which can be found running the syntax dtutil /? from cmd utility.

Read Full Post »

Sometimes you might want to add more than one column as primary key. For example, if you have three columns named Last Name, First Name and Address and  there can be duplicate Last Names  or duplicate First Names but can never have duplicates in Last Name, First Name  and Address combined together

Adding columns to a primary key constraint using T-SQL script:

USE AdventureWorks2012;
GO
ALTER TABLE Production.TransactionHistoryArchive 
ADD CONSTRAINT PK_TransactionHistoryArchive_TransactionID 
PRIMARY KEY CLUSTERED (TransactionID,[ProductID]);
GO

In order to add columns to already existing primary key, you need to drop the primary key constraint first to add columns later.
Drop primary key script is provided below.

USE AdventureWorks2012;
GO
ALTER TABLE Production.TransactionHistoryArchive 
DROP constraint [PK_TransactionHistoryArchive_TransactionID] 

Adding columns to a primary key constraint using SSMS GUI:

Right click on the Table Name and click on ‘Design’. Hold Ctrl key and select the column names that you want to add as Primary key. Then click on the ‘Set Primary Key’ as shown below.

1

The below screen shot shows three columns as primary key.

2

As a quick fact, the maximum number of columns that you can add to the primary key is limited to 16.

Read Full Post »

Today morning when I was working on a huge database containing lots of LOB data, I was required to know what tables have LOB data, and the list of LOB columns by table name along with the data type.

I found that starting SQL Server 2005, we can easily retrieve this information from the Information Schema Views, by specifying “COLUMN_NAME” in view_name. There may be even a better way to do this, but here is what I came-up with.

USE [AdventureWorks2012]
GO
SELECT TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME, DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS 
WHERE DATA_TYPE IN ('FILESTREAM','XML','VARBINARY','TEXT','NTEXT','IMAGE') 
OR(DATA_TYPE IN ('VARCHAR', 'NVARCHAR') AND CHARACTER_MAXIMUM_LENGTH = -1)
ORDER BY TABLE_NAME

HERE is an explanation by Pinal Dave, why I have to include CHARACTER_MAXIMUM_LENGTH = -1 in the above query

Sample Output:

Untitled

Below column types are considered LOB (Large Objects):
VARCHAR(MAX), NVARCHAR(MAX), FILESTREAM, XML, VARBINARY,
TEXT, NTEXT, IMAGE

Note: Data types TEXT, NTEXT and IMAGE were deprecated in SQL Server 2012 and are going to be removed in future versions, so Microsoft doesn’t recommend using them in new applications. For more information, please refer HERE

Read Full Post »

Even though there are more pros than cons, Striping database backups are often overlooked by many DBAs. Based on my observations in our environment, striping can significantly benefit larger database backups (~500+ GB).

As shown in the picture below, striping is nothing but splitting one backup file to multiple backup files (maximum 64 files). However, these files may or may not be the same size (depends on the storage disks IO).

Striped Backups

By Striping a backup we can:

  • Increase backup throughput and reduce the backup time window
  • Allow backups & restores to be written or to be read from all devices in parallel
  • Enable backup to different disks, thus distribute the space usage

Below are the simple T-SQL backup commands using AdventureWorks2012 sample database as an example.

T-SQL command for Striping a database backup

Note: In the below script, I used only Disk C to contain all the striped .bak files. However, we can  direct to multiple disks if required

-- Striped Backups -- Backup to multiple files - 4 files in this case
BACKUP DATABASE [AdventureWorks2012]
TO 
DISK='C:\AdventureWorks2012_1.bak', 
DISK='C:\AdventureWorks2012_2.bak', 
DISK='C:\AdventureWorks2012_3.bak',
DISK='C:\AdventureWorks2012_4.bak'
WITH STATS = 10
GO

T-SQL command to restore from Striped database backup 

--Restoring from striped backup -- from multiple files
RESTORE DATABASE [AdventureWorks2012] 
FROM  
DISK='C:\AdventureWorks2012_1.bak', 
DISK='C:\AdventureWorks2012_2.bak', 
DISK='C:\AdventureWorks2012_3.bak',
DISK='C:\AdventureWorks2012_4.bak'
WITH STATS = 10 
GO

Also, we can apply the same striping concept on Log backups. Below is how we do it

T-SQL command for Striping transaction log backup

--Striped Log backup
BACKUP LOG [AdventureWorks2012]
TO
DISK='C:\AdventureWorks2012_1.trn', 
DISK='C:\AdventureWorks2012_2.trn', 
DISK='C:\AdventureWorks2012_3.trn',
DISK='C:\AdventureWorks2012_4.trn'
WITH STATS = 10
GO

Demo: Normal Backup Vs Striped Backup

Below are the results and screenshots from a live production environemnt. This once again proves striping backup files increase data transfer rate and reduce the time to backup

Results:

Backup Type Time to Backup Data Transfer Rate
Normal Backup (1 File) 537.6 Seconds 111.9 MB/Sec
Striped Backup (4 Files) 201.0 Seconds 299.3 MB/Sec

Screenshots:

StripedBackupDemo_1of2

StripedBackupDemo_2of2

However, the major downside of striping a backup is that if at-least one backup file is corrupt, restore operation cannot be performed using the other files.

Also, HERE is why I haven’t discussed striping backups using SSMS GUI.

Technical Reviewer(s): Hareesh Gottipati; Jaipal Vajrala

Read Full Post »

Older Posts »