RSS

Monthly Archives: February 2015

SQL Server Cumulative update package #6 is available now for SQL Server 2014

downloadHello Guys SQL Server 2014 is the latest version from Microsoft SQL Server and it is coming with awoosem enhancement and features and we are still waiting SP 1 but until now still not released only we are updated our enviroment by the latest comulative update package to fix some issues in SQL Server 2014

Now Microsoft announced about SQL Server Cumulative update package 6 for more information about the Hotfixes that are included in this cumulative update package check this link.including nearly 400 fixes and enhancements

and for more information about the Latest Builds of SQL Server 2014 check this post for sqlsentry Community

  • KB Article: KB #3031047
  • 64 fixes total (55 fixes listed publicly at time of publication)
  • Build number is 12.0.2480
  • Relevant for @@VERSION 12.0.2000 through 12.0.2479

Follow the author:

View all my tips , LinkedIn Website Slideshare 

 
Leave a comment

Posted by on February 22, 2015 in General topics

 

Tags: , , , , ,

Keep your database in safe mode

images (1)Hi Seniors DBA HI junior DBA HI DBA Consultant , There is no one without errors or defects if you agree with me Complete the post if not don’t waste your time !, How you can keep your database in safe Mode from any mistake can by happen by any accident ? we should put our database in the safe mode Under any circumstances ,so what you can do if you deleted your database by wrong ! and in the same time you lost your last backup for this database !! really it is very bad situation and may be not happened put may be happen also, to be calm and to Keep your database in safe mode you should take preventive action one of this preventive action Forcing the Drop database Command or Rename database to rollback and you can do it easily without more effort and without more maintenance cost by using DDL trigger ,  to know more about triggers and DML , DDL and Login triggers go for this link (CREATE TRIGGER (Transact-SQL)) Trigger can be created on tables , Databases and on All Server what we will do now Create trigger on all Server

Create Trigger 

IF EXISTS (SELECT * FROM sys.server_triggers
WHERE name = ‘KeepYourDBinSafeMode’)
DROP TRIGGER KeepYourDBinSafeMode
ON ALL SERVER;
GO
Create TRIGGER KeepYourDBinSafeMode
ON ALL SERVER
FOR Drop_Database ,Alter_Database
AS
PRINT ‘You must disable Trigger “KeepYourDBinSafeMode” to drop or alter Database!’
ROLLBACK
GO

Check the trigger from Mangment Studio 

Keepyourdatabaseinsavemode

Read the rest of this entry »

 
Leave a comment

Posted by on February 17, 2015 in General topics

 

Tags: , ,

Filling all your tables with test data VII

introduction

Hi guys in my last post  (Filling all your tables with test data) i explained how we can fill multiple tables by test data by using My own Stored procedure , but after i creating the script and i do on it more test on our staging environment and it’s  working very well i got  one issues on the script ( SP can not cover the tables with composite primary key )  at this time I reviewed  the Stored procedure from the scratch to know where is the issue and how i can solve it at the end i asked on e of my best friends Mohamed Abdel Kariem mentor T-SQL to help me in this subject and at the end we do awesome script and here the below new enhancement on our main stored procedure .

Stored procedure enhancement:

  1. SP now supported tables with composite primary key
  2. SP now supported tables with composite primary key with data type uniqueidentifier.
  3. reduced the liens in the SP by using CASE technology
  4. using QUOTENAME function instead of more concat.

Read the rest of this entry »

 
Leave a comment

Posted by on February 15, 2015 in General topics

 

Tags: , , , , , , ,

Filling all your tables with test data

Problem

Database Assessment process for any new project or any new database have more parts one of this parts Assess the most hitting stored procedures and do stress test on it to be sure it can afford as example 500 concurrent user  without any issue in the IO , CPU and execution time , for doing this part we should have reasonable volume of  data in the table used in stored procedures , from two days i received request by new Database should be deploy in production at this time we start our work on the database to do Full database assessment process and when i arrive to the point of filling the table i found 50 tables should be filling by one milion of record  for test.

Solution :

i do one Stored procedure for filling any table with any schema design and this stored procedures covered the most worst scenario like the below cases but this table should have minimum one record 

  1. covered the tables with identity columns
  2. covered the tables with computed columns
  3. covered the tables with Primary key columns and not identity
  4. covered the tables with Primary key columns with data type uniqueidentifier
  5. covered the table build on different schema
  6. covered the table with special column name as example column name KEY

by this script we can fill any volume of data in any table .

Read the rest of this entry »

 
1 Comment

Posted by on February 13, 2015 in General topics

 

Index Statistics for all Databases Exists on your Server

How to return the unused index on all databases? How to save index Statistics? in the previous post i explained How to check the (unused index , duplicated index , missed index) but in one single database yesterday I updated the script of Unused index to can run on all databases exists in the server and by this new DMV I can save the result (index Statistics) in the table because if you do restart for SQL Server or do failover for the cluster you will lose the Index Statistics but by the below steps you can save it easily and return for it in any time to check all My posts and my Friends posts in SQL Server Index GO HERE

.download

Steps for Saving index Statistics:

  • Create Table Index_Statistics on MSDB database

USE [msdb]
GO

/****** Object: Table [dbo].[Index_Statistics] Script Date: 01/02/2015 01:58:31 PM ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[Index_Statistics](
[ID] [INT] IDENTITY(1,1) NOT NULL,
[Database_name] NVARCHAR(200),
[ObjectName] [sysname] NOT NULL,
[IndexName] [sysname] NULL,
[IndexID] [INT] NOT NULL,
[UserSeek] [BIGINT] NOT NULL,
[UserScans] [BIGINT] NOT NULL,
[last_user_seek] [DATETIME] NULL,
[last_user_scan] [DATETIME] NULL,
[UserLookups] [BIGINT] NOT NULL,
[UserUpdates] [BIGINT] NOT NULL,
[TableRows] [BIGINT] NULL,
[drop statement] [NVARCHAR](790) NULL,
[Execution_time] [DATETIME] NOT NULL,
PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON,FILLFACTOR=80,DATA_COMPRESSION=PAGE) ON [PRIMARY]
) ON [PRIMARY]

GO

  • Create Stored procedure Unused index

USE msdb
Go

Create PROC UnusedIndex
AS
BEGIN
SET NOCOUNT ON
DECLARE @name NVARCHAR(200)
DECLARE db_cursor CURSOR FOR
SELECT name
FROM MASTER.dbo.sysdatabases
WHERE name NOT IN (‘master’,’model’,’msdb’,’tempdb’)

OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @name

WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @SQL NVARCHAR(MAX)=N’
Use ‘+@name+’
SELECT ”’+@name+”’ As Database_name,
o.name AS ObjectName
, i.name AS IndexName
, i.index_id AS IndexID
, dm_ius.user_seeks AS UserSeek
, dm_ius.user_scans AS UserScans
,dm_ius.last_user_seek
,dm_ius.last_user_scan
, dm_ius.user_lookups AS UserLookups
, dm_ius.user_updates AS UserUpdates
, p.TableRows
, ”DROP INDEX” + QUOTENAME(i.name)
+ ” ON ” + QUOTENAME(s.name) + ”.” + QUOTENAME(OBJECT_NAME(dm_ius.object_id)) as ” DROP STATEMENT ”
,GETDATE() AS Execution_time
FROM sys.dm_db_index_usage_stats dm_ius
INNER JOIN sys.indexes i ON i.index_id = dm_ius.index_id AND dm_ius.object_id = i.object_id
INNER JOIN sys.objects o on dm_ius.object_id = o.object_id
INNER JOIN sys.schemas s on o.schema_id = s.schema_id
INNER JOIN (
SELECT SUM(p.rows) TableRows, p.index_id, p.object_id
FROM sys.partitions p GROUP BY p.index_id, p.object_id
) p
ON p.index_id = dm_ius.index_id AND dm_ius.object_id = p.object_id
WHERE OBJECTPROPERTY(dm_ius.object_id,”IsUserTable”) = 1
AND dm_ius.database_id = DB_ID()
AND i.type_desc = ”NONCLUSTERED”
AND i.is_primary_key = 0
AND i.is_unique_constraint = 0
ORDER BY o.name ASC , I.name ASC ‘
EXEC SP_ExecuteSQL @SQL
FETCH NEXT FROM db_cursor INTO @name
END

CLOSE db_cursor
DEALLOCATE db_cursor
END

  • Create Job to Execute the unused index (optional)

Because if you create the job I recommended to run it manually not schedule it because our target here to save the index statistics before restart the server or failover it so after this happened I should not overwrite on my data exists in index Statistics table .

Follow the author:

View all my tips , LinkedIn Website Slideshare 

 
Leave a comment

Posted by on February 2, 2015 in General topics

 

Tags: , , ,