Wednesday 25 November 2015

How to know who dropped a table in SQL Server by T-SQL

There are a lot of questions about this topic , who has deleted the object from Production environment. How to identify it in SQL Server? Can't i trace it, the answer it yes you can. There is one undocumented function fn_dblog() which can be used for this purpose.

fn_dblog() allows to view the transaction log as a part of transaction log file. This function accepts two parameters

The first is starting log sequence number, or LSN or NULL.
The second is the ending LSN or  NULL.
When we run query given below

SELECT * FROM fn_dblog(NULL, NULL) 


it gives us all of the transaction log file records as below




If we need to filter transactions based on insert, update & delete command then we can use query like below

SQL
select * 
 
FROM   fn_dblog(null,null) 
 
where Operation IN  
 
   ('LOP_INSERT_ROWS','LOP_MODIFY_ROW', 
 
    'LOP_DELETE_ROWS','LOP_BEGIN_XACT','LOP_COMMIT_XACT'
Now how to identify to know the details of deleted objects 
I have created a table TestDrop to test. I collected the information from sys.objects about the table which gives me object_id=295672101 for table TestDrop. In next SQL Statement i dropped the table.



SQL
CREATE TABLE [TestDrop] ( 
 
Id INT IDENTITY, 
 
Name VARCHAR (100)); 
 
GO 
 
INSERT INTO [TestDropvalues('Test'); 
 
GO 
 
Select  object_idname  from sys.objects where type='U' 
 
GO 
 
DROP TABLE [TestDrop]; 
 
GO
After dropping the table we will not get any information from sys.objects so we have to make sure that we are taking backup of this table to have that information somewhere otherwise we have to lookup the dropped object information with the databases previously taken  backups. Now we will try to get the object id for the table which is dropped.

SQL
SELECT [Transaction Id], [Begin Time], SUSER_SNAME ([Transaction SID]) AS [User] 
 
FROM fn_dblog (NULLNULL) 
 
WHERE [Transaction Name] = N'DROPOBJ';
Above query will give us the Transaction id and tell us that the table is dropped and who has dropped it but it does not tell us which object has been dropped, to identify this we will use Transaction id , in my case it gives me Transaction id  "0000:00002a3a" which can be further used in the following query to get the object id

SQL
SELECT TOP (1) [Lock InformationFROM fn_dblog (NULLNULL) 
 
WHERE [Transaction Id] = '0000:00002a3a' 
 
AND [Lock InformationLIKE '%SCH_M OBJECT%';
this query gives us the "HoBt 0:ACQUIRE_LOCK_SCH_M OBJECT: 1:295672101:0 " and in this string the 1:295672101:0 contains the database id and object id. Before first occurrence of ";" is the database id and the next one is object id.

We can use 2956721010 with previous database backup to know which object is dropped.

I have created  complete script "Get_Info_about_dropped_Object.sql" which you can download form the link below, where you need not to pass object_id manually, the only requirement is you should have a periodic backup of sys.objects table. In my case i have done it temporarly in a table "#databaseTables".

Download Link: https://gallery.technet.microsoft.com/How-to-know-who-dropped-a-b5c2a244

cheers,


Thursday 19 November 2015

SSIS Getting Started - Extract, Transform & Load the data




In this post we will learn how to use the SQL Server Integration Services 2012 to Load the data. Here I created a sample CSV data file "Employee" which we will load into a table dbo. Employee in the database. During this exercise we will use some of the basic SSIS transformations like Source Adapter, Derived Column transformation, Lookup transformation etc.

You can download this paper with Source file, Script & SSIS solution from my Technet link given
below

https://gallery.technet.microsoft.com/SSIS-Getting-Started-Guide-d8fb22ac

Open SSDT from Start-->Microsoft SQL Server 2012-->SQL Server Data Tools


Click on File-->New-->Project as in the image below



In the new project window click on the Integration Services Project, and enter the details like name of the project, Location where you want to create SSIS solution etc as in the image below.


After creating solution we will get the window as in the image below, where we have Solution Explorer Window on right side, here one ssis package is already created package.dtsx. 
I have given SSIS solution physical location here is "D:\MSBI\SSIS\Learning\SSIS" and Source Csv file location is "D:\MSBI\SSIS\Learning\SourceFiles"



Rename this package as SSISLoad.dtsx by right clicking on package.dtsx.
Now right click on Connection Managers pane at bottom, then click on new flat file connection as in the image below


Flat File Connection Manager Editor window will appear, Fill in the details as below

Connection Manager Name : SourceFileConnection (you can choose whatever you want)
File Name : D:\MSBI\SSIS\Learning\SourceFiles\Employee.csv (give the path where source file                                                                                                                   exists)



Leave it as is the rest of the setting.

Now click on Column Tab on the left pane of the Flat File Connection Manager Editor window, you see here row & column  delimiter are defined as {CR}{LF} & Comma {,}. Leave it as is. Here you can preview first 100 rows of your source file data.


Now click on Advanced Tab on the left pane of the Flat File Connection Manager Editor window, it is the important part as we can define our source data type according to our destination table data types here. Now the destination table data type are as below

CREATE TABLE [dbo].[Employee](
[EmployeeKey] [int] NOT NULL,
[ParentEmployeeKey] [int] NULL,
[EmployeeNationalIDAlternateKey] [nvarchar](15) NULL,
[SalesTerritoryKey] [int] NULL,
[EmployeeFullName] [nvarchar](500) NOT NULL,
[FirstName] [nvarchar](50) NOT NULL,
[LastName] [nvarchar](50) NOT NULL,
[MiddleName] [nvarchar](50) NULL,
[NameStyle] [bit] NOT NULL,
[Title] [nvarchar](50) NULL,
[HireDate] [date] NULL,
[BirthDate] [date] NULL,
[EmailAddress] [nvarchar](50) NULL,
[Phone] [nvarchar](25) NULL,
[MaritalStatus] [nchar](1) NULL
) ON [PRIMARY] 

We need to run above query to crate this table on the database where we will be loading the data, so create above table on your database.

So we need to define the data type on advance tab as in the image below i change EmployeeKey from DT_Str to DT_Numeric 


Repeat above steps for all the columns and then click on ok. Now you will see that your source connection "SourceFileConnection" is created as in the image below


Now we will create database connection, right click again on the Connection Manager pane and choose New Oledb Connection, a new window will pop up "Configure Oledb Connection Manager" as in the image below, in this window click on new button then you will get Connection Manager Window where you can select your SQL Server name, login mechanism and the database name where you need to load the data


Once you will fill in these details then click ok and your destination connection will be created as in the image below






Now we will design our data flow. First drag Execute SQL Task by Clicking on SSIS Toolbox on the left window pane as image below


Now double click on Execute SQL Task and define its properties, first click on connection drop down and select your destination connection as in the image below



Now click on sql statement property and paste the following query

Truncate table [dbo].[Employee]

This query truncates the data every time you will run the package.


Click ok. Now Drag a Data Flow Task from SSIS toolbox window and connect the Execute Sql Task to this Data Flow Task with the green arrow as in the image below.


Now we will design our data flow which will extract data from file and insert into the destination table. Double click on Data Flow Task and drag Source Assistant from SSIS Toolbox window to the design window as in the image below, it will pop up a window named "Source Assistant - Add New Source"




Now click on Flat File in Select Source Type section on the left of the window and select SourceFileConnection (source connection which we created above) on the right pane of the window and click ok.


If you check the destination table which the package will create has FullName as one column but in the source file we does not have this column, so we will create this column values by SSIS package so to do this we will drag drived column transformation from SSIS Toolbox window and join it with the Source Assistant we just dragged.

Now double click Derived Column Transformation which will pop up a window "Derived Column Transformation Editor". Click on Columns on the left pane of the window and copy & paste below expression under the Expression tab

[FirstName]+" "+ [MiddleName]+" "+ [LastName]

Give a name to this column under Derived Column Name and click ok


Drag Destination Assistant and double click to configure it. Select DestinationDb (destination connection which we created above) on the right pane of  Destination Assistant window and click ok



Now double click on the OLEDB Destination which will pop up OLEDB Destination window, here select your table (here dbo.Employee) as in the image below


Now click on the mappings on the left pane of the window, it will map all the source columns with the destination columns automatically if both have the same name. Now if you see for EmployeeFullName on the destination side is not mapped with the source column, so click on the <ignore value> on the source column and select FullName which we created in the Derived Column Transformation above and click ok.



Fun time, package development is complete, now we will execute it. Right click on SSISLoad.dtsx on the Solution Explorer window and select Execute Package.


Cheers, It runs successfully and load 296 rows into the dbo.Employee database.

Happy Learning.


















Thursday 12 November 2015

Comparing full sentences to illegal keywords in a table by TSQL

Comparing full sentences to illegal keywords in a table by TSQL

With the help of this Script we can compare sentences with a table which holds the illegal keywords and based on these keywords values we can mark these sentences as valid or not valid. In this script i created some temporary tables, one of them holds the sentences we need to compare and the other holds the illegal keywords.




/*************************************************************************************************/
--First temp table that holds Sentences needs to compare
--you can replace it with the table which holds the actual sentences
/*************************************************************************************************/

CREATE TABLE #SearchSentance (ID INT,Ttext VARCHAR(4000));
INSERT INTO #SearchSentance
SELECT 1,N'Headset Ball Bearings Chainring Nut This is illegal Sentence'
UNION
SELECT 2,N'This is Correct Sentence'


 SELECT * INTO #SearchTable
 FROM (
SELECT A.ID,
     Split.a.value('.', 'VARCHAR(100)') AS Words
 FROM  (SELECT ID,
         CAST ('<M>' + REPLACE(Ttext, ' ', '</M><M>') + '</M>' AS XML) AS Words
     FROM  #SearchSentance) AS A CROSS APPLY Words.nodes ('M') AS Split(a)
)Q

/*************************************************************************************************/
--This temp table holds the illegal keywords
--you can replace it with the table which holds the actual keywords
/*************************************************************************************************/

CREATE TABLE #Illegal_keyword_Master (ID INT,IllegalKeyWords VARCHAR(4000));
INSERT INTO #Illegal_keyword_Master
SELECT 1,'Bearing' UNION  SELECT 2,'Blade' UNION SELECT 3,' Race' UNION SELECT 4,'Ball' UNION SELECT 5,'Nut'




SELECT
DISTINCT Sentence=stt.Ttext,
[Valid or Not]=CASE WHEN st.id is null THEN 'Legal' ELSE 'Illegal' END
FROM
#SearchSentance stt
LEFT JOIN (
SELECT
*
FROM
#SearchTable st
WHERE
EXISTS(SELECT * FROM #Illegal_keyword_Master im WHERE st.Words=im.IllegalKeyWords)
)st ON st.ID=stt.ID

DROP TABLE #SearchTable
DROP TABLE #SearchSentance
DROP TABLE #Illegal_keyword_Master

Converting Comma Separated Column value into multiple rows by T-SQL

DECLARE @SearchTable table(ID int,Ttext varchar(4000));
insert into @SearchTable
Select 1,N'Headset Ball Bearings Chainring Nut This is illegal Sentence'
union
Select 2,N'This is Correct Sentence'


select A.ID,
     Split.a.value('.', 'VARCHAR(100)') AS Words
 FROM  (SELECT ID,
         CAST ('<M>' + REPLACE(Ttext, ' ', '</M><M>') + '</M>' AS XML) AS Words
FROM  @SearchTable) AS A CROSS APPLY Words.nodes ('M') AS Split(a)

Cheerss,

Wednesday 11 November 2015

How to create a PreFilter Dimension in SSAS

In this tutorial we will see how we can filter Dimension Data and then use it in cube. Here we will filter data of dimension DimPromotion.


  • Open your DSV then right click on middle window and select option "New Named Query" as in the image below
  • Now in the Create Named Query window write your query, here we are filtering the data where promotion is not equal "No Discount", Give name to the new dimension "DimPromotionNew" and then click Ok.
                 SELECT *
                  FROM [dbo].[DimPromotion]
                     where EnglishPromotionName<>'No Discount'

  • Now you will find a new dimension created in your dsv "DimPromotionNew". Now create relationship with your facts with this newly created dimension and delete the old one.
  • If you explore the data in these two dimensions then you will find the difference of 1 row, in the new dimension the row with the Promotion "No Discount" is deleted.
DimPromotion Data


DimPromotionNew Data

Cheers,







Monday 9 November 2015

Get worst performing queries on your database/server with T-SQL

Finding worst performing queries 

The following query returns information about the worst queries ranked by average CPU time. it aggregates the queries according to their query hash so that logically equivalent queries are grouped by their cumulative resource consumption, so top in the list is the costlier by Avg CPU Time. you can edit the query if you want to get top 5 or 10 worst queries by top n clause.

SELECT 
MIN(query_text.statement_text) AS [Query Text], 
    SUM(query_text.total_worker_time) / SUM(query_text.execution_count) AS [Avg CPU Time],
SUM(query_text.total_elapsed_time)/SUM(query_text.execution_count) AS [AVG Execution Time],
SUM(total_physical_reads) [Total Physical Reads],
SUM(total_rows) [Total Rows Returned]
FROM 
    (SELECT 
EQS.*, 
SUBSTRING(ST.text, (EQS.statement_start_offset/2) + 1,
((CASE statement_end_offset 
WHEN -1 THEN DATALENGTH(ST.text)
ELSE EQS.statement_end_offset END 
            - EQS.statement_start_offset)/2) + 1) AS statement_text
     FROM sys.dm_exec_query_stats AS EQS
     CROSS APPLY sys.dm_exec_sql_text(EQS.sql_handle) as ST) as query_text
GROUP BY query_text.query_hash

ORDER BY [Avg CPU Time] DESC;


Sunday 8 November 2015

T-SQL to check whether the statistics of tables are up to date or not

With the help of below T-SQL we can check whether the statistics of tables are up to date or not

 SELECT distinct OBJECT_NAME(s.object_id) AS [ObjectName]
      ,s.[name] AS [StatisticName]
      ,STATS_DATE(s.[object_id], [stats_id]) AS [StatisticUpdateDate]
FROM sys.stats s join sys.objects o on s.object_id=o.object_id
where o.type='U' and STATS_DATE(s.[object_id], [stats_id]) is not null;

Cheers,