Find the best tutors and institutes for Microsoft BI (Business Intelligence) Tools

Find Best Microsoft BI (Business Intelligence) Tools Training

Please select a Category.

Please select a Locality.

No matching category found.

No matching Locality found.

Search for topics

Microsoft BI (Business Intelligence) Tools Updates

Ask a Question

Post a Lesson

All

All

Lessons

Discussion

Lesson Posted on 06 Aug IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server IT Courses/MS SQL/MS SQL Administration IT Courses/MS SQL/MS SQL Certification +1 IT Courses/MS SQL/MS SQL General less

Troubleshooting SQL Server Agent failed Jobs - Common daily routine work for DBA

Meetopus

Meetopus is into creating talent expert and recruitment business. We have tie up with top Organization...

If you are lazy and not intended to do repetitive manual work at a routine time, then SQL Server Agent job help you achieve this. SQL Server agent jobs help to automate the execution of the recurring task in the SQL Server database at a scheduled time. It is important that DBA should have a clear idea... read more

If you are lazy and not intended to do repetitive manual work at a routine time, then SQL Server Agent job help you achieve this. SQL Server agent jobs help to automate the execution of the recurring task in the SQL Server database at a scheduled time. 

It is important that DBA should have a clear idea about how to troubleshoot when the job gets failed.

SQL Agent can automate the execution of multiple tasks in a single or by creating multiple jobs. Below are few things which SQL Agent job can handle or categorise. 

1. Execute TSQL Query

2. Execute Maintenace Plan (SSIS) 

3. SSIS Package 

4. Execute Powershell script.

5. Database Integrity check

6. Replication agent

7. Log-shipping

 

How DBA can help resolve or troubleshoot the failure of SQL Jobs?

  1. DBA should use Job activity
  2. Check for last run status 
  3. Find out the error resulting in failure. 

 

Some common reason for job failure

1. Insufficient Permission provided to the job owner.

2. Script error / Incorrect syntax /Object missing/ Source file missing

3. Primary foreign key relation error 

4. Deadlock/Blocking 

5. Insufficient space on the drive

6. Restricted growth of database (data, Log file)

7. Linked server permission issue

8. LSN mismatch in case of log shipping

 

If you are not able to find the error, you should add logging to the job which can be done in advance setting on job steps. Apart from the listed error, DBA should check the SQL Server Error log to find the RCA.

read less
Comments
Dislike Bookmark

Lesson Posted on 06 Aug IT Courses/MS SQL/MS SQL Administration IT Courses/MS SQL/MS SQL Certification IT Courses/MS SQL/MS SQL General +2 IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server IT Courses/MS SQL less

What is database backup? How backup works in SQL Server?

Meetopus

Meetopus is into creating talent expert and recruitment business. We have tie up with top Organization...

Database Backup in normal terms can be said as a copy of data which can be used in a situation when the things go wrong with your system/server. If your role is IT/Database Administrator, then it is your responsibility to safeguard the data from any disaster apart from critical security. MS SQL Server,... read more

Database Backup in normal terms can be said as a copy of data which can be used in a situation when the things go wrong with your system/server. If your role is IT/Database Administrator, then it is your responsibility to safeguard the data from any disaster apart from critical security.

MS SQL Server, unlike another database system, has the ability to take the backup which can be used during disaster recovery. SQL Server allows the database user to decide and take backup based on the business strategy (RPO) Recovery point objective and (RTO) Recovery Time Objective. 

Based on the fact to recover the database till the point of recovery SQL Server has a different type of database recovery model and backup type.

  • Full or Database
  • Differential or Incremental
  • Transactional Log or Log.

Full database backup – Backs up all the data and objects in the data file(s) for a given database.

Differential database backup – Backs up any data and objects in the data file(s) for a given database that have changed since the last full backup. Remember that a differential backup backs up data since the last full backup, even if there have been intervening differential backups.

Transaction log backups – Copy into a backup file all the log records inserted into the transaction log LDF file since the last transaction log backup.

Apart from above type of backup. SQL Server has below backups 

  • File and filegroup backups - File and filegroup backups back up individual database files and filegroups rather than performing a full database backup. This method backs up very large databases. You must back up the transaction log when performing a file and filegroup backup. You cannot use this method if the Truncate Log On Checkpoint option is enabled.
  • Copy-only backups - Copy-only backups are functionally the same as a full database or transaction log backups but do not affect the backup sequence. For example, if you took a full backup, a copy backup, and then a transaction log backup, all the transaction logs since the full backup would be backed up, and the existence of the copy backup would be ignored. Copy backups cannot be used as the basis for a differential backup or transaction log backup.

Happy Learning

Meetopus

read less
Comments
Dislike Bookmark

Lesson Posted on 06 Aug IT Courses/MS SQL/MS SQL Administration IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server IT Courses/MS SQL/MS SQL Certification +1 IT Courses/MS SQL/MS SQL General less

Why Windows Administration knowledge is required for SQL DBA.

Meetopus

Meetopus is into creating talent expert and recruitment business. We have tie up with top Organization...

As the RDBMS system runs on the windows server operating system and for most of its function is depended on windows OS. Hence SQL DBA should know the basic windows to review various services, events and errors. Below are the some Windows components you should know if you are starting your career as... read more

As the RDBMS system runs on the windows server operating system and for most of its function is depended on windows OS. Hence SQL DBA should know the basic windows to review various services, events and errors.

Below are the some Windows components you should know if you are starting your career as SQL Server DBA.

  • Remote Desk Top
  • Windows Service
  • File System
  • Active Directory
  • Managing Local Users and Group
  • Local Security Policy
  • Disk Management
  • ODBC Administration
  • Event Viewer

Happy Learning

Meetopus

read less
Comments
Dislike Bookmark

Looking for Microsoft BI (Business Intelligence) Tools Training

Find best Microsoft BI (Business Intelligence) Tools Training in your locality on UrbanPro.

FIND NOW

Answered on 31 May IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/Microsoft SharePoint

I want to know the difference between SharePoint designer and SharePoint administrator. Please, can... read more
I want to know the difference between SharePoint designer and SharePoint administrator. Please, can anyone explain me the difference? read less

Chandra Moorthy D

Trainer

SharePoint designer is the person who determines the branding of the whole site.. mostly developer does this job. SharePoint admin is the person who maintains the environment on which SharePoint is installed, server maintenance, health monitoring and so on.. Microsoft designer is also a name of... read more

SharePoint designer is the person who determines the branding of the whole site.. mostly developer does this job.

SharePoint admin is the person who maintains the environment on which SharePoint is installed, server maintenance, health monitoring and so on..

 

Microsoft designer is also a name of the tool for SharePoint.. but that is a different cade

read less
Answers 1 Comments
Dislike Bookmark

Answered on 19 Apr IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/Microsoft Power BI

Ramesh KN

Corporate Trainer

No need to have SSAS knowledge Kumar, but you need to have good knowledge of Excel operations and Pivot table, chart creations as Power BI uses lot of Excel features. Additional knowledge of any RDBS(though not mandatory) will be an added plus. Ramesh (Power BI Consultant)
Answers 9 Comments 1
Dislike Bookmark

Lesson Posted on 16/09/2017 IT Courses/MS SQL IT Courses/MS SQL/MS SQL Administration IT Courses/MS SQL/MS SQL Certification +6 IT Courses/MS SQL/MS SQL General IT Courses/MS SQL/MS SQL Integration IT Courses/MS SQL/MS SQL Reporting IT Courses/MS SQL/MS SQL Development IT Courses/SQL Programming IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server less

Cursors In SQL Server

Redbush Technologies Pvt.Ltd

At RedBush technologies,we specialize in providing training on Hadoop, Big Data, SQL DBA, SQL Developer,...

First thing first Usage of Cursors is not encouraged in SQL Server as they are slow. You may go with While loop if you need to iterate through a recordset. Cursor is a database object to retrieve data from a result set one row at a time, instead of the T-SQL commands that operate on all the rows in the... read more
First thing first Usage of Cursors is not encouraged in SQL Server as they are slow. You may go with While loop if you need to iterate through a recordset. Cursor is a database object to retrieve data from a result set one row at a time, instead of the T-SQL commands that operate on all the rows in the result set at one time.So this is a row by row operation instead of a set based operation.
 
 
SQL Server Cursor Components: Cursors include following components:
 
1) Declare statements: Declare variables used in the code block.
 
2) Set\Select statements: Initialize the variables to a specific value.
 
3) Declare Cursor statement: Populate the cursor with values that will be evaluated.
 
4) Open statement: Open the cursor to begin data processing.

5) Fetch Next statements: Assign the specific values from the cursor to the variables

Note: This logic is used for the initial population before the While statement and then again during each loop in the process as a portion of the While statement.

While statement: Condition to begin and continue data processing.

Begin...End statement: Start and end of the code block.
 
Close statement: Releases the current data and associated locks, but permits the cursor to be re-opened.

Deallocate statement: Destroys the cursor
 
Below is an example of a static cursor:
 
DECLARE @Id int
DECLARE @name varchar(50)
DECLARE @salary int
DECLARE cur_emp CURSOR STATIC FOR
SELECT EmpID,EmpName,Salary from ContractEmployee
OPEN cur_emp
FETCH NEXT FROM cur_emp INTO @Id,@name,@salary
--FETCH ABSOLUTE 3 FROM cur_emp INTO @Id,@name,@salary
WHILE @@Fetch_status = 0
BEGIN
PRINT 'ID : '+ convert(varchar(20),@Id)+', Name : '+@name+ ', Salary : '+convert(varchar(20),@salary)
--FETCH RELATIVE 3 FROM cur_emp INTO @Id,@name,@salary
FETCH NEXT FROM cur_emp INTO @Id,@name,@salary
END
CLOSE cur_emp
DEALLOCATE cur_emp

 
read less
Comments
Dislike Bookmark

Looking for Microsoft BI (Business Intelligence) Tools Training

Find best Microsoft BI (Business Intelligence) Tools Training in your locality on UrbanPro.

FIND NOW

Lesson Posted on 19/08/2017 IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools IT Courses/Amazon Web Services/AWS Certified SysOps Administrator

What Is The Future Prospect Of A Career In Ms Sql Server?

Manoj Kumar Vishwakarma

I have MCA ( Master of Computer Application ) regular and have 16 year IT Teaching Experience in technical...

What is the future prospect of a career in MS SQL Server? You need to get more specific. Are you talking about being a DBA, designing databases, or getting a job with Microsoft on the SQL Server team? However, I don’t think that any of these options would be a career limiting decision. Also, I... read more

What is the future prospect of a career in MS SQL Server?

You need to get more specific. Are you talking about being a DBA, designing databases, or getting a job with Microsoft on the SQL Server team? However, I don’t think that any of these options would be a career limiting decision. Also, I don’t see any reason to assume that SQL Server is going away any time soon. Here are some specifics on these three choices:

DBA: SQL Server DBAs certainly do require some very product specific knowledge, but much of the knowledge of managing one Relational Database Management System (RDBMS) will transfer to other RDBMS products (Oracle, Sybase, etc.).

Designing databases: If you are designing relational database schemas (writing DDL) then a good 95% of your knowledge would be transferable to other databases (I’ve worked with at least a dozen different RDBMS products over my career and as the SQL DDL and DML standards improve, the more similar the products get to one another).

Working at Microsoft on SS: The knowledge that you gain while working on this product will obviously be mostly general programming experience. And since Microsoft is a big company if SQL Server was killed there would undoubtedly be a large number of other positions you could move into.

MS SQL Server DBA Training:

Overview:

Database Administrators (DBAs) are responsible for the design, implementation, support and maintenance of computerized databases in today’s organizations. The role also includes architecting, building and scaling databases for future data growth and capacity. They are also responsible for security, performance and availability of data to users and customers. .

All the above tasks are performed with the help of a Database Management System (DBMS) and among the most widely used DBMS across the world today is the Microsoft SQL Server Data Platform.

Salaries and Job Growth:

DBAs play an important and responsible role in every company’s Information Technology (IT) department. DBAs are also very well paid and the average annual salary is more than $100,000 in the USA.

read less
Comments
Dislike Bookmark

Lesson Posted on 16/06/2017 IT Courses/MS SQL/MS SQL Development IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server IT Courses/Database Training +1 IT Courses/SQL Programming less

How To Minimize The Page Splits In Sqlserver To Improve The Performane Of Database?

Amitava Majumder

I am an experienced Trainer and IT professional with over 10 years of experience in IT Sector and more...

How to minimize the page splits in sqlserver to improve the performane of database? Page Splits: A page is 8Kbytes of data which can be index related, data related, large object binary (lob’s) etc... When you insert rows into a table they go on a page, into ‘slots’, your row will... read more

How to minimize the page splits in sqlserver to improve the performane of database?

Page Splits:

A page is 8Kbytes of data which can be index related, data related, large object binary (lob’s) etc...

When you insert rows into a table they go on a page, into ‘slots’, your row will have a row length and you can get only so many rows on the 8Kbyte page. What happens when that row’s length increases because you entered a bigger product name in your varchar column for instance,well,SQL Server needs to move the other rows along in order to make room for your modification, if the combined new length of all the rows on the page will no longer fit on that page then SQL Server grabs a new page and moves rows to the right or left of your modification onto it – that is called a ‘page split’.

Page splits arise when records from one memory page are moved to another page during changes to your table. Suppose a new record (Martin) being inserted, in sequence, between Adam and Rony. Since there’s no room in this memory page, some records will need to shift around. The page split occurs when Irene’s record moves to the second page.

This creates page fragmentation and is very bad for performance and is also reported as page split.

Page splits are considered very bad for performance, and there are a number of techniques to reduce, or even eliminate, the risk of page splits.

 Example code for tracking Page Splits :

We can find the bad page splits using the event sql_server.transaction_log. This event monitors all the activities in the transaction log, because that we need to use with caution. We can filter the ‘operation’ field looking for the value 11, which means LOP_DELETE_SPLIT. This is the deletion of rows that happens when SQL Server is moving rows from one page to another in a page split, a bad page split.

Extended Events for SQL Server provides a generic tracing and troubleshooting framework which allows deeper and more granular level control of tracing which was not possible using earlier methods like DBCC, SQL Trace, Profiler, etc... These earlier methods still exist and Extended Events is not a replacement.

 For this We need to create the session by t-sql. The code to create the session will be this:

 IF EXISTS (SELECT 1

            FROM sys.server_event_sessions

            WHERE name = 'PageSplits_Tracker')

    DROP EVENT SESSION [PageSplits_Tracker] ON SERVER

 CREATE EVENT SESSION PageSplits_Tracker

ON    SERVER

ADD EVENT sqlserver.transaction_log(

    WHERE operation = 11  -- LOP_DELETE_SPLIT

)

--Description for transaction_log event is: “Occurs when a record is added to the SQL Server transaction log.

--This is a very high volume event that will affect the performance of the server. Therefore, you should use

--appropriate filtering to reduce the number of events, and only use this event for targeted troubleshooting

--during a short time period.”

 -- LOP_DELETE_SPLIT : A page split has occurred. Rows have moved physically.

ADD TARGET package0.histogram(

    SET filtering_event_name = 'sqlserver.transaction_log',

        source_type = 0,source = 'database_id');

GO

--package0.histogram : You can use the histogram target to troubleshoot performance issues.      

--filtering_event_name : Any event present in the Extended Events session.

--source_type : The type of object that the bucket is based on.

--0 for an event

--1 for an action

--source : The event column or action name that is used as the data source.

-- Start the Event Session

ALTER EVENT SESSION PageSplits_Tracker

ON SERVER

STATE=START;

GO

-- Create the database

CREATE DATABASE Performance_Tracker

GO

USE [Performance_Tracker]

GO

-- Create a bad splitting clustered index table

CREATE TABLE PageSplits

( ROWID UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,

  Data INT NOT NULL DEFAULT (RAND()*1000),

  Change_Date DATETIME2 NOT NULL DEFAULT CURRENT_TIMESTAMP);

GO

--  This index should mid-split based on the DEFAULT column value

CREATE INDEX IX_PageSplitsPk_Data ON PageSplits (Data);

GO

--  This index should end-split based on the DEFAULT column value

CREATE INDEX IX_PageSplitsPk_ChangeDate ON PageSplits (Change_Date);

GO

-- Create a table with an increasing clustered index

CREATE TABLE PageSplits_Index

( ROWID INT IDENTITY NOT NULL PRIMARY KEY,

Data INT NOT NULL DEFAULT (RAND()*1000),

Change_Date DATETIME2 NOT NULL DEFAULT DATEADD(mi, RAND()*-1000, CURRENT_TIMESTAMP))

GO

-- This index should mid-split based on the DEFAULT column value

CREATE INDEX IX_PageSplits_Index_ChangeDate ON PageSplits_Index (Change_Date);

GO

-- Insert the default values repeatedly into the tables

WHILE 1=1

BEGIN

    INSERT INTO PageSplits DEFAULT VALUES;

    INSERT INTO PageSplits_Index DEFAULT VALUES;

    WAITFOR DELAY '00:00:00.005';

END

GO

--If we startup this workload and allow it to run for a couple of minutes, we can then query the histogram target

--for our session to find the database that has the mid-page splits occurring.

-- Query the target data to identify the worst splitting database_id

with cte as

(

SELECT

    n.value('(value)[1]', 'int') AS database_id,

    DB_NAME(n.value('(value)[1]', 'int')) AS database_name,

    n.value('(@count)[1]', 'bigint') AS split_count

FROM

(SELECT CAST(target_data as XML) target_data

 FROM sys.dm_xe_sessions AS s

 JOIN sys.dm_xe_session_targets t

     ON s.address = t.event_session_address

 WHERE s.name = 'PageSplits_Tracker'

  AND t.target_name = 'histogram' ) as tab

CROSS APPLY target_data.nodes('HistogramTarget/Slot') as q(n)

)

select * from cte

database_id

database_name

split_count

16

Performance_Tracker

123

--With the database_id of the worst splitting database, we can then change our event session configuration

--to only look at this database, and then change our histogram target configuration to bucket on the alloc_unit_id

--so that we can then track down the worst splitting indexes in the database experiencing the worst mid-page splits

-- Drop the Event Session so we can recreate it

-- to focus on the highest splitting database

DROP EVENT SESSION [PageSplits_Tracker]

ON SERVER

-- Create the Event Session to track LOP_DELETE_SPLIT transaction_log operations in the server

CREATE EVENT SESSION [PageSplits_Tracker]

ON    SERVER

ADD EVENT sqlserver.transaction_log(

    WHERE operation = 11  -- LOP_DELETE_SPLIT

      AND database_id = 16 -- CHANGE THIS BASED ON TOP SPLITTING DATABASE!

)

ADD TARGET package0.histogram(

    SET filtering_event_name = 'sqlserver.transaction_log',

        source_type = 0, -- Event Column

        source = 'alloc_unit_id');

GO

-- Start the Event Session Again

ALTER EVENT SESSION [PageSplits_Tracker]

ON SERVER

STATE=START;

GO

--With the new event session definition, we can now rerun our problematic workload for more than 10 minutes period

-- and look at the worst splitting indexes based on the alloc_unit_id’s that are in the histogram target:

WHILE 1=1

BEGIN

    INSERT INTO PageSplits DEFAULT VALUES;

    INSERT INTO PageSplits_Index DEFAULT VALUES;

    WAITFOR DELAY '00:00:00.005';

END

GO

-- Query Target Data to get the top splitting objects in the database:

SELECT

    o.name AS table_name,

    i.name AS index_name,

    tab.split_count,indexstats.index_type_desc AS IndexType,

indexstats.avg_fragmentation_in_percent,

    i.fill_factor

FROM (    SELECT

            n.value('(value)[1]', 'bigint') AS alloc_unit_id,

            n.value('(@count)[1]', 'bigint') AS split_count

        FROM

        (SELECT CAST(target_data as XML) target_data

         FROM sys.dm_xe_sessions AS s

         JOIN sys.dm_xe_session_targets t

             ON s.address = t.event_session_address

         WHERE s.name = 'PageSplits_Tracker'

          AND t.target_name = 'histogram' ) as tab

        CROSS APPLY target_data.nodes('HistogramTarget/Slot') as q(n)

) AS tab

JOIN sys.allocation_units AS au

    ON tab.alloc_unit_id = au.allocation_unit_id

JOIN sys.partitions AS p

    ON au.container_id = p.partition_id

JOIN sys.indexes AS i

    ON p.object_id = i.object_id

        AND p.index_id = i.index_id

JOIN sys.objects AS o

    ON p.object_id = o.object_id

JOIN sys.dm_db_index_physical_stats(DB_ID(), NULL, NULL, NULL, NULL) indexstats

ON i.object_id = indexstats.object_id

AND i.index_id = indexstats.index_id

WHERE o.is_ms_shipped = 0

ORDER BY indexstats.avg_fragmentation_in_percent DESC

table_name

index_name

split_count

IndexType

avg_fragmentation_in_percent

fill_factor

PageSplits_Index

IX_PageSplits_Index_ChangeDate

286

NONCLUSTERED INDEX

99.57894737

0

PageSplits

PK__PageSpli__97BD02EBEA21A6BC

566

CLUSTERED INDEX

99.37238494

0

PageSplits

IX_PageSplitsPk_Data

341

NONCLUSTERED INDEX

98.98989899

0

PageSplits

IX_PageSplitsPk_ChangeDate

3

NONCLUSTERED INDEX

1.747572816

0

--With this information we can now go back and change our FillFactor specifications and retest/monitor the impact

-- to determine whether we’ve had the appropriate reduction in mid-page splits to accommodate the time between

-- our index rebuild operations:

-- Change FillFactor based on split occurences to minimize page splits

Using Fill Factor we can minimize the page splits :

Fill Factor :When an index is created with a fill factor percentage, this leaves a percentage of the index pages free after the index is created, rebuilt or reorganized. This free space is used to hold additional pages as page splits occur, reducing the change of a page split in the data page causing a page split in the index structure as well, but even with your Fill Factor set to 10% to 20%, index pages eventually fill up and are split the same way that a data page is split.

 A page is the basic unit of data storage in SQL server. Its size is 8KB(8192 bytes). Data is stored in the leaf-level pages of Index.  The percentage of space to be filled with data in a leaf level page is decided by fill factor. The remaining space left is used for future growth of data in the page. Fill factor is a number from 1 to 100. Its default value is 0, which is same as 100. So when we say fill factor is 70 means, 70% of space is filled with data and remaining 30% is vacant for future use. So higher the fill factor, more data is stored in the page. Fill factor setting is applied when we create/rebuild index.

ALTER INDEX PK__PageSpli__97BD02EBEA21A6BC ON PageSplits REBUILD WITH (FILLFACTOR=70)

ALTER INDEX IX_PageSplitsPk_Data ON PageSplits REBUILD WITH (FILLFACTOR=70)

ALTER INDEX IX_PageSplits_Index_ChangeDate ON PageSplits_Index REBUILD WITH (FILLFACTOR=80)

GO

-- Stop the Event Session to clear the target

ALTER EVENT SESSION [PageSplits_Tracker]

ON SERVER

STATE=STOP;

GO

-- Start the Event Session Again

ALTER EVENT SESSION [PageSplits_Tracker]

ON SERVER

STATE=START;                           

GO

--Do the workload once again

WHILE 1=1

BEGIN

    INSERT INTO PageSplits DEFAULT VALUES;

    INSERT INTO PageSplits_Index DEFAULT VALUES;

    WAITFOR DELAY '00:00:00.005';

END

GO

--With the reset performed we can again start up our workload generation and

--begin monitoring the effect of the FillFactor specifications on the indexes with our code.

--After another 2 minute period, the following splits were noted.

--Once again Query Target Data to get the top splitting objects in the database:

--At present there is no page splits are found in indexes IX_PageSplitsPk_ChangeDate, PK__PageSpli__97BD02EBEA21A6BC,   IX_PageSplitsPk_Data

read less
Comments
Dislike Bookmark

Lesson Posted on 13/05/2017 IT Courses/Computer Software/DBA IT Courses/MS SQL IT Courses/MS SQL/MS SQL Administration +2 IT Courses/MS SQL/MS SQL Certification IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools/SQL Server less

History of SQL server

Rahul J Rajesh

I have 7+ years in SQL server DBA and 6+ years in teaching field.

In 1988, Microsoft released its first version of SQL Server. It was designed for the OS/2 platform and was developed jointly by Microsoft and Sybase. During the early 1990s, Microsoft began to develop a new version of SQL Server for the NT platform. While it was under development, Microsoft decided that... read more

In 1988, Microsoft released its first version of SQL Server. It was designed for the OS/2 platform and was developed jointly by Microsoft and Sybase. During the early 1990s, Microsoft began to develop a new version of SQL Server for the NT platform. While it was under development, Microsoft decided that SQL Server should be tightly coupled with the NT operating system. In 1992, Microsoft assumed core responsibility for the future of SQL Server for NT. In 1993, Windows NT 3.1 and SQL Server 4.2 for NT were released. Microsoft's philosophy of combining a high-performance database with an easy-to-use interface proved to be very successful. Microsoft quickly became the second most popular vendor of high-end relational database software. In 1994, Microsoft and Sybase formally ended their partnership. In 1995, Microsoft released version 6.0 of SQL Server. This release was a major rewrite of SQL Server's core technology. Version 6.0 substantially improved performance, provided built-in replication, and delivered centralized administration. In 1996, Microsoft released version 6.5 of SQL Server. This version brought significant enhancements to the existing technology and provided several new features. In 1997, Microsoft released version 6.5 Enterprise Edition. In 1998, Microsoft released version 7.0 of SQL Server, which was a complete rewrite of the database engine. In 2000, Microsoft released SQL Server 2000. SQL Server version 2000 is Microsoft's most significant release of SQL Server to date. This version further builds upon the SQL Server 7.0 framework. According to the SQL Server development team, the changes to the database engine are designed to provide an architecture that will last for the next 10 years.

 

Prior to version 7.0 the code for MS SQL Server was sold by Sybase SQL Server to Microsoft, and was Microsoft's entry to the enterprise-level database market, competing against Oracle, IBM DB2, and, later, Sybase.

Rather than listing all the new features and enhancements found in 2000, I've decided to list my favorite changes. The remainder of this chapter is dedicated to discussing these new features found in version 2000.

table_name

index_name

split_count

IndexType

Version

Year

Release name

Code name

Internal version

1.0 (OS/2)

1989

SQL Server 1.0 (16 bit)

Ashton-Tate / Microsoft SQL Server

-

1.1 (OS/2)

1991

SQL Server 1.1 (16 bit)

-

-

4.21 (WinNT)

1993

SQL Server 4.21

SQLNT

-

6

1995

SQL Server 6.0

SQL95

-

6.5

1996

SQL Server 6.5

Hydra

-

7

1998

SQL Server 7.0

Sphinx

515

-

1999

SQL Server 7.0 OLAP Tools

Palato mania

-

8

2000

SQL Server 2000

Shiloh

539

8

2003

SQL Server 2000 64-bit Edition

Liberty

539

9

2005

SQL Server 2005

Yukon

611/612

10

2008

SQL Server 2008

Katmai

661

10.25

2010

Azure SQL DB

Cloud Database or CloudDB

-

10.5

2010

SQL Server 2008 R2

Kilimanjaro (aka KJ)

665

11

2012

SQL Server 2012

Denali

706

12

2014

SQL Server 2014

SQL14

782

13

2016

SQL Server 2016

 

852

14.0

2017

SQL Server 2017

Helsinki

 

 

 

read less
Comments
Dislike Bookmark

Looking for Microsoft BI (Business Intelligence) Tools Training

Find best Microsoft BI (Business Intelligence) Tools Training in your locality on UrbanPro.

FIND NOW

Answered on 21/02/2017 IT Courses/Tableau IT Courses/Microsoft Training/Microsoft BI (Business Intelligence) Tools SAP Business Objects Training

Techandmate

TechandMate - The Techmnology Insight

Tableau is the best data visualization tool among all BI tools, you can go for it.
Answers 28 Comments
Dislike Bookmark

About UrbanPro

UrbanPro.com helps you to connect with the best Microsoft BI (Business Intelligence) Tools Training in India. Post Your Requirement today and get connected.

Overview

Questions 18

Lessons 45

Total Shares  

+ Follow 3,023 Followers

Related Topics

Top Contributors

Connect with Expert Tutors & Institutes for Microsoft BI (Business Intelligence) Tools

x

Ask a Question

Please enter your Question

Please select a Tag

UrbanPro.com is India's largest network of most trusted tutors and institutes. Over 25 lakh students rely on UrbanPro.com, to fulfill their learning requirements across 1,000+ categories. Using UrbanPro.com, parents, and students can compare multiple Tutors and Institutes and choose the one that best suits their requirements. More than 6.5 lakh verified Tutors and Institutes are helping millions of students every day and growing their tutoring business on UrbanPro.com. Whether you are looking for a tutor to learn mathematics, a German language trainer to brush up your German language skills or an institute to upgrade your IT skills, we have got the best selection of Tutors and Training Institutes for you. Read more