miércoles, 6 de abril de 2016

Managing the size of the SQL Server SSIS catalog database

SQL Server

https://www.mssqltips.com/sqlservertip/3307/managing-the-size-of-the-sql-server-ssis-catalog-database/

Escenario

I have implemented the SSIS catalog that comes with SQL Server 2012 and I love the reporting and centralization that comes with it, however, I noticed after a few months that the database was growing exponentially. How can I keep the size of this database to a minimum?


Configuration

The SSIS catalog is the central point for working with Integration Services (SSIS) projects that you’ve deployed to the SSIS server. When the SSIS catalog is implemented, the SSISDB database is automatically created. The scope of this tip will focus on the SSISDB growth more than the SSIS catalog itself.
SSIS projects, packages, parameters, environments, and operational history are all stored in the SSISDB database so if you have hundreds of SSIS packages or packages that run every few minutes you could see how the database storing all the historical information would grow exponentially.
Also included when you enable this feature is a SQL Server Agent Job called SSIS Server Maintenance job:
A SQL Server Agent job called SSIS Server Maintenance job.
Inside this job are two steps, SSIS Server Operation Records Maintenance and SSIS Server Max Version Per Project Maintenance, that will help clean up the database. By default, this job is set to run at 12:00am nightly which is sufficient:
SSIS Server Operation Records Maintenance and SSIS Server Max Version Per Project Maintenance
Looking at the first step, SSIS Server Operations Records Maintenance, you will notice that it executes a stored procedure named internal.cleanup_server_retention_window. This sounds like it could be the stored procedure that cleans up history:

Integration Services Logging Levels in SQL Server 2016

SQL Server

https://www.mssqltips.com/sqlservertip/4034/integration-services-logging-levels-in-sql-server-2016/

Escenario

At the time of writing, SQL Server 2016 preview (CTP 2.3) has been released and there are some changes for the Integration Services (SSIS) Catalog logging levels. This tip will explain what is new and how you can use it.


Configuration

Integration Services Logging Levels

In SQL Server 2012, the SSIS catalog was introduced. This catalog provides a central administration platform for the SSIS environment. One of its features is the logging levels, where each logging level specifies which events are logged to the SSIS catalog. Do not confuse this with the log providers of an SSIS package. The log providers log events to a specific destination: SQL Server, a flat file, an XML file and so on. A logging level is a configuration option for the built-in logging of the SSIS catalog: every time you run an SSIS package in the catalog, events are logged to tables in the SSISDB database.
In its initial release and in SQL Server 2014, Integration Services had 4 logging levels:
  • None - logging is turned off.
  • Basic - all events are logged, except custom and diagnostic events. The name is quite misleading, as a lot of information is logged. This is the default logging level of the SSIS catalog.
  • Performance - only performance statistics and the OnWarning and OnEvent are logged. This logging level might log less information than the basic logging level, but it depends on the number of data flows and their complexity. Data flow performance information of this logging level is shown in the catalog report Execution Performance.
  • Verbose - all events are logged. Only use this logging level for advanced troubleshooting or fine tuning.
In the SQL Server 2016 preview, a new logging level has been added and there is now the possibility to create your own custom logging levels. Both will be explained in the following sections.

SQL Server 2016 preview

As mentioned earlier, at the time of writing SQL Server 2016 is still in preview (at the time of writing CTP 2.3 has been released). This means that the features of Integration Services and its catalog can still change and that functionality might change, disappear or be added.

The RuntimeLineage Logging Level

Currently, not a lot of information is out there on this logging level. The documentation states: Collects the data required to track lineage in the data flow. My guess is this logging level might be used in other features/products to track the lineage of the data; i.e. where does the data come from and where does it go to? In order to test this logging level, I created a very basic package:
Simple Data Flow
It reads data from a table, adds a custom column, merges streams and writes the data to a table. When the package is executed, we can see in the catalog view catalog.event_message_context that extra information about the dataflow has been logged: