Wednesday, March 28, 2012

Performance Help

I have a 40 gig database (SQL 2005 Standard) and need to run several hundred
queries against it every night. These queries are for reporting purposes and
need to run within certain time constraints. I have added indexes to the
tables, but the queries could potentially include filter criteria for any
number of 100 or so columns. At this point, i am at a loss of how to optimize
a database of this size properly to faciliate these requirements. Any
thoughts?
Thanks,
Matt
Message posted via http://www.droptable.com
One easy solution could be to use Database Engine Tuning Advisor (DTA).
Create a trace using Profiler or (better) a server trace to collect all these
queries and use this trace file as input to DTA. Then follow the index and
statistics recommendations from DTA.
A better solution would be to use a tuning methodology like the one
described by Itzik Ben-Gan on his book Inside SQL Server 2005: T-SQL Querying
(Chapter 3, Tuning Methodology).
Hope this helps,
Ben Nevarez
"lotek via droptable.com" wrote:

> I have a 40 gig database (SQL 2005 Standard) and need to run several hundred
> queries against it every night. These queries are for reporting purposes and
> need to run within certain time constraints. I have added indexes to the
> tables, but the queries could potentially include filter criteria for any
> number of 100 or so columns. At this point, i am at a loss of how to optimize
> a database of this size properly to faciliate these requirements. Any
> thoughts?
> Thanks,
> Matt
> --
> Message posted via http://www.droptable.com
>
|||Some rather high-level and general suggestions:
1. Use SQL Trace to record these queries, and use the recorded queries as
your base workload for performance analysis.
2. Run the workload through the database engine tuning advisor and analyze
the index recommendations. Note that the recommendations from the database
tuning advisor may not always be spot on, but you can take them as input to
your analysis.
3. Concentrate on the top N most expensive queries and focus your tuning
efforts them first.
Linchi
"lotek via droptable.com" wrote:

> I have a 40 gig database (SQL 2005 Standard) and need to run several hundred
> queries against it every night. These queries are for reporting purposes and
> need to run within certain time constraints. I have added indexes to the
> tables, but the queries could potentially include filter criteria for any
> number of 100 or so columns. At this point, i am at a loss of how to optimize
> a database of this size properly to faciliate these requirements. Any
> thoughts?
> Thanks,
> Matt
> --
> Message posted via http://www.droptable.com
>
|||Make sure that the queries are executed at a time when the server is not
busy doing anything else, when it is experiencing the least amount of
activity. You may also want to check what else runs on the server outside
of mssql and terminate as many unnecessary running processes as possible.
Sincerely,
John K
Knowledgy Consulting
www.knowledgy.org
Atlanta's Business Intelligence and Data Warehouse Experts
"Linchi Shea" <LinchiShea@.discussions.microsoft.com> wrote in message
news:83BE6CA5-2EA6-4177-8559-07675C44F8DE@.microsoft.com...[vbcol=seagreen]
> Some rather high-level and general suggestions:
> 1. Use SQL Trace to record these queries, and use the recorded queries as
> your base workload for performance analysis.
> 2. Run the workload through the database engine tuning advisor and analyze
> the index recommendations. Note that the recommendations from the database
> tuning advisor may not always be spot on, but you can take them as input
> to
> your analysis.
> 3. Concentrate on the top N most expensive queries and focus your tuning
> efforts them first.
> Linchi
> "lotek via droptable.com" wrote:
|||also, see if you can partition any of your data and tables and run your
queries in parallel
Sincerely,
John K
Knowledgy Consulting
www.knowledgy.org
Atlanta's Business Intelligence and Data Warehouse Experts
"lotek via droptable.com" <u16539@.uwe> wrote in message
news:7fc7a9d0d08f9@.uwe...
>I have a 40 gig database (SQL 2005 Standard) and need to run several
>hundred
> queries against it every night. These queries are for reporting purposes
> and
> need to run within certain time constraints. I have added indexes to the
> tables, but the queries could potentially include filter criteria for any
> number of 100 or so columns. At this point, i am at a loss of how to
> optimize
> a database of this size properly to faciliate these requirements. Any
> thoughts?
> Thanks,
> Matt
> --
> Message posted via http://www.droptable.com
>
|||1) Get professional assistance
2) Buy better hardware (especially RAM, more CPUs and FAST I/O)
3) Investigate partitioning
4) If you use lots of char-type fields, investigate Full Text Indexing
5) If your queries access large portions of the data (10-20% plus) indexing
(full or regular) won't help because the optimizer will choose table scans
(unless you have covering indexes).
6) Check for locking. Consider using NOLOCK hints on the report queries.
Kevin G. Boles
Indicium Resources, Inc.
SQL Server MVP
kgboles a earthlink dt net
"lotek via droptable.com" <u16539@.uwe> wrote in message
news:7fc7a9d0d08f9@.uwe...
>I have a 40 gig database (SQL 2005 Standard) and need to run several
>hundred
> queries against it every night. These queries are for reporting purposes
> and
> need to run within certain time constraints. I have added indexes to the
> tables, but the queries could potentially include filter criteria for any
> number of 100 or so columns. At this point, i am at a loss of how to
> optimize
> a database of this size properly to faciliate these requirements. Any
> thoughts?
> Thanks,
> Matt
> --
> Message posted via http://www.droptable.com
>

No comments:

Post a Comment