reports can take up to several minutes to run due to the complexity of
the queries (hundreds of lines each in most cases). Each report can be
run by many users, so in effect we have a slow system.
I want to seperate the complex part of the queries into a process that
is generated each night. Then the reports will only have to query
pre-formatted data with minimal parameters as the hard part will have
been completed for the users when they are not in. Ideally we will
generate (stored procedure possibly) a set of data for each report and
hold this on the server. We can then query with simpler parameters
such as by date and get the data back quite quickly.
The whole process of how we obtain the data is very complex. There are
various views which gather data from the back office system. These are
very complex and when queries are run against them including other
tables to bring in more data, it gets nicely complicated.
The only problem is that the users want to have access to LIVE data
from the back office system, specifically the Sales team who want to
access this remotely. My method only allows for data from the night
before, so is there an option available to me which will allow me to
do this ? The queries can't be improved on an awful lot, so they will
take as long as they take. The idea of running them once is the only
way I can see to improve the performance in any significant way.
True I could just let them carry on as they are and let them suffer
with the performance on live data, but I'd like to do something to
improve the situation for them.
Any advice would be appreciated.
Thanks
RyanYep, that definitely makes a difference. Taken me a while to get back
to this, but we're looking at a solution along these lines.
By taking a copy of the data from the view into a table and then
replicating this onto another database (for generating) and then
running stored procedures (from each complex query) against this,
we've been able to reduce the time from 5 minutes for a test generate
query to about 3 seconds. We've timed the other parts and can probably
generate all of our data in less than 5 minutes for 100 complex
queries that would normally take hours.
Once this data is generated we'll simply swap it for the current data
which should be pretty quick. Our report now loads in less than a
second as opposed to 5 mins or so.
This means we can probably update the data once an hour and push it
onto the query database which will be more than quick enough for the
users. This has the knock on effect of being able to produce the
reports in seconds instead of minutes.
Erland, thanks for the advice on this it should make quite a
difference.
Erland Sommarskog <sommar@.algonet.se> wrote in message news:<Xns93DEF27F53F61Yazorman@.127.0.0.1>...
> Ryan (ryanofford@.hotmail.com) writes:
> > The only problem is that the users want to have access to LIVE data
> > from the back office system, specifically the Sales team who want to
> > access this remotely. My method only allows for data from the night
> > before, so is there an option available to me which will allow me to
> > do this ? The queries can't be improved on an awful lot, so they will
> > take as long as they take. The idea of running them once is the only
> > way I can see to improve the performance in any significant way.
> If users want current data, you cannot pre-compute it for them. You
> know, have the cake and eat it.
> What you could to do is to set up a report server which you replicate
> to, so that you take the load of the back-office system.
> Yet another alternative, is to have three databases:
> A - the source-system.
> B - the view system that the users use.-
> C - a computation system.
> You keep the computation database updated by means of log shipping
> (this is better than replication in this case). One you have applied
> a log, you pre-compute all data. When this step is done, you flush the
> pre-computed tables on B, and insert the data from C. Now you go and
> pick up a new log from A. A more fancy variant is to have a seamless
> switch between B and C which you flip constantly. I believe there are
> such things.|||Ryan (ryanofford@.hotmail.com) writes:
> By taking a copy of the data from the view into a table and then
> replicating this onto another database (for generating) and then
> running stored procedures (from each complex query) against this,
> we've been able to reduce the time from 5 minutes for a test generate
> query to about 3 seconds. We've timed the other parts and can probably
> generate all of our data in less than 5 minutes for 100 complex
> queries that would normally take hours.
300 seconds down to three! That's not bad!
> Erland, thanks for the advice on this it should make quite a
> difference.
I'm only glad to have helped! And thanks for reporing back!
--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp
No comments:
Post a Comment