Monday, March 26, 2012
Performance depending on user rights?
I am in the world of SQL Server. Just installed it (SQL Server 2000)
from cd, added some users, created an ODBC datasource from my client
and am ready to roll. And it works!
When I perform an query from the Query Analyzer some queries are very
fast, while the same query from my client (Access) take much longer. I
noticed in the Profiler differences between the execution paths. I
just can't find out what causes these performance problems.
Someone told me it could have something to do with the user. When I
use Query analyser I am on the server, with admin rights. ODBC uses a
user with as little rights as possible. Can this be the problem?
Hope someone can help me.
Thankz,
PeterDoes the SQL statement look the same (the one from QA and Access)?
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"P. Bikkel" <bikkelp@.hotmail.com> wrote in message
news:f3f99039.0310230343.7f809590@.posting.google.com...
> Hello,
> I am in the world of SQL Server. Just installed it (SQL Server 2000)
> from cd, added some users, created an ODBC datasource from my client
> and am ready to roll. And it works!
> When I perform an query from the Query Analyzer some queries are very
> fast, while the same query from my client (Access) take much longer. I
> noticed in the Profiler differences between the execution paths. I
> just can't find out what causes these performance problems.
> Someone told me it could have something to do with the user. When I
> use Query analyser I am on the server, with admin rights. ODBC uses a
> user with as little rights as possible. Can this be the problem?
> Hope someone can help me.
> Thankz,
> Peter|||Yes, the SQL statement is the same.
"Tibor Karaszi" <tibor.please_reply_to_public_forum.karaszi@.cornerstone.se> wrote in message news:<uAOiBNWmDHA.1884@.TK2MSFTNGP09.phx.gbl>...
> Does the SQL statement look the same (the one from QA and Access)?
> --
> Tibor Karaszi, SQL Server MVP
> Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
>
> "P. Bikkel" <bikkelp@.hotmail.com> wrote in message
> news:f3f99039.0310230343.7f809590@.posting.google.com...
> > Hello,
> >
> > I am in the world of SQL Server. Just installed it (SQL Server 2000)
> > from cd, added some users, created an ODBC datasource from my client
> > and am ready to roll. And it works!
> >
> > When I perform an query from the Query Analyzer some queries are very
> > fast, while the same query from my client (Access) take much longer. I
> > noticed in the Profiler differences between the execution paths. I
> > just can't find out what causes these performance problems.
> >
> > Someone told me it could have something to do with the user. When I
> > use Query analyser I am on the server, with admin rights. ODBC uses a
> > user with as little rights as possible. Can this be the problem?
> >
> > Hope someone can help me.
> >
> > Thankz,
> >
> > Peter|||Could possibly be different SET settings between the QA and ODBC connection...
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"P. Bikkel" <bikkelp@.hotmail.com> wrote in message
news:f3f99039.0310250327.4779d0bf@.posting.google.com...
> Yes, the SQL statement is the same.
> "Tibor Karaszi" <tibor.please_reply_to_public_forum.karaszi@.cornerstone.se> wrote in message
news:<uAOiBNWmDHA.1884@.TK2MSFTNGP09.phx.gbl>...
> > Does the SQL statement look the same (the one from QA and Access)?
> >
> > --
> > Tibor Karaszi, SQL Server MVP
> > Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
> >
> >
> > "P. Bikkel" <bikkelp@.hotmail.com> wrote in message
> > news:f3f99039.0310230343.7f809590@.posting.google.com...
> > > Hello,
> > >
> > > I am in the world of SQL Server. Just installed it (SQL Server 2000)
> > > from cd, added some users, created an ODBC datasource from my client
> > > and am ready to roll. And it works!
> > >
> > > When I perform an query from the Query Analyzer some queries are very
> > > fast, while the same query from my client (Access) take much longer. I
> > > noticed in the Profiler differences between the execution paths. I
> > > just can't find out what causes these performance problems.
> > >
> > > Someone told me it could have something to do with the user. When I
> > > use Query analyser I am on the server, with admin rights. ODBC uses a
> > > user with as little rights as possible. Can this be the problem?
> > >
> > > Hope someone can help me.
> > >
> > > Thankz,
> > >
> > > Peter|||On 23 Oct 2003 04:43:04 -0700, bikkelp@.hotmail.com (P. Bikkel) wrote:
>I am in the world of SQL Server. Just installed it (SQL Server 2000)
>from cd, added some users, created an ODBC datasource from my client
>and am ready to roll. And it works!
I love it when a plan comes together!
>When I perform an query from the Query Analyzer some queries are very
>fast, while the same query from my client (Access) take much longer. I
>noticed in the Profiler differences between the execution paths. I
>just can't find out what causes these performance problems.
>Someone told me it could have something to do with the user. When I
>use Query analyser I am on the server, with admin rights. ODBC uses a
>user with as little rights as possible. Can this be the problem?
Well, how fast is fast, and how much longer is much longer?
If the query returns a lot of records, it could be something as simple
as the network transfer time. What if you run Query Analyzer on the
client, or Access on the server?
Joshua Sternsql
performance degradation on data transfer
Wonder if any of you could help me out here. I have just created an
new empty database and imported data from another database into it.
This was done with the import wizard from MMC.
First thing that I noticed was the size difference, the old database
was well over 1GB, but the new one was only about 400MB.
Second thing I've noticed, and this is the problem, is that accessing
the new (smaller) database instead of the old one causes a huge speed
degradation, about 5 times slower than the old version.
We are using MS SQL Server 7. Any help would be very gratefully
received.
Regards
Gethyn"Gethyn" <geth@.blueyonder.co.uk> wrote in message
news:34b76a2b.0401080147.77d51187@.posting.google.c om...
> Hello guys,
> Wonder if any of you could help me out here. I have just created an
> new empty database and imported data from another database into it.
> This was done with the import wizard from MMC.
> First thing that I noticed was the size difference, the old database
> was well over 1GB, but the new one was only about 400MB.
> Second thing I've noticed, and this is the problem, is that accessing
> the new (smaller) database instead of the old one causes a huge speed
> degradation, about 5 times slower than the old version.
One possibility is that indices were not moved over.
Also, the hardware could be different.
Finally, try a UPDATE STATISICS.
> We are using MS SQL Server 7. Any help would be very gratefully
> received.
> Regards
> Gethyn
Performance degradation after replication rolled out
Hi all.
Any assistance would be greatly appreciated.
We recently created transactional replication to hopefully improve performance issues we were expereincing. The replication is between 2 SQL Servers (2000), and since we have introduced the replication, the performance has degraded considerably.
I will try and explain the scenario.
We have a primary db that our internal users use and we also have the newly replicated db that our website and another application use. The users are complaining that the website and the internal application is extremely slow and I was just wondering if it is possible to do an Index Tuning on both the primary db and replicated db based on trace files so as to create new indexes or would this have an impact on the replication?
Thanks in advance.
Running the index tuning wizard is a good idea. Also, generate some trace files and look through them..see if there are any CPU spikes..check out your most frequently used stored procs and look for any missing indexes. Look for the columns in all the where conditions in your T-SQL statements. Do you have indexes on those columns? Index Tuning Wizard can identify most such things. You might have to stop replication during this. You could do it on a weekend maintenance window.
|||There's also a basic whitepaper on transactional replication performance you should read through to see if there's anything that you can take advantage of - "Transactional Replication Performance Tuning and Optimization" - http://msdn2.microsoft.com/en-us/library/aa902656(sql.80).aspx.
Do you replicate a lot of updates? And do the tables on the subscriber have more than one index on it? If so, I highly recommend reading up on sp_scriptdynamicupdproc.
Saturday, February 25, 2012
Perform upgrade method
hi
I have two versions of the same dataflow component , i need to use the new version component in the packages created using the old version . How can i do this using the perform upgrade method .
Thanks
Mani
Explanation about PerformUpgrade and a sample code can be found in book online or at:
http://msdn2.microsoft.com/ko-kr/library/microsoft.sqlserver.dts.pipeline.pipelinecomponent.performupgrade.aspx
Hope this helps,
Ovidiu Burlacu
hi
thanks i got the solution the perform upgrade method only works if the current version property, that is stored in the package is less than the currentversion property of the component on the local computer.
Regards
Mani
|||hi
If i removed the old DLL from the GAC i am getting errors while opening the package saved with the old version DLL
these are the errors ...
Error 1 Validation error. Data Flow Task: DTS.Pipeline: The component metadata for "component "Oracle Destination" (67)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed. Package1.dtsx 0 0
Error 2 Error loading Package1.dtsx: The "component "Oracle Destination" (67)" failed to cache the component metadata object and returned error code 0x80131600. d:\testproject\integration services project5\Package1.dtsx 1 1
Error 3 Error loading Package1.dtsx: Component "component "Oracle Destination" (67)" could not be created and returned error code 0xC0047067. Make sure that the component is registered correctly. d:\testproject\integration services project5\Package1.dtsx 1 1
Error 4 Error loading Package1.dtsx: The component metadata for "component "Oracle Destination" (67)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed. d:\testproject\integration services project5\Package1.dtsx 1 1
how can i correct these errors
thanks
Mani
|||You should not remo0ve the old DLL from the GAC becuase the component can no longer be instantiated. The new DLL should be able to live in the GAC with the old DLL because they have different versions (file or assembly or even the PublicTokenKey)
Thanks,
Ovidiu
Ovidiu, I don't see how you can do side by side with SSIS components in the GAC. File versions do not differentiate between assemblies, so you cannot use that to installside by side. If you change the assembly version or the key (and therefore the public key token) you can side by side them, but then your are stuffed since SSIS stores the full strong name in the package. Any change in the string name means you get the nice error about not being able to load the assembly. Hacking the XML would seem to be the only solution, shame it is not a supported one :)
|||Indeed you are right. I think is time for me to go in vacation |||
In fact you can automatically upgrade a dataflow component's metadata when you upgrade the component's assembly version.
Here's how:
1. Remove the old assembly from GAC.
2. Make sure your new component's CurrentVersion argument of the DtsPipelineComponent attribute is greater than the one saved in the old metadata. That'll ensure that PerformUpgrade of the new component will be called when SSIS opens the package.
3. In the new component override PerformUpgrade and in it make sure to include the following:
ComponentMetaData.CustomPropertyCollection["UserComponentTypeName"].Value = this.GetType().AssemblyQualifiedName;
This changes the metadata's type reference to refer to the new type and version. This will ensure that things like doubleclick on the component will work if you're using UITypeName argument of DtsPipelineComponent attribute for example.
Of course here you should also upgrade any metadata properties from the old version to the new one, including any UITypeEditor properties of custom properties which have their custom editors.
4. Install the new assembly to the GAC.
5. Create a policy assembly redirecting the old version to the new one and install it in GAC. For example of how to do that see here: http://samples.gotdotnet.com/quickstart/howto/doc/pubpolicy.aspx
Now when you open packages with the old version the pattern above will automatically upgrade to the new version.
Milen
Perform upgrade method
hi
I have two versions of the same dataflow component , i need to use the new version component in the packages created using the old version . How can i do this using the perform upgrade method .
Thanks
Mani
Explanation about PerformUpgrade and a sample code can be found in book online or at:
http://msdn2.microsoft.com/ko-kr/library/microsoft.sqlserver.dts.pipeline.pipelinecomponent.performupgrade.aspx
Hope this helps,
Ovidiu Burlacu
hi
thanks i got the solution the perform upgrade method only works if the current version property, that is stored in the package is less than the currentversion property of the component on the local computer.
Regards
Mani
|||hi
If i removed the old DLL from the GAC i am getting errors while opening the package saved with the old version DLL
these are the errors ...
Error 1 Validation error. Data Flow Task: DTS.Pipeline: The component metadata for "component "Oracle Destination" (67)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed. Package1.dtsx 0 0
Error 2 Error loading Package1.dtsx: The "component "Oracle Destination" (67)" failed to cache the component metadata object and returned error code 0x80131600. d:\testproject\integration services project5\Package1.dtsx 1 1
Error 3 Error loading Package1.dtsx: Component "component "Oracle Destination" (67)" could not be created and returned error code 0xC0047067. Make sure that the component is registered correctly. d:\testproject\integration services project5\Package1.dtsx 1 1
Error 4 Error loading Package1.dtsx: The component metadata for "component "Oracle Destination" (67)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed. d:\testproject\integration services project5\Package1.dtsx 1 1
how can i correct these errors
thanks
Mani
|||You should not remo0ve the old DLL from the GAC becuase the component can no longer be instantiated. The new DLL should be able to live in the GAC with the old DLL because they have different versions (file or assembly or even the PublicTokenKey)
Thanks,
Ovidiu
Ovidiu, I don't see how you can do side by side with SSIS components in the GAC. File versions do not differentiate between assemblies, so you cannot use that to installside by side. If you change the assembly version or the key (and therefore the public key token) you can side by side them, but then your are stuffed since SSIS stores the full strong name in the package. Any change in the string name means you get the nice error about not being able to load the assembly. Hacking the XML would seem to be the only solution, shame it is not a supported one :)
|||Indeed you are right. I think is time for me to go in vacation |||
In fact you can automatically upgrade a dataflow component's metadata when you upgrade the component's assembly version.
Here's how:
1. Remove the old assembly from GAC.
2. Make sure your new component's CurrentVersion argument of the DtsPipelineComponent attribute is greater than the one saved in the old metadata. That'll ensure that PerformUpgrade of the new component will be called when SSIS opens the package.
3. In the new component override PerformUpgrade and in it make sure to include the following:
ComponentMetaData.CustomPropertyCollection["UserComponentTypeName"].Value = this.GetType().AssemblyQualifiedName;
This changes the metadata's type reference to refer to the new type and version. This will ensure that things like doubleclick on the component will work if you're using UITypeName argument of DtsPipelineComponent attribute for example.
Of course here you should also upgrade any metadata properties from the old version to the new one, including any UITypeEditor properties of custom properties which have their custom editors.
4. Install the new assembly to the GAC.
5. Create a policy assembly redirecting the old version to the new one and install it in GAC. For example of how to do that see here: http://samples.gotdotnet.com/quickstart/howto/doc/pubpolicy.aspx
Now when you open packages with the old version the pattern above will automatically upgrade to the new version.
Milen