显示标签为“advice”的博文。显示所有博文
显示标签为“advice”的博文。显示所有博文

2012年3月11日星期日

Any T-SQL advice?

Okay, given my newness to SQL, and the complexity of this query, I thought I'd
run this by you for your opinion:

SELECT DISTINCT a.WeekEnding, c.lastdate, c.numlate, b.totaldate

FROM Accomplishment a LEFT OUTER JOIN

(SELECT weekending, COUNT(weekending) AS
totaldate

FROM Accomplishment

WHERE (EmployeeID = 50)

GROUP BY weekending) b ON a.WeekEnding =
b.weekending LEFT OUTER JOIN

(SELECT weekending, MAX(entrydate) AS
lastdate, COUNT(weekending) AS numlate

FROM accomplishment

WHERE employeeid = 50 AND entrydate >
weekending

GROUP BY weekending) c ON a.WeekEnding =
c.weekending

ORDER BY a.WeekEnding

What I'm trying to do is for each pay period find which ones the employee
submitted a timesheet and which they were late (and if they were late, how
many of them). However, the query takes a good 5 seconds, and it seems
removing the "entrydate > weekending" clause speeds things up to almost
instant, however it does ruin the count that I really want. No idea why
that makes such a difference..CK (c_kettenbach@.hotmail.com) writes:
> Okay, given my newness to SQL, and the complexity of this query, I
> thought I'd run this by you for your opinion:
>...
> What I'm trying to do is for each pay period find which ones the employee
> submitted a timesheet and which they were late (and if they were late, how
> many of them). However, the query takes a good 5 seconds, and it seems
> removing the "entrydate > weekending" clause speeds things up to almost
> instant, however it does ruin the count that I really want. No idea why
> that makes such a difference..

Really why it takes longer with that clause I cannot tell, as I don't
know its tables nor its indexes. However, I found a simplification of
the query:

SELECT a.WeekEnding, b.lastdate, b.numlate, b.totaldate
FROM Accomplishment a
LEFT JOIN (SELECT weekending,
COUNT(weekending) AS totaldate,
SUM(CASE WHEN entrydate > weekending
THEN 1
ELSE 0
END) AS numlate,
MAX(CASE WHEN entrydate > weekending
THEN entrydate
END) AS lastdate
FROM Accomplishment
WHERE EmployeeID = 50
GROUP BY weekending) b ON a.WeekEnding = b.weekending
ORDER BY a.WeekEnding

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||On Mon, 06 Feb 2006 17:11:38 GMT, CK wrote:

>Okay, given my newness to SQL, and the complexity of this query, I thought I'd
>run this by you for your opinion:
>
>SELECT DISTINCT a.WeekEnding, c.lastdate, c.numlate, b.totaldate
>FROM Accomplishment a LEFT OUTER JOIN
> (SELECT weekending, COUNT(weekending) AS
>totaldate
> FROM Accomplishment
> WHERE (EmployeeID = 50)
> GROUP BY weekending) b ON a.WeekEnding =
>b.weekending LEFT OUTER JOIN
> (SELECT weekending, MAX(entrydate) AS
>lastdate, COUNT(weekending) AS numlate
> FROM accomplishment
> WHERE employeeid = 50 AND entrydate >
>weekending
> GROUP BY weekending) c ON a.WeekEnding =
>c.weekending
>ORDER BY a.WeekEnding
>
>What I'm trying to do is for each pay period find which ones the employee
>submitted a timesheet and which they were late (and if they were late, how
>many of them). However, the query takes a good 5 seconds, and it seems
>removing the "entrydate > weekending" clause speeds things up to almost
>instant, however it does ruin the count that I really want. No idea why
>that makes such a difference..

Hi CK,

The query looks more complicated than it needs to be. Based on a whole
lot of assumptions about your data and without any testing (check out
www.aspfaq.com/5006 if you prefer less assumptions and more testing),
I'd suggest changing it to

SELECT WeekEnding,
MAX(CASE WHEN entrydate > weekending THEN entrydate END) AS
LastDate,
COUNT(CASE WHEN entrydate > weekending THEN 'CountMe' END) AS
NumLate,
COUNT(*) AS TotalDate
FROM Accomplishment
WHERE EmployeeId = 30
GROUP BY WeekEnding
ORDER BY WeekEnding

Or, if you want a report for all employees:

SELECT EmployeeID,
WeekEnding,
MAX(CASE WHEN entrydate > weekending THEN entrydate END) AS
LastDate,
COUNT(CASE WHEN entrydate > weekending THEN 'CountMe' END) AS
NumLate,
COUNT(*) AS TotalDate
FROM Accomplishment
GROUP BY EmployeeID,
WeekEnding
ORDER BY EmployeeID,
WeekEnding

--
Hugo Kornelis, SQL Server MVP|||Seeing Hugo's queries, I realize that I did a blunder when I cut
DISTINCT. Assuming that you want all weekendings - also those when
Employee 50 did not enter anything at all, this may be better:

SELECT a.WeekEnding, b.lastdate, b.numlate, b.totaldate
FROM (SELECT DISTINCT WeekEnding FROM Accomplishment) AS a
LEFT JOIN (SELECT weekending,
COUNT(weekending) AS totaldate,
SUM(CASE WHEN entrydate > weekending
THEN 1
ELSE 0
END) AS numlate,
MAX(CASE WHEN entrydate > weekending
THEN entrydate
END) AS lastdate
FROM Accomplishment
WHERE EmployeeID = 50
GROUP BY weekending) b ON a.WeekEnding = b.weekending
ORDER BY a.WeekEnding

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||>> What I'm trying to do is for each pay period find which of the employees submitted a timesheet and which they were late (and if they were late, how many of them). <<

Where is the payperiod table in your data model??

>> No idea why that makes such a difference.. <<

Because your schema design is a pile of crap?? Duh!

Wild Idea! for uou Please post DDL, so that people do not have to
guess what the keys, constraints, Declarative Referential Integrity,
data types, etc. in your schema are. Sample data is also a good idea,
along with clear specifications. It is very hard to debug code when
you do not let us see it.

Newbies think that DDL is not as important as DML. They assume they
can "repair" a bad schema in code. You cannot. Ever. The best you can
hope for is a horrible nested sert of joins, like Sommarskog posted.

Do you want ot do it right or just kludge and patch it?

2012年3月6日星期二

Any other way of data transfere?

Hey everyone,
I have more of a general problem that I need some advice on. In the app I am writing, I am pulling data from a backend Sql Server 2k to a PDA using PocketPC2002. (after long hours and lots of blood, sweat and tears my app is working and almost finished.) Tthe problem I and my team is facing is the fact that because I have to create a merge publication to enable data transfere, the tables used are locked up and no one can make any changes to those tables, unless the publication is deleted, changes made and the publication is recreated again.
Does anyone know of any other way of accessing data from a sql server 2000 db other than the merge replication or RDA methods that will not lock up the tables used? Or, is there a way that I am overlooking in using RDA that will not lock the tables in the database.
Any comments or suggestions?

Thanks in advance,
EricI'm sorry, perhaps I am being thick. If I may, can I try to restate your situation?

You have an App written for PocketPC. You are using the PocketPC version of SQL Server (dunno the actual product name). To enable data to then transfer to your main SQL server (on a network), you use Merge Replication.

The problem is that because it's Merge Replication, to make any schema changes to any of the articles included in the publication, you must delete and re-initialize the replication.

Have I correctly re-stated the problem, or did I miss something?

Regards,

hmscott|||You're not being thick my friend. Yes the problem as you stated it is correct. Forgive me as I am not a DBA, just an ordinary programmer who is dealing with databases right now. (Though I am learning).

So far the best solution we (I) have come up with is that I have created a Sql Script to delete the publication, then make the changes, and then recreate the publication. But, is this the only(or best) way to make any changes to the schema? The tables(articles) used in the publication are constantly locked up and the only other way I have been able to figure out to make changes is to do it through the properties form of the publication (for each individual article effected).
Is there another way I am missing?

Thanks,
Eric

2012年2月13日星期一

Any advice on where to start with learning SQL and C#?

Hi all,

Not sure if this is correct place to post this thread but im looking for advice on where to start with learning SQL and C# as i wish to start in a new career as a DBA.

Cheers,

Your best bet is to download the express editions, and buy some books. C# and SQL are obviously different languages, you'll also need to learn ADO.NET, which is part of the .NET library that is for talking to databases.

In the first instance, I suggest you learn C#. Which means buying a book and working through it. It also means writing console applications first, unless you have a C++ background, in which case you should get into the basics quickly. Then you should learn enough SQL to build a database, and start working on connecting the two.

What's your background ? Have you done programming before ? What's prompting this move ?

|||

I have a legal copy of Visual Studio 2005 Pro and a copy of SQL 2005 Standard so as for the software im sorted, i have a limited understanding of VB6 but not the .Net Framwork not sure if it will help as i have been told that the syntax are very much apart.

The move is being prompted as i am leaving the Armed forces and have gained MCP and MCSE from them although that was Server 2003.

As my main aim is to become a DBA would i be better off learning VB or C#?

|||

VB.NET is very different to VB6, enough that some people regard it as a whole new language. However, the syntax is more VB like than C# is, C# syntax is C++ like.

There is no reason to choose between VB.NET or C# that has any basis in capability of performance, you should choose whatever suits you, probably VB.NET. There are differences, but none of them stop either language from being able to write the same applications as the other.

|||Thanks for the advice i think i will move to VB .NET

Any Advice on transact repl error

Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.

Does anyone have any insight on this error?

The process could not execute 'sp_MSadd_repl_commands27hp'

TIA Scott B."Scott Bradley" <blah@.blah.comwrote in message
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...

Quote:

Originally Posted by

Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.
>
Does anyone have any insight on this error?
>
The process could not execute 'sp_MSadd_repl_commands27hp'
>
TIA Scott B.
>
>

|||"Scott Bradley" <blah@.blah.comwrote in message
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...

Quote:

Originally Posted by

Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.
>
Does anyone have any insight on this error?
>
The process could not execute 'sp_MSadd_repl_commands27hp'
>


We saw this happen from time to time. Never found the solution.

Put a retry on the job and then alert upon COMPLETION (not just failure) and
restart when needed.

Quote:

Originally Posted by

TIA Scott B.
>
>


--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html|||What job. I haven't been able to determine which job is calling the sp in
question.
The log reader agent retries 10 ties then 'fails'. Sometime I can get things
work by restarting teh agent. Other times I have to reboot the subscribing
server, which where I also run the distribution agents.
Thanks, Scott B.
"Greg D. Moore (Strider)" <mooregr_deleteth1s@.greenms.comwrote in message
news:13jksd0rea3mabe@.corp.supernews.com...

Quote:

Originally Posted by

"Scott Bradley" <blah@.blah.comwrote in message
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...

Quote:

Originally Posted by

Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.

Does anyone have any insight on this error?

The process could not execute 'sp_MSadd_repl_commands27hp'


>
We saw this happen from time to time. Never found the solution.
>
Put a retry on the job and then alert upon COMPLETION (not just failure)


and

Quote:

Originally Posted by

restart when needed.
>
>

Quote:

Originally Posted by

TIA Scott B.


>
>
>
--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com


http://www.greenms.com/sqlserver.html

Quote:

Originally Posted by

>
>

Any advice on testing Data Warehouse solution?

We are looking at deploying a data warehouse solution soon that has been
being developed for a few months. The development team has been working on
developing this for the company. Soon they will be turning this over to the
DBA group to do some level of integration testing with the rest of our
production data.
What guidelines are there for testing data warehouse solutions?
What types of things should I be looking for other than user sign off that
the data looks good?Function test of Backup/restore of full data size
Performance/Load testing of full data size
Kevin Connell, MCDBA
----
The views expressed here are my own
and not of my employer.
----
"Doug Needham" <dneedham@.cfsloans.com> wrote in message
news:eqTaIH3xDHA.2148@.TK2MSFTNGP12.phx.gbl...
quote:

> We are looking at deploying a data warehouse solution soon that has been
> being developed for a few months. The development team has been working on
> developing this for the company. Soon they will be turning this over to

the
quote:

> DBA group to do some level of integration testing with the rest of our
> production data.
> What guidelines are there for testing data warehouse solutions?
> What types of things should I be looking for other than user sign off that
> the data looks good?
>
>

Any Advice on Monitoring Performance of Processing Cubes in AS2000

Hi,
I want to monitor an AS2000 installation to see if and where we have any
bottlenecks in the cube processing. Does anyone know of a good document or
advice to offer. I have already read the SQL Server 2000 Resource Kit on
monitoring AS. It lists the counters to examine, but I'm not really sure what
would good figures and what would be poor ones.
We are using a 4 CPU box running Win2003.
The fact tables are stored in a SQL Server 2000 on the same box.
Thanks in advance.
standard best practices:
to correctly optimize your server:
1. Process & Monitor
2. change settings (like memory properties of AS)
3. Process & Monitor again...
You'll find the best parameters for your system.
also, try to process your partitions / cubes in parrallel.
put the distinct count measures in a dedicated cube
your better counter is the overall time taken to do your process ;-)
"Al" <Al@.discussions.microsoft.com> wrote in message
news:BA3F3FA0-BE1D-4437-BB9B-161577221AEF@.microsoft.com...
> Hi,
> I want to monitor an AS2000 installation to see if and where we have any
> bottlenecks in the cube processing. Does anyone know of a good document or
> advice to offer. I have already read the SQL Server 2000 Resource Kit on
> monitoring AS. It lists the counters to examine, but I'm not really sure
> what
> would good figures and what would be poor ones.
> We are using a 4 CPU box running Win2003.
> The fact tables are stored in a SQL Server 2000 on the same box.
> Thanks in advance.
|||First, these two white papers:
http://www.microsoft.com/technet/pro.../anservog.mspx
http://www.microsoft.com/technet/pro.../ansvcspg.mspx
Second, configure your server to capture the system-wide processing log file
(run Analysis Manager, right-click on the server and look at the "Logging"
tab of the server properties. This allows you to gather statistics on when
and how long processing took.
Hope that helps.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI Systems Team
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Al" <Al@.discussions.microsoft.com> wrote in message
news:BA3F3FA0-BE1D-4437-BB9B-161577221AEF@.microsoft.com...
> Hi,
> I want to monitor an AS2000 installation to see if and where we have any
> bottlenecks in the cube processing. Does anyone know of a good document or
> advice to offer. I have already read the SQL Server 2000 Resource Kit on
> monitoring AS. It lists the counters to examine, but I'm not really sure
> what
> would good figures and what would be poor ones.
> We are using a 4 CPU box running Win2003.
> The fact tables are stored in a SQL Server 2000 on the same box.
> Thanks in advance.

Any Advice on Monitoring Performance of Processing Cubes in AS2000

Hi,
I want to monitor an AS2000 installation to see if and where we have any
bottlenecks in the cube processing. Does anyone know of a good document or
advice to offer. I have already read the SQL Server 2000 Resource Kit on
monitoring AS. It lists the counters to examine, but I'm not really sure wha
t
would good figures and what would be poor ones.
We are using a 4 CPU box running Win2003.
The fact tables are stored in a SQL Server 2000 on the same box.
Thanks in advance.standard best practices:
to correctly optimize your server:
1. Process & Monitor
2. change settings (like memory properties of AS)
3. Process & Monitor again...
You'll find the best parameters for your system.
also, try to process your partitions / cubes in parrallel.
put the distinct count measures in a dedicated cube
your better counter is the overall time taken to do your process ;-)
"Al" <Al@.discussions.microsoft.com> wrote in message
news:BA3F3FA0-BE1D-4437-BB9B-161577221AEF@.microsoft.com...
> Hi,
> I want to monitor an AS2000 installation to see if and where we have any
> bottlenecks in the cube processing. Does anyone know of a good document or
> advice to offer. I have already read the SQL Server 2000 Resource Kit on
> monitoring AS. It lists the counters to examine, but I'm not really sure
> what
> would good figures and what would be poor ones.
> We are using a 4 CPU box running Win2003.
> The fact tables are stored in a SQL Server 2000 on the same box.
> Thanks in advance.|||First, these two white papers:
http://www.microsoft.com/technet/pr...n/anservog.mspx
http://www.microsoft.com/technet/pr...n/ansvcspg.mspx
Second, configure your server to capture the system-wide processing log file
(run Analysis Manager, right-click on the server and look at the "Logging"
tab of the server properties. This allows you to gather statistics on when
and how long processing took.
Hope that helps.
--
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI Systems Team
SQL BI Product Unit (Analysis Services)
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Al" <Al@.discussions.microsoft.com> wrote in message
news:BA3F3FA0-BE1D-4437-BB9B-161577221AEF@.microsoft.com...
> Hi,
> I want to monitor an AS2000 installation to see if and where we have any
> bottlenecks in the cube processing. Does anyone know of a good document or
> advice to offer. I have already read the SQL Server 2000 Resource Kit on
> monitoring AS. It lists the counters to examine, but I'm not really sure
> what
> would good figures and what would be poor ones.
> We are using a 4 CPU box running Win2003.
> The fact tables are stored in a SQL Server 2000 on the same box.
> Thanks in advance.

2012年2月11日星期六

Answers and the truth.. Need advice on replication..

First off I have...
A) I have a database with no primary keys ..
B) I have two machines setup with SQL 2000 Advanced..
C) One with my data on it running in production right now...
D) An app which uses ODBC to connect..
What are my options and my best route for redundancy and high availability
of this database//?
Thanks,
Jeff
That depends.
For automatic failover you need clustering. Clustering is very expensive.
Clustering's latency can be 1-2 minutes. Clustering requires SQL 2000
Enterprise, which I assume is what you mean by SQL 2000 Advanced.
With log shipping you get redundancy at a low cost but lower latency.
Latency typically is 5 minutes, but can be much less depending on the size
of the logs you are restoring. There is no automatic failover with log
shipping. SQL 2000 Enterprise edition has a log shipping wizard, but you can
roll your own solution. Here is an example:
http://www.sql-server-performance.co...g_shipping.asp
You can also use replication. By default transactional replication needs
primary keys on the tables you are replication, but there are ways around
this. Please see this link:
http://groups.google.com/groups?selm...&output=gplain
Other than that you can use merge.
Latency is typically around 1 minute.
With replication there is no automatic failover.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
"Jeff Bade" <jbade@.collidingstar.com> wrote in message
news:8Gdbd.7702$5b1.5654@.newssvr17.news.prodigy.co m...
> First off I have...
> A) I have a database with no primary keys ..
> B) I have two machines setup with SQL 2000 Advanced..
> C) One with my data on it running in production right now...
> D) An app which uses ODBC to connect..
> What are my options and my best route for redundancy and high availability
> of this database//?
> Thanks,
> Jeff
>
|||Thanks for all the information.. This is a big help..
As far as Clustering.. Any problems I need to look at here?
Especially with no Primary Keys..
Or Login/Security rights?
Thanks,
Jeff
"Hilary Cotter" <hilary.cotter@.gmail.com> wrote in message
news:%23zIfPbesEHA.2668@.TK2MSFTNGP12.phx.gbl...
> That depends.
> For automatic failover you need clustering. Clustering is very expensive.
> Clustering's latency can be 1-2 minutes. Clustering requires SQL 2000
> Enterprise, which I assume is what you mean by SQL 2000 Advanced.
> With log shipping you get redundancy at a low cost but lower latency.
> Latency typically is 5 minutes, but can be much less depending on the size
> of the logs you are restoring. There is no automatic failover with log
> shipping. SQL 2000 Enterprise edition has a log shipping wizard, but you
can
> roll your own solution. Here is an example:
> http://www.sql-server-performance.co...g_shipping.asp
> You can also use replication. By default transactional replication needs
> primary keys on the tables you are replication, but there are ways around
> this. Please see this link:
>
http://groups.google.com/groups?selm...&output=gplain
> Other than that you can use merge.
> Latency is typically around 1 minute.
> With replication there is no automatic failover.
> --
> Hilary Cotter
> Looking for a SQL Server replication book?
> http://www.nwsu.com/0974973602.html
>