显示标签为“task”的博文。显示所有博文
显示标签为“task”的博文。显示所有博文

2012年3月22日星期四

Any work around for 4000 char maximum limitation in Expression

Hello,

I have to build dynamic sql statement in an SQL task.

The SQL statement is way over 4000 char.

The expression builder complains the length of the expression.

Any work around to this limitation?

Thanks a lot!

Add multiple varchar(4000) together.

DECLARE @.sql1 varchar(4000), @.sql2 varchar(4000), @.sql3 varchar(4000)

SET @.sql1 = 'stuff'
SET @.sql2 = 'more stuff'
SET @.sql3 = 'even more stuff'

EXEC (@.sql1+@.sql2+@.sql3)

|||Are you building the expression using variables and other things, or can you stick that SQL statement in a view?|||Instead of using an expression, you could build your SQL statement in a Script Task and write the result out to a file. The Execute SQL Task can read statements from a file connection. The OLE DB Source component cannot, but you could use the Execute SQL Task to put the results in a variable, then shred the results in your Data Flow using a Script Source Component.

Any work around for 4000 char maximum limitation in Expression

Hello,

I have to build dynamic sql statement in an SQL task.

The SQL statement is way over 4000 char.

The expression builder complains the length of the expression.

Any work around to this limitation?

Thanks a lot!

Add multiple varchar(4000) together.

DECLARE @.sql1 varchar(4000), @.sql2 varchar(4000), @.sql3 varchar(4000)

SET @.sql1 = 'stuff'
SET @.sql2 = 'more stuff'
SET @.sql3 = 'even more stuff'

EXEC (@.sql1+@.sql2+@.sql3)

|||Are you building the expression using variables and other things, or can you stick that SQL statement in a view?|||Instead of using an expression, you could build your SQL statement in a Script Task and write the result out to a file. The Execute SQL Task can read statements from a file connection. The OLE DB Source component cannot, but you could use the Execute SQL Task to put the results in a variable, then shred the results in your Data Flow using a Script Source Component.
sql

2012年3月20日星期二

any way to prevent SSAS memory from paging?

This morning msmdsrv.exe looked like the following in Task Manager:

Memory Usage: 600,000 K

VM Size: 2,500,000 K

If I understand correctly, that means that a good deal of SSAS memory has been swapped to disk by the OS. Is there any way to prevent this? (I have read about using the Lock Pages In Memory privilege at http://msdn2.microsoft.com/en-us/library/ms179301.aspx which lets SQL prevent the OS from paging sqlservr.exe memory. I don't suppose there's an equivalent SSAS setting.)

In particular, the symptom I'm seeing is that after we finish processing, all that paged memory has to be loaded back into memory before the transaction can be committed. Committing the transaction is usually very quick. But when most of the SSAS process memory is pages, it takes quite a while, during which I can see the Memory Usage number in Task Manager growing. (I don't think it's a blocked transaction commit.)

Looks like maybe there is a way to do this in SP2. Can anyone explain the PreAllocate setting further?

http://www.sdsqlug.org/presentations/November2006/November_2006_64-bit_SQL_Server.pdf (page 36)

2012年3月19日星期一

Any way to check if a log file (.txt) already exists before BCPing a file?

I want to have one log file for each task ran within a job. There are several tasks for each job that may be handled by different people so I would like to have a running log file on the network that can be checked by supervisors. I know how to creat the log file:

master..xp_cmdshell bcp "SELECT * FROM ##logFile" queryout "c:\log.txt"

Problem is next time a task is run, the original text file will get written over by the new one. I came up with a solution for that by using bcp to export a temporary log file, then append that file to my existing log file, then kill the temporary file and that works fine:

master..xp_cmdshell copy log_temp.txt + log.txt

The problem is the second command obviously fails if the main log file does not exist.

Ideally I would like to check to see if the log file exists, if not create it, if it does exist, append to it.

Having a master log table stored in SQL has been suggested and shot down. They want one text file for job ran.

Any help would be greatly appreciated.

Thanks.

Tim

Write VBScript code (for example) and run this in a SQLAgent job. Don't use xp_cmdshell for these sort of things. It is less flexible and a security vulnerability. It takes few lines of code in VBScript to check for file. Or you can write a CMD script and run it in a SQLAgent job.

2012年3月8日星期四

Any solution for that?

I've got a .dts which load data to several Sql2k servers by Copy SQL Server Objects Task. I'd like to add another task for a Sql25k but I can't because of it's requesting SMO.

It seems that such task need DMO. I know that there are other ways for do such stuff in that DTS but I wonder if there is any trick or shortcut for to avoid this barrier.

How odd, if you use Dts Import/Export Wizard from EM sql25k is accepted.