2012年3月22日星期四
Anybody seen this error? - 37000 - Storage Allocation record not a
> in the SQL Server error log. Does Stopping/Starting SQL server help? Have you
> tried defragmenting the disc? Are you using a fixed value for file growth on
> all the databases (including tempdb).
> John
It's SQL Server 7 on NT.
Stopping and starting SQL Server does not resolve the problem.
Defragmentation is not an issue either. We're really stumped, and have
opened a call with the makers of the software that is reporting the
error from SQL Server.
[vbcol=seagreen]
> "jonathan.beckett" wrote:
Hi Jonathan
You should be able to run SQL profiler to find out what commands are being
sent to SQL Server and possibly track down what is causing it.
John
"jonathan.beckett" wrote:
>
> It's SQL Server 7 on NT.
> Stopping and starting SQL Server does not resolve the problem.
> Defragmentation is not an issue either. We're really stumped, and have
> opened a call with the makers of the software that is reporting the
> error from SQL Server.
>
>
>
2012年3月19日星期一
any way to do record login failures?
i'd like to keep track of login failures in a table in addition to the
sql log. is there any way to do this?
i've created an alert for error 18456, login failed for user '%ls'.
if i configure the alert to call a job, how do i get that error message
into the job so that i can insert it into a table?
also, i'd like to record the hostname or ip address of the client
machine from which the login failure occurs. i know sysprocesses has
that info once a user gets logged in, but where is that info if the
login fails?
Another option is to use a trace (or profiler) to monitor
for failed logins. You can import the trace file into a
table.
The IP and Host name won't be directly available for failed
logins. Host name isn't that reliable anyway as it's
controlled by the client. For the ip address, you would need
to capture this using a network tool.
-Sue
On Wed, 02 Jun 2004 09:04:30 -0500, ch <ch@.dontemailme.com>
wrote:
>sql2000 sp3a
>i'd like to keep track of login failures in a table in addition to the
>sql log. is there any way to do this?
>i've created an alert for error 18456, login failed for user '%ls'.
>if i configure the alert to call a job, how do i get that error message
>into the job so that i can insert it into a table?
>also, i'd like to record the hostname or ip address of the client
>machine from which the login failure occurs. i know sysprocesses has
>that info once a user gets logged in, but where is that info if the
>login fails?
any way to do record login failures?
i'd like to keep track of login failures in a table in addition to the
sql log. is there any way to do this?
i've created an alert for error 18456, login failed for user '%ls'.
if i configure the alert to call a job, how do i get that error message
into the job so that i can insert it into a table?
also, i'd like to record the hostname or ip address of the client
machine from which the login failure occurs. i know sysprocesses has
that info once a user gets logged in, but where is that info if the
login fails?Another option is to use a trace (or profiler) to monitor
for failed logins. You can import the trace file into a
table.
The IP and Host name won't be directly available for failed
logins. Host name isn't that reliable anyway as it's
controlled by the client. For the ip address, you would need
to capture this using a network tool.
-Sue
On Wed, 02 Jun 2004 09:04:30 -0500, ch <ch@.dontemailme.com>
wrote:
>sql2000 sp3a
>i'd like to keep track of login failures in a table in addition to the
>sql log. is there any way to do this?
>i've created an alert for error 18456, login failed for user '%ls'.
>if i configure the alert to call a job, how do i get that error message
>into the job so that i can insert it into a table?
>also, i'd like to record the hostname or ip address of the client
>machine from which the login failure occurs. i know sysprocesses has
>that info once a user gets logged in, but where is that info if the
>login fails?
any way to do record login failures?
i'd like to keep track of login failures in a table in addition to the
sql log. is there any way to do this?
i've created an alert for error 18456, login failed for user '%ls'.
if i configure the alert to call a job, how do i get that error message
into the job so that i can insert it into a table?
also, i'd like to record the hostname or ip address of the client
machine from which the login failure occurs. i know sysprocesses has
that info once a user gets logged in, but where is that info if the
login fails?Another option is to use a trace (or profiler) to monitor
for failed logins. You can import the trace file into a
table.
The IP and Host name won't be directly available for failed
logins. Host name isn't that reliable anyway as it's
controlled by the client. For the ip address, you would need
to capture this using a network tool.
-Sue
On Wed, 02 Jun 2004 09:04:30 -0500, ch <ch@.dontemailme.com>
wrote:
>sql2000 sp3a
>i'd like to keep track of login failures in a table in addition to the
>sql log. is there any way to do this?
>i've created an alert for error 18456, login failed for user '%ls'.
>if i configure the alert to call a job, how do i get that error message
>into the job so that i can insert it into a table?
>also, i'd like to record the hostname or ip address of the client
>machine from which the login failure occurs. i know sysprocesses has
>that info once a user gets logged in, but where is that info if the
>login fails?
Any way to check if a log file (.txt) already exists before BCPing a file?
I want to have one log file for each task ran within a job. There are several tasks for each job that may be handled by different people so I would like to have a running log file on the network that can be checked by supervisors. I know how to creat the log file:
master..xp_cmdshell bcp "SELECT * FROM ##logFile" queryout "c:\log.txt"
Problem is next time a task is run, the original text file will get written over by the new one. I came up with a solution for that by using bcp to export a temporary log file, then append that file to my existing log file, then kill the temporary file and that works fine:
master..xp_cmdshell copy log_temp.txt + log.txt
The problem is the second command obviously fails if the main log file does not exist.
Ideally I would like to check to see if the log file exists, if not create it, if it does exist, append to it.
Having a master log table stored in SQL has been suggested and shot down. They want one text file for job ran.
Any help would be greatly appreciated.
Thanks.
Tim
Write VBScript code (for example) and run this in a SQLAgent job. Don't use xp_cmdshell for these sort of things. It is less flexible and a security vulnerability. It takes few lines of code in VBScript to check for file. Or you can write a CMD script and run it in a SQLAgent job.Any way to attach a dual-log-file database without the log files?
like to know if there's another way around it.
Our problem is that a log file got so big it filled up the drive.
Couldn't even switch it to simple mode. So the thought was to detach
it, then attach it without the log file. However, it appears that we
had two log files in that database, so we get the following when
trying to use sp_attach_single_file_db:
File activation failure. The physical file name "E:\generic
\generic_log.ldf" may be incorrect.
The log was not rebuilt because there is more than one log file.
Msg 1813, Level 16, State 2, Line 1
Could not open new database 'Generic'. CREATE DATABASE is aborted.
Any thoughts? Thanks.Did you try sp_attach_single_file_db? According to BOL for 2005, this should allow attaching a db
having one data file but multiple log files. Actually, sp_attach* procedures are deprecated so use
CREATE DATABASE ... FOR ATTACH (or in this case FOR_ATTACH__REBUILD_LOG) instead. Also, see the
information on BOL for CREATE DATABASE and ATTACH_REBUILD_LOG.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
"M Bourgon" <bourgon@.gmail.com> wrote in message
news:f5c0e2d4-3530-461d-a2e0-9238074f7c0f@.21g2000hsj.googlegroups.com...
> Odd problem I ran into today - we'll restore from backups, but I'd
> like to know if there's another way around it.
> Our problem is that a log file got so big it filled up the drive.
> Couldn't even switch it to simple mode. So the thought was to detach
> it, then attach it without the log file. However, it appears that we
> had two log files in that database, so we get the following when
> trying to use sp_attach_single_file_db:
> File activation failure. The physical file name "E:\generic
> \generic_log.ldf" may be incorrect.
> The log was not rebuilt because there is more than one log file.
> Msg 1813, Level 16, State 2, Line 1
> Could not open new database 'Generic'. CREATE DATABASE is aborted.
>
> Any thoughts? Thanks.
2012年3月8日星期四
Any side effects from backing up DB and log at the same time?
Do you know of any other possible dangers, apart from transaction log backup
not being made and waiting for the full DB backup to complete, when letting
the transaction log backup run during the full DB backup?
What else besides the failing transaction log backup can happen when making
the backup during the time the database is in simple recovery mode?
-- Many, thanks. Oskar.
in simple rm t-log backup useless
"Oskar" <Oskar@.discussions.microsoft.com> wrote in message
news:4B5DCB3A-80B3-4E06-9633-9E6886BD9273@.microsoft.com...
> Hi,
> Do you know of any other possible dangers, apart from transaction log
> backup
> not being made and waiting for the full DB backup to complete, when
> letting
> the transaction log backup run during the full DB backup?
> What else besides the failing transaction log backup can happen when
> making
> the backup during the time the database is in simple recovery mode?
> -- Many, thanks. Oskar.
Any side effects from backing up DB and log at the same time?
Do you know of any other possible dangers, apart from transaction log backup
not being made and waiting for the full DB backup to complete, when letting
the transaction log backup run during the full DB backup?
What else besides the failing transaction log backup can happen when making
the backup during the time the database is in simple recovery mode?
-- Many, thanks. Oskar.in simple rm t-log backup useless
"Oskar" <Oskar@.discussions.microsoft.com> wrote in message
news:4B5DCB3A-80B3-4E06-9633-9E6886BD9273@.microsoft.com...
> Hi,
> Do you know of any other possible dangers, apart from transaction log
> backup
> not being made and waiting for the full DB backup to complete, when
> letting
> the transaction log backup run during the full DB backup?
> What else besides the failing transaction log backup can happen when
> making
> the backup during the time the database is in simple recovery mode?
> -- Many, thanks. Oskar.
Any side effects from backing up DB and log at the same time?
Do you know of any other possible dangers, apart from transaction log backup
not being made and waiting for the full DB backup to complete, when letting
the transaction log backup run during the full DB backup?
What else besides the failing transaction log backup can happen when making
the backup during the time the database is in simple recovery mode?
-- Many, thanks. Oskar.in simple rm t-log backup useless
"Oskar" <Oskar@.discussions.microsoft.com> wrote in message
news:4B5DCB3A-80B3-4E06-9633-9E6886BD9273@.microsoft.com...
> Hi,
> Do you know of any other possible dangers, apart from transaction log
> backup
> not being made and waiting for the full DB backup to complete, when
> letting
> the transaction log backup run during the full DB backup?
> What else besides the failing transaction log backup can happen when
> making
> the backup during the time the database is in simple recovery mode?
> -- Many, thanks. Oskar.
Any Reporting Service Log experts out there?
undertand how our users are (or are not) leveraging our Reporting Services
implementation. We have been writing execution log data to a database based
on the Msft provided SSIS package that pulls data from the Report Server
database and I've discovered a hole in the information that I'm hoping
someone can help me fix.
We use data drive subscriptions pretty heavily which of course require
cached data credentials. In the "ExecutionLogs" table, these entries appear
as being requested by "System" and the user shows our proxy account. If you
review the "ReportServerService_..." log on the report server, you can see
the actual detail surrounding the processing of the subscription, but I can't
find a way to correlate these entries back to the ExecutionLog table. The
"ExecutionLogId" in the ExecutionLog table doesn't reference any of the
uniqueidentifiers that you see in the text ReportServerService log.
My end goal is to be able to update the User column in the ExecutionLog
table with the user who was the actual recipient of the report as identified
in the ReportServerService log file.
Anyone tackle this yet or have any ideas as to how it might be accomplished?I'll give you a big hint.. :)
Remember, you are not limited to just select statements in your datasets for
your reports.
I have a dataset that does INSERTS for my data driven reports... (since it
is data driven, you KNOW who the users are that are going to get the
reports)
cheers!
=-Chris
"KS" <ks@.community.nospam> wrote in message
news:C44736CF-304A-4478-98E9-01BDA5A7C0B2@.microsoft.com...
> We are working on developing some statistical reports to help us to better
> undertand how our users are (or are not) leveraging our Reporting Services
> implementation. We have been writing execution log data to a database
> based
> on the Msft provided SSIS package that pulls data from the Report Server
> database and I've discovered a hole in the information that I'm hoping
> someone can help me fix.
> We use data drive subscriptions pretty heavily which of course require
> cached data credentials. In the "ExecutionLogs" table, these entries
> appear
> as being requested by "System" and the user shows our proxy account. If
> you
> review the "ReportServerService_..." log on the report server, you can
> see
> the actual detail surrounding the processing of the subscription, but I
> can't
> find a way to correlate these entries back to the ExecutionLog table. The
> "ExecutionLogId" in the ExecutionLog table doesn't reference any of the
> uniqueidentifiers that you see in the text ReportServerService log.
> My end goal is to be able to update the User column in the ExecutionLog
> table with the user who was the actual recipient of the report as
> identified
> in the ReportServerService log file.
> Anyone tackle this yet or have any ideas as to how it might be
> accomplished?|||Thanks for the hint, Chris.
Unless I'm missing something, however, that still doesn't provide you with
the ability to cross reference the actual report delivery with the
ExecutionLog record to be able to access the other metrics that are being
captured (TimeDataRetrieval, TimeProcessing, TimeRendering, ByteCount,
RowCount...)
Need to find a way to accurately identify a specific data driven
subscription execution with the corresponding ExecutionLog entry.
Any other ideas? Like I said, all the necessary info is in
ReportServerServices_xxx.log, but I don't see how I can accurately tie that
back to the appropriate ExecutionLogId in the ExecutionLog table. The text
file inlcudes a number of uniqueidentifiers, but none of which match up to
the ExecutionLog. Can't go by exact time either as the text file is not
precise enough in the event of batch data driven subscription processing.
"Chris Conner" wrote:
> I'll give you a big hint.. :)
> Remember, you are not limited to just select statements in your datasets for
> your reports.
> I have a dataset that does INSERTS for my data driven reports... (since it
> is data driven, you KNOW who the users are that are going to get the
> reports)
> cheers!
> =-Chris
>
> "KS" <ks@.community.nospam> wrote in message
> news:C44736CF-304A-4478-98E9-01BDA5A7C0B2@.microsoft.com...
> > We are working on developing some statistical reports to help us to better
> > undertand how our users are (or are not) leveraging our Reporting Services
> > implementation. We have been writing execution log data to a database
> > based
> > on the Msft provided SSIS package that pulls data from the Report Server
> > database and I've discovered a hole in the information that I'm hoping
> > someone can help me fix.
> >
> > We use data drive subscriptions pretty heavily which of course require
> > cached data credentials. In the "ExecutionLogs" table, these entries
> > appear
> > as being requested by "System" and the user shows our proxy account. If
> > you
> > review the "ReportServerService_..." log on the report server, you can
> > see
> > the actual detail surrounding the processing of the subscription, but I
> > can't
> > find a way to correlate these entries back to the ExecutionLog table. The
> > "ExecutionLogId" in the ExecutionLog table doesn't reference any of the
> > uniqueidentifiers that you see in the text ReportServerService log.
> >
> > My end goal is to be able to update the User column in the ExecutionLog
> > table with the user who was the actual recipient of the report as
> > identified
> > in the ReportServerService log file.
> >
> > Anyone tackle this yet or have any ideas as to how it might be
> > accomplished?
>
>|||Hmm... this is a nice challenge.
Let's try this - forget the log for the moment - I know I could look this
up - but I'm not at a report server at the moment - does the report
Globals!ExecutionTime match the time stored in the ExectionLog? I mean, you
know the report name ...
Here is what I was thinking:
select name, b.executiontime
from reportserver.dbo.catalog c
inner join reportserver.dbo.executionlog ex
ON (c.ItemID = ex.ReportID)
inner join BobTable b on (c.name = b.ReportName)
where name = b.ReportName and b.executiontime between c.TimeStart and
c.TimeEnd
Where "BobTable" is your table that you store the report name and report
execution time when the report runs.
=-Chris
"KS" <ks@.community.nospam> wrote in message
news:2E0EE652-8A08-4D89-9DB3-50EB8B367FF6@.microsoft.com...
> Thanks for the hint, Chris.
> Unless I'm missing something, however, that still doesn't provide you with
> the ability to cross reference the actual report delivery with the
> ExecutionLog record to be able to access the other metrics that are being
> captured (TimeDataRetrieval, TimeProcessing, TimeRendering, ByteCount,
> RowCount...)
> Need to find a way to accurately identify a specific data driven
> subscription execution with the corresponding ExecutionLog entry.
> Any other ideas? Like I said, all the necessary info is in
> ReportServerServices_xxx.log, but I don't see how I can accurately tie
> that
> back to the appropriate ExecutionLogId in the ExecutionLog table. The
> text
> file inlcudes a number of uniqueidentifiers, but none of which match up to
> the ExecutionLog. Can't go by exact time either as the text file is not
> precise enough in the event of batch data driven subscription processing.
> "Chris Conner" wrote:
>> I'll give you a big hint.. :)
>> Remember, you are not limited to just select statements in your datasets
>> for
>> your reports.
>> I have a dataset that does INSERTS for my data driven reports... (since
>> it
>> is data driven, you KNOW who the users are that are going to get the
>> reports)
>> cheers!
>> =-Chris
>>
>> "KS" <ks@.community.nospam> wrote in message
>> news:C44736CF-304A-4478-98E9-01BDA5A7C0B2@.microsoft.com...
>> > We are working on developing some statistical reports to help us to
>> > better
>> > undertand how our users are (or are not) leveraging our Reporting
>> > Services
>> > implementation. We have been writing execution log data to a database
>> > based
>> > on the Msft provided SSIS package that pulls data from the Report
>> > Server
>> > database and I've discovered a hole in the information that I'm hoping
>> > someone can help me fix.
>> >
>> > We use data drive subscriptions pretty heavily which of course require
>> > cached data credentials. In the "ExecutionLogs" table, these entries
>> > appear
>> > as being requested by "System" and the user shows our proxy account.
>> > If
>> > you
>> > review the "ReportServerService_..." log on the report server, you can
>> > see
>> > the actual detail surrounding the processing of the subscription, but I
>> > can't
>> > find a way to correlate these entries back to the ExecutionLog table.
>> > The
>> > "ExecutionLogId" in the ExecutionLog table doesn't reference any of the
>> > uniqueidentifiers that you see in the text ReportServerService log.
>> >
>> > My end goal is to be able to update the User column in the ExecutionLog
>> > table with the user who was the actual recipient of the report as
>> > identified
>> > in the ReportServerService log file.
>> >
>> > Anyone tackle this yet or have any ideas as to how it might be
>> > accomplished?
>>
Any report runs once and then I have to log back in to my app
running into an issue we don't even understand why it's happening let alone
fixing it...
You can only run one report and then you have to logout of the app and log
back in to run another report. This is with any of our reports! They all
work but only the first report runs the first time. After that all of the
reports don't work. Logout and log back in and choose another report and it
works...?
Here's our code that is basically the same for every report...
dstDataSet = New DataSet
Dim strSPName As String = "spSRS_Appeal_TaxAgents"
'Passing the parameter to the SQL stored procedure
Dim aryParams(1) As SqlParameter
aryParams(0) = New SqlClient.SqlParameter("@.TaxYear", SqlDbType.Int)
aryParams(0).Value = CType(strTaxYear, Integer)
aryParams(1) = New SqlClient.SqlParameter("@.Cycle", SqlDbType.Int)
aryParams(1).Value = CType(strCycle, Integer)
Try
dstDataSet = SqlHelper.ExecuteDataset(cnnConn, strSPName,
aryParams)
Dim intCount As Integer = dstDataSet.Tables(0).Rows.Count
If intCount < 1 Then
lblInfo.Text = "Report records not found! Canceled operation."
Exit Sub
End If
Catch exn As Exception
lblInfo.Text = "Error. Failed to retrieve records! " & exn.Message
End Try
'Create the URL string to render this report from the SQL report
server
'SPParam is the parameter which is submitted to the SQL sotred
procedures
Dim str1, str2, str3, str4, strURL As String
str1 = "http://tennessee/reportserver?/astransc/AppealTaxAgents"
str2 = "&rs:Command=Render&rs:Format=PDF"
str3 = strTaxYear
str4 = strCycle
Dim strPath As New System.Text.StringBuilder
strPath.Append(str1)
strPath.Append(str2)
strPath.Append("&TaxYear=").Append(str3)
strPath.Append("&Cycle=").Append(str4)
strURL = strPath.ToString
Response.Redirect(strURL)
The report is in PDF format and when the report runs it pops up a form
asking if we want to open, save, or cancel the PDF file...
The programmer who wrote this doesn't have this issue on his machine.
However, on the test server this is occuring to us. In fact the programmer
asked for assistance because on his machine it on a rare occasion would not
create the PDF file but he couldn't find a cause for it. On our test server
it occurs like clock work. Log in, run one report (any of them) and then no
other report including the one you just called won't work until you log back
in....
In looking at the report server logs the subsequent requests are not even
making it to the report server? Any ideas? We were wondering if there was
something about the response.redirect? I can take the string created by
the code type it into a url address and it works every time?
Thanks,
KevinI have somewhat figured out what is occuring...
The viewstate is being clobbered when this is occuring (not entirely sure
how but suspect it is occuring because the redirect does not physically
change the page I am on, only creates a do you want to open, save, cancel
this PDF file dialog box). I can only assume that the response.redirect is
the cause now. I am going to figure out a way to accomplish this task.
Kevin
"Kevin" wrote:
> We have an asp.net app that calls the report server for our reports. We are
> running into an issue we don't even understand why it's happening let alone
> fixing it...
> You can only run one report and then you have to logout of the app and log
> back in to run another report. This is with any of our reports! They all
> work but only the first report runs the first time. After that all of the
> reports don't work. Logout and log back in and choose another report and it
> works...?
> Here's our code that is basically the same for every report...
> dstDataSet = New DataSet
> Dim strSPName As String = "spSRS_Appeal_TaxAgents"
> 'Passing the parameter to the SQL stored procedure
> Dim aryParams(1) As SqlParameter
> aryParams(0) = New SqlClient.SqlParameter("@.TaxYear", SqlDbType.Int)
> aryParams(0).Value = CType(strTaxYear, Integer)
> aryParams(1) = New SqlClient.SqlParameter("@.Cycle", SqlDbType.Int)
> aryParams(1).Value = CType(strCycle, Integer)
> Try
> dstDataSet = SqlHelper.ExecuteDataset(cnnConn, strSPName,
> aryParams)
> Dim intCount As Integer = dstDataSet.Tables(0).Rows.Count
> If intCount < 1 Then
> lblInfo.Text = "Report records not found! Canceled operation."
> Exit Sub
> End If
> Catch exn As Exception
> lblInfo.Text = "Error. Failed to retrieve records! " & exn.Message
> End Try
> 'Create the URL string to render this report from the SQL report
> server
> 'SPParam is the parameter which is submitted to the SQL sotred
> procedures
> Dim str1, str2, str3, str4, strURL As String
> str1 = "http://tennessee/reportserver?/astransc/AppealTaxAgents"
> str2 = "&rs:Command=Render&rs:Format=PDF"
> str3 = strTaxYear
> str4 = strCycle
> Dim strPath As New System.Text.StringBuilder
> strPath.Append(str1)
> strPath.Append(str2)
> strPath.Append("&TaxYear=").Append(str3)
> strPath.Append("&Cycle=").Append(str4)
> strURL = strPath.ToString
> Response.Redirect(strURL)
> The report is in PDF format and when the report runs it pops up a form
> asking if we want to open, save, or cancel the PDF file...
> The programmer who wrote this doesn't have this issue on his machine.
> However, on the test server this is occuring to us. In fact the programmer
> asked for assistance because on his machine it on a rare occasion would not
> create the PDF file but he couldn't find a cause for it. On our test server
> it occurs like clock work. Log in, run one report (any of them) and then no
> other report including the one you just called won't work until you log back
> in....
> In looking at the report server logs the subsequent requests are not even
> making it to the report server? Any ideas? We were wondering if there was
> something about the response.redirect? I can take the string created by
> the code type it into a url address and it works every time?
> Thanks,
> Kevin
2012年2月25日星期六
any link that talks about different DR options
evaluate the different technologies out there that can do so..( log
shipping, mirroring, stretch clustering, SAN replication, 3rd party,etc.)
and wanted to get a quick overview of the pros and cons of each approach
that may be listed on a link already.
Can someone let me know if there is one out there and send me the path ?
ThanksHi Hassan,
Check these links:
http://technet.microsoft.com/en-us/sqlserver/bb331801.aspx
http://support.microsoft.com/kb/822400
Jonathan
Hassan wrote:
> We want to set up Disaster Recovery for our SQL databases and want to
> evaluate the different technologies out there that can do so..( log
> shipping, mirroring, stretch clustering, SAN replication, 3rd party,etc.)
> and wanted to get a quick overview of the pros and cons of each approach
> that may be listed on a link already.
> Can someone let me know if there is one out there and send me the path ?
> Thanks
>
2012年2月13日星期一
Any blocking when we extend the data files ?
5GB to 25GB on a highly transactional system ? Let me know your thoughts.Hi
No locks are taken at data level, but the high I/O does affect performance
on queries.
Do it when usage is lowest and like normal, test it on a non-production
system before you do it.
Regards
Mike
"Hassan" wrote:
> Will there be any blocking ifi grow the data files and Log files from say
> 5GB to 25GB on a highly transactional system ? Let me know your thoughts.
>
>
Any blocking when we extend the data files ?
5GB to 25GB on a highly transactional system ? Let me know your thoughts.
Hi
No locks are taken at data level, but the high I/O does affect performance
on queries.
Do it when usage is lowest and like normal, test it on a non-production
system before you do it.
Regards
Mike
"Hassan" wrote:
> Will there be any blocking ifi grow the data files and Log files from say
> 5GB to 25GB on a highly transactional system ? Let me know your thoughts.
>
>
Any Advice on transact repl error
stand-by server. We are seeing the following error from the Log reader
agents.
Does anyone have any insight on this error?
The process could not execute 'sp_MSadd_repl_commands27hp'
TIA Scott B."Scott Bradley" <blah@.blah.comwrote in message
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...
Quote:
Originally Posted by
Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.
>
Does anyone have any insight on this error?
>
The process could not execute 'sp_MSadd_repl_commands27hp'
>
TIA Scott B.
>
>
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...
Quote:
Originally Posted by
Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.
>
Does anyone have any insight on this error?
>
The process could not execute 'sp_MSadd_repl_commands27hp'
>
We saw this happen from time to time. Never found the solution.
Put a retry on the job and then alert upon COMPLETION (not just failure) and
restart when needed.
Quote:
Originally Posted by
TIA Scott B.
>
>
--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html|||What job. I haven't been able to determine which job is calling the sp in
question.
The log reader agent retries 10 ties then 'fails'. Sometime I can get things
work by restarting teh agent. Other times I have to reboot the subscribing
server, which where I also run the distribution agents.
Thanks, Scott B.
"Greg D. Moore (Strider)" <mooregr_deleteth1s@.greenms.comwrote in message
news:13jksd0rea3mabe@.corp.supernews.com...
Quote:
Originally Posted by
"Scott Bradley" <blah@.blah.comwrote in message
news:HTm_i.9413$ww2.2129@.newssvr19.news.prodigy.ne t...
Quote:
Originally Posted by
Hi All, We are using SQL2000 servers with transact replication to a warm
stand-by server. We are seeing the following error from the Log reader
agents.
Does anyone have any insight on this error?
The process could not execute 'sp_MSadd_repl_commands27hp'
>
We saw this happen from time to time. Never found the solution.
>
Put a retry on the job and then alert upon COMPLETION (not just failure)
and
Quote:
Originally Posted by
restart when needed.
>
>
Quote:
Originally Posted by
TIA Scott B.
>
>
>
--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com
http://www.greenms.com/sqlserver.html
Quote:
Originally Posted by
>
>
2012年2月11日星期六
Ansi_Padding - how to get rid of
I copied and pasted a database (and log) from Sql Server 2000 to Sql Server
2005 (and attached it).
Everything appeared okay until my VB 6 app had problems with its comboboxes.
I realized that Ansi_Padding was automatically put in by Sql Server 2005 (it
wasn't there in Sql Server 2000), thus making the text in the textbox of the
combo display strangely (e.g. not showing the first few letters of a
particular item, etc).
My question is: How do I get the Ansi_Padding out? It is a varchar field.
Any help will be greatly appreciated!
--
SandySandy (Sandy@.discussions.microsoft.com) writes:
> I copied and pasted a database (and log) from Sql Server 2000 to Sql
> Server 2005 (and attached it).
> Everything appeared okay until my VB 6 app had problems with its
> comboboxes.
> I realized that Ansi_Padding was automatically put in by Sql Server
> 2005 (it wasn't there in Sql Server 2000), thus making the text in the
> textbox of the combo display strangely (e.g. not showing the first few
> letters of a particular item, etc).
> My question is: How do I get the Ansi_Padding out? It is a varchar
> field.
If you attached the database file from SQL 2000, the setting of ANSI_PADDING
should not change, as it saved with the column.
You can verify this by running:
select name, is_ansi_padded
from sys.columns
where object_id('usrdictwords') = object_id
Generally I would recommend that you stick with ANSI_PADDING on, since
there are features that require this setting. In SQL 2000 it was
indexed views and indexed computed columns. In SQL 2005 this require-
ment also applies when you use XQuery.
As for the behaviour of your VB app, it does not sound like ANSI_PADDING
to me. What ANSI_PARDDING is about is what happens to trailing blanks
in varchar when you insert it. With ANSI_PADDING off, they are trimmed,
with ANSI_PADDING on, they are retained.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx