显示标签为“file”的博文。显示所有博文
显示标签为“file”的博文。显示所有博文

2012年3月25日星期日

Anyone know about Parsing an email?

Hello everyone

Heres what it looks like:

I have a large file of over 40k email records. The emails are all mixed up and come in various formats but i noticed that most of them are in this format:

firstname.lastname@.email.com
firstname.middlename.lastname@.email.com

For all those emails with the period (.) in between, the (.) actually separates an individuals first and last name.

My task is this, to separate all the emails that are in this format into first and last name fields. I'm stimped folks and I'll really appreciate any pointers or ideas on how to go about solving this task.

Thanksuse substring and charindexsql

anyone imported text file generated from DB2?

Has anyone import into SQL Server a text file generated by DB2?

the DB2 files has bunch of weird characters in it that look like they are used for formatting...as opposed to actually being data.

Also, SQL can't find a row terminator.

I received a cobol declaration copy book for the data, but I have no idea what to do with the special characters.
Any help would be appreiciated.

There's a sample of the copybook...

* COBOL DECLARATION FOR TABLE @.TNJ00.PL_JUDGM_ATTORNEY *
************************************************** ****************
01 DCLPL-JUDGM-ATTORNEY.
10 PJATY-ATTORNEY-KEY.
15 PJATY-JUDGMENT-ID PIC X(14).
15 PJATY-GROUPING-CDE PIC S9(4) USAGE COMP.
15 PJATY-ROLE-CDE PIC X(1).
88 PJATY-CREDITOR VALUE 'C'.
88 PJATY-DEBTOR VALUE 'D'.
88 PJATY-TRUSTEE VALUE 'T'.
15 PJATY-GROUP-SEQ PIC S9(4) USAGE COMP.
15 PJATY-ENTRY-SEQ PIC S9(4) USAGE COMP.
10 PJATY-NAME-PREFIX PIC X(4).
10 PJATY-LAST-NME PIC X(25).
10 PJATY-FIRST-NME PIC X(12).
10 PJATY-MIDDLE-NME PIC X(12).
10 PJATY-NAME-SUFFIX PIC X(4).
10 PJATY-CITY-NME PIC X(25).
10 PJATY-STATE PIC X(2).
10 PJATY-ZIP-CDE PIC X(9).
10 PJATY-TELEPHONE PIC X(10).
10 PJATY-LST-MNT-DTE PIC S9(9) USAGE COMP.
10 PJATY-REC-LOCK-CDE PIC X(1).
10 PJATY-ALT-NAME.
49 PJATY-ALT-NAME-LEN PIC S9(4) USAGE COMP.
49 PJATY-ALT-NAME-TEXT PIC X(75).
10 PJATY-ADDR-LINE-1.
49 PJATY-ADDR-LINE-1-LEN PIC S9(4) USAGE COMP.
49 PJATY-ADDR-LINE-1-TEXT PIC X(50).
10 PJATY-ADDR-LINE-2.
49 PJATY-ADDR-LINE-2-LEN PIC S9(4) USAGE COMP.
49 PJATY-ADDR-LINE-2-TEXT PIC X(50).
10 PJATY-ADDR-LINE-3.
49 PJATY-ADDR-LINE-3-LEN PIC S9(4) USAGE COMP.
49 PJATY-ADDR-LINE-3-TEXT PIC X(50).
************************************************** ****************
* THE NUMBER OF COLUMNS DESCRIBED BY THIS DECLARATION IS 20 *
************************************************** ****************Tell them to send you another file and unpack the numeric fields, and to make the sign a separate character.

The "USAGE COMP" (usage computational) packs a digit into 4 bits, so that each byte contains two digits. Then COBOL also uses a half byte for the sign unless they specify sign separate and it's location (trailing or leading).

PIC 9(04) would mean numeric values occupying 4 bytes.
PIC 9(04) usage comp would mean numeric values occupying 2 bytes
PIC S9(04) usage comp means numeric values occupying 3 bytes (2 for the numbers and one for the sign with the sign leading the numeric values.


The 01 level is a record level descriptor. The 10 is a field. the 15 are the sub-fields that make up the field if the 10 level does not have a type identifier. All PIC X fields are fixed length. The 88 level describes the allowed values in the 15 field just above them. The 49 level is the same as the 15 level ... it describes the data in the sub-fields that make up the 10 field.

Clear as mud, i'm sure. As for the row terminator, you have to calculate the row length and apply your own. You might be able to import this with DTS, but it would be easier to have them send it to you as all character data, them you import and convert to numeric (all numbers in this example are integers (thank goodness) ... but couls have positive or negative signs).|||Very good...I thought I was the last salt on the planet that knew this stuff...

In either case, not only do you have that, you also have varchar's it looks like...which shouldn't be a problem.

Also, you don't know if any of the columns are nullable.

What you really need is the LOAD Card that is generated by DB2 when they unload the data. THAT will tell you what you have, not the COBOL Copybook.

TYhey should probably unload the data using PARM('SQL') and use SQL DML to creat a nice loadable file|||164 150 141 156 153 040 171 157 165 040 166 145 162 171 040 155 165 143 150

in octal

(http://nickciske.com/tools/octal.php)|||You will get CRLF in the file. I've never needed to generate a row terminator...in any event, the Column names are definetly not the column names in DB2...they are limited to 18 bytes...another reason to get the LOAD CARD

You want them to do this

//UNLOAD JOB (B,X,XXXXX),'UNLOAD',PRTY=13,GROUP=XXXXXXXX,
// NOTIFY=&SYSUID,MSGCLASS=V,TIME=60
//*+JBS BIND XTDDBB4.ONLINE
//UNCAT EXEC PGM=IEXUNCAT,COND=(4,LT)
//SYSIN DD *
XXXXXX.DBB4.SBD000DB.UNLOAD.INDEX.D060606
/*
//UNLOAD EXEC PGM=IKJEFT01,REGION=6M,COND=(4,LT)
//STEPLIB DD DSN=BXXXB4.DB2.SDSNLOAD,DISP=SHR
//SYSTSPRT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SYSUDUMP DD DUMMY
//SYSREC00 DD SPACE=(CYL,(100,25),RLSE),
// UNIT=DASD,DISP=(,CATLG),LABEL=RETPD=365,
// DSN=XXXXXX.DBB4.SBD000DB.UNLOAD.INDEX.D060606
//SYSPUNCH DD DUMMY
//*SYSPUNCH DD DISP=SHR,
//* DSN=XXXXXX.DBA.DBB4.SBD000DB.CTLCARD(INDEX)
//SYSTSIN DD *
DSN SYSTEM(DBB4)
RUN PROGRAM(DSNTIAUL) PLAN(DSNTIAUL) -
LIB('BXXXB4.DB2.RUNLIB.LOAD') PARMS('SQL')
END
/*
//SYSIN DD *
SELECT PJATY_JUDGMENT_ID
, CHAR(PJATY_GROUPING_CDE,8)
, PJATY_ROLE_CDE
, CHAR(PJATY_GROUP_SEQ,8)
, CHAR(PJATY_ENTRY_SEQ,8)
, PJATY_NAME_PREFIX
, PJATY_LAST_NME
, PJATY_FIRST_NME
, PJATY_MIDDLE_NME
, PJATY_NAME_SUFFIX
, PJATY_CITY_NME
, PJATY_STATE
, PJATY_ZIP_CDE
, PJATY_TELEPHONE
, CHAR(PJATY_LST_MNT_DTE,18)
, PJATY_REC_LOCK_CDE
, CHAR(PJATY_ALT_NAME_TEXT,75)
, CHAR(PJATY_ADDR_LINE_1_TEXT,50)
, CHAR(PJATY_ADDR_LINE_2_TEXT,50)
, CHAR(PJATY_ADDR_LINE_3_TEXT,50)
FROM TABLE
/*|||And if you really want to smoke their minds...ask them for a comma delimted file

SELECT ' "'||COALESCE(RTRIM(PJATY_JUDGMENT_ID),'')||'"'
||',"'||COALESCE(RTRIM(CHAR(PJATY_GROUPING_CDE,8)),'')| |'"'
||',"'||COALESCE(RTRIM(PJATY_ROLE_CDE),'')||'"'
||',"'||COALESCE(RTRIM(CHAR(PJATY_GROUP_SEQ,8)),'')||'"'
||',"'||COALESCE(RTRIM(CHAR(PJATY_ENTRY_SEQ,8)),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_NAME_PREFIX),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_LAST_NME),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_FIRST_NME),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_MIDDLE_NME),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_NAME_SUFFIX),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_CITY_NME),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_STATE),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_ZIP_CDE),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_TELEPHONE),'')||'"'
||',"'||COALESCE(RTRIM(CHAR(PJATY_LST_MNT_DTE,18)),'')| |'"'
||',"'||COALESCE(RTRIM(PJATY_REC_LOCK_CDE),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_ALT_NAME_TEXT),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_ADDR_LINE_1_TEXT),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_ADDR_LINE_2_TEXT),'')||'"'
||',"'||COALESCE(RTRIM(PJATY_ADDR_LINE_3_TEXT),'')||'"'
FROM TABLE|||And if you already have a file based on that layout...you can't use it...or at least you can't use it with out A LOT of coding on your end

Anyone have experience with SYSFILES?

According to Microsoft, sysfiles.status:

0x80 = File has been written to since last backup.

Unfortunately, this query is not returning what I expect. That is, if I have inserted/updated/deleted any records, I belive that some underlying file must be written to.

So, I tried:

Code Snippet

select * from sysfiles where status & 0x80 <> 0

I updated several fields in my database, doubled and tripled the size etc. I can't get the above query to return any results.

Does anyone have any experience with this particular flag? I was hoping to check the sysfiles and know if a backup was needed or not.

Thanks mucho.


Interesting. I just tried this with 7.0, 2000, and 2005, and that bit is never set. And 6.5 uses sysdevices rather than sysfiles. I wonder if somewhere in the development cycle they decided not to implement that, and it was never taken out of the documentation drafts. Either that or it's a very long standing, little-known bug. ;-)

There might be some ways to take advantage of the differential changed map to determine if a database has changed since the last backup, but the only way I know to read it isn't well suited for inclusion in a batch job.

Code Snippet

DBCC TRACEON(3604)
DBCC PAGE(databasename, 1, 6, 3)


Granted, a database larger than about 4 GB is probably going to have more than one dcm page, and I'm not entirely sure how to determine where subsequent pages are located.

2012年3月20日星期二

Any way to repair the damage backup file?

Maybe post to wrong group, I am sorry.
If my backup file has error so that the restore is failed, is there any way
to repair the damage backup file?
(The restore procedure of 2005 report there is an error on page 65535:-1)
Thx.
To the best of my knowledge, there's no repair option for a backup. But in 2005, you have the
CONTINUE_AFTER_ERROR option for the RESTORE command. Of could, you will then have a database win
some corruption which you need to handle.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Frank Lee" <Reply@.to.newsgroup> wrote in message news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gbl...
> Maybe post to wrong group, I am sorry.
> If my backup file has error so that the restore is failed, is there any way to repair the damage
> backup file?
> (The restore procedure of 2005 report there is an error on page 65535:-1)
> Thx.
>
|||I see. Thanks. Good news for me.
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com>
??:eWH4mgwDGHA.2924@.tk2msftngp13.phx.gbl...
> To the best of my knowledge, there's no repair option for a backup. But in
> 2005, you have the CONTINUE_AFTER_ERROR option for the RESTORE command. Of
> could, you will then have a database win some corruption which you need to
> handle.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "Frank Lee" <Reply@.to.newsgroup> wrote in message
> news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gbl...
>

Any way to repair the damage backup file?

Maybe post to wrong group, I am sorry.
If my backup file has error so that the restore is failed, is there any way
to repair the damage backup file?
(The restore procedure of 2005 report there is an error on page 65535:-1)
Thx.To the best of my knowledge, there's no repair option for a backup. But in 2005, you have the
CONTINUE_AFTER_ERROR option for the RESTORE command. Of could, you will then have a database win
some corruption which you need to handle.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Frank Lee" <Reply@.to.newsgroup> wrote in message news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gbl...
> Maybe post to wrong group, I am sorry.
> If my backup file has error so that the restore is failed, is there any way to repair the damage
> backup file?
> (The restore procedure of 2005 report there is an error on page 65535:-1)
> Thx.
>|||I see. Thanks. Good news for me.
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com>
'?:eWH4mgwDGHA.2924@.tk2msftngp13.phx.gbl...
> To the best of my knowledge, there's no repair option for a backup. But in
> 2005, you have the CONTINUE_AFTER_ERROR option for the RESTORE command. Of
> could, you will then have a database win some corruption which you need to
> handle.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "Frank Lee" <Reply@.to.newsgroup> wrote in message
> news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gbl...
>> Maybe post to wrong group, I am sorry.
>> If my backup file has error so that the restore is failed, is there any
>> way to repair the damage backup file?
>> (The restore procedure of 2005 report there is an error on page 65535:-1)
>> Thx.
>

Any way to repair the damage backup file?

Maybe post to wrong group, I am sorry.
If my backup file has error so that the restore is failed, is there any way
to repair the damage backup file?
(The restore procedure of 2005 report there is an error on page 65535:-1)
Thx.To the best of my knowledge, there's no repair option for a backup. But in 2
005, you have the
CONTINUE_AFTER_ERROR option for the RESTORE command. Of could, you will then
have a database win
some corruption which you need to handle.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Frank Lee" <Reply@.to.newsgroup> wrote in message news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gb
l...
> Maybe post to wrong group, I am sorry.
> If my backup file has error so that the restore is failed, is there any wa
y to repair the damage
> backup file?
> (The restore procedure of 2005 report there is an error on page 65535:-1)
> Thx.
>|||I see. Thanks. Good news for me.
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com>
'?:eWH4mgwDGHA.2924@.tk2msftngp13.phx.gbl...
> To the best of my knowledge, there's no repair option for a backup. But in
> 2005, you have the CONTINUE_AFTER_ERROR option for the RESTORE command. Of
> could, you will then have a database win some corruption which you need to
> handle.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "Frank Lee" <Reply@.to.newsgroup> wrote in message
> news:%23VMMehtDGHA.644@.TK2MSFTNGP09.phx.gbl...
>sql

Any way to recover schema (thats it) from a mdf file

Ok, with sqlserver 2005, I've noticed my databases being marked as
suspect far more often than sql 2000. Anyway... after futzing around
with it for quite some time... I'd made no progress.
Now I am at the point where the database wasnt cleanly shutdown, and I
dont have a log file.
I am curious if theres anyway to grab schema from the mdf. I am
relatively sure that the mdf file is in fact in tact and not corrupt,
but I dont know where to go from here. sp_attach_single_file_db looks
like it'd work if it was cleanly shutdown.
Thanks in advance
WestonBefore I say anything else: this is worth a call to PSS, especially if this
is critical data. That being said...
You can start the database in EMERGENCY mode, which will start it without
attempting recovery. Given that all of the original data files are in place,
you can expert data out (although you aren't guaranteed as to transactional
consistency).
I would do that and then try the the sp_attach_single_file_db - at least you
can pump out info from the database first.
"Weston Weems" <wweemsNO_SPAM.PLEASE@.gmail.com> wrote in message
news:u2i2ZGdoGHA.4776@.TK2MSFTNGP03.phx.gbl...
> Ok, with sqlserver 2005, I've noticed my databases being marked as suspect
> far more often than sql 2000. Anyway... after futzing around with it for
> quite some time... I'd made no progress.
> Now I am at the point where the database wasnt cleanly shutdown, and I
> dont have a log file.
> I am curious if theres anyway to grab schema from the mdf. I am relatively
> sure that the mdf file is in fact in tact and not corrupt, but I dont know
> where to go from here. sp_attach_single_file_db looks like it'd work if it
> was cleanly shutdown.
>
> Thanks in advance
> Weston|||If you've got the MDF , do a reattach . Are you saying your log file has
disappeared?
--
Jack Vamvas
___________________________________
Receive free SQL tips - www.ciquery.com/sqlserver.htm
___________________________________
"Weston Weems" <wweemsNO_SPAM.PLEASE@.gmail.com> wrote in message
news:u2i2ZGdoGHA.4776@.TK2MSFTNGP03.phx.gbl...
> Ok, with sqlserver 2005, I've noticed my databases being marked as
> suspect far more often than sql 2000. Anyway... after futzing around
> with it for quite some time... I'd made no progress.
> Now I am at the point where the database wasnt cleanly shutdown, and I
> dont have a log file.
> I am curious if theres anyway to grab schema from the mdf. I am
> relatively sure that the mdf file is in fact in tact and not corrupt,
> but I dont know where to go from here. sp_attach_single_file_db looks
> like it'd work if it was cleanly shutdown.
>
> Thanks in advance
> Weston|||On Fri, 07 Jul 2006 07:28:20 -0700, Weston Weems wrote:

>Ok, with sqlserver 2005, I've noticed my databases being marked as
>suspect far more often than sql 2000.
Hi Weston,
Having databases marked as suspect should not happen on a regular basis,
unless you're running on wacky hardware or you're doing wacky things.

>Now I am at the point where the database wasnt cleanly shutdown, and I
>dont have a log file.
Any idea what caused yoou to lose the log file? The cause of that might
be related to yoour high frequency of suspect databases.
Hugo Kornelis, SQL Server MVP

Any way to recover schema (thats it) from a mdf file

Ok, with sqlserver 2005, I've noticed my databases being marked as
suspect far more often than sql 2000. Anyway... after futzing around
with it for quite some time... I'd made no progress.
Now I am at the point where the database wasnt cleanly shutdown, and I
dont have a log file.
I am curious if theres anyway to grab schema from the mdf. I am
relatively sure that the mdf file is in fact in tact and not corrupt,
but I dont know where to go from here. sp_attach_single_file_db looks
like it'd work if it was cleanly shutdown.
Thanks in advance
WestonBefore I say anything else: this is worth a call to PSS, especially if this
is critical data. That being said...
You can start the database in EMERGENCY mode, which will start it without
attempting recovery. Given that all of the original data files are in place,
you can expert data out (although you aren't guaranteed as to transactional
consistency).
I would do that and then try the the sp_attach_single_file_db - at least you
can pump out info from the database first.
"Weston Weems" <wweemsNO_SPAM.PLEASE@.gmail.com> wrote in message
news:u2i2ZGdoGHA.4776@.TK2MSFTNGP03.phx.gbl...
> Ok, with sqlserver 2005, I've noticed my databases being marked as suspect
> far more often than sql 2000. Anyway... after futzing around with it for
> quite some time... I'd made no progress.
> Now I am at the point where the database wasnt cleanly shutdown, and I
> dont have a log file.
> I am curious if theres anyway to grab schema from the mdf. I am relatively
> sure that the mdf file is in fact in tact and not corrupt, but I dont know
> where to go from here. sp_attach_single_file_db looks like it'd work if it
> was cleanly shutdown.
>
> Thanks in advance
> Weston|||If you've got the MDF , do a reattach . Are you saying your log file has
disappeared?
--
--
Jack Vamvas
___________________________________
Receive free SQL tips - www.ciquery.com/sqlserver.htm
___________________________________
"Weston Weems" <wweemsNO_SPAM.PLEASE@.gmail.com> wrote in message
news:u2i2ZGdoGHA.4776@.TK2MSFTNGP03.phx.gbl...
> Ok, with sqlserver 2005, I've noticed my databases being marked as
> suspect far more often than sql 2000. Anyway... after futzing around
> with it for quite some time... I'd made no progress.
> Now I am at the point where the database wasnt cleanly shutdown, and I
> dont have a log file.
> I am curious if theres anyway to grab schema from the mdf. I am
> relatively sure that the mdf file is in fact in tact and not corrupt,
> but I dont know where to go from here. sp_attach_single_file_db looks
> like it'd work if it was cleanly shutdown.
>
> Thanks in advance
> Weston|||On Fri, 07 Jul 2006 07:28:20 -0700, Weston Weems wrote:
>Ok, with sqlserver 2005, I've noticed my databases being marked as
>suspect far more often than sql 2000.
Hi Weston,
Having databases marked as suspect should not happen on a regular basis,
unless you're running on wacky hardware or you're doing wacky things.
>Now I am at the point where the database wasnt cleanly shutdown, and I
>dont have a log file.
Any idea what caused yoou to lose the log file? The cause of that might
be related to yoour high frequency of suspect databases.
--
Hugo Kornelis, SQL Server MVP

2012年3月19日星期一

Any way to create one subscription which delivers server email and file share at the same time?

Hi,

I need to know if it's possible to send out a notification email and deliver the report by file share with one subscription or at least send out a notification email depending on the event of a subscription which delivers the report file share.

My goal is not to have 2 separate subscriptions.

Thanks

You will need to have your own delivery extension created for this,as there is no composing of different delivery methods. But as the classes already exists for those two method you can just call that within your own extension.

HTH, jens Suessmeyer.

http://www.sqlserver2005.de|||

Hi Jens,

thanks for you reply.

If you don't mind, could you please explain your reply in details....I'm not sure about the delivery extension you mentioned.

I'm not an expert as you.

thanks again

Bo

|||

Good question...

So, how to create delivery extension ? Do anybody know place where it's possible to read something about how create it?

Thanks,

Any way to create one subscription which delivers server email and file share at the same ti

Hi,

I need to know if it's possible to send out a notification email and deliver the report by file share with one subscription or at least send out a notification email depending on the event of a subscription which delivers the report file share.

My goal is not to have 2 separate subscriptions.

Thanks

You will need to have your own delivery extension created for this,as there is no composing of different delivery methods. But as the classes already exists for those two method you can just call that within your own extension.

HTH, jens Suessmeyer.

http://www.sqlserver2005.de|||

Hi Jens,

thanks for you reply.

If you don't mind, could you please explain your reply in details....I'm not sure about the delivery extension you mentioned.

I'm not an expert as you.

thanks again

Bo

|||

Good question...

So, how to create delivery extension ? Do anybody know place where it's possible to read something about how create it?

Thanks,

Any way to create one subscription which delivers server email and file share at the same ti

Hi,

I need to know if it's possible to send out a notification email and deliver the report by file share with one subscription or at least send out a notification email depending on the event of a subscription which delivers the report file share.

My goal is not to have 2 separate subscriptions.

Thanks

You will need to have your own delivery extension created for this,as there is no composing of different delivery methods. But as the classes already exists for those two method you can just call that within your own extension.

HTH, jens Suessmeyer.

http://www.sqlserver2005.de|||

Hi Jens,

thanks for you reply.

If you don't mind, could you please explain your reply in details....I'm not sure about the delivery extension you mentioned.

I'm not an expert as you.

thanks again

Bo

|||

Good question...

So, how to create delivery extension ? Do anybody know place where it's possible to read something about how create it?

Thanks,

Any way to create one subscription which delivers server email and file share at the same ti

Hi,

I need to know if it's possible to send out a notification email and deliver the report by file share with one subscription or at least send out a notification email depending on the event of a subscription which delivers the report file share.

My goal is not to have 2 separate subscriptions.

Thanks

You will need to have your own delivery extension created for this,as there is no composing of different delivery methods. But as the classes already exists for those two method you can just call that within your own extension.

HTH, jens Suessmeyer.

http://www.sqlserver2005.de|||

Hi Jens,

thanks for you reply.

If you don't mind, could you please explain your reply in details....I'm not sure about the delivery extension you mentioned.

I'm not an expert as you.

thanks again

Bo

|||

Good question...

So, how to create delivery extension ? Do anybody know place where it's possible to read something about how create it?

Thanks,

Any way to check if a log file (.txt) already exists before BCPing a file?

I want to have one log file for each task ran within a job. There are several tasks for each job that may be handled by different people so I would like to have a running log file on the network that can be checked by supervisors. I know how to creat the log file:

master..xp_cmdshell bcp "SELECT * FROM ##logFile" queryout "c:\log.txt"

Problem is next time a task is run, the original text file will get written over by the new one. I came up with a solution for that by using bcp to export a temporary log file, then append that file to my existing log file, then kill the temporary file and that works fine:

master..xp_cmdshell copy log_temp.txt + log.txt

The problem is the second command obviously fails if the main log file does not exist.

Ideally I would like to check to see if the log file exists, if not create it, if it does exist, append to it.

Having a master log table stored in SQL has been suggested and shot down. They want one text file for job ran.

Any help would be greatly appreciated.

Thanks.

Tim

Write VBScript code (for example) and run this in a SQLAgent job. Don't use xp_cmdshell for these sort of things. It is less flexible and a security vulnerability. It takes few lines of code in VBScript to check for file. Or you can write a CMD script and run it in a SQLAgent job.

2012年3月11日星期日

Any SQL wizard can help? Reformat the input file and transfer into SQL server

I am trying to transfer 200 txt files into SQL server by using query analyzer.
The command is 'Bulk insert [tableName] from 'path\filename.txt'
However, I need to read and modifiy the txt file.
I am new to SQL server but I believe there must be some one who is a wizard can do what I want easily.

Thank you for the help in advance!

Here is the raw data layout, which is comma delimited.
BDate 1/1/1990 BDate 1/1/1990 BDate 1/1/1990 BDate 1/1/1990
Edate 1/1/2005 Edate 1/1/2005 Edate 1/1/2005 Edate 1/1/2005
Fq D Fq D Fq D Fq D
Date R P M E D Date R P M E D Date R P M E D Date R P M E D
1/1/90 1 2 3 4 5 1/1/90 2 3 4 5 6 1/1/90 3 4 5 6 7 1/1/90 4 5 6 7 8
2 3 4 5 6 1 2 3 4 5 3 4 5 6 7 6 7 8 9 1
1/1/05 ..... 1/1/05 ... 1/1/05 .... 1/1/05 ....

This is the desired output after load into the table, which is tacking each repeating block on top of each other.
Date R P M E D
1/1/90 1 2 3 4 5
2 3 4 5 6
1/1/05 .....
1/1/90 2 3 4 5 6
2 3 4 5 6
1/1/05 .....
1/1/90 3 4 5 6 7
3 4 5 6 7
1/1/05 .....
1/1/90 4 5 6 7 8
6 7 8 9 1
1/1/05 ....."I am trying to transfer 200 txt files into SQL server by using query analyzer."
--DTS might be more appropriate.

"I am new to SQL server but I believe there must be some one who is a wizard can do what I want easily."
--Faith is a powerful thing.

"Here is the raw data layout, which is comma delimited."
--What you posted is not comma delimited.

"This is the desired output after load into the table, which is tacking each repeating block on top of each other."
--You are going to need to load this data into a staging table and normalize it before loading into your production tables. The process will be complex, involving several passes through the data.

If at all possible, try to get your source data in a better format. Practically any other format would be preferable to what you posted.|||Blindman,
Thank you for your reply.
You are right... I forgot to put "," in my sample file layout.
I am using another source provider to request time series in excel. This is the most efficient way I can utilize excel ability (256 columns and over 65,000 rows). That's why the raw data layout looks wired. However, I have to stick to it.

I was thinking to load these files into a table to normalize but I am not sure if I know SQL well enough to say this is the best solution. I think I got the answer from you.

What is staging db. I assume it is one of defualt DB in in enterprise manager, however, I did not see it. Or this is the name you gave?

Thank you again for the help.
Shiparsons|||Not "Staging DB". "Staging TABLE."

A staging table is basically an table that has the same structure as your input data, with additional columns added as needed to keep track of records as they are being processed. I always add an "Imported" column that defaults to getdate(), and an ImportErrors column that I populate as necessary during processing.

Your staging table should have no Primary Keys or constraints (unless you add a surrogate PKey for processing...), so that your import process never fails because the data does not match what is expected.

Once the data is in the staging table you cleans it and make sure it satisfies all the business rules required by your production tables. Then you make as many passes through the staging table as necessary to update the various production tables it feeds, starting with the top-level tables.|||Thank you for the explanation.
What datatype I should use when I create my staging table? I assume this is nonconstraints type since my raw data contains text, datetime, and float.

Thank you,
Qing|||You should try to match the datatype to the type of the data being entered, though some people just make all staging table columns varchar by default. I don't do this, as a rule, but you may have no other choice since your import file is actually a mix of different layouts. String fields are the only column types that will accept any input type.|||Blindman,
Thank you for the help.

I will try.

shiparsons

2012年3月6日星期二

Any Program or Tool that helps in convertion !

Hi,
I need a tool or a program that helps me to convert an MS SQL backup
database file with the ( .bak) extension into ( .sql) , I need that
urgently please.
Many Thanks in advance.
Alabdulelah
Hi,
The tool is called "SQL Server"
Seriously. Restore your database, then script it using Enterprise Manager
or Query Analyzer.
Adam Machanic
Pro SQL Server 2005, available now
http://www.apress.com/book/bookDisplay.html?bID=457
<alabdulelah@.gmail.com> wrote in message
news:1140018773.153566.38100@.z14g2000cwz.googlegro ups.com...
> Hi,
> I need a tool or a program that helps me to convert an MS SQL backup
> database file with the ( .bak) extension into ( .sql) , I need that
> urgently please.
> Many Thanks in advance.
> Alabdulelah
>

Any Program or Tool that helps in convertion !

Hi,
I need a tool or a program that helps me to convert an MS SQL backup
database file with the ( .bak) extension into ( .sql) , I need that
urgently please.
Many Thanks in advance.
Alabdulelahalabdulelah@.gmail.com wrote:
> Hi,
> I need a tool or a program that helps me to convert an MS SQL backup
> database file with the ( .bak) extension into ( .sql) , I need that
> urgently please.
> Many Thanks in advance.
> Alabdulelah
You can restore a SQL Backup to SQL Server using the RESTORE DATABASE
command. Once you've done that you can script the database schema as a
SQL script using the Generate Script feature of SQL 2000 Enterprise
Manager or SQL 2005 Management Studio. (I assume SQL scripts are what
you mean when you say you want a .SQL file?)
Scripting the data itself may be harder depending on how much and how
complex it is. Two possible solutions:
http://vyaskn.tripod.com/code.htm#inserts
http://www.red-gate.com/products/SQ...mpare/index.htm
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
--

Any problem for using a single data file?

Hi there,
Is there any problem using 1 data file with restricted file growth
set to 20GB? I've heard that it's better to have multiple data files with
2GB each. Is that true?
Thanks!
Alex
That was true for Win95/98 and FAT file partitions. If you are using
Win2000 or higher and NTFS, the 20 GB single file is just fine.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>
|||Unless you're splitting filegroups up in order to put them on different
physical devices, there is, IMO, little benefit in creating multiple data
files. All it will accomplish is creating more of a maintenance headache
for you.
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>
|||Hello Alex
It depends on what you are trying to achieve. There is no set requirement
or recommendation either way. However creating a lot of small files for a
database could lead to additional maintenance chores. From performance
standpoint, there should really be no difference either way unless you
achieve stripping with multiple database files across several disk
controllers and drives. However for a database of about 20GB in size, this
striping may only give you small performance benefit.
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.
|||Thanks for the information. I'm really appreciated.
alex
"Geoff N. Hiten" <SRDBA@.Careerbuilder.com> wrote in message
news:uBkGIFFkEHA.1348@.TK2MSFTNGP15.phx.gbl...[vbcol=seagreen]
> That was true for Win95/98 and FAT file partitions. If you are using
> Win2000 or higher and NTFS, the 20 GB single file is just fine.
> --
> Geoff N. Hiten
> Microsoft SQL Server MVP
> Senior Database Administrator
> Careerbuilder.com
> I support the Professional Association for SQL Server
> www.sqlpass.org
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
growth[vbcol=seagreen]
with
>
|||Got it. Thanks!
alex
"Adam Machanic" <amachanic@.hotmail._removetoemail_.com> wrote in message
news:ej7SNGFkEHA.704@.TK2MSFTNGP09.phx.gbl...[vbcol=seagreen]
> Unless you're splitting filegroups up in order to put them on different
> physical devices, there is, IMO, little benefit in creating multiple data
> files. All it will accomplish is creating more of a maintenance headache
> for you.
>
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
growth[vbcol=seagreen]
with
>
|||Thanks!
alex
"Pankaj Agarwal [MSFT]" <pankaja@.online.microsoft.com> wrote in message
news:ecnxtZHkEHA.2516@.cpmsftngxa10.phx.gbl...
> Hello Alex
> It depends on what you are trying to achieve. There is no set requirement
> or recommendation either way. However creating a lot of small files for a
> database could lead to additional maintenance chores. From performance
> standpoint, there should really be no difference either way unless you
> achieve stripping with multiple database files across several disk
> controllers and drives. However for a database of about 20GB in size, this
> striping may only give you small performance benefit.
> Thank you for using Microsoft newsgroups.
> Sincerely
> Pankaj Agarwal
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>

Any problem for using a single data file?

Hi there,
Is there any problem using 1 data file with restricted file growth
set to 20GB? I've heard that it's better to have multiple data files with
2GB each. Is that true?
Thanks!
AlexThat was true for Win95/98 and FAT file partitions. If you are using
Win2000 or higher and NTFS, the 20 GB single file is just fine.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>|||Unless you're splitting filegroups up in order to put them on different
physical devices, there is, IMO, little benefit in creating multiple data
files. All it will accomplish is creating more of a maintenance headache
for you.
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>|||Hello Alex
It depends on what you are trying to achieve. There is no set requirement
or recommendation either way. However creating a lot of small files for a
database could lead to additional maintenance chores. From performance
standpoint, there should really be no difference either way unless you
achieve stripping with multiple database files across several disk
controllers and drives. However for a database of about 20GB in size, this
striping may only give you small performance benefit.
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.|||Thanks for the information. I'm really appreciated.
alex
"Geoff N. Hiten" <SRDBA@.Careerbuilder.com> wrote in message
news:uBkGIFFkEHA.1348@.TK2MSFTNGP15.phx.gbl...
> That was true for Win95/98 and FAT file partitions. If you are using
> Win2000 or higher and NTFS, the 20 GB single file is just fine.
> --
> Geoff N. Hiten
> Microsoft SQL Server MVP
> Senior Database Administrator
> Careerbuilder.com
> I support the Professional Association for SQL Server
> www.sqlpass.org
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
growth[vbcol=seagreen]
with[vbcol=seagreen]
>|||Got it. Thanks!
alex
"Adam Machanic" <amachanic@.hotmail._removetoemail_.com> wrote in message
news:ej7SNGFkEHA.704@.TK2MSFTNGP09.phx.gbl...
> Unless you're splitting filegroups up in order to put them on different
> physical devices, there is, IMO, little benefit in creating multiple data
> files. All it will accomplish is creating more of a maintenance headache
> for you.
>
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
growth[vbcol=seagreen]
with[vbcol=seagreen]
>|||Thanks!
alex
"Pankaj Agarwal [MSFT]" <pankaja@.online.microsoft.com> wrote in message
news:ecnxtZHkEHA.2516@.cpmsftngxa10.phx.gbl...
> Hello Alex
> It depends on what you are trying to achieve. There is no set requirement
> or recommendation either way. However creating a lot of small files for a
> database could lead to additional maintenance chores. From performance
> standpoint, there should really be no difference either way unless you
> achieve stripping with multiple database files across several disk
> controllers and drives. However for a database of about 20GB in size, this
> striping may only give you small performance benefit.
> Thank you for using Microsoft newsgroups.
> Sincerely
> Pankaj Agarwal
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>

Any problem for using a single data file?

Hi there,
Is there any problem using 1 data file with restricted file growth
set to 20GB? I've heard that it's better to have multiple data files with
2GB each. Is that true?
Thanks!
AlexThat was true for Win95/98 and FAT file partitions. If you are using
Win2000 or higher and NTFS, the 20 GB single file is just fine.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>|||Unless you're splitting filegroups up in order to put them on different
physical devices, there is, IMO, little benefit in creating multiple data
files. All it will accomplish is creating more of a maintenance headache
for you.
"Alex Cheng" <acheng@.qtcm.com> wrote in message
news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi there,
> Is there any problem using 1 data file with restricted file growth
> set to 20GB? I've heard that it's better to have multiple data files with
> 2GB each. Is that true?
> Thanks!
> Alex
>|||Hello Alex
It depends on what you are trying to achieve. There is no set requirement
or recommendation either way. However creating a lot of small files for a
database could lead to additional maintenance chores. From performance
standpoint, there should really be no difference either way unless you
achieve stripping with multiple database files across several disk
controllers and drives. However for a database of about 20GB in size, this
striping may only give you small performance benefit.
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.|||Thanks for the information. I'm really appreciated.
alex
"Geoff N. Hiten" <SRDBA@.Careerbuilder.com> wrote in message
news:uBkGIFFkEHA.1348@.TK2MSFTNGP15.phx.gbl...
> That was true for Win95/98 and FAT file partitions. If you are using
> Win2000 or higher and NTFS, the 20 GB single file is just fine.
> --
> Geoff N. Hiten
> Microsoft SQL Server MVP
> Senior Database Administrator
> Careerbuilder.com
> I support the Professional Association for SQL Server
> www.sqlpass.org
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> > Hi there,
> >
> > Is there any problem using 1 data file with restricted file
growth
> > set to 20GB? I've heard that it's better to have multiple data files
with
> > 2GB each. Is that true?
> >
> > Thanks!
> >
> > Alex
> >
> >
>|||Got it. Thanks!
alex
"Adam Machanic" <amachanic@.hotmail._removetoemail_.com> wrote in message
news:ej7SNGFkEHA.704@.TK2MSFTNGP09.phx.gbl...
> Unless you're splitting filegroups up in order to put them on different
> physical devices, there is, IMO, little benefit in creating multiple data
> files. All it will accomplish is creating more of a maintenance headache
> for you.
>
> "Alex Cheng" <acheng@.qtcm.com> wrote in message
> news:uQisgrEkEHA.1404@.TK2MSFTNGP09.phx.gbl...
> > Hi there,
> >
> > Is there any problem using 1 data file with restricted file
growth
> > set to 20GB? I've heard that it's better to have multiple data files
with
> > 2GB each. Is that true?
> >
> > Thanks!
> >
> > Alex
> >
> >
>|||Thanks!
alex
"Pankaj Agarwal [MSFT]" <pankaja@.online.microsoft.com> wrote in message
news:ecnxtZHkEHA.2516@.cpmsftngxa10.phx.gbl...
> Hello Alex
> It depends on what you are trying to achieve. There is no set requirement
> or recommendation either way. However creating a lot of small files for a
> database could lead to additional maintenance chores. From performance
> standpoint, there should really be no difference either way unless you
> achieve stripping with multiple database files across several disk
> controllers and drives. However for a database of about 20GB in size, this
> striping may only give you small performance benefit.
> Thank you for using Microsoft newsgroups.
> Sincerely
> Pankaj Agarwal
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>

2012年2月23日星期四

Any idea why OpenRowSet to open Excel file doesn''t work well in SQL 2005?

Maybe it worked once, but in most time it doesn't work, query like below

select top 10 *
from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\\ws8\web\ablefiles\sitefiles\4000010\reibc\active.xls',
'select * from [crap2$]')

I got error

OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)" returned message "Unspecified error".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)".

but the same query can run without any problem on a SQL 2000 server run on a server in the same network.

Any idea?

After I installed SP2. the query works on the SQL server, but still doesn't work when run it in management studio on my home machine, I connect by VPN. I get

Msg 7399, Level 16, State 1, Line 2
The OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7303, Level 16, State 1, Line 2
Cannot initialize the data source object of OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)".

|||

Any idea?

this drives me crazy, it seems it doesn't matter where I run it, I just tried it in the studio on the SQL server itself, it gives me

OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Unspecified error".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".

it seems it can work sometime, but not the other time, everytime when I run it to handle data in Excel, it doesn't work, but later it can work,

Any help?

|||

any suggestion?

help please

|||I've seen this exact error, but only when the path to the XLS file was wrong (SQL couldn't find the file). Suggest you try a simple path (eg. C:\TEMP\XLSFILE.XLS) first. And of course make sure you have privileges to read that folder.|||

Something is not right, it didn't work and then started to work, after restart SQL server, it doesn't work again, even I copy file to C:\ of the SQL server, it gives me the some error, well before with the same account, it worked for me when the file is on remote server.

I am using the buildin/administrators to connect to SQL server

|||

I get following error while trying to query ACCESS data, any clues?

OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Unspecified error".

Msg 7303, Level 16, State 1, Line 1

Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".

|||

Ashish,

I'm getting the same error you did. Did you ever resolve the problem?

In my case, I've built and deployed a SSRS 2005 report that's based on a view that connects to a linked server (an Access .MDB). I used my domain account to build and deploy the report, and if I try to view it locally (on the SSRS server -- using http://localhost/reports), it works perfectly without any errors.

But if I log in with my domain account from another PC or server, I get this error in the report server:

An error has occurred during report processing.

Query execution failed for data set 'FinanceLinkedServer'.

Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "FINANCEDB".

My domain account has Full permissions on the SSRS server, the Access .MDB itself, and the folder in which it is stored.

I've also checked to make sure that the domain account under which the SQL Server and SSRS services are running also has Full permissions on the Access .MDB and it's container/folder.

Thanks!

|||

I was having a similar issue, and thanks to Systernals FileMon I found that the sqlserver process tries to access the %tmp% directory of the profile of the account the process is running under as the Windows Authentication credentials of the user. Don't think I stated that very well, so I'll give my example:

My sqlserver was running under an account called 'DOMAIN\SqlServiceAct'. The %tmp% directory of this account was the 'C:\Winnt\Profiles\SqlServiceAct\Local Settings\Temp' folder. By changing the setting of this folder to allow both Read and Write permissions to the built-in Domain Users group, all is well.

|||

Thanks, davery921. I tried that, but it didn't resolve the error.

Does anyone else have any ideas? Has anyone out there actually built a report based on a linked Access database and gotten it to work properly?

|||First, to answer your question, the answer is yes, I've gotten it to work.

Next.... I need help too...

I’m getting the strangest error. I have three SSMS clients, one on a PC, two on servers through remote desktop, the servers are both running SQL 2005. If I run this:

DECLARE @.RC int

DECLARE @.STARTDATE datetime

DECLARE @.ENDDATE datetime

-- TODO: Set parameter values here.

EXECUTE @.RC = [Reporting].[dbo].[uspPCChargeAudit3]

'07/01/2007','07/31/2007'

On my machine through SSMS, I get this error:

OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "PCCHARGE" returned message "The Microsoft Jet database engine cannot open the file '\\SERVER_NAME_REMOVED_FOR_POST\Active-Charge\pccw.mdb'. It is already opened exclusively by another user, or you need permission to view its data.".

Msg 7303, Level 16, State 1, Procedure uspPCChargeAudit3, Line 30

Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "PCCHARGE".

If I run this on my production server (through Remote desktop), I get the same error when this machine is pointed to my Test database.

OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "PCCHARGE" returned message "The Microsoft Jet database engine cannot open the file '\\SERVER_NAME_REMOVED_FOR_POST\Active-Charge\pccw.mdb'. It is already opened exclusively by another user, or you need permission to view its data.".

Msg 7303, Level 16, State 1, Procedure uspPCChargeAudit3, Line 30

Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "PCCHARGE".

If I run this on my test server through Remote desktop, to the test database, it runs just fine!

Now here is the kicker… If I run this through Remote desktop on my production server to the production DB, it works!!! If I run the same script point to my production database from my pc, it works.

The only difference is that my production machine that is pointed to the production database is interacting with Oracle 9i in the SP while my test database SP is interacting with Oracle 10g. They all point to the same access database on the same server, competing for the same resource. We are just testing 10g.

Any ideas?