Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Thursday, March 29, 2012

Error when trying to use checkpoint file

I have a package with settings SaveCheckpoints=True and CheckpointUsage = Never. After the package failed I fixed the cause of failure by setting a database column to allow nulls. Then I went to our web app that we built to monitor package execution and clicked on the button to restart the package. The web app loads the package and then sets the CheckpointUsage property of the package object to 'Always'. Then it executes the package in a new thread. The package then produced this error:

Checkpoint file "E:\Package1Checkpoint" failed to open due to error 0x80070005 "Access is denied.".

Since there was only one remaining task to run in the package I ran it manually.

Now here is the really interesting part. I then needed to run the same package but with different parameters. When I attempted to run it with the saved checkpoint settings (CheckpointUsage=Never, SaveCheckpoints=True) I got this error:

"An existing checkpoint file is found with contents that do not appear to be for this package, so the file cannot be overwritten to start saving new checkpoints. Remove the existing checkpoint file and try again. This error occurs when a checkpoint file exists, the package is set to not use a checkpoint file, but to save checkpoints. The existing checkpoint file will not be overwritten. "

So I then attempted to rename the checkpoint file so it would not interfere, however, it would not let me, saying that the file was in use.

So what I had to do was add a configuration entry which set SaveCheckpoints to False. Then I was able to run the package.

Any ideas?

Could it be that the web app was still executing, and the package executed by web app was still running and holding the checkpoint file? This would explain both "Access denied" when trying to read checkpoint, and "file in use" when trying to delete it.|||

TFYR. I'm reviewing the log entries for when the package "failed". I do not see the 'PackageEnd' log entry so I believe you are correct in that the package was still running and was holding the checkpoint file. I believe it was running a very long-running ExecuteDTS2000 task. ' I must have assumed that when a previous task failed that it had failed the package also, which evidently was not the case. Anyway, below is the code from the web app that launches the package if you are interested. Thanks.

...

pkg.CheckpointUsage = DTSCheckpointUsage.Always;

RunPackage();

}

private void RunPackage() {

ThreadStart threadStart = new ThreadStart(PackageExecute);

Thread thread = new Thread(threadStart);

thread.Name = pkg.Name;

thread.Start();

}

private void PackageExecute() {

try

{

pkg.Execute();

}

catch (Exception ex)

{

lblMessage.Text = ex.Message;

}

}

Tuesday, March 27, 2012

Error when shrinking data file in SQL 2005

I've just upgraded my production server from 2K to 2K5, and I've got one
database that is way too big (almost 5 GB with 88% free space). I have no
problems shrinking the transaction log, but when I try to shrink the data
file, I get the following message:
"A severe error occurred on the current command. The results, if any,
should be discarded. File ID 1 of database ID 8 cannot be shrunk as it is
either being shrunk by another process or is empty. (Microsoft SQL Server,
Error: 0)"
At one point, I had autoshrink turned on for the database to handle the
transaction log, but this has never been a problem for the data file. Durin
g
my upgrade process, I detatched this database and copied it over to my test
server, where I am still having no problems. Help!!Hi
I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC CHECKDB?
John
"Gary" wrote:

> I've just upgraded my production server from 2K to 2K5, and I've got one
> database that is way too big (almost 5 GB with 88% free space). I have no
> problems shrinking the transaction log, but when I try to shrink the data
> file, I get the following message:
> "A severe error occurred on the current command. The results, if any,
> should be discarded. File ID 1 of database ID 8 cannot be shrunk as it is
> either being shrunk by another process or is empty. (Microsoft SQL Server,
> Error: 0)"
> At one point, I had autoshrink turned on for the database to handle the
> transaction log, but this has never been a problem for the data file. Dur
ing
> my upgrade process, I detatched this database and copied it over to my tes
t
> server, where I am still having no problems. Help!!
>|||"John Bell" wrote:
[vbcol=seagreen]
> I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC >CHECKDB?[/vbc
ol]
John,
I ended up getting that problem solved (by detaching and reattaching the
database and then taking it offline and rebooting the server), but I've got
another somewhat related problem. In a maintenance plan I was doing a
rebuild on the indexes for the database. That maintenance plan never
completed (it hung up), and now my data file size is about 4 GB (it was abou
t
500 MB) with basically no free space. I've run DBCC CHECKDB along with abou
t
everything else I can think of, and I can't find any way to get that file
size back down.
Thanks!|||Hi Gary
Is this the data file or log file? If it is the log file you can use BACKUP
LOG ... WITH TRUNCATE_ONLY and then DBCC SHRINKFILE.
John
"Gary" wrote:

> "John Bell" wrote:
>
> John,
> I ended up getting that problem solved (by detaching and reattaching the
> database and then taking it offline and rebooting the server), but I've go
t
> another somewhat related problem. In a maintenance plan I was doing a
> rebuild on the indexes for the database. That maintenance plan never
> completed (it hung up), and now my data file size is about 4 GB (it was ab
out
> 500 MB) with basically no free space. I've run DBCC CHECKDB along with ab
out
> everything else I can think of, and I can't find any way to get that file
> size back down.
> Thanks!
>|||John,
No, it's the data file. I've tried to manually rebuild/reorganize
individual indexes, and that's not really working, either.
Gary
[vbcol=seagreen]
> Hi Gary
> Is this the data file or log file? If it is the log file you can use BACKU
P
> LOG ... WITH TRUNCATE_ONLY and then DBCC SHRINKFILE.
> John
>
> "Gary" wrote:
>|||Hi
You can use sp_spaceused to see what is being used by the file, and DBCC
SHRINKFILE will shrink it. In general it is not a good idea to contunually
expand/shrink the files.
John
"Gary" wrote:
[vbcol=seagreen]
> John,
> No, it's the data file. I've tried to manually rebuild/reorganize
> individual indexes, and that's not really working, either.
> Gary
>|||i came accross this problem too. what did was to restart the service
and executed dbcc shrinkfile again. that time it succeeded.|||Hi,
I got the same error,
When i moved all of user tables and indexes out of primary FG and to a new
FG, my Primary is left with around 17GB of which less than a 1GB is actually
used, and when I ran shrink file on the only file on that FG, it took ages.
then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is not
that bad :-) ). when I ran it again I get the error described.
when I back up the db and restore it, then I can shrink the file.
this is a db in simle mode, dev environment. worried what happens when we go
to production with this.
error:
File ID 1 of database ID X cannot be shrunk as it is either being shrunk by
another process or is empty.
Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should
be discarded.
CTS DBA
"XNMB" wrote:

> i came accross this problem too. what did was to restart the service
> and executed dbcc shrinkfile again. that time it succeeded.
>|||I tried detach and re-attach the database fie and run DBCC SHRINKFILE
But I wouldn't recommend shrinking. Here's an article by the great Tibor
Karaszi on database shrinking
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
"CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...[vbcol=seagreen]
> Hi,
> I got the same error,
> When i moved all of user tables and indexes out of primary FG and to a new
> FG, my Primary is left with around 17GB of which less than a 1GB is
> actually
> used, and when I ran shrink file on the only file on that FG, it took
> ages.
> then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is
> not
> that bad :-) ). when I ran it again I get the error described.
> when I back up the db and restore it, then I can shrink the file.
> this is a db in simle mode, dev environment. worried what happens when we
> go
> to production with this.
> error:
> File ID 1 of database ID X cannot be shrunk as it is either being shrunk
> by
> another process or is empty.
> Msg 0, Level 11, State 0, Line 0
> A severe error occurred on the current command. The results, if any,
> should
> be discarded.
> --
> CTS DBA
>
> "XNMB" wrote:
>|||hi,
thanks for the reply.
I know what you mean, but what I don't understand is why shrinkfile wouldn't
work when there's so much free space in the file.
over 8GB
if the original database shrunk in SQL 2000 it works perfectly.
Thanks
--
CTS DBA
"bass_player [SBS-MVP]" wrote:

> I tried detach and re-attach the database fie and run DBCC SHRINKFILE
> But I wouldn't recommend shrinking. Here's an article by the great Tibor
> Karaszi on database shrinking
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
> "CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
> news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...
>
>

Error when shrinking data file in SQL 2005

I've just upgraded my production server from 2K to 2K5, and I've got one
database that is way too big (almost 5 GB with 88% free space). I have no
problems shrinking the transaction log, but when I try to shrink the data
file, I get the following message:
"A severe error occurred on the current command. The results, if any,
should be discarded. File ID 1 of database ID 8 cannot be shrunk as it is
either being shrunk by another process or is empty. (Microsoft SQL Server,
Error: 0)"
At one point, I had autoshrink turned on for the database to handle the
transaction log, but this has never been a problem for the data file. During
my upgrade process, I detatched this database and copied it over to my test
server, where I am still having no problems. Help!!Hi
I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC CHECKDB?
John
"Gary" wrote:
> I've just upgraded my production server from 2K to 2K5, and I've got one
> database that is way too big (almost 5 GB with 88% free space). I have no
> problems shrinking the transaction log, but when I try to shrink the data
> file, I get the following message:
> "A severe error occurred on the current command. The results, if any,
> should be discarded. File ID 1 of database ID 8 cannot be shrunk as it is
> either being shrunk by another process or is empty. (Microsoft SQL Server,
> Error: 0)"
> At one point, I had autoshrink turned on for the database to handle the
> transaction log, but this has never been a problem for the data file. During
> my upgrade process, I detatched this database and copied it over to my test
> server, where I am still having no problems. Help!!
>|||"John Bell" wrote:
> I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC >CHECKDB?
John,
I ended up getting that problem solved (by detaching and reattaching the
database and then taking it offline and rebooting the server), but I've got
another somewhat related problem. In a maintenance plan I was doing a
rebuild on the indexes for the database. That maintenance plan never
completed (it hung up), and now my data file size is about 4 GB (it was about
500 MB) with basically no free space. I've run DBCC CHECKDB along with about
everything else I can think of, and I can't find any way to get that file
size back down.
Thanks!|||Hi Gary
Is this the data file or log file? If it is the log file you can use BACKUP
LOG ... WITH TRUNCATE_ONLY and then DBCC SHRINKFILE.
John
"Gary" wrote:
> "John Bell" wrote:
> > I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC >CHECKDB?
> John,
> I ended up getting that problem solved (by detaching and reattaching the
> database and then taking it offline and rebooting the server), but I've got
> another somewhat related problem. In a maintenance plan I was doing a
> rebuild on the indexes for the database. That maintenance plan never
> completed (it hung up), and now my data file size is about 4 GB (it was about
> 500 MB) with basically no free space. I've run DBCC CHECKDB along with about
> everything else I can think of, and I can't find any way to get that file
> size back down.
> Thanks!
>|||John,
No, it's the data file. I've tried to manually rebuild/reorganize
individual indexes, and that's not really working, either.
Gary
> Hi Gary
> Is this the data file or log file? If it is the log file you can use BACKUP
> LOG ... WITH TRUNCATE_ONLY and then DBCC SHRINKFILE.
> John
>
> "Gary" wrote:
> > "John Bell" wrote:
> >
> > > I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC >CHECKDB?
> >
> > John,
> >
> > I ended up getting that problem solved (by detaching and reattaching the
> > database and then taking it offline and rebooting the server), but I've got
> > another somewhat related problem. In a maintenance plan I was doing a
> > rebuild on the indexes for the database. That maintenance plan never
> > completed (it hung up), and now my data file size is about 4 GB (it was about
> > 500 MB) with basically no free space. I've run DBCC CHECKDB along with about
> > everything else I can think of, and I can't find any way to get that file
> > size back down.
> >
> > Thanks!
> >|||Hi
You can use sp_spaceused to see what is being used by the file, and DBCC
SHRINKFILE will shrink it. In general it is not a good idea to contunually
expand/shrink the files.
John
"Gary" wrote:
> John,
> No, it's the data file. I've tried to manually rebuild/reorganize
> individual indexes, and that's not really working, either.
> Gary
>
> > Hi Gary
> >
> > Is this the data file or log file? If it is the log file you can use BACKUP
> > LOG ... WITH TRUNCATE_ONLY and then DBCC SHRINKFILE.
> >
> > John
> >
> >
> > "Gary" wrote:
> >
> > > "John Bell" wrote:
> > >
> > > > I assume you are using DBCC SHRINKFILE? Have you tried doing a DBCC >CHECKDB?
> > >
> > > John,
> > >
> > > I ended up getting that problem solved (by detaching and reattaching the
> > > database and then taking it offline and rebooting the server), but I've got
> > > another somewhat related problem. In a maintenance plan I was doing a
> > > rebuild on the indexes for the database. That maintenance plan never
> > > completed (it hung up), and now my data file size is about 4 GB (it was about
> > > 500 MB) with basically no free space. I've run DBCC CHECKDB along with about
> > > everything else I can think of, and I can't find any way to get that file
> > > size back down.
> > >
> > > Thanks!
> > >|||i came accross this problem too. what did was to restart the service
and executed dbcc shrinkfile again. that time it succeeded.|||Hi,
I got the same error,
When i moved all of user tables and indexes out of primary FG and to a new
FG, my Primary is left with around 17GB of which less than a 1GB is actually
used, and when I ran shrink file on the only file on that FG, it took ages.
then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is not
that bad :-) ). when I ran it again I get the error described.
when I back up the db and restore it, then I can shrink the file.
this is a db in simle mode, dev environment. worried what happens when we go
to production with this.
error:
File ID 1 of database ID X cannot be shrunk as it is either being shrunk by
another process or is empty.
Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should
be discarded.
--
CTS DBA
"XNMB" wrote:
> i came accross this problem too. what did was to restart the service
> and executed dbcc shrinkfile again. that time it succeeded.
>|||I tried detach and re-attach the database fie and run DBCC SHRINKFILE
But I wouldn't recommend shrinking. Here's an article by the great Tibor
Karaszi on database shrinking
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
"CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...
> Hi,
> I got the same error,
> When i moved all of user tables and indexes out of primary FG and to a new
> FG, my Primary is left with around 17GB of which less than a 1GB is
> actually
> used, and when I ran shrink file on the only file on that FG, it took
> ages.
> then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is
> not
> that bad :-) ). when I ran it again I get the error described.
> when I back up the db and restore it, then I can shrink the file.
> this is a db in simle mode, dev environment. worried what happens when we
> go
> to production with this.
> error:
> File ID 1 of database ID X cannot be shrunk as it is either being shrunk
> by
> another process or is empty.
> Msg 0, Level 11, State 0, Line 0
> A severe error occurred on the current command. The results, if any,
> should
> be discarded.
> --
> CTS DBA
>
> "XNMB" wrote:
>> i came accross this problem too. what did was to restart the service
>> and executed dbcc shrinkfile again. that time it succeeded.
>>|||hi,
thanks for the reply.
I know what you mean, but what I don't understand is why shrinkfile wouldn't
work when there's so much free space in the file.
over 8GB
if the original database shrunk in SQL 2000 it works perfectly.
Thanks
--
CTS DBA
"bass_player [SBS-MVP]" wrote:
> I tried detach and re-attach the database fie and run DBCC SHRINKFILE
> But I wouldn't recommend shrinking. Here's an article by the great Tibor
> Karaszi on database shrinking
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
> "CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
> news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...
> > Hi,
> >
> > I got the same error,
> > When i moved all of user tables and indexes out of primary FG and to a new
> > FG, my Primary is left with around 17GB of which less than a 1GB is
> > actually
> > used, and when I ran shrink file on the only file on that FG, it took
> > ages.
> > then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is
> > not
> > that bad :-) ). when I ran it again I get the error described.
> >
> > when I back up the db and restore it, then I can shrink the file.
> > this is a db in simle mode, dev environment. worried what happens when we
> > go
> > to production with this.
> >
> > error:
> > File ID 1 of database ID X cannot be shrunk as it is either being shrunk
> > by
> > another process or is empty.
> > Msg 0, Level 11, State 0, Line 0
> > A severe error occurred on the current command. The results, if any,
> > should
> > be discarded.
> >
> > --
> > CTS DBA
> >
> >
> > "XNMB" wrote:
> >
> >> i came accross this problem too. what did was to restart the service
> >> and executed dbcc shrinkfile again. that time it succeeded.
> >>
> >>
>
>|||I had similar problem with the data file shrinking, but I think I solved the
problem safer way (without restoring or detaching the database and without
service restart):
1) Backup database log
2) Run sp_helpfile and note the size of the data file (6444160 KB, for
example)
3) Run ALTER DATABASE and use the file size number slightly bigger than
actual file size:
ALTER DATABASE TESTDB
MODIFY FILE
( NAME = TESTDB_DATA
, SIZE = 6444165KB )
4) Now shrink file command works:
USE TESTDB
GO
DBCC SHRINKFILE (N'TESTDB_DATA' , TRUNCATEONLY)
GO
Let me know if this worked for anybody else.
"CTS_DBA" wrote:
> hi,
> thanks for the reply.
> I know what you mean, but what I don't understand is why shrinkfile wouldn't
> work when there's so much free space in the file.
> over 8GB
> if the original database shrunk in SQL 2000 it works perfectly.
> Thanks
> --
> CTS DBA
>
> "bass_player [SBS-MVP]" wrote:
> > I tried detach and re-attach the database fie and run DBCC SHRINKFILE
> >
> > But I wouldn't recommend shrinking. Here's an article by the great Tibor
> > Karaszi on database shrinking
> > http://www.karaszi.com/SQLServer/info_dont_shrink.asp
> >
> > "CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
> > news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...
> > > Hi,
> > >
> > > I got the same error,
> > > When i moved all of user tables and indexes out of primary FG and to a new
> > > FG, my Primary is left with around 17GB of which less than a 1GB is
> > > actually
> > > used, and when I ran shrink file on the only file on that FG, it took
> > > ages.
> > > then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is
> > > not
> > > that bad :-) ). when I ran it again I get the error described.
> > >
> > > when I back up the db and restore it, then I can shrink the file.
> > > this is a db in simle mode, dev environment. worried what happens when we
> > > go
> > > to production with this.
> > >
> > > error:
> > > File ID 1 of database ID X cannot be shrunk as it is either being shrunk
> > > by
> > > another process or is empty.
> > > Msg 0, Level 11, State 0, Line 0
> > > A severe error occurred on the current command. The results, if any,
> > > should
> > > be discarded.
> > >
> > > --
> > > CTS DBA
> > >
> > >
> > > "XNMB" wrote:
> > >
> > >> i came accross this problem too. what did was to restart the service
> > >> and executed dbcc shrinkfile again. that time it succeeded.
> > >>
> > >>
> >
> >
> >

Error when shrinking data file in SQL 2005

Hi,
I got the same error,
When i moved all of user tables and indexes out of primary FG and to a new
FG, my Primary is left with around 17GB of which less than a 1GB is actually
used, and when I ran shrink file on the only file on that FG, it took ages.
then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is not
that bad :-) ). when I ran it again I get the error described.
when I back up the db and restore it, then I can shrink the file.
this is a db in simle mode, dev environment. worried what happens when we go
to production with this.
error:
File ID 1 of database ID X cannot be shrunk as it is either being shrunk by
another process or is empty.
Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should
be discarded.
CTS DBA
"XNMB" wrote:

> i came accross this problem too. what did was to restart the service
> and executed dbcc shrinkfile again. that time it succeeded.
>
I tried detach and re-attach the database fie and run DBCC SHRINKFILE
But I wouldn't recommend shrinking. Here's an article by the great Tibor
Karaszi on database shrinking
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
"CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...[vbcol=seagreen]
> Hi,
> I got the same error,
> When i moved all of user tables and indexes out of primary FG and to a new
> FG, my Primary is left with around 17GB of which less than a 1GB is
> actually
> used, and when I ran shrink file on the only file on that FG, it took
> ages.
> then I stoped it (in 2000 it would have taken max 2-3 hours, our SAN is
> not
> that bad :-) ). when I ran it again I get the error described.
> when I back up the db and restore it, then I can shrink the file.
> this is a db in simle mode, dev environment. worried what happens when we
> go
> to production with this.
> error:
> File ID 1 of database ID X cannot be shrunk as it is either being shrunk
> by
> another process or is empty.
> Msg 0, Level 11, State 0, Line 0
> A severe error occurred on the current command. The results, if any,
> should
> be discarded.
> --
> CTS DBA
>
> "XNMB" wrote:
|||hi,
thanks for the reply.
I know what you mean, but what I don't understand is why shrinkfile wouldn't
work when there's so much free space in the file.
over 8GB
if the original database shrunk in SQL 2000 it works perfectly.
Thanks
CTS DBA
"bass_player [SBS-MVP]" wrote:

> I tried detach and re-attach the database fie and run DBCC SHRINKFILE
> But I wouldn't recommend shrinking. Here's an article by the great Tibor
> Karaszi on database shrinking
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
> "CTS_DBA" <CTSDBA@.discussions.microsoft.com> wrote in message
> news:630320F3-22A4-4D33-A245-54D14D4784FD@.microsoft.com...
>
>

error when running the DTSRUN command

Hi
We're using SqlServer 2000.

I want to run a DTS using the DTSRUN command.
The commmand is inside a ".bat" file which my application is running
from an SQL client.
No matter how I run it from the client it fails giving me the message:

[ConnectionOpen (Connect()).]SQL Server does not exist or access denied

I've tried to run it from the client from the application, from the
command prompt and it fails.
I went into enterprise on the client and tried to run the DTS from
there. The run failed already on the 1st step which is a call to a
stored procedure, also giving me the same message. This really stuck me
as wierd because in enterprise I'm accessing the server successfully so
why the above messgae. It identifies the server and I am accessing it.

Because it failed on the 1st step which is a call to a stored procedure
I tried running the sp from within query analyzer and it succeeded.

Has anyone got any idea what the problem is and ghow to solve it.

Thanks !

David GreenbergDavid Greenberg (davidgr@.iba.org.il) writes:

Quote:

Originally Posted by

I want to run a DTS using the DTSRUN command.
The commmand is inside a ".bat" file which my application is running
from an SQL client.
No matter how I run it from the client it fails giving me the message:
>
[ConnectionOpen (Connect()).]SQL Server does not exist or access denied
>
I've tried to run it from the client from the application, from the
command prompt and it fails.
I went into enterprise on the client and tried to run the DTS from
there. The run failed already on the 1st step which is a call to a
stored procedure, also giving me the same message. This really stuck me
as wierd because in enterprise I'm accessing the server successfully so
why the above messgae. It identifies the server and I am accessing it.
>
Because it failed on the 1st step which is a call to a stored procedure
I tried running the sp from within query analyzer and it succeeded.
>
Has anyone got any idea what the problem is and ghow to solve it.


You can't accuse me to know too much about DTS, but I see that a
DTS package has connection properties. I would assume that you can
have a package on server A, and run it against server B.

So I would check the connection properties of the package (or the step,
as different task surely must be able to connect to different servers).

The error message simply means that you are trying to connect to a
server that is not available. (Not running, misspelled, a firewall
in the way etc.)

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

Monday, March 26, 2012

Error when running DTS package as job

I have a DTS package which deletes some rows in a table and then inserts new ones from a Dbase file.

When I execute the package it works perfectly, but when I shedule it as a job it gives this error message:

Executed as user: LOMMELDOM\Administrator. ...DTSStep_DTSExecuteSQLTask_2 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_2 DTSRun OnStart: Copy Data from GRES to [ZNA].[dbo].[GRES] Step DTSRun OnError: Copy Data from GRES to [ZNA].[dbo].[GRES] Step, Error = -2147221164 (80040154) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error Detail Records: Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 4700 Error: -2147221164 (... Process Exit Code 5. The step failed.

The deleting of the rows works but the import from the Dbase doens't, I thought it was a security issue. So I made sure the JobAgent has all the rights it needs to access the directory of the Dbase file.

Any clues?

You are using a custom class / transform / task that is not registered on the server. You will need to register it through the GAC....|||But how come it works perfectly on the same server when I just execute the package? And on the same server it fails if job agent does it?|||

You are executing it on YOUR machine, even if it is hosted on the server.

Edit: Unless if you are running it from the job by right clicking and hitting execute job starting at step...

|||

Hi Eric,

Thanks for the reply.

Euh, my machine = server so there is no other machine around.

It fails when sheduled but it also fails when I right click the job and say execute. It only works if I right click the package and execute that! Further there are no other steps in the job and the job was created by right clicking on the package and sheduling it.

|||Looking a bit more closely at your error messages, are you running SSIS (SQL 2005)? It kind of looks like you are still using SQL 2000 and DTS...|||Yes I'm using SQL2000 with DTS, that's why I also mentionned this in the title of my question. I didn't find a DTS specific newsgroup.|||Try this one: http://groups.google.com/group/microsoft.public.sqlserver.dts?lnk=srg

Error when running DTS package as job

I have a DTS package which deletes some rows in a table and then inserts new ones from a Dbase file.

When I execute the package it works perfectly, but when I shedule it as a job it gives this error message:

Executed as user: LOMMELDOM\Administrator. ...DTSStep_DTSExecuteSQLTask_2 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_2 DTSRun OnStart: Copy Data from GRES to [ZNA].[dbo].[GRES] Step DTSRun OnError: Copy Data from GRES to [ZNA].[dbo].[GRES] Step, Error = -2147221164 (80040154) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error Detail Records: Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 4700 Error: -2147221164 (... Process Exit Code 5. The step failed.

The deleting of the rows works but the import from the Dbase doens't, I thought it was a security issue. So I made sure the JobAgent has all the rights it needs to access the directory of the Dbase file.

Any clues?

You are using a custom class / transform / task that is not registered on the server. You will need to register it through the GAC....|||But how come it works perfectly on the same server when I just execute the package? And on the same server it fails if job agent does it?|||

You are executing it on YOUR machine, even if it is hosted on the server.

Edit: Unless if you are running it from the job by right clicking and hitting execute job starting at step...

|||

Hi Eric,

Thanks for the reply.

Euh, my machine = server so there is no other machine around.

It fails when sheduled but it also fails when I right click the job and say execute. It only works if I right click the package and execute that! Further there are no other steps in the job and the job was created by right clicking on the package and sheduling it.

|||Looking a bit more closely at your error messages, are you running SSIS (SQL 2005)? It kind of looks like you are still using SQL 2000 and DTS...|||Yes I'm using SQL2000 with DTS, that's why I also mentionned this in the title of my question. I didn't find a DTS specific newsgroup.|||Try this one: http://groups.google.com/group/microsoft.public.sqlserver.dts?lnk=srg

Friday, March 23, 2012

Error when Logging on - Error: 18456, Severity: 14, State: 16

Hi All,
I am getting the following error in the ERRORLOG File when I am
trying
to connect to SQL Server 2005 Express Edition from the command line.
2007-03-23 07:30:05.54 Logon Error: 18456, Severity: 14, State:
16.
2007-03-23 07:30:05.54 Logon Login failed for user 'sa'.
[CLIENT: xx.xx.xx.xx]
I have a fresh install of SQL Server on my local machine, which is
probably the source of my angst, but I am trying to create a database
from scratch through using a batch file that is run from the command
line, the batch file looks like this:
<snip>
SQLCMD -U %USR% -P %PWD% -S %SVR% -d %DBF% -i CreateDatabase.sql
<snip>
The values of %USR% and %PWD% are correct as I frequently log into
Management Studio with the details, %SVR% is "<ComputerName>
\SQLEXPRESS,1433" and %DBF% is <DatabaseName>. The contents of
CreateDatabase.sql when run in Management Studio successfully creates
the database that I require so I know the error doesn't lie there.
The command line error that is showing reads as the following:
Msg 4060, Level 11, State 1, Server <ComputerName>\SQLEXPRESS, Line 1
Cannot open database "<DatabaseName>" requested by the login. The
login failed.
Msg 18456, Level 14, State 1, Server <ComputerName>\SQLEXPRESS, Line
1
Login failed for user 'sa'.
I have a feeling that it is related to 'sa' not having the correct
permissions with regards to logging into or creating a database from
scratch, but I find it odd that I can login as 'sa' in Management
Studio and create the database from there, but when I try it from the
command-line (despite having TCP/IP enabled) it won't let me connect
to create a database.
Is there something I am missing here? I have been banging my head off
of walls for days over this and am unsure of the next steps to take.
If any of this screams "why the hell are you doing this?" then please
let me know as at the moment I am punching in the dark with this
problem, I have searched Usenet, Google Groups, the Web and although
I
have found potential solutions to this particular problem I am
having,
nothing I have done so far has actually managed to fix the issue.
Related articles I have followed are:
Login failed for user 'x'
http://msdn2.microsoft.com/en-us/library/ms366351.aspx
Change Server Authentication Mode
http://msdn2.microsoft.com/en-us/library/ms188670.aspx
Upgrading MSDE 2000 to SQL Server 2005 Express [Enabling network
protocols section]
http://www.microsoft.com/technet/pr...e2sqlexpress...
NOTE: I just listed these articles to show what path I am currently
looking at with regards to this problem, but again I stress I have
found nothing yet to get round this problem, and any further articles
or direction would be much appreciated.
Yours Distressingly
AlastairI'm still not clear on what database is used for -d
<DatabaseName> and this could be one of the problems. If you
are trying to log in using the database you are about to
create, then that's likely the issue as it doesn't exist
when logging in. Try specifying master for the initial login
database.
-Sue
On 23 Mar 2007 08:48:10 -0700, "Alastair"
<alastair_anderson@.hotmail.com> wrote:

>Hi All,
>I am getting the following error in the ERRORLOG File when I am
>trying
>to connect to SQL Server 2005 Express Edition from the command line.
>
>2007-03-23 07:30:05.54 Logon Error: 18456, Severity: 14, State:
>16.
>2007-03-23 07:30:05.54 Logon Login failed for user 'sa'.
>[CLIENT: xx.xx.xx.xx]
>
>I have a fresh install of SQL Server on my local machine, which is
>probably the source of my angst, but I am trying to create a database
>from scratch through using a batch file that is run from the command
>line, the batch file looks like this:
>
><snip>
>SQLCMD -U %USR% -P %PWD% -S %SVR% -d %DBF% -i CreateDatabase.sql
><snip>
>
>The values of %USR% and %PWD% are correct as I frequently log into
>Management Studio with the details, %SVR% is "<ComputerName>
>\SQLEXPRESS,1433" and %DBF% is <DatabaseName>. The contents of
>CreateDatabase.sql when run in Management Studio successfully creates
>the database that I require so I know the error doesn't lie there.
>
>The command line error that is showing reads as the following:
>
>Msg 4060, Level 11, State 1, Server <ComputerName>\SQLEXPRESS, Line 1
>Cannot open database "<DatabaseName>" requested by the login. The
>login failed.
>Msg 18456, Level 14, State 1, Server <ComputerName>\SQLEXPRESS, Line
>1
>Login failed for user 'sa'.
>
>I have a feeling that it is related to 'sa' not having the correct
>permissions with regards to logging into or creating a database from
>scratch, but I find it odd that I can login as 'sa' in Management
>Studio and create the database from there, but when I try it from the
>command-line (despite having TCP/IP enabled) it won't let me connect
>to create a database.
>
>Is there something I am missing here? I have been banging my head off
>of walls for days over this and am unsure of the next steps to take.
>
>If any of this screams "why the hell are you doing this?" then please
>let me know as at the moment I am punching in the dark with this
>problem, I have searched Usenet, Google Groups, the Web and although
>I
>have found potential solutions to this particular problem I am
>having,
>nothing I have done so far has actually managed to fix the issue.
>
>Related articles I have followed are:
>
>Login failed for user 'x'
>http://msdn2.microsoft.com/en-us/library/ms366351.aspx
>
>Change Server Authentication Mode
>http://msdn2.microsoft.com/en-us/library/ms188670.aspx
>
>Upgrading MSDE 2000 to SQL Server 2005 Express [Enabling network
>protocols section]
>http://www.microsoft.com/technet/pr...e2sqlexpress...
>
>NOTE: I just listed these articles to show what path I am currently
>looking at with regards to this problem, but again I stress I have
>found nothing yet to get round this problem, and any further articles
>or direction would be much appreciated.
>
>Yours Distressingly
>
>Alastair|||On 25 Mar, 16:17, Sue Hoegemeier <S...@.nomail.please> wrote:
> I'm still not clear on what database is used for -d
> <DatabaseName> and this could be one of the problems. If you
> are trying to log in using the database you are about to
> create, then that's likely the issue as it doesn't exist
> when logging in. Try specifying master for the initial login
> database.
> -Sue
> On 23 Mar 2007 08:48:10 -0700, "Alastair"
>
> <alastair_ander...@.hotmail.com> wrote:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> - Show quoted text -
You managed to hit the nail on the head.
Running my current script when the new database already exists it
works fine, however for a fresh install of SQL Server, when it doesn't
currently exist then it doesn't work.
Thanks for your help
Alastair

Error when installing SQL Server Express

Hi All,

I have a problem when installing SQL Server Express Advanced Edition

It crash when installing Database Services, the log file contains this message

.....

<Func Name='GetSkuIt'>
GetServiceUserGroup failed for SQLEXPRESS, 5
Error Code: 0x80070534 (1332)
Windows Error Text: Le mappage entre les noms de compte et les ID de scurit n'a pas t effectu.
Source File Name: sqlca\sqlcax.cpp
Compiler Timestamp: Fri Feb 9 22:35:05 2007
Function Name: SetInstanceProperty
Source Line Number: 1223

Error Code: 1332
MSI (s) (E0!E8) [10:34:39:288]: Produit : Microsoft SQL Server 2005 -- Erreur 29528. Le programme d'installation a rencontr une erreur inattendue pendant Dfinition des proprits internes. L'erreur est : Erreur irrcuprable lors de l'installation.

Erreur 29528. Le programme d'installation a rencontr une erreur inattendue pendant Dfinition des proprits internes. L'erreur est : Erreur irrcuprable lors de l'installation.

<EndFunc Name='LaunchFunction' Return='1332' GetLastError='203'>
Fin de l'action 10:34:39 : SetInstanceProperty.D20239D7_E87C_40C9_9837_E70B8D4882C2. Valeur renvoye : 3.
Fin de l'action 10:34:39 : INSTALL. Valeur renvoye : 3.
.......

Can someone help me ?

Thx

Hi,

this might helps http://support.microsoft.com/default.aspx/kb/925976.

|||

No, it won't.

Maybe my windows was in a bad state.

I've see that there is no group in local group ..... and a lot of bad news.

I format .........

Thx for your help.

Error when installing

i'm trying to install MS server 2000 components and I continuously get the same error. "A previous program installation created pending file operations on the installation machine. You must restart the computer before running setup." I've rebooted 100 times, turned the machine on and off, installed other programs, etc. and stil I get the error ONLY when I try to install the components.

Please Help!!!!

Thanks.

Moved to the setup group for a better chance of a response.|||

try...

http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:80/support/kb/articles/q290/9/91.asp&NoWebContent=1

|||

Hi, this should help:

http://support.microsoft.com/default.aspx?scid=kb;en-us;312995

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

sql

Error when installing

i'm trying to install MS server 2000 components and I continuously get the same error. "A previous program installation created pending file operations on the installation machine. You must restart the computer before running setup." I've rebooted 100 times, turned the machine on and off, installed other programs, etc. and stil I get the error ONLY when I try to install the components.

Please Help!!!!

Thanks.

Moved to the setup group for a better chance of a response.|||

try...

http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:80/support/kb/articles/q290/9/91.asp&NoWebContent=1

|||

Hi, this should help:

http://support.microsoft.com/default.aspx?scid=kb;en-us;312995

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

error when importing text file - saving in notepad first works

I'm trying to import a tab separated text file into sql server 2005 using the import guide. But when running the job I get the error message

Error 0xc02020c5: Data Flow Task: Data conversion failed while converting column "Column 19" (67) to column "Column 19" (404). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)

The column 19 which reported a problem contains this information:
?searchroot=/_gen_/txt/&template=std.htm&use=prospect&intref=1_26067

However what is mysterious is that if I open the file in notepad or Excel and resave it again the job runs perfectly. This is not a way we could make it work later on since it's an automatic job that will run each night on a new text file.

The text file is sent from Norway to Sweden - and I use ANSI latin 1 when importing.

The column has datatype DT_STR with a width of 500.

I use Locale Swedish and when I save in Notepad it is saved in ANSI,

I use Windows XP Swedish version.

Try DT_WSTR instead as its unicode.

There could be a control character in that field that gets thrown away when you save in Notepad.|||

Had the same error importing a text file. Error was resolved by changing the dataype as you advised. Thanks!

error when importing text file - saving in notepad first works

I'm trying to import a tab separated text file into sql server 2005 using the import guide. But when running the job I get the error message

Error 0xc02020c5: Data Flow Task: Data conversion failed while converting column "Column 19" (67) to column "Column 19" (404). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)

The column 19 which reported a problem contains this information:
?searchroot=/_gen_/txt/&template=std.htm&use=prospect&intref=1_26067

However what is mysterious is that if I open the file in notepad or Excel and resave it again the job runs perfectly. This is not a way we could make it work later on since it's an automatic job that will run each night on a new text file.

The text file is sent from Norway to Sweden - and I use ANSI latin 1 when importing.

The column has datatype DT_STR with a width of 500.

I use Locale Swedish and when I save in Notepad it is saved in ANSI,

I use Windows XP Swedish version.

Try DT_WSTR instead as its unicode.

There could be a control character in that field that gets thrown away when you save in Notepad.|||

Had the same error importing a text file. Error was resolved by changing the dataype as you advised. Thanks!

Error when importing data from Excel File

Hi!

This is what i'm doing:

IF EXISTS (SELECT srvname FROM master.dbo.sysservers srv WHERE srv.srvid !=

0 AND srv.srvname = N'ExcelSource')

EXEC master.dbo.sp_dropserver @.server=N'ExcelSource', @.droplogins='droplogins';
-
EXEC master.dbo.sp_addlinkedserver
@.server = 'ExcelSource',
@.srvproduct = 'Excel',
@.provider = 'Microsoft.Jet.OLEDB.4.0',
@.datasrc = @.Chemin,
@.provstr = 'Excel 8.0';

EXEC master.dbo.sp_addlinkedsrvlogin
@.rmtsrvname = 'ExcelSource',
@.useself = false,
@.locallogin = NULL,
@.rmtuser ='ADMIN',
@.rmtpassword = NULL;

set @.NomServ = 'ExcelSource';

This create a linkedServer to read my ExcelFile.

Then i'm doing this:

EXEC ('Insert into Elements (No_element, Nom_elem, Desc_elem, Code_grpe_classe, Tps_elem, Code_sgrpe, Code_produit)
Select No_element, Nom_elem, Desc_elem, Code_grpe_classe, Tps_elem, Code_sgrpe, Code_produit
from ' + @.NomServ + '...[Elements$];')

This is where i got an error. The error is:
The OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "ExcelSource" does not contain the table "Elements$". The table either does not exist or the current user does not have permissions on that table.

I can't figure out what i'm missing. I've add permissions for EVERYONE on the file and on the folder just to be sure and i still have the same error. How can i check if the table [Elements$] exist ?
I've forgot to mention that my excel spreadsheet name is Elements.|||I finally found what the problem was. One of my column name had the wrong Column name. Now it's working fine !

Wednesday, March 21, 2012

Error when file is saved to PDF

I am calling a procedure that will save a file to PDF. It runs successfully. When I go to look at the PDF, I get a message, "Acrobat reader could not open 'Doc.pdf' because it is either not a support file or because the file has been damaged."

This is how I'm calling the procedure:

SaveFile(Dts.Variables("varSSRS_URL").Value.ToString() + "?/TestReports/MyReport&rs:Command=Render&rs:Format=PDF", "C:\" + "MyReport.pdf")

Ifaber,

Is this a SSIS ralated issue? If so, please provide more details about the error, which task, etc...

|||I'm with Rafael. I cannot find a SaveFile function when searching in an SSIS script component.|||

Yes, this is a SSIS issue. I put this script in a Script Task.

See http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1173906&SiteID=1&PageID=1 for reference. I have jumped through a lot of errors, and this is my last problem.

|||

ifaber wrote:

Yes, this is a SSIS issue. I put this script in a Script Task.

See http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1173906&SiteID=1&PageID=1 for reference. I have jumped through a lot of errors, and this is my last problem.

In that case, you're going to have to provide your script for us to look at. The link you provided doesn't serve as a valid foundation for your question.

My question though, is where is the PDF data stream? You can't create a PDF file by merely adding a .pdf extension. The data going into SaveFile needs to be a PDF binary stream, I would think.|||

I don't know anything about a PDF data stream. I read somewhere else on the net that 'rs:Command=Render&rs:Format=PDF' would work.

This is the sample that I got and used:

//Sample call
SaveFile(Dts.Variables("varSSRS_URL").Value.ToString() + "?%2fYOUR_SSRS_FOLDER%2fYOUR_REPORT_NAME&rs:Command=Render&rs:Format=PDF", outpath + "FILENAME.pdf")

//Note that you can replace "EXCEL" by "CSV" or "PDF" or any other supported export format

//The get & save file method
Protected Sub SaveFile(ByVal url As String, ByVal localpath As String)
Dim loRequest As System.Net.HttpWebRequest
Dim loResponse As System.Net.HttpWebResponse
Dim loResponseStream As System.IO.Stream
Dim loFileStream As New System.IO.FileStream(localpath, System.IO.FileMode.Create, System.IO.FileAccess.Write)
Dim laBytes(256) As Byte
Dim liCount As Integer = 1

Try
loRequest = CType(System.Net.WebRequest.Create(url), System.Net.HttpWebRequest)
loRequest.Credentials = New System.Net.NetworkCredential(Dts.Variables("varSSRS_LOGIN").Value.ToString(), Dts.Variables("varSSRS_PASSWORD").Value.ToString(), Dts.Variables("varSSRS_DOMAIN").Value.ToString())
loRequest.Timeout = 1000 * 60 * 15 'timeout 15 minutes
loRequest.Method = "GET"
loResponse = CType(loRequest.GetResponse, System.Net.HttpWebResponse)
loResponseStream = loResponse.GetResponseStream
Do While liCount > 0
liCount = loResponseStream.Read(laBytes, 0, 256)
loFileStream.Write(laBytes, 0, liCount)
Loop
loFileStream.Flush()
loFileStream.Close()
Catch ex As Exception

End Try

End Sub

|||Unless you're having issues with the variables, this looks like it's an SSRS issue....

I think the thread already going in the SSRS forum would be more beneficial.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1199641&SiteID=1|||I had issues with the variables because I had a typo. Now it just won't create the PDF.|||

ifaber wrote:

I had issues with the variables because I had a typo. Now it just won't create the PDF.

Yep, then we might want to look at moving this back to the SSRS forum for starters. The code you are building isn't specific to SSIS (though you might be using SSIS variables) so we are likely not going to be able to help you any further.|||Ok...thanks|||

ifaber wrote:

Ok...thanks

Moved to the SSRS Forum to see if anyone over here has any ideas for this custom code.

ifaber is trying to build a PDF file using a reporting services call. The user is trying to perform this in an SSIS task, but the code the user has built isn't SSIS specific.

Thanks,
Phil Brammer
SSIS Moderator

Error when file is saved to PDF

I am calling a procedure that will save a file to PDF. It runs successfully. When I go to look at the PDF, I get a message, "Acrobat reader could not open 'Doc.pdf' because it is either not a support file or because the file has been damaged."

This is how I'm calling the procedure:

SaveFile(Dts.Variables("varSSRS_URL").Value.ToString() + "?/TestReports/MyReport&rs:Command=Render&rs:Format=PDF", "C:\" + "MyReport.pdf")

Ifaber,

Is this a SSIS ralated issue? If so, please provide more details about the error, which task, etc...

|||I'm with Rafael. I cannot find a SaveFile function when searching in an SSIS script component.|||

Yes, this is a SSIS issue. I put this script in a Script Task.

See http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1173906&SiteID=1&PageID=1 for reference. I have jumped through a lot of errors, and this is my last problem.

|||

ifaber wrote:

Yes, this is a SSIS issue. I put this script in a Script Task.

See http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1173906&SiteID=1&PageID=1 for reference. I have jumped through a lot of errors, and this is my last problem.

In that case, you're going to have to provide your script for us to look at. The link you provided doesn't serve as a valid foundation for your question.

My question though, is where is the PDF data stream? You can't create a PDF file by merely adding a .pdf extension. The data going into SaveFile needs to be a PDF binary stream, I would think.|||

I don't know anything about a PDF data stream. I read somewhere else on the net that 'rs:Command=Render&rs:Format=PDF' would work.

This is the sample that I got and used:

//Sample call
SaveFile(Dts.Variables("varSSRS_URL").Value.ToString() + "?%2fYOUR_SSRS_FOLDER%2fYOUR_REPORT_NAME&rs:Command=Render&rs:Format=PDF", outpath + "FILENAME.pdf")

//Note that you can replace "EXCEL" by "CSV" or "PDF" or any other supported export format

//The get & save file method
Protected Sub SaveFile(ByVal url As String, ByVal localpath As String)
Dim loRequest As System.Net.HttpWebRequest
Dim loResponse As System.Net.HttpWebResponse
Dim loResponseStream As System.IO.Stream
Dim loFileStream As New System.IO.FileStream(localpath, System.IO.FileMode.Create, System.IO.FileAccess.Write)
Dim laBytes(256) As Byte
Dim liCount As Integer = 1

Try
loRequest = CType(System.Net.WebRequest.Create(url), System.Net.HttpWebRequest)
loRequest.Credentials = New System.Net.NetworkCredential(Dts.Variables("varSSRS_LOGIN").Value.ToString(), Dts.Variables("varSSRS_PASSWORD").Value.ToString(), Dts.Variables("varSSRS_DOMAIN").Value.ToString())
loRequest.Timeout = 1000 * 60 * 15 'timeout 15 minutes
loRequest.Method = "GET"
loResponse = CType(loRequest.GetResponse, System.Net.HttpWebResponse)
loResponseStream = loResponse.GetResponseStream
Do While liCount > 0
liCount = loResponseStream.Read(laBytes, 0, 256)
loFileStream.Write(laBytes, 0, liCount)
Loop
loFileStream.Flush()
loFileStream.Close()
Catch ex As Exception

End Try

End Sub

|||Unless you're having issues with the variables, this looks like it's an SSRS issue....

I think the thread already going in the SSRS forum would be more beneficial.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1199641&SiteID=1|||I had issues with the variables because I had a typo. Now it just won't create the PDF.|||

ifaber wrote:

I had issues with the variables because I had a typo. Now it just won't create the PDF.

Yep, then we might want to look at moving this back to the SSRS forum for starters. The code you are building isn't specific to SSIS (though you might be using SSIS variables) so we are likely not going to be able to help you any further.|||Ok...thanks|||

ifaber wrote:

Ok...thanks

Moved to the SSRS Forum to see if anyone over here has any ideas for this custom code.

ifaber is trying to build a PDF file using a reporting services call. The user is trying to perform this in an SSIS task, but the code the user has built isn't SSIS specific.

Thanks,
Phil Brammer
SSIS Moderator

Error when doing fuzzy lookup

I am trying to run a SSIS package that contains a fuzzy lookup. I am using a flat file with about 7 million records as the input. The reference table has about 2000 records. The package fails after about 40,000 records with the following information:

Warning: 0x8007000E at Data Flow Task, Fuzzy Lookup [228]: Not enough storage is available to complete this operation.
Warning: 0x800470E9 at Data Flow Task, DTS.Pipeline: A call to the ProcessInput method for input 229 on component "Fuzzy Lookup" (228) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 2 before the call, and 1 after the call returned.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Fuzzy Lookup" (228) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8007000E.
Error: 0xC02020C4 at Data Flow Task, Flat File Source [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "Flat File Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.

-

I have tried many things - changing the BufferTempStoragePath path to a drive that has plenty space, changed the MaxInsertCommitSize to 5,000...

What else can I do?

Thanks!

Did you get this on a RTM build?

If yes, is it possible you apply SP1 to see whether that helps?

Thanks

wenyang

|||

Yes, it is on a RTM build.

The server does not have SP1 on it yet. Is there something else that I could try before that is installed?

Thanks!

|||

I know there was a memory issue got fixed in SP1 for FuzzyLookup, in which case the error looks very similar to yours, so I wanted you to try SP1.

For temp work arounds, you may also try one of the following two on your RTM

1) Increase your page file to >10GB

2) Mitigate the problem by splitting the input into smaller batches (e.g. 100k at a time) and tearing down the package in between each batch.

HTH

wenyang

|||

I have installed SP1, and am getting the same errors.

Any other suggestions other than the temp work arounds you mentioned? As I mentioned earlier, the process is stopping after only 40,000 records or so. This seems like a very small number and I'd hate to have to split the input up into smaller pieces than that.

Thanks!

|||

Does increase page file size help? If not, then we probably need to repro in house to investigate further.

Is it possible you log a bug with repro? (bug report place is http://lab.msdn.microsoft.com/productfeedback/default.aspx) We'll get back to you asap.

Thanks

Wenyang

|||

Sorry - I was told that SP1 had been installed before I wrote my previous message. It actually hadn't been installed at that point. Once that was successfully installed on Integration Services, the fuzzy lookup issues seem to have been resolved.

Thanks!

|||

Hi Wenyang,

I have installed SQL Server 2005 Service Pack 2 and unfortunately this error does not seem to have gone away.

Increasing my pagefile does not seemed to have helped the situation either ?

It seems to occur whenever I choose to perform a match on more than 10 columns within a Fuzzy Lookup ?

Can you please advise what I should do next/when this will be fixed ?

Many Thanks,

Gary

Error when doing fuzzy lookup

I am trying to run a SSIS package that contains a fuzzy lookup. I am using a flat file with about 7 million records as the input. The reference table has about 2000 records. The package fails after about 40,000 records with the following information:

Warning: 0x8007000E at Data Flow Task, Fuzzy Lookup [228]: Not enough storage is available to complete this operation.
Warning: 0x800470E9 at Data Flow Task, DTS.Pipeline: A call to the ProcessInput method for input 229 on component "Fuzzy Lookup" (228) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 2 before the call, and 1 after the call returned.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Fuzzy Lookup" (228) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8007000E.
Error: 0xC02020C4 at Data Flow Task, Flat File Source [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "Flat File Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.

-

I have tried many things - changing the BufferTempStoragePath path to a drive that has plenty space, changed the MaxInsertCommitSize to 5,000...

What else can I do?

Thanks!

Did you get this on a RTM build?

If yes, is it possible you apply SP1 to see whether that helps?

Thanks

wenyang

|||

Yes, it is on a RTM build.

The server does not have SP1 on it yet. Is there something else that I could try before that is installed?

Thanks!

|||

I know there was a memory issue got fixed in SP1 for FuzzyLookup, in which case the error looks very similar to yours, so I wanted you to try SP1.

For temp work arounds, you may also try one of the following two on your RTM

1) Increase your page file to >10GB

2) Mitigate the problem by splitting the input into smaller batches (e.g. 100k at a time) and tearing down the package in between each batch.

HTH

wenyang

|||

I have installed SP1, and am getting the same errors.

Any other suggestions other than the temp work arounds you mentioned? As I mentioned earlier, the process is stopping after only 40,000 records or so. This seems like a very small number and I'd hate to have to split the input up into smaller pieces than that.

Thanks!

|||

Does increase page file size help? If not, then we probably need to repro in house to investigate further.

Is it possible you log a bug with repro? (bug report place is http://lab.msdn.microsoft.com/productfeedback/default.aspx) We'll get back to you asap.

Thanks

Wenyang

|||

Sorry - I was told that SP1 had been installed before I wrote my previous message. It actually hadn't been installed at that point. Once that was successfully installed on Integration Services, the fuzzy lookup issues seem to have been resolved.

Thanks!

|||

Hi Wenyang,

I have installed SQL Server 2005 Service Pack 2 and unfortunately this error does not seem to have gone away.

Increasing my pagefile does not seemed to have helped the situation either ?

It seems to occur whenever I choose to perform a match on more than 10 columns within a Fuzzy Lookup ?

Can you please advise what I should do next/when this will be fixed ?

Many Thanks,

Gary

Monday, March 19, 2012

Error when deleting

I have a msde back end for my application.
I also have an Access data file with linked tables to the SQL one.
My app accesses the data through the Access one through ODBC.

I have a data control and when try to delete a record as so:
DataControl.Recordset.Delete
I get a 'ODBC Call Failed' error.

Any ideas why?
The recordset's updateable property is true.

Thanks,THAT is the reason why I hate bound controls! Throw them away, replace dynamic calls with sp's, and live happily ever after! You think you have problems now while developing? Try putting it into prod, with more than 1 user, and you'll see what I mean (this is not an attack, just my heart screaming "WATCH OUT! THIS IS NOT A LIGHT AT THE END OF THE TUNNEL! IT IS A TRAIN!")|||Have you posted this on the Access forum for input?

Have you considered changing your Access interface to an Access Data Project? It synchronizes with SQL Server much better.

Have you checked to make sure that each of your linked tables has an enforced unique key and that Access was aware of this key when your table was linked?

Have you checked that your login has permission to delete those records?

Have you checked to be sure that the records in the table are participating in non-cascading relationships with other tables?|||...Have you checked it's plugged in?

Have you checked its pulse?

Have you checked your pulse?

...Have you checked if the neiborhood bar has started the happy hour? Go check it! It helps me!|||Here is my ms_sql_dba approved response:

OBVIOUSLY it is plugged in, or he wouldn't be getting an error message! But you didn't THINK about that, did you? And computers don't HAVE pulses, which you'd know if you bothered to read the manual!|||...?...|||...That was a joke...though I know I was accused of making my jokes somewhat untranslateable at times...But hey, that's what made me what I am, so there you have it!|||Originally posted by blindman
Here is my ms_sql_dba approved response:

OBVIOUSLY it is plugged in, or he wouldn't be getting an error message! But you didn't THINK about that, did you? And computers don't HAVE pulses, which you'd know if you bothered to read the manual!

Incorrigible

Friday, March 9, 2012

Error when attempting to backup to a second disk file

In SQL Server 2005, via the GUI, I wish to backup a database to an additional disk file (there is already an existing backup disk file for this database), so that I can have more than one backup. I've added the new disk file name, highlighted it, and clicked OK.

I get an immediate error (see below). Note, the 2nd error message is specifying the existing backup disk file, not the new one I'm attempting to create.

"Backup failed for Server 'WCS-DEV-TPA'. (Microsoft.SqlServer.Smo)"

"System.Data.SqlClient.SqlError: The volume on device 'D:\Program Files\Microsoft SQL Server\MSSQL\BACKUP\WCS_ADV_Longmont.bak' is not part of a multiple family media set. BACKUP WITH FORMAT can be used to form a new media set. (Microsoft.SqlServer.Smo)"

Does anyone know what causes this and how to correct it?

Try removing all but the new disk file name. Also you can just copy the first backup(s) to another drive.|||

While that would be a good workaround, it won't meet my needs.

(a) The users will need to make multiple backups and retain several older versions in order to be able to restore a specific version if necessary. (b) The users who will be making the backups will not have access to the file system on the computer that will host the backup disk files.

Is there a way to correct this problem so that we can have multiple disk backups via the GUI? (The users in question won't be knowledgeable (or privileged) enough to create SQL scripts to do the backups on this 'ad-hoc' basis.)

|||

From another post:

You can use the UI to create a Maintenance Plan(s) with multiple backups. If these are adhoc backups they'll have to be separately.

Kevin Farlee

Posts 224

Answer Re: Backup error - not part of a multiple family media set
Was this post helpful ?


When you back up to two files, you are creating a stripe set. The restriction is that all backups sent to a media set must have the same number of stripes. That's the meaning of your error.

You need to create a media set with the number of stripes you want to use. You can't add members later.

The method for creating a new media set is to use the WITH FORMAT option on your backup command. This is the equivalent of reformatting a tape for backups. It wipes out any previous data in the file(s), and sets up the headers correctly.

So, if you issue the same command in your script, adding WITH FORMAT for ONE TIME ONLY, the first time you use that media family, you should be good to go. From then on, you can just use it as your script is now.


Microsoft SQL Server Storage Engine PM

|||

[1] This is for use in ad-hoc backups and need to be done by fairly inexperienced users as desired before running application software that will modify the database.

[2] We don't want to backup the database to a multiple disk file set - we want to be able to specify a new disk file to receive the database backup, giving us the capability of seeing multiple versions of the database (based on the backup snapshot in each disk file) and restoring (if necessary) from any one of the backup disk files. Doing this through the GUI is necessary because of this will be done with less privileged and knowledgeable users.

|||

Sorry I can't be of more help.

The only solution I can see is for your users to use the UI to back up to only one "location" at a time. They will just have to use the UI to twice, each time backing up to one location.

|||Does this mean that the problem I'm reporting is a known flaw (bug?) in the SQL Server management GUI? It can't be corrected to allow the specification of several backup disk files?|||

I would just say that the Backup Database dialog only allows for one backup at a time to either a single file or device or a multiple family media set. It has to be used two times for two backups.

You can use the Maintenance Plan UI to create a plan with two backup tasks, to two different folders, that run in series and you can schedule that plan.

The UI for the Maintenance Plan - Backup Database Task is similiar to the Backup Database dialog.:

|||

OK, I see. It looks like we'll have to re-evaluate how the permissions will be allocated to users in consideration of this. Thank you (all of y'all) for your help.

Regards,

Douglas

|||

Help me understand more about what you're trying to accomplish:

Are you trying to get two files, each containing exactly the same backup content?

or are you wanting to have a collection of backups of the same database at different times?

If you want the former, you want to use the MIRROR TO clause in the Backup command. I can't find a button on the GUI for it, but if you set up a backup, and specify two files, then click on the "script" button, you'll get a backup command with two disk files separated by a comma. i.e.

BACKUP DATABASE foo TO DISK = N'C:\somefile.bak', DISK = N'D:\AnotherFile.BAK' WITH FORMAT;

if you replace the comma with 'MIRROR TO', as in

BACKUP DATABASE foo TO DISK = N'C:\somefile.bak' MIRROR TO DISK = N'AnotherFile.BAK' WITH FORMAT;

then hit "GO" you'll get two copies of a backup of database foo instead of a stripe set.

|||We're wanting to "have a collection of backups of the same database at different times".