I am trying to run a SSIS package that contains a fuzzy lookup. I am using a flat file with about 7 million records as the input. The reference table has about 2000 records. The package fails after about 40,000 records with the following information:
Warning: 0x8007000E at Data Flow Task, Fuzzy Lookup [228]: Not enough storage is available to complete this operation.
Warning: 0x800470E9 at Data Flow Task, DTS.Pipeline: A call to the ProcessInput method for input 229 on component "Fuzzy Lookup" (228) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 2 before the call, and 1 after the call returned.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Fuzzy Lookup" (228) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8007000E.
Error: 0xC02020C4 at Data Flow Task, Flat File Source [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "Flat File Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
-
I have tried many things - changing the BufferTempStoragePath path to a drive that has plenty space, changed the MaxInsertCommitSize to 5,000...
What else can I do?
Thanks!
Did you get this on a RTM build?
If yes, is it possible you apply SP1 to see whether that helps?
Thanks
wenyang
|||Yes, it is on a RTM build.
The server does not have SP1 on it yet. Is there something else that I could try before that is installed?
Thanks!
|||I know there was a memory issue got fixed in SP1 for FuzzyLookup, in which case the error looks very similar to yours, so I wanted you to try SP1.
For temp work arounds, you may also try one of the following two on your RTM
1) Increase your page file to >10GB
2) Mitigate the problem by splitting the input into smaller batches (e.g. 100k at a time) and tearing down the package in between each batch.
HTH
wenyang
|||I have installed SP1, and am getting the same errors.
Any other suggestions other than the temp work arounds you mentioned? As I mentioned earlier, the process is stopping after only 40,000 records or so. This seems like a very small number and I'd hate to have to split the input up into smaller pieces than that.
Thanks!
|||Does increase page file size help? If not, then we probably need to repro in house to investigate further.
Is it possible you log a bug with repro? (bug report place is http://lab.msdn.microsoft.com/productfeedback/default.aspx) We'll get back to you asap.
Thanks
Wenyang
|||Sorry - I was told that SP1 had been installed before I wrote my previous message. It actually hadn't been installed at that point. Once that was successfully installed on Integration Services, the fuzzy lookup issues seem to have been resolved.
Thanks!
|||Hi Wenyang,
I have installed SQL Server 2005 Service Pack 2 and unfortunately this error does not seem to have gone away.
Increasing my pagefile does not seemed to have helped the situation either ?
It seems to occur whenever I choose to perform a match on more than 10 columns within a Fuzzy Lookup ?
Can you please advise what I should do next/when this will be fixed ?
Many Thanks,
Gary
No comments:
Post a Comment