Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Friday, March 30, 2012

nvarchar and output to flat file.

Hello,
Im outputting a sql table to a text file. One of the columns in this table
is an nvarchar(1200) but the result in the flat file is truncated to 254
chars....how do ensure that it outputs all chars up to 1200?
thxJohn,
which approach / utility are you using to outputting the sql table to a text
file?
AMB
"John Smith" wrote:

> Hello,
> Im outputting a sql table to a text file. One of the columns in this table
> is an nvarchar(1200) but the result in the flat file is truncated to 254
> chars....how do ensure that it outputs all chars up to 1200?
> thx
>
>|||I think I gave him a suitable answer on a separate post.
Barrysql

Wednesday, March 7, 2012

Null values when extracting data from an excel file

Hello,

I'm trying to import some data from a spreadsheet to a database table from a package in integration services. The problem is that I see the data when I open the excel file but when I try to run my package , it doesn't insert any rows in the table and it finishes with a success status.

My excel file has some formulas to get the data from other worksheets. I added a Data Viewer and all I see is null values in every cell.

I need help...does anyone know what's wrong?

This may seem to be obviuos; but meake sure you are pointing to the right sheet. What happens when you click the preview button in the Excel source?

Rafael Salas

|||It's the same thing... null values :(|||As soon as I close the Excel workbook containing links, I start getting NULLS through the dataflow.

When the workbook is open, the linked data works fine.|||

In my case, it happens in both situations, closed or opened. But it happens when I modify something in the sheet. I inserted some rows in blank and that's when I cannot read the data in SSIS. I still have the original file and I was able to import that specific sheet (before the change) to the database.

I did the same thing in another file(I′m looping through files) and it did OK, so I don't think the problem is the blank rows...

NULL values returned when reading values from a text file using Data Reader.

I have a DTSX package which reads values from a fixed-length text file using a data reader and writes some of the column values from the file to an Oracle table. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Data Reader. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?Why aren't you using the flat file source connector?|||

Sorry: I reread my initial posting and it contained some incorrect details. I have modified the message accordingly and the modified content is in italics

I have a DTSX package which reads values from a delimited text file using a Flat File source component (and a Lookup for validating some of the data) and reads the data into a Table in an Oracle database. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. The typical file length is between 300,000 to 500,000 rows of data. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Flat File source component. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?

|||One additional detail which may be pertinent: The DTSX is running inside of a virtual machine.|||Since you are using a lookup, a couple of things to consider:
Lookup matching is case sensitive, so you may have to adjust your data for comparison accordingly|||It appears that memory may be the answer. The VM where the DTSX was running only had 512 MB of RAM. When this was moved to an environment with 1 GB we do not see the same issues.|||Hmmm - seems I spoke too soon. One file which was about 75,000 rows of data / 30 MB of data processed successfully. However another file which was around 300,000 rows of data / 150 MB resulted in the same "false NULL" situation. I'm going to try breaking down the larger file into 4 smaller segments and will process them each individually to see if this makes a difference. I am not aware of any explicit size limitations in SSIS that we should be bumping up against with these file sizes but can anyone tell me if there are thresholds that should not be exceeded as a best practice when dealing with flat files or lookups?|||Ok, running the smaller file also resulted in the same error as before. We're now operating under the theory that this has to do with something being kept in memory after the package initially runs since we usually seem to be able to generate a "clean" result file after the first time we try running a package. To test this theory we will reboot the computer where the DTSX is stored and will then rerun a file which has already generated errors.|||

It appears that we have a solution: We broke up the DTSX package into 4 smaller packages, broke the file up in 4 smaller files, ran the packages on a non-virtual machine with 1GB of RAM, and executed the packages through the command line. When all of these changes were combined we get the expected result without any false NULL issues. Initial testing seems to reveal that omitting any one of these steps may still result in the original error but this isn't conclusive at this point in time as we haven't tried all of the different scenarios. All these changes seem to indicate that the root cause of the issue is related to available memory, and I am interested in anyone else has any insight - thanks.

|||

I am encountering this error. My solution was to uncheck the "Retain null values..." within the Flat File Source. However, I had to change my logic that checked for nulls to check for empty strings.

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

|||

Shizelmah wrote:

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

So what are you going to do about it? Leaving pithy comments on here won't make the slightest bit of difference I'm afraid. The correct place to submit your bugs and suggestions is http://connect.microsoft.com/sqlserver/feedback

Hope that helps.

Regards

-Jamie

|||I have encountered the same problem. It seems to be a bug in SSIS.

After doing some investigation it turned out it was the Union All component used in the dataflow which caused this situation. When I removed the Union All component the problem disappeared.

I can't explain this weird behaviour of SSIS - I suppose it is a bug related to internal SSIS buffer management.

If you have a Union All component with more than 4 inputs in your dataflow, do try to remove it. Maybe it will help, as it was in my case.

Regards,
Grzegorz

NULL values returned when reading values from a text file using Data Reader.

I have a DTSX package which reads values from a fixed-length text file using a data reader and writes some of the column values from the file to an Oracle table. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Data Reader. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?Why aren't you using the flat file source connector?|||

Sorry: I reread my initial posting and it contained some incorrect details. I have modified the message accordingly and the modified content is in italics

I have a DTSX package which reads values from a delimited text file using a Flat File source component (and a Lookup for validating some of the data) and reads the data into a Table in an Oracle database. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. The typical file length is between 300,000 to 500,000 rows of data. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Flat File source component. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?

|||One additional detail which may be pertinent: The DTSX is running inside of a virtual machine.|||Since you are using a lookup, a couple of things to consider:
Lookup matching is case sensitive, so you may have to adjust your data for comparison accordingly|||It appears that memory may be the answer. The VM where the DTSX was running only had 512 MB of RAM. When this was moved to an environment with 1 GB we do not see the same issues.|||Hmmm - seems I spoke too soon. One file which was about 75,000 rows of data / 30 MB of data processed successfully. However another file which was around 300,000 rows of data / 150 MB resulted in the same "false NULL" situation. I'm going to try breaking down the larger file into 4 smaller segments and will process them each individually to see if this makes a difference. I am not aware of any explicit size limitations in SSIS that we should be bumping up against with these file sizes but can anyone tell me if there are thresholds that should not be exceeded as a best practice when dealing with flat files or lookups?|||Ok, running the smaller file also resulted in the same error as before. We're now operating under the theory that this has to do with something being kept in memory after the package initially runs since we usually seem to be able to generate a "clean" result file after the first time we try running a package. To test this theory we will reboot the computer where the DTSX is stored and will then rerun a file which has already generated errors.|||

It appears that we have a solution: We broke up the DTSX package into 4 smaller packages, broke the file up in 4 smaller files, ran the packages on a non-virtual machine with 1GB of RAM, and executed the packages through the command line. When all of these changes were combined we get the expected result without any false NULL issues. Initial testing seems to reveal that omitting any one of these steps may still result in the original error but this isn't conclusive at this point in time as we haven't tried all of the different scenarios. All these changes seem to indicate that the root cause of the issue is related to available memory, and I am interested in anyone else has any insight - thanks.

|||

I am encountering this error. My solution was to uncheck the "Retain null values..." within the Flat File Source. However, I had to change my logic that checked for nulls to check for empty strings.

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

|||

Shizelmah wrote:

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

So what are you going to do about it? Leaving pithy comments on here won't make the slightest bit of difference I'm afraid. The correct place to submit your bugs and suggestions is http://connect.microsoft.com/sqlserver/feedback

Hope that helps.

Regards

-Jamie

|||I have encountered the same problem. It seems to be a bug in SSIS.

After doing some investigation it turned out it was the Union All component used in the dataflow which caused this situation. When I removed the Union All component the problem disappeared.

I can't explain this weird behaviour of SSIS - I suppose it is a bug related to internal SSIS buffer management.

If you have a Union All component with more than 4 inputs in your dataflow, do try to remove it. Maybe it will help, as it was in my case.

Regards,
Grzegorz

NULL values returned when reading values from a text file using Data Reader.

I have a DTSX package which reads values from a fixed-length text file using a data reader and writes some of the column values from the file to an Oracle table. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Data Reader. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?Why aren't you using the flat file source connector?|||

Sorry: I reread my initial posting and it contained some incorrect details. I have modified the message accordingly and the modified content is in italics

I have a DTSX package which reads values from a delimited text file using a Flat File source component (and a Lookup for validating some of the data) and reads the data into a Table in an Oracle database. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. The typical file length is between 300,000 to 500,000 rows of data. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Flat File source component. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?

|||One additional detail which may be pertinent: The DTSX is running inside of a virtual machine.|||Since you are using a lookup, a couple of things to consider:
Lookup matching is case sensitive, so you may have to adjust your data for comparison accordingly|||It appears that memory may be the answer. The VM where the DTSX was running only had 512 MB of RAM. When this was moved to an environment with 1 GB we do not see the same issues.|||Hmmm - seems I spoke too soon. One file which was about 75,000 rows of data / 30 MB of data processed successfully. However another file which was around 300,000 rows of data / 150 MB resulted in the same "false NULL" situation. I'm going to try breaking down the larger file into 4 smaller segments and will process them each individually to see if this makes a difference. I am not aware of any explicit size limitations in SSIS that we should be bumping up against with these file sizes but can anyone tell me if there are thresholds that should not be exceeded as a best practice when dealing with flat files or lookups?|||Ok, running the smaller file also resulted in the same error as before. We're now operating under the theory that this has to do with something being kept in memory after the package initially runs since we usually seem to be able to generate a "clean" result file after the first time we try running a package. To test this theory we will reboot the computer where the DTSX is stored and will then rerun a file which has already generated errors.|||

It appears that we have a solution: We broke up the DTSX package into 4 smaller packages, broke the file up in 4 smaller files, ran the packages on a non-virtual machine with 1GB of RAM, and executed the packages through the command line. When all of these changes were combined we get the expected result without any false NULL issues. Initial testing seems to reveal that omitting any one of these steps may still result in the original error but this isn't conclusive at this point in time as we haven't tried all of the different scenarios. All these changes seem to indicate that the root cause of the issue is related to available memory, and I am interested in anyone else has any insight - thanks.

|||

I am encountering this error. My solution was to uncheck the "Retain null values..." within the Flat File Source. However, I had to change my logic that checked for nulls to check for empty strings.

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

|||

Shizelmah wrote:

I really wish this would be resolved by Microsoft because it seems to happen when the files have many rows. I'm importing about 7 mil rows a time. It would be nice to have more confidence in the SSIS product.

So what are you going to do about it? Leaving pithy comments on here won't make the slightest bit of difference I'm afraid. The correct place to submit your bugs and suggestions is http://connect.microsoft.com/sqlserver/feedback

Hope that helps.

Regards

-Jamie

|||I have encountered the same problem. It seems to be a bug in SSIS.

After doing some investigation it turned out it was the Union All component used in the dataflow which caused this situation. When I removed the Union All component the problem disappeared.

I can't explain this weird behaviour of SSIS - I suppose it is a bug related to internal SSIS buffer management.

If you have a Union All component with more than 4 inputs in your dataflow, do try to remove it. Maybe it will help, as it was in my case.

Regards,
Grzegorz

Null Values

I set up a new SQL database file, in that file I allowed nulls, When I went through code to save the record, the exception is saying it doesnt allow nulls.

Before I get to involved with SQL, is it a bad practice to use nulls?

If it is what do you enter in place of the null value,

which will lead to more code, right?

Davids Learning SQL

Having and using nulls is not bad practice.....

Your problem maybe that your trying to have a null on your key field........

|||

No, the primary key did have a number assigned that did not exist.

I did a msgbox to throw the value back to me just to see in case and it was the number I chose.

I intentionally did not enter a value into a int field to see what would happen.

I opened the database file, and allow nulls is checked for the field and the field is not the primary

Any thoughts?

Davids Learning

|||Is the field a date field?|||I remember running in to similar problem, but I am not sure if it is exactly the same. The work around (or the solution for that matter) was I opened the DataSet containing my dataTable. Clicked on the field in that dataTable and in properties, I Changed 'NullValue' property to (Nothing). By default, it's set to (Throw Exception), which forces the dataSet to throw exception even if its AllowDBNull is set to true. See if that solves your issue.|||

is there a way to default to "Nothing" instead of throwing an exception?

I wont be able to try your suggestion until Monday, when I get back to my server at work.

Thanks

Davids Learning

|||Move the thread in this forum in order to get better answers.|||That depends, what does nothing mean in your case ? Normally nothing is equal to NULL when "nothing" is entered.

Jens K. Suessmeyer.

http://www.sqlserver2005.de
|||

Hey, sorry for the delay, been having ticker problems, Im back now

What I was getting at was to have the designer default or not check for null instead of throwing exceptions.

There is not but 1 field that needs not to be null and thats the primary field and I am taking care of that.

Is there a way to accomplish that.

The tables that I am dealing with contain a lot of fields and it will take some time to go through and change.

Thanks Again

Davids Learning

|||Hi,

what about setting a default value in SQL Server then ?

Jens K. Suessmeyer.

http://www.sqlserver2005.de
|||

I need to post what I did to mask the problem

I went into the table designer in VB and changed how each column handles nulls, basically setting them to not throw an exception on dbnulls.

I didnt want values in some fields intentionally because of other factor, like dates, if I set a default date in an expiration field(that would be bad).

I will on the other hand have to watch and plan on handling nulls now, and thats alright.

Thanks

Davids Learning

|||To me, it looked like a VB bug, but I will leave the judgement for you guys. If I allow nulls in my database table and create a typed DataSet from that table, the DataSet throws an error when it encounters a null value. If the DataSet's AllowDBNull is set to true, then it should default to 'Nothing' for NullValue property. It does not seem logical to me (Allow DBNull, but throw an error!!!). For those who want to reproduce the problem, use bound DataGridview with some fields that allow DBNull and try to save without entering data in those fields.|||

Glad to see someone in the same boat

Davids Learning

Saturday, February 25, 2012

Null value handling in SSIS package

Hi All

I am facing this problem while loading data from text file into Table.

Scenario is -

There are chances of having spaces for null values in text file.

when i m trying to rum my SSIS package this is getting failed.

How can i avoid this problem? i want null values to be inserted if ther is spaces for that field in text file.

Thanks,

Anshu

Hi,

You can use a derived column transformation and check to see if your input column value has all spaces... trim and followed by len and if len == 0, set the column value to null.

Hope this helps...