Showing posts with label bulk. Show all posts
Showing posts with label bulk. Show all posts

Friday, March 23, 2012

Problem in bulk insert

Hi All,

I am trying to perform Bulk Insert for an existing folder containing the data files into an existing destination table . I get the following error :

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\PD Data\PDDW_md_costs\Files_to_load" could not be opened. Operating system error code 5(Access is denied.).".

Here Files_to_load is the existing folder that contains the data files. I am not sure why it is saying that i do not have access to it. I have however also tried sharing the folder providing full access to it.

An input i would like to add is that i am performing all these tasks by remote connecting to a server machine that hosts SQL Server 2005 . I also am remote connecting using the administrator id to do these tasks.

If anyone has an idea on why this has occured please do let me know as this would really help solve the problem.

Thanks & regards,

Satya

I am having the same issue. Did anyone ever fiqure this out?

Thanks in advance.

JM

|||Have you tried using the UNC format for the foldername and path?|||

I'm also having this problem. What is UNC format (sorry if that's a stupid question - I'm a newbie)?

I've specified my input file using the connection manager, and selecting the file from the drop down list inside of Integration Services. It makes absolutely no sense to me why it suddenly can't see the file at runtime.

Thanks,

Laurie

|||Also, I should note that the file that I'm trying to access for the bulk insert is on my own local PC, not a remote drive somewhere on the network.|||

UNC path is \\MachineName\filePath. However, I tried this with no luck. Anyone else have any ideas? I have been stuck on this for a week and ordered a book that was no help.

JM

|||

I got past this problem by moving my data file to a shared drive. While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

The fact that I could see the file and link to it easily with no warning from the Connection manager strikes me as inconsistent design. It should give you an error message or something if the file is going to be inaccessible at runtime. Just my 2 cents worth.

|||One more thing - you also need to use UNC format for the file name and path.|||

>While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

Is that the case? To run the package remotely you woudl need to be invoking it with SQL Agent.

However, even if the package is actually running locally, if the Bulk Insert Task uses a remote SQL Server, then you are asking that remote SQL Server to perform the bulk insert. That SQL Server must have access to the source file.

You make an interesting point about whether we should validate that remote access at design time. However, the SQL Server Bulk Insert operation does not provide us with an option to validate - we have to invoke the operation to do so, which in turn would execute the Bulk Insert operation.

I'm thinking (before coffee, so don't expect too much) that it would be possible to craft a stored proc which could be called by OnValidate on the BulkInsert task that could effectively validate access to the source file.

Donald Farmer

|||

I have used the UNC path name to specify the folder in which to read. I get the same error. Also, I am not working Remotely. I am accessing directly on the server, actually directly on the desktop.

Every, post I have read has pointed to a permissions issue within the SQL service. Where can I find information on adjusting these permissions?

Also, I am logged in as Administrator when I run the package, I have successfully run the package by specifying a specific file name AND only get the Access Denied error when I choose a folder from which to read. Am I doing something wrong?

I too am writing this before coffee, and not sure if my post makes much sense. But I do thank everyone for the help...jm

|||

Donald,

Thanks for clarifying what was going on. I'm still very new to Integration Services.

|||

Hi,

We are having the same issue. We have two SQL Servers running on the same machine. One is SQLDEV-DBS and it is SQL Server 2000. The other is SQLDEV-DBS\SQL2005 and it is SQL Server 2005. We can bulk insert without issue into the SQL Server 2000 instance but when we try to do a bulk insert into the SQL Server 2005 instance we get the message:

Error: 0xC002F304 at Load Stumast, Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "\\sqlprd-dbs\sqlimport\DLSTD4.TXT" could not be opened. Operating system error code 5(Access is denied.).".

Both servers are running under the same domain accounts so this is a real brain teaser.

Anyone have any ideas? Which service do we need to check? There are a lot more under 2005 than 2000.

Thanks for any and all help.

|||Hi! i am having the same porblem but could not find a solution to this ......|||Have you had any luck with this. I still cannot get this to work. Thanks!|||The flatfile connection manager in bulk insert task should point to the file containing source data and not the folder. So, set the path to the file.

Problem in bulk insert

Hi All,

I am trying to perform Bulk Insert for an existing folder containing the data files into an existing destination table . I get the following error :

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\PD Data\PDDW_md_costs\Files_to_load" could not be opened. Operating system error code 5(Access is denied.).".

Here Files_to_load is the existing folder that contains the data files. I am not sure why it is saying that i do not have access to it. I have however also tried sharing the folder providing full access to it.

An input i would like to add is that i am performing all these tasks by remote connecting to a server machine that hosts SQL Server 2005 . I also am remote connecting using the administrator id to do these tasks.

If anyone has an idea on why this has occured please do let me know as this would really help solve the problem.

Thanks & regards,

Satya

I am having the same issue. Did anyone ever fiqure this out?

Thanks in advance.

JM

|||Have you tried using the UNC format for the foldername and path?|||

I'm also having this problem. What is UNC format (sorry if that's a stupid question - I'm a newbie)?

I've specified my input file using the connection manager, and selecting the file from the drop down list inside of Integration Services. It makes absolutely no sense to me why it suddenly can't see the file at runtime.

Thanks,

Laurie

|||Also, I should note that the file that I'm trying to access for the bulk insert is on my own local PC, not a remote drive somewhere on the network.|||

UNC path is \\MachineName\filePath. However, I tried this with no luck. Anyone else have any ideas? I have been stuck on this for a week and ordered a book that was no help.

JM

|||

I got past this problem by moving my data file to a shared drive. While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

The fact that I could see the file and link to it easily with no warning from the Connection manager strikes me as inconsistent design. It should give you an error message or something if the file is going to be inaccessible at runtime. Just my 2 cents worth.

|||One more thing - you also need to use UNC format for the file name and path.|||

>While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

Is that the case? To run the package remotely you woudl need to be invoking it with SQL Agent.

However, even if the package is actually running locally, if the Bulk Insert Task uses a remote SQL Server, then you are asking that remote SQL Server to perform the bulk insert. That SQL Server must have access to the source file.

You make an interesting point about whether we should validate that remote access at design time. However, the SQL Server Bulk Insert operation does not provide us with an option to validate - we have to invoke the operation to do so, which in turn would execute the Bulk Insert operation.

I'm thinking (before coffee, so don't expect too much) that it would be possible to craft a stored proc which could be called by OnValidate on the BulkInsert task that could effectively validate access to the source file.

Donald Farmer

|||

I have used the UNC path name to specify the folder in which to read. I get the same error. Also, I am not working Remotely. I am accessing directly on the server, actually directly on the desktop.

Every, post I have read has pointed to a permissions issue within the SQL service. Where can I find information on adjusting these permissions?

Also, I am logged in as Administrator when I run the package, I have successfully run the package by specifying a specific file name AND only get the Access Denied error when I choose a folder from which to read. Am I doing something wrong?

I too am writing this before coffee, and not sure if my post makes much sense. But I do thank everyone for the help...jm

|||

Donald,

Thanks for clarifying what was going on. I'm still very new to Integration Services.

|||

Hi,

We are having the same issue. We have two SQL Servers running on the same machine. One is SQLDEV-DBS and it is SQL Server 2000. The other is SQLDEV-DBS\SQL2005 and it is SQL Server 2005. We can bulk insert without issue into the SQL Server 2000 instance but when we try to do a bulk insert into the SQL Server 2005 instance we get the message:

Error: 0xC002F304 at Load Stumast, Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "\\sqlprd-dbs\sqlimport\DLSTD4.TXT" could not be opened. Operating system error code 5(Access is denied.).".

Both servers are running under the same domain accounts so this is a real brain teaser.

Anyone have any ideas? Which service do we need to check? There are a lot more under 2005 than 2000.

Thanks for any and all help.

|||Hi! i am having the same porblem but could not find a solution to this ......|||Have you had any luck with this. I still cannot get this to work. Thanks!|||The flatfile connection manager in bulk insert task should point to the file containing source data and not the folder. So, set the path to the file.

Problem in bulk insert

Hi All,

I am trying to perform Bulk Insert for an existing folder containing the data files into an existing destination table . I get the following error :

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\PD Data\PDDW_md_costs\Files_to_load" could not be opened. Operating system error code 5(Access is denied.).".

Here Files_to_load is the existing folder that contains the data files. I am not sure why it is saying that i do not have access to it. I have however also tried sharing the folder providing full access to it.

An input i would like to add is that i am performing all these tasks by remote connecting to a server machine that hosts SQL Server 2005 . I also am remote connecting using the administrator id to do these tasks.

If anyone has an idea on why this has occured please do let me know as this would really help solve the problem.

Thanks & regards,

Satya

I am having the same issue. Did anyone ever fiqure this out?

Thanks in advance.

JM

|||Have you tried using the UNC format for the foldername and path?|||

I'm also having this problem. What is UNC format (sorry if that's a stupid question - I'm a newbie)?

I've specified my input file using the connection manager, and selecting the file from the drop down list inside of Integration Services. It makes absolutely no sense to me why it suddenly can't see the file at runtime.

Thanks,

Laurie

|||Also, I should note that the file that I'm trying to access for the bulk insert is on my own local PC, not a remote drive somewhere on the network.|||

UNC path is \\MachineName\filePath. However, I tried this with no luck. Anyone else have any ideas? I have been stuck on this for a week and ordered a book that was no help.

JM

|||

I got past this problem by moving my data file to a shared drive. While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

The fact that I could see the file and link to it easily with no warning from the Connection manager strikes me as inconsistent design. It should give you an error message or something if the file is going to be inaccessible at runtime. Just my 2 cents worth.

|||One more thing - you also need to use UNC format for the file name and path.|||

>While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

Is that the case? To run the package remotely you woudl need to be invoking it with SQL Agent.

However, even if the package is actually running locally, if the Bulk Insert Task uses a remote SQL Server, then you are asking that remote SQL Server to perform the bulk insert. That SQL Server must have access to the source file.

You make an interesting point about whether we should validate that remote access at design time. However, the SQL Server Bulk Insert operation does not provide us with an option to validate - we have to invoke the operation to do so, which in turn would execute the Bulk Insert operation.

I'm thinking (before coffee, so don't expect too much) that it would be possible to craft a stored proc which could be called by OnValidate on the BulkInsert task that could effectively validate access to the source file.

Donald Farmer

|||

I have used the UNC path name to specify the folder in which to read. I get the same error. Also, I am not working Remotely. I am accessing directly on the server, actually directly on the desktop.

Every, post I have read has pointed to a permissions issue within the SQL service. Where can I find information on adjusting these permissions?

Also, I am logged in as Administrator when I run the package, I have successfully run the package by specifying a specific file name AND only get the Access Denied error when I choose a folder from which to read. Am I doing something wrong?

I too am writing this before coffee, and not sure if my post makes much sense. But I do thank everyone for the help...jm

|||

Donald,

Thanks for clarifying what was going on. I'm still very new to Integration Services.

|||

Hi,

We are having the same issue. We have two SQL Servers running on the same machine. One is SQLDEV-DBS and it is SQL Server 2000. The other is SQLDEV-DBS\SQL2005 and it is SQL Server 2005. We can bulk insert without issue into the SQL Server 2000 instance but when we try to do a bulk insert into the SQL Server 2005 instance we get the message:

Error: 0xC002F304 at Load Stumast, Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "\\sqlprd-dbs\sqlimport\DLSTD4.TXT" could not be opened. Operating system error code 5(Access is denied.).".

Both servers are running under the same domain accounts so this is a real brain teaser.

Anyone have any ideas? Which service do we need to check? There are a lot more under 2005 than 2000.

Thanks for any and all help.

|||Hi! i am having the same porblem but could not find a solution to this ......|||Have you had any luck with this. I still cannot get this to work. Thanks!|||The flatfile connection manager in bulk insert task should point to the file containing source data and not the folder. So, set the path to the file.sql

Problem in bulk insert

Hi All,

I am trying to perform Bulk Insert for an existing folder containing the data files into an existing destination table . I get the following error :

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\PD Data\PDDW_md_costs\Files_to_load" could not be opened. Operating system error code 5(Access is denied.).".

Here Files_to_load is the existing folder that contains the data files. I am not sure why it is saying that i do not have access to it. I have however also tried sharing the folder providing full access to it.

An input i would like to add is that i am performing all these tasks by remote connecting to a server machine that hosts SQL Server 2005 . I also am remote connecting using the administrator id to do these tasks.

If anyone has an idea on why this has occured please do let me know as this would really help solve the problem.

Thanks & regards,

Satya

I am having the same issue. Did anyone ever fiqure this out?

Thanks in advance.

JM

|||Have you tried using the UNC format for the foldername and path?|||

I'm also having this problem. What is UNC format (sorry if that's a stupid question - I'm a newbie)?

I've specified my input file using the connection manager, and selecting the file from the drop down list inside of Integration Services. It makes absolutely no sense to me why it suddenly can't see the file at runtime.

Thanks,

Laurie

|||Also, I should note that the file that I'm trying to access for the bulk insert is on my own local PC, not a remote drive somewhere on the network.|||

UNC path is \\MachineName\filePath. However, I tried this with no luck. Anyone else have any ideas? I have been stuck on this for a week and ordered a book that was no help.

JM

|||

I got past this problem by moving my data file to a shared drive. While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

The fact that I could see the file and link to it easily with no warning from the Connection manager strikes me as inconsistent design. It should give you an error message or something if the file is going to be inaccessible at runtime. Just my 2 cents worth.

|||One more thing - you also need to use UNC format for the file name and path.|||

>While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

Is that the case? To run the package remotely you woudl need to be invoking it with SQL Agent.

However, even if the package is actually running locally, if the Bulk Insert Task uses a remote SQL Server, then you are asking that remote SQL Server to perform the bulk insert. That SQL Server must have access to the source file.

You make an interesting point about whether we should validate that remote access at design time. However, the SQL Server Bulk Insert operation does not provide us with an option to validate - we have to invoke the operation to do so, which in turn would execute the Bulk Insert operation.

I'm thinking (before coffee, so don't expect too much) that it would be possible to craft a stored proc which could be called by OnValidate on the BulkInsert task that could effectively validate access to the source file.

Donald Farmer

|||

I have used the UNC path name to specify the folder in which to read. I get the same error. Also, I am not working Remotely. I am accessing directly on the server, actually directly on the desktop.

Every, post I have read has pointed to a permissions issue within the SQL service. Where can I find information on adjusting these permissions?

Also, I am logged in as Administrator when I run the package, I have successfully run the package by specifying a specific file name AND only get the Access Denied error when I choose a folder from which to read. Am I doing something wrong?

I too am writing this before coffee, and not sure if my post makes much sense. But I do thank everyone for the help...jm

|||

Donald,

Thanks for clarifying what was going on. I'm still very new to Integration Services.

|||

Hi,

We are having the same issue. We have two SQL Servers running on the same machine. One is SQLDEV-DBS and it is SQL Server 2000. The other is SQLDEV-DBS\SQL2005 and it is SQL Server 2005. We can bulk insert without issue into the SQL Server 2000 instance but when we try to do a bulk insert into the SQL Server 2005 instance we get the message:

Error: 0xC002F304 at Load Stumast, Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "\\sqlprd-dbs\sqlimport\DLSTD4.TXT" could not be opened. Operating system error code 5(Access is denied.).".

Both servers are running under the same domain accounts so this is a real brain teaser.

Anyone have any ideas? Which service do we need to check? There are a lot more under 2005 than 2000.

Thanks for any and all help.

|||Hi! i am having the same porblem but could not find a solution to this ......|||Have you had any luck with this. I still cannot get this to work. Thanks!|||The flatfile connection manager in bulk insert task should point to the file containing source data and not the folder. So, set the path to the file.

Problem in bulk insert

Hi All,

I am trying to perform Bulk Insert for an existing folder containing the data files into an existing destination table . I get the following error :

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\PD Data\PDDW_md_costs\Files_to_load" could not be opened. Operating system error code 5(Access is denied.).".

Here Files_to_load is the existing folder that contains the data files. I am not sure why it is saying that i do not have access to it. I have however also tried sharing the folder providing full access to it.

An input i would like to add is that i am performing all these tasks by remote connecting to a server machine that hosts SQL Server 2005 . I also am remote connecting using the administrator id to do these tasks.

If anyone has an idea on why this has occured please do let me know as this would really help solve the problem.

Thanks & regards,

Satya

I am having the same issue. Did anyone ever fiqure this out?

Thanks in advance.

JM

|||Have you tried using the UNC format for the foldername and path?|||

I'm also having this problem. What is UNC format (sorry if that's a stupid question - I'm a newbie)?

I've specified my input file using the connection manager, and selecting the file from the drop down list inside of Integration Services. It makes absolutely no sense to me why it suddenly can't see the file at runtime.

Thanks,

Laurie

|||Also, I should note that the file that I'm trying to access for the bulk insert is on my own local PC, not a remote drive somewhere on the network.|||

UNC path is \\MachineName\filePath. However, I tried this with no luck. Anyone else have any ideas? I have been stuck on this for a week and ordered a book that was no help.

JM

|||

I got past this problem by moving my data file to a shared drive. While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

The fact that I could see the file and link to it easily with no warning from the Connection manager strikes me as inconsistent design. It should give you an error message or something if the file is going to be inaccessible at runtime. Just my 2 cents worth.

|||One more thing - you also need to use UNC format for the file name and path.|||

>While I am running Integration Services on my local PC, SQL 2005 itself actually resides on another server on my network. When the package attempted to run, it was running from that remote server, causing the unable to access file error.

Is that the case? To run the package remotely you woudl need to be invoking it with SQL Agent.

However, even if the package is actually running locally, if the Bulk Insert Task uses a remote SQL Server, then you are asking that remote SQL Server to perform the bulk insert. That SQL Server must have access to the source file.

You make an interesting point about whether we should validate that remote access at design time. However, the SQL Server Bulk Insert operation does not provide us with an option to validate - we have to invoke the operation to do so, which in turn would execute the Bulk Insert operation.

I'm thinking (before coffee, so don't expect too much) that it would be possible to craft a stored proc which could be called by OnValidate on the BulkInsert task that could effectively validate access to the source file.

Donald Farmer

|||

I have used the UNC path name to specify the folder in which to read. I get the same error. Also, I am not working Remotely. I am accessing directly on the server, actually directly on the desktop.

Every, post I have read has pointed to a permissions issue within the SQL service. Where can I find information on adjusting these permissions?

Also, I am logged in as Administrator when I run the package, I have successfully run the package by specifying a specific file name AND only get the Access Denied error when I choose a folder from which to read. Am I doing something wrong?

I too am writing this before coffee, and not sure if my post makes much sense. But I do thank everyone for the help...jm

|||

Donald,

Thanks for clarifying what was going on. I'm still very new to Integration Services.

|||

Hi,

We are having the same issue. We have two SQL Servers running on the same machine. One is SQLDEV-DBS and it is SQL Server 2000. The other is SQLDEV-DBS\SQL2005 and it is SQL Server 2005. We can bulk insert without issue into the SQL Server 2000 instance but when we try to do a bulk insert into the SQL Server 2005 instance we get the message:

Error: 0xC002F304 at Load Stumast, Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "\\sqlprd-dbs\sqlimport\DLSTD4.TXT" could not be opened. Operating system error code 5(Access is denied.).".

Both servers are running under the same domain accounts so this is a real brain teaser.

Anyone have any ideas? Which service do we need to check? There are a lot more under 2005 than 2000.

Thanks for any and all help.

|||Hi! i am having the same porblem but could not find a solution to this ......|||Have you had any luck with this. I still cannot get this to work. Thanks!|||The flatfile connection manager in bulk insert task should point to the file containing source data and not the folder. So, set the path to the file.

Wednesday, March 21, 2012

problem importing csv delimited text file into a sql server 2005 table

I am using the Bulk Insert command and trying to import a CSV delimited text file into a table and I am having problems with the quote field delimiters ", " The command below works but it takes in all the "" quotes as well and the field delimiter comma , works only if the commas are the separators only. If I have a comma within a address field for example then the data gets imported into the wrong fields. What can I use to identify that the text qualifier is ". I don't see where I can use the bulk insert command to determine this. Is there another command that I can use or am I using this command incorrectly. I thank you in advance for any response or suggestion you may have.

BULK INSERT AdventureWorks.dbo.MbAddress

FROM 'a:\mbAddress.txt'

WITH (

DATAFILETYPE = 'char',

FIELDTERMINATOR=',',

ROWTERMINATOR='\n',

CODEPAGE = '1252',

KEEPIDENTITY,

KEEPNULLS,

FIRSTROW=2)

Here is a sample ascii file I am importing as well you can see that 6330 has a extra comma in the address line.

"AddressAutoID","Memkey","Type","BadAddress","Address1","Address2","Address3","City","State","Zip","Foreign","CarrierRoute","Dpbc","County","CountyNo","ErrorCode","ChangeDate","UserID"
6317,26517,1,0,"1403 W. Kline Ave","","","MILWAUKEE","WI","53221","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6318,26225,1,0,"501 Dunford Dr","","","BURLINGTON","WI","53105","","",0.00,"RACINE",101,"",1/25/2006 0:00:00,"admin"
6319,20101,1,0,"2115 Cappaert Rd #35","","","MANITOWOC","WI","54220","","",0.00,"MANITOWOC",71,"",1/25/2006 0:00:00,"admin"
6320,23597,1,0,"728 Woodland Park Dr","","","DELAFIELD","WI","53018","","",0.00,"WAUKESHA",133,"",1/25/2006 0:00:00,"admin"
6321,23392,1,0,"7700 S. 51st St","","","FRANKLIN","WI","53132","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6322,26537,1,0,"W188 S6473 GOLD DRIVE","","","MUSKEGO","WI","53150","","",0.00,"WAUKESHA",133,"",1/26/2006 0:00:00,"admin"
6323,25953,1,0,"3509 N. Downer Ave","","","MILWAUKEE","WI","53211","","",0.00,"MILWAUKEE",79,"",1/26/2006 0:00:00,"admin"
6324,19866,1,0,"10080 E. Mountain View Lake Rd. #145","","","SCOTTSDALE","AZ","85258","","",0.00,"MARICOPA",13,"",1/27/2006 0:00:00,"admin"
6325,25893,1,0,"W129 N6889 Northfield Dr. Apt 114","","","MENOMONEE FALLS","WI","53051-0517","","",0.00,"WAUKESHA",133,"",1/27/2006 0:00:00,"admin"
6326,26569,1,0,"8402 64th Street","","","KENOSHA","WI","53142-7577","","",0.00,"KENOSHA",59,"",1/27/2006 0:00:00,"admin"
6327,24446,4,0,"83 Sweetbriar Br","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",1/30/2006 0:00:00,"admin"
6328,19547,1,0,"4359 MERCHANT AVENUE","","","SPRING HILL","FL","34608","","",0.00,"HERNANDO",53,"",2/8/2006 0:00:00,"admin"
6329,26524,1,0,"264 Lakeridge Drive","","","OCONOMOWOC","WI","53066","","",0.00,"WAUKESHA",133,"",2/10/2006 0:00:00,"admin"
6330,23967,1,0,"3423 HICKORY ST","100 Tangerine Blvd., Brownsville, TX 78521-4368","Texas Phone Number: 956-546-4279","SHEBOYGAN","WI","53081","","",0.00,"SHEBOYGAN",117,"",2/15/2006 0:00:00,"admin"
6331,25318,1,0,"3960 S. Prairie Hill Lane Unit 107","","","Greenfield","WI","53228","","",0.00,"MILWAUKEE",79,"",2/20/2006 0:00:00,"admin"
6332,24446,1,0,"83 Sweetbriar BR","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",2/21/2006 0:00:00,"admin"
6333,26135,1,0,"P.O. Box 8 127 Main Street","","","CASCO","WI","54205","","",0.00,"KEWAUNEE",61,"",2/21/2006 0:00:00,"admin"

Hi

You can do this by using a formatfile. You can define each field in a formatfile and specify a differend field terminator for each field.

For example:

<?xml version="1.0"?>

<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

<RECORD>

<FIELD ID="1" xsi:type="CharTerm" TERMINATOR="""," MAX_LENGTH="30"/>

<FIELD ID="2" xsi:type="CharTerm" TERMINATOR="""\n" MAX_LENGTH="30"/>

[....]

</RECORD> <ROW>

<COLUMN SOURCE="1" NAME="c1" xsi:type="SQLNVARCHAR"/>

<COLUMN SOURCE="2" NAME="c2" xsi:type="SQLNVARCHAR"/> </ROW> </BCPFORMAT>

=== Edited by Hatzi74 @. 25 Apr 2006 2:49 PM UTC===
Edit2: You could also use the Data Import Wizzard... Select "Flatfile" as source, and then match the fielddelimiters for every field the way you need them to be...

For example... After a Text -> Number you would need --",-- as a delimiter if you go from text -> text you will need --","-- as a delimiter and when coming from number -> text you need --,"--

Hope this makes sense. Also the row delimiter could be adjusted to match --"\n-- if the last field is a text field. But this will leave the field "dirty" (It will include a tailing ")

|||

Thank you for your quick reply. I did try a non xml format file using the bcp command and it came out like the info below. However, It worked on importing but I still had the same problem with the commas as the delimiter. I wanted to add the quotes as a delimiter but I could not figure out how to do that in this type of file.

I will try a xml format this time but I am not sure by your instructions how to tell it that the file has quotes for the text qualifiers. I think you are saying I need a "," as a delimiter

Thank you so much for your time and effort. I am going to try it.

Most sincere,

Sarah

my format file below (.fmt type)

9.0
18
1 SQLCHAR 0 100 "," 1 AddressAutoID SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 100 "," 2 Memkey SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 100 "," 3 Type SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 100 "," 4 BadAddress SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 100 "," 5 Address1 SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 100 "," 6 Address2 SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 0 100 "," 7 Address3 SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 100 "," 8 City SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 100 "," 9 State SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 100 "," 10 Zip SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 100 "," 11 Foreign SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 100 "," 12 CarrierRoute SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 100 "," 13 Dpbc SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 100 "," 14 County SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 100 "," 15 CountyNo SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 100 "," 16 ErrorCode SQL_Latin1_General_CP1_CI_AS
17 SQLCHAR 0 100 "," 17 ChangeDate SQL_Latin1_General_CP1_CI_AS
18 SQLCHAR 0 100 "\r\n" 18 UserID SQL_Latin1_General_CP1_CI_AS

|||

Try this:

3 SQLCHAR 0 100 "," 3 Type SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 100 ",""" 4 BadAddress SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 100 """," 5 Address1 SQL_Latin1_General_CP1_CI_AS

|||

I am trying that, but it does not seem to be working.

Sarah

|||

I think it is getting confused with the quotes because they seem to be reserved for the program or something. I get an error.

do you know the bcp command I need to get the xml format? I will try that as well.

Sarah

|||

Hi

I just wanted to thank you for your reply and that you somewhat directed me into the right direction.

I just wanted to let you know that I finally figured it out. I had to strip the " quotes out of the ascii file and replace them with | pipes and then I created the fmt (format file) and imported the data. Here is a copy of the fmt file just for your own information. I had to tell it exactly where the pipes started and ended for each field delimter. I tried setting up the format file to look for the quotes but it had a problem with that because " " were being used in the format file.

Anyways I got it and now I am having a problem accessing the A:\ drive, using the bulk insert command. sql server thinks I am accessing the actual servers A:\ and not my local drive. wow what a nightmare just to read a simple ascii file into a table. Unfortunatley the operator has to import data from floppy diskettes. I will repost this problem.

Anyways, thank you very much for your imput.

Most sincere,

Sarah Diane Reid

Sample Format File below:

9.0
18
1 SQLCHAR 0 100 "," 1 AddressAutoID SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 100 "," 2 Memkey SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 100 "," 3 Type SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 100 ",|" 4 BadAddress SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 100 "|,|" 5 Address1 SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 100 "|,|" 6 Address2 SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 0 100 "|,|" 7 Address3 SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 100 "|,|" 8 City SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 100 "|,|" 9 State SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 100 "|,|" 10 Zip SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 100 "|,|" 11 Foreign SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 100 "|," 12 CarrierRoute SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 100 ",|" 13 Dpbc SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 100 "|," 14 County SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 100 ",|" 15 CountyNo SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 100 "|," 16 ErrorCode SQL_Latin1_General_CP1_CI_AS
17 SQLCHAR 0 100 ",|" 17 ChangeDate SQL_Latin1_General_CP1_CI_AS
18 SQLCHAR 0 100 "|\r\n" 18 UserID SQL_Latin1_General_CP1_CI_AS

Problem generating error file using BULK INSERT or BCP thru xp_cmdshell.

BCP thru xp_cmdshell from stored procedure:

EXEC sp_configure 'show advanced options', 1;

RECONFIGURE

EXEC sp_configure 'xp_cmdshell', 1;

RECONFIGURE

EXEC xp_cmdshell 'bcp database.dbo.table in c:\scheduled.csv -S SERVER\SQLEXPRESS -T -t, -r\n -c -e "error.txt"';

This is returning the following error code. I even tried placing the command in a seperate command file and calling that with no success. If I run this from the command line the error file generation does work.

=================================================================

SQLState = HY000, NativeError = 0

Error = [Microsoft][SQL Native Client]Unable to open BCP error-file

=================================================================

Error message when using BULK INSERT as follows:

BULK INSERT database.dbo.table from 'c:\unscheduled.csv' with

(FIELDTERMINATOR = ',', ERRORFILE = 'c:\error.txt');

Returns the following error message:

=================================================================

Msg 4861, Level 16, State 1, Procedure pro_cedure, Line 9

Cannot bulk load because the file "c:\error.txt" could not be opened. Operating system error code 80(The file exists.).

Msg 4861, Level 16, State 1, Procedure pro_cedure, Line 9

Cannot bulk load because the file "c:\error.txt.Error.Txt" could not be opened. Operating system error code 80(The file exists.).

=================================================================

The Bulk Insert actually creates a empty error.txt file (0kb) and never preforms the insert, I can not find any examples of anyone using the -ERRORFILE switch on BULK INSERT. Prolly some default security setting to allow file creation/modification I am missing. Anyone help me out? Thanks.

EDIT: SQL SERVER EXPRESS 2005 - WINXP PRO SP2

bcp or bulk insert will not overwrite or append to the error file. If you get an error the first time, and want to rerun your command, you need to either delete these files, or specify a new location for the error file.

|||

Unfourtunetly I am deleting the file, I'm still doing so by hand in testing.

On bcp thru xp_cmdshell it does not generate an errorfile at all (this actually works just fine from the command line just not thru a stored procedure), with bulk insert it generates a error file which is completely blank (even tho I know there is 6 rows that cannot be imported) of size 0kb, so totally empty. Also it does not actually execute the insert.

Most likely it is a security setting. I can not find it however. Another possiblity is maybe I disabled a required service.

Anyways, thanks for the reply.

Running SQL Express 2005 on WINXP PRO SP2.

|||

Does the service account which SQL Server is running under have write permissions to the disk?

According to the bulk insert error message, it does (your message says the file exists), but you say in the next post you're deleting it?

|||

Ah your the bomb.

SQLExpress service was running under network authority not local system account. I switched it to local system and now the bulk import with errorfile switch is working from sqlcmd. Write permissions or something, anyways I know where to look now, thanks!