2

I am using this method to upload data to SQL.

private void button5_Click(object sender, EventArgs e)
{
    string filepath = textBox2.Text;

    string connectionString_i = string.Format(@"Provider=Microsoft.Jet.OleDb.4.0; Data Source={0};Extended Properties=""Text;HDR=YES;FMT=Delimited""", Path.GetDirectoryName(filepath));

    using (OleDbConnection connection_i = new OleDbConnection(connectionString_i))
    {
        connection_i.Open();

        OleDbCommand command = new OleDbCommand ("Select * FROM [" + Path.GetFileName(filepath) +"]", connection_i);

        command.CommandTimeout = 180;

        using (OleDbDataReader dr = command.ExecuteReader())
        {
            string sqlConnectionString = MyConString;

            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnectionString))
            {
                SqlBulkCopy bulkInsert = new SqlBulkCopy(sqlConnectionString);
                bulkInsert.BulkCopyTimeout = 180;
                bulkInsert.DestinationTableName = "Table_Name";
                bulkInsert.WriteToServer(dr);

                MessageBox.Show("Upload Successful!");
            }
        }

        connection_i.Close();
    }
}

I have an Excel sheet in .CSV format of about 1,048,313 entries. That bulk copy method is just working for about 36000 to 60000 entries. I want to ask if there is any way that I can select the first 30000 entries from Excel and upload them to a SQL Server table, then again select next chunk of 30000 rows and upload those to SQL Server, and so on until the last entry has been stored.

2
  • well.. csv is a very simple format..isn't is much easier to manipulate the file with a file reader? (just read next 30000 lines every time) Commented Feb 27, 2017 at 21:52
  • Well if i have to make a new table every time then its simple, but i have to update existing table and also if somehow its gets disconnected or some some error happens, then i have to roll back all current changes also.. so that the problem. Commented Feb 27, 2017 at 22:09

1 Answer 1

1
  1. Create a datatable to store the values from your csv file that needs to be inserted into your target table. Each column in the datatable would correspond to a data column in the csv file.
  2. Create a custom data type (table-valued) on SQL Server to match your data table, including data type and length. As this post was tagged sql-server and not access, as your sample connection string seems to contradict that.
  3. Using a text reader and a counter variable, populate your datatable with 30,000 records.
  4. Pass the data table to your insert query or stored procedure. The pararameter type is SqlDbType.Structured.
  5. In the event that the job fails and you need to restart, the first step could be to determine the last inserted value from a predefined key in your field. Your could also use a left outer join as part of your insert query to only insert records that do not exist on the table. These are just a few of the more common techniques to restart a failed ETL job.

This technique has some tactical advantages over the bulk copy as it adds flexibility and is less coupled to the target table, thus changes to the table could be less volatile, depending on the nature of the change.

Sign up to request clarification or add additional context in comments.

3 Comments

thanks SteveD. i did changed that, I cannot comment here my code that i changed. now i am getting error that source or destination column maping dont match.
The column data type in the datatable need to match the table-valued type in sql server. You also need to make sure the data you are inserting from your csv file into you datatable (c#) conforms to the types defined of the data columns. So everything needs to match text -> datatable -> table-valued parameter
You're welcome, I'm glad this worked for you. Typically if the answer satisfies the question, you would mark the answer as accepted answer.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.