'Are there downsides from opening a SQL Connection 50k times on separate threads?

I want to know if there is a better recommended way to do this. My DBA has notified me that this below code is causing Page Latches and Page Waits in production and eventual timeouts. I have to process as many messages that arrive in a messaging server. The only way I could figure to do this is through loops and threading. Each thread will open a new connection to SQL and pass an xml list of data.

I am not sure why he is saying page waits are happening as I though SQL could handle millions of connections. Could it be the connection string itself? I am closing my connections but any advice here would help. Thank you

Here is the connection string. I have the pool to 10k as without it I get pool exhaustion errors. I could lower the value to maybe 1000 but I am not sure if this even makes a difference.

  var connectionString="Server=myServerAddress;Database=myDataBase;Trusted_Connection=True;MultipleActiveResultSets=true;Asynchronous Processing=True;Pooling=True;Max Pool Size=10000;Min Pool Size=100;"

Here is an example of the threads. I chose 50k here but it could be upwards of 100k messages in the queue at peak hours.

  var results = Enumerable.Range(0, 50000)
                         .AsParallel()
                         .WithDegreeofParallelism(Environment.ProcessorCount)
                         .Select(x => InsertPayloadtoDB(xmlList))
                         .ToList()

The payload is an XML that is sent to the stored procedure for processing.

public void InsertPayloadtoDB(XmlElement xmlList)
{
    var task = Task.Run(async () => await ExecuteNonQueryAsync("MyStoredProcedure",
        new List<SqlParameter>{ new SqlParameter("@xmlList", xmlList.ToString())}));
}

public static Task<bool> ExecuteNonQueryAsync(string storedProcedureName,
    List<SqlParameter> parameters) 
{
    using(var connection = new SqlConnection(connectionString))
    {
        using(var cmd = new SqlCommand(storedProcedureName, connection))
        {
            cmd.CommandType = CommandType.StoredProcedure;
            cmd.CommandTimeout = 300;
            cmd.Parameters.AddRange(parameters);
            await connection.OpenAsync();
            await cmd.ExecuteNonQueryAsync();
            cmd.Parameters.Clear();
        }

        connection.Close();
    }
}

Here is the Stored Procedure which just takes what I send it and inserts into a table.

DECLARE @xmlList = NULL

BEGIN
    DECLARE @MyTable TABLE
                     (
                         First varchar(40),
                         Last varchar(40),
                         Address varchar(50),
                         -- ... 15 more columns
                     )

    BEGIN TRANSACTION
        INSERT INTO @MyTable (First varchar(40),
                              Last varchar(40),
                              Address varchar(50),
                              --... 15 more columns
                             )
            SELECT
                N.value('(Name)[1]', 'varchar(40)') AS Name
                N.value('(Last)[1]', 'varchar(40)') AS Last
                N.value('(Address)[1]', 'varchar(50)') AS Address
            FROM
                @xmlList.nodes('//ClientInfo') AS T(N)

        // This is where the deadlock seems to happen as when 
        // I comment this out everything is smooth sailing. 
        INSERT INTO ClientInfoTable (Name, Last, Address, ...)
            SELECT
                Name, Last,
                Address,
                ...
            FROM @MyTable
       )

        COMMIT TRANSACTION
END


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source