'FOR JSON PATH results in SSMS truncated to 2033 characters

I'm concatenating strings together using "for JSON path('')".

I have set the Tools->Options->SQL Server->Results to Grid options to max.

I have set the Tools->Options->SQL Server->Results to Text options to max.

Executing the query in Grid mode and copying the one row/one column results, I see the return value is limited to 2033 characters.

How can I ensure the returned value isn't truncated?



Solution 1:[1]

Insert into a nvarchar(max) variable or a table will do it.

declare @json table (j nvarchar(max));
insert into @json select * from(select* from Table where Criteria1 for json auto)a(j)
insert into @json select * from(select* from Table where Criteria2 for json auto)a(j)
select * from @json

or

DECLARE @JSON nvarchar(max)
SET @JSON = (SELECT * FROM Table FOR JSON AUTO)
SELECT @JSON

Solution 2:[2]

I had similar question and found answer by using cast result to varchar format. Pease beware the result will still have limition since varchar has 8000 character for maximum. But it works for my situation. Hope it helps.

SELECT CAST((SELECT [Columns] FROM [Tables] FOR JSON PATH('')) AS VARCHAR(MAX)) AS JSONDATA

Solution 3:[3]

I've noticed that it suffices to just put the JSON (or XML) into a scalar subquery:

-- Useless
select row_number() over(order by (select 1)) as rn
from sys.all_objects
for json auto

-- Useful
select (
  select row_number() over(order by (select 1)) as rn
  from sys.all_objects
  for json auto
)

... at least in newer versions of SQL Server such as e.g. 2019. dbfiddle demo here

Solution 4:[4]

I got this to work by using a string builder and appending the additional rows while in the DataReader loop.

My original logic only allowed a single row returned - so that when the data reader looped more than once - it fell into 2033+ length issue.

Solution 5:[5]

With the latest SSMS (18.8) truncation appears to be fine, just click on the highlighted output on the result set in grid view, it opens up the entire string in a new document. See this screenshot

Solution 6:[6]

You can generate json for each row to avoid line limit, and manually replace the result line break with comma and add brackets to make it array.

SELECT (
    SELECT [column1], [column2]
    FOR JSON PATH, without_array_wrapper
)
FROM [dbo].[table_name]

Note that you cannot use * to select all columns. You can query column names with:

SELECT column_name FROM information_schema.columns WHERE table_name = [table_name]

If you are running in command line, you should use bcp Utility instead of sqlcmd.

This is the sample powershell script:

# Fetch columns
$columnsQuery = "SET NOCOUNT on;SELECT column_name FROM information_schema.columns WHERE table_name = N'$tableName'"
$columns = sqlcmd -S $DB_SERVER -d $DB_NAME -U $DB_USERNAME -P $DB_PASSWORD -h -1 -u -y 8000 -Q $columnsQuery

# Separate column names with `,`
$columnsCommaSeparated = ($columns -join ',').replace(' ','') -replace '[a-zA-Z0-9_]+', '[$0]'

# Run bcp command
$query="select (select $columnsCommaSeparated for json path, without_array_wrapper) from [$tableName] $conditionExpression"
bcp """$query""" queryout $outFile -S $DB_SERVER -d $DB_NAME -U $DB_USERNAME -P $DB_PASSWORD -w

# Make json Array
$result = Get-Content $outFile 
$jsonWithoutBracket = $result -join ','
$json = "[$jsonWithoutBracket]"

# Write
$Utf8WithBomEncoding = New-Object System.Text.UTF8Encoding $True
[System.IO.File]::WriteAllLines($outFile , $json, $Utf8WithBomEncoding)

Solution 7:[7]

I could recover a Giant JSON of Sql Server to Netcore doing this with dapper:

public class AccountMigrationQueries : SingleBaseRepository
{
    public AccountMigrationQueries(IReadOnlyDbConnection readOnlyConnection): base(readOnlyConnection){}

    private readonly string sqlReturnJsonLarge = @"
        declare @json table (result nvarchar(max));
        insert into @json
        select * from (select ... from ... where ... for Json auto, without_array_wrapper) a(result)
        select * from @json
    ";
        
    public async Task<YourClass> GetByIdAsyc()
    {
        var dReader = await ExecuteReaderAsync(sqlReturnJsonLarge);

        StringBuilder json = new StringBuilder();
        while (dReader.Read())
        {
            json.Append(dReader.GetString(0));
        }
        dReader.Close();
        return JsonConvert.DeserializeObject<YourClass>(json.ToString());
    }
}

// other file:
using System.Data;
using Dapper;
public abstract class SingleBaseRepository
{
    protected readonly IReadOnlyDbConnection _readOnlyConnection;
    protected SingleBaseRepository(IReadOnlyDbConnection readOnlyConnection)
    {
        _readOnlyConnection = readOnlyConnection;
    }

    protected async Task<IDataReader> ExecuteReaderAsync(string sql, object parameters = null)
    { 
        await _readOnlyConnection.ExecuteReaderAsync(sql, parameters); // <= dapper
    }
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Charlieface
Solution 2 SheldonZh
Solution 3
Solution 4
Solution 5 GeekMystique
Solution 6 Lagyu
Solution 7 Cesar Alvarado Diaz