Avoiding truncation errors in Microsoft Fabric Data Warehouse: Understanding UTF‑8, collations, and column lengths
Loading data into a Microsoft Fabric Data Warehouse can feel effortless—until it suddenly isn’t. Being a total MS Fabric noob, the following error message caused some confusing and brough me on a learning journey I like to share:
String or binary data would be truncated while reading column of type 'VARCHAR(255)'.
Turns out that if you’re working with Parquet files, multilingual data, or Copy Into pipelines, this error can appear even when the source column seems to match the target column. In this post, I’ll walk through why this happens, why collation changes alone don’t fix it, and what you can do to avoid it.
The scenario
I was loading data from an on premise datasource through a data gateway into a Fabric Data Warehouse table. The target column is defined as:
NAME_LB VARCHAR(255)
To already give you a hint, the source column contains values like:
RÖ
The column lengths match. I even switched the collation to a UTF‑8 compatible one:
COLLATE Latin1_General_100_CI_AS_KS_WS_SC_UTF8
And yet… the load still fails with a truncation error. Why?
The root cause: Bytes vs. Characters
Fabric Warehouses store text in UTF‑8 when using UTF‑8 collations. That means:
-
VARCHAR(255)= 255 bytes, not 255 characters -
ASCII characters → 1 byte
-
Extended characters (Ö, é, ç, …) → 2–4 bytes
So a string that is 200 characters long may exceed 255 bytes if it contains multi‑byte characters.
This is the key insight:
Truncation errors in Fabric are about byte length, not character length.
Changing the collation to a UTF‑8 one ensures that characters like Ö are stored correctly, but it does not change the byte limit of the column.
Even with UTF‑8 collation:
VARCHAR(255)
still means:
-
Maximum 255 bytes
-
Not 255 characters
-
Not “255 Unicode characters”
So if the source contains a value that exceeds 255 bytes, the load fails.
How to inspect the column metadata?
If you encounter this problem, a good starting point is to inspect the column metadata. This can be done with the following SQL query:
SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, COLLATION_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'SCHEMA-ABCD' AND TABLE_NAME = 'TABLE-ABCD';
The fix
We fixed it by increasing the column size for the impacted columns. As you cannot ALTER a column in a Fabric datawarehouse directly, we achieved it by deleting and re-adding the column:
ALTER TABLE SCHEMA.TABLE_X DROP COLUMN NAME_LB;
ALTER TABLE SCHEMA.TABLE_X ADD NAME_LB VARCHAR(300) COLLATE Latin1_General_100_CI_AS_KS_WS_SC_UTF8
That’s it!
