varchar, nvarchar, what is difference between varchar and nvarchar, varchar and nvarchar, nvarchar max, difference in sql, nvarchar in sql, varchar nvarchar difference, nvarchar sql, difference sql, nvarchar data type, nvarchar unicode, nchar varchar, nchar char, nvarchar nchar, varchar vs nchar, sql differences, differences sql, sql nvarchar max
Difference between varchar and nvarchar:


The data type Varchar and NVarchar both will used to store the string values.
Varchar - Non-Unicode Data (1 Byte)
NVarchar - Unicode Data (2 Byte)

Storage Size :
Varchar - Actual Length (in bytes)
NVarchar - 2 times Actual Length (in bytes).



VARCHAR is an abbreviation for variable-length character string. It's a string of text characters that can be as large as the page size for the database table holding the column in question.

The "N" in NVARCHAR means uNicode. Essentially, NVARCHAR is nothing more than a VARCHAR that supports two-byte characters. The most common use for this sort of thing is to store character data that is a mixture of English and non-English symbols

The key difference between the two data types is how they're stored. VARCHAR is stored as regular 8-bit data. 

But NVARCHAR strings are stored in the database as UTF-16 — 16 bits or two bytes per character, 

UniCode:
Using Unicode data types, a column can store any character defined by the Unicode Standard, which includes all of the characters defined in the various character sets. Unicode data types take twice as much storage space as non-Unicode data types.
Unicode data is stored using the nchar, nvarchar, and ntext data types in SQL Server.