Missing nvarchar columns when reading SQL Server database table from Oracle
I have a SQL Server database with a table that has a column of
nvarchar(4000) data type. When I try to read the data from Oracle through a dblink, I don’t see the
nvarchar(4000) column. All the other column’s data is displayed properly.
Can anyone help me to find the issue here and how to fix it?
One Solution collect form web for “Missing nvarchar columns when reading SQL Server database table from Oracle”
Appendix A-1 …
ODBC Oracle Comment
SQL_WCHAR NCHAR –
NVARCHAR – SQL_WLONGVARCHAR LONG if Oracle DB Character Set = Unicode.
Otherwise, it is not supported
nvarchar(max) is mapped to
SQL_WLONGVARCHAR and this data type can only be mapped to Oracle if the Oracle database character set is
To check the database character set, please excuet:
select * from nls_parameters;
and have a look at:
NLS_CHARACTERSET needs to be a
unicode character set – for example
AL32UTF8(Do this if you know what you are doing or ask you r DBA to do it.)
NCHAR character set isn’t used as the mapping is to Oracle
LONG which uses the normal database character set.
A 2nd solution would be to create on the SQL Server side a view that splits the
nvarchar(max) to several
nvarchar(xxx) and then to select from the view and to concatenate the content again in Oracle.(If you have problem with changing the character set to unicode then this approach is the beset way to go.)