ORA-12899: importing a .dmp file + character set conversion

character-setimportoracleoracle-xe

I'm working on a django project on Ubuntu 10.04 with Oracle Database Server , so I've installed: Oracle Database 10g xe universal Rel.10.2.0.1.0 cx_Oracle-5.0.4-10g-unicode-py26-1.x86_64

When I import a .dmp file which is generated by oracle 10gr2 enterprise edition (on Windows XP) I get errors as listed below(it happens with data types varchar2), probably connected to the characterset conversion (my db contains greek) and my data don't sync completely.

Username: Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 – Production Export file created by EXPORT:V10.02.01 via conventional path import done in US7ASCII character set and AL16UTF16 NCHAR character set import server uses AL32UTF8 character set (possible charset conversion) export client uses EL8MSWIN1253 character set (possible charset conversion) . . importing table "xxx" IMP-00019: row rejected due to ORACLE error 12899 IMP-00003: ORACLE error 12899 encountered ORA-12899: value too large for column "xxx"."xxx"."xxx" (actual: 41, maximum: 40) and so on

My oracle express server is running with these settings:

NLS_CHARACTERSET AL32UTF8 
NLS_DATE_LANGUAGE AMERICAN 
NLS_LANGUAGE AMERICAN 
NLS_LENGTH_SEMANTICS BYTE 
NLS_NCHAR_CHARACTERSET AL16UTF16 
NLS_NCHAR_CONV_EXCP FALSE 
NLS_TERRITORY AMERICA

And the database server which generated the .dmp file is running with:

NLS_CHARACTERSET EL8MSWIN1253
NLS_NCHAR_CHARACTERSET AL16UTF16

Does anyone have a hint how I could solve this problem?

Best Answer

One solution is to have both databases using the same character set. Considering your destination is a free XE version why not do a new install? Just make sure you use EL8MSWIN1253 character set and the same UTF version.