Oracle Performance – Varchar2 (x char) vs Varchar2 (x byte) Comparison

oraclevarchar

I know the definition diference between Varchar2 (x char) x Varchar2 (x byte).

varchar2(20 char) means you can store 20 characters — whereas varchar2(20) means you can store 20 bytes.

  1. What about the performance?
  2. Is there a problem if all my database is using Varchar2 (x char), even though my whole database is just using english?

Thanks!

Best Answer

Assuming that your database character set is AL32UTF8 (if there is a difference between the two, this is almost certainly your character set) and that all the data is characters that exist in the US7ASCII character set ("english" might constitute more than that depending on the definitions we're using) there won't be any noticable performance differences between a varchar2(n CHAR) and a varchar2(n BYTE).

Based on your database character set, Oracle has to do a tiny bit more work reading a string to figure out that every byte represents a single character rather than being part of a multi-byte character. But checking the length of a string in either characters or bytes is a pretty trivial operation. Unless you're doing something like running a TPC benchmark where you're already doing a crazy number of things that you'd never do in reality to get tiny fractions of performance improvement, it's not something worth worrying about.

If you were going to go crazy with a TPC benchmark level of effort to get every last thousandth of a percent, it would probably be slightly more efficient to use a database character set of US7ASCII and declare everything using byte semantics given that you only had to store English. Practically, though, if your application is so efficient that this sort of thing matters, though, you've already solved every performance problem so you should be focusing much more heavily on supportability and maintainability which would argue for an AL32UTF8 character set and character length semantics.