In our previous episode, Graeme Geldenhuys said: > The Char type would be defined as String[4] (max size in bytes of a > unicode codepoint) Doesn't sound wise. length(stringtype)=n should mean that the string takes sizeof(char)*n bytes. (give or take the #0#0)