The current position seems like "well, you can write exploratory programs using only the numbers 0 to 7, so clearly they're the first priority".
String handling is an important feature of any programming language, and strings are series of characters. Doesn't that suggest character sets are pretty core?
I'd suggest the main reason character handling is currently such an enormous time suck for developers is that many of the languages in common use were developed before Unicode came along. If you want to design a language that will be an enormous time suck in_use now, so be it.
Supporting short programs is great - but supporting short programs that will actually work is much better. A solution that is special-cased to a limited range of values is not a reusable solution.