(char) (i + 'a') JAVA: what's happening here? - Stack Overflow
Learning Java: What is the point of the Char datatype?
What is the difference between char and Character in Java? - Stack Overflow
Character literal in Java? - Stack Overflow
Videos
The ASCII value of a is 97.
When i = 0, i + 'a' = 0 + 97 => When cast into char, it will be a
When i = 1, i + 'a' = 1 + 97 => When cast into char, it will be b
When i = 2, i + 'a' = 2 + 97 => When cast into char, it will be c
...and so on
Java char is a 16-bit integral type. 'a' is the same as 97, which you can see with System.out.println((int) 'a'); - it follows that 98 is 'b' and so on across the entire ASCII table.
Apologies as I come from Python, but the Char datatype seems extremely pointless...
I'm just starting to learn Java, so can you shed some light on this?
char is a primitive type that represents a single 16 bit Unicode character while Character is a wrapper class that allows us to use char primitive concept in OOP-kind of way.
Example for char,
char ch = 'a';
Example of Character,
Character.toUpperCase(ch);
It converts 'a' to 'A'
From the JavaDoc:
The Character class wraps a value of the primitive type char in an object. An object of type Character contains a single field whose type is char. In addition, this class provides several methods for determining a character's category (lowercase letter, digit, etc.) and for converting characters from uppercase to lowercase and vice versa.
Character information is based on the Unicode Standard, version 6.0.0.
So, char is a primitive type while Character is a class. You can use the Character to wrap char from static methods like Character.toUpperCase(char c) to use in a more "OOP way".
I imagine in your program there was an 'OOP' mistake(like init of a Character) rather than char vs Character mistake.
char is actually an integer type. It stores the 16-bit Unicode integer value of the character in question.
You can look at something like http://asciitable.com to see the different values for different characters.
In Java char literals represent UTF-16 (character encoding schema) code units. What you got from UTF-16 is mapping between integer values (and the way they are saved in memory) with corresponding character (graphical representation of unit code).
You can enclose characters in single quotes - this way you don't need to remember UTF-16 values for characters you use. You can still get the integer value from character type and put if for example in int type (but generally not in short, they both use 16 bits but short values are from -32768 to 32767 and char values are from 0 to 65535 or so).