Videos
The ASCII value of a is 97.
When i = 0, i + 'a' = 0 + 97 => When cast into char, it will be a
When i = 1, i + 'a' = 1 + 97 => When cast into char, it will be b
When i = 2, i + 'a' = 2 + 97 => When cast into char, it will be c
...and so on
Java char is a 16-bit integral type. 'a' is the same as 97, which you can see with System.out.println((int) 'a'); - it follows that 98 is 'b' and so on across the entire ASCII table.
Apologies as I come from Python, but the Char datatype seems extremely pointless...
I'm just starting to learn Java, so can you shed some light on this?
char is a primitive type that represents a single 16 bit Unicode character while Character is a wrapper class that allows us to use char primitive concept in OOP-kind of way.
Example for char,
char ch = 'a';
Example of Character,
Character.toUpperCase(ch);
It converts 'a' to 'A'
From the JavaDoc:
The Character class wraps a value of the primitive type char in an object. An object of type Character contains a single field whose type is char. In addition, this class provides several methods for determining a character's category (lowercase letter, digit, etc.) and for converting characters from uppercase to lowercase and vice versa.
Character information is based on the Unicode Standard, version 6.0.0.
So, char is a primitive type while Character is a class. You can use the Character to wrap char from static methods like Character.toUpperCase(char c) to use in a more "OOP way".
I imagine in your program there was an 'OOP' mistake(like init of a Character) rather than char vs Character mistake.