There is no difference. Chars are numbers. What makes them appear as characters is special overloads for them and thing called character table which defines mapping from character value to actual symbol on screen.
Everything in a computer is represented by numbers in binary. Decimal, octal, hexadecimal, etc... are just different ways that we humans can express numbers, and any given binary number can freely be expressed in any of those bases we choose.
Characters can not be intrinsically represented by numbers, so people came up with ASCII, EBCDIC, Unicode, and other character sets to represent them in computers.
In memory a binary sequence of 01100001 could represent many things, what it actually means is entirely dependent upon how our programs interpret it.
As a number, that sequence could be read as decimal 97, hex 61, octal 141, or any other numeric base you wish to use. If you read that sequence as an ASCII character it represents 'a'.
In C++ char is integral type. That means it only differs from int in size. It can be passed to functions expecting int, short, long or other integral types. You can perform math operations on them:'a' * '0' % '!' is a valid expression, because characters are integral values. It is not a separate entity as, say, pointer types (though they can be explicitely casted to and from integral values, they are not integral themself)