A decimal literal is an integer literal that is written in base 10. This is the normal way of writing numbers that we are all used to. Examples of decimal literals are 123 and 400000000.
Normally when you write a decimal literal it will have the type int. If int is not big enough to hold the value it will try to use long. If long is not big enough to hold the value it will use long long.
Integer literals starting with 0 are octal literals written in base 8, meaning it only uses the digits 0-7. Examples of octal literals are 010 (=8) and 0755 (=493).
Integer literals starting with 0x are hexadecimal literals written in base 16. It uses A-F as extra digits. Examples of hexadecimal literals are 0xFF (=255) and 0x10 (=16).
The type for octal and hexadecimal literals work similar to decimal literals. The difference is that they also consider unsigned types. The default type is int. If int is not big enough to hold the value it will try to use unsigned int. If unsigned int is not big enough to hold the value it will try to use long. If long is not big enough to hold the value it will try to use unsigned long. And so on.