• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Why compiler complains "integer number too big" if I declare "int a=0000009"?

 
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
This doesn't make sense.

int a = 0000009;

it should be automatically convert to "9", shouldn't it?

 
Bartender
Posts: 612
7
Mac OS X Python
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Greetings,

You need to do a little research into int literals.

As a hint - integer literals that start with a 0 are octal, not decimal and of course there is no such thing as a 9s digit in octal numbers (nor 8)

-steve
 
Consider Paul's rocket mass heater.
reply
    Bookmark Topic Watch Topic
  • New Topic