The Joel on Software Discussion Group (CLOSED)

A place to discuss Joel on Software. Now closed.

This community works best when people use their real names. Please register for a free account.

Other Groups:
Joel on Software
Business of Software
Design of Software (CLOSED)
.NET Questions (CLOSED)
TechInterview.org
CityDesk
FogBugz
Fog Creek Copilot


The Old Forum


Your hosts:
Albert D. Kallal
Li-Fan Chen
Stephen Jones

int64/long long xcode/gcc, hardcore...

i'm compiling a static library using xcode (latest version i believe) and  i'm getting a compile error which is not really an error. the error message is thus:

"error: Integer constant is too large for 'long' type"

now, the code is part of an encryption package which runs on windows.

here's the code snippet:

const long long sha512_K[80] = {
    0x428a2f98d728ae22, 0x7137449123ef65cd, 0xb5c0fbcfec4d3b2f,

so it's bitching about the constants; the problem is:
"How do i get 64 bit integers, and 64-bit integer math, working side by side with the standard 32-bit integers?"

on windows, it's simple, they use "int64"

so, does Apple suck, is this possible? i looked up stuff on the web, it basically said "long long" for gcc compiler's meant 64 bit, but it doesn't compile that way. I tried messing with the settings, switching on/off 64-bit math in xcode, still no go.

thanks.
lemon obrien on his mac Send private email
Monday, February 04, 2008
 
 
> it basically said "long long" for gcc compiler's meant 64 bit, but it doesn't compile that way

What version of GCC? Are you compiling C or C++? What error did you get? Was it really an error, or a warning like this:

test.c:6: warning: ISO C90 does not support 'long long'

If you got that warning but it failed to compile, your problem lies somewhere else.

FWIW, this program compiles as both C and C++ on my system, with or without -ansi. It only gives a warning for "long long" if I both specify -pedantic and omit --std=c99.

#include <stdint.h>

int main(void)
{
  int64_t n = 0;
  long long m = 0;
  return n + m;
}
clcr
Monday, February 04, 2008
 
 
I also have no problem using 'long long' with XCode / GCC 4 on OSX 10.5 (and also had no problem with GCC 3 before I upgraded from OSX 10.4)...
JJ
Monday, February 04, 2008
 
 
The problem is that the literal constants, as written, default to integer size so they are overflowing 32-bits. That is what the compiler is complaining about. Add an 'LL' suffix:

0x428a2f98d728ae22LL,
0x7137449123ef65cdLL,
0xb5c0fbcfec4d3b2fLL,
.
Monday, February 04, 2008
 
 
thanks, it was the "LL" extension to the numbers. i thought something like that would just be automatic. oh well, i wonder if i just broke the windows version.

thanks.

btw, i'm using the latest and greatest from Apple, xcode 3.0, with gcc 4.? i believe.

thanks again. i would've never thought about manually converting the constant.
lemon obrien on his mac Send private email
Monday, February 04, 2008
 
 
Actually, the last one does not fit in a "signed long long", it's an "unsigned long long". you probably have to add ULL extension.
Someone else
Monday, February 04, 2008
 
 
It looks like GCC will produce correct code for the 64 bit constants, even with the warning.

This is GCC 3.4.4 under Cygwin, x86 32-bit CPU.
frustrated
Monday, February 04, 2008
 
 
Yeap looks like the ULL is needed.  Else consider splitting it up into 32bit chunks?

const long sha512_K[80] = {
    0x428a2f98, 0xd728ae22, 0x71374491, 0x23ef65cd, 0xb5c0fbcf, 0xec4d3b2f,

...?
Architecture Astronaut
Tuesday, February 05, 2008
 
 
I always become cold shudders down my back when I see someone using an algorithm designed for unsigned integer operations with constants obviously meant to be unsigned, and then assigning them to signed integers... What's wrong with "unsigned long long"?
Secure
Tuesday, February 05, 2008
 
 

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics
 
Powered by FogBugz