§ ¶The requested length is too int
The type long is problematic in C, because its exact type is not defined, and practically it can be either 32-bit or 64-bit on common platforms. Sometimes people will do a mass search-and-replace to change all variables declared as long to int in order to fix portability issues. If you do this, you need to make sure you check the diffs or you will end up publishing something nonsensical:
This function returns status values as follows:
CBS_INVARG requested file name length is too int
Comments
Comments posted:
Ah well, that does sound quite like a Clbuttic Mistake!
(see
http://thedailywtf.com/Articles/The-Clbu.. )
Marcel - 04 07 10 - 11:42
What's amazing is that people still think that strstr() is a good way to censor text. Nothing like trying to read a blog on a major website about the World Cup and seeing comments that refer to "compe***ion" and "ener***."
Phaeron - 04 07 10 - 12:14
'int' is not defined precisely either.
http://c-faq.com/decl/inttypes.htmldmh2000 - 04 07 10 - 18:38
Was it really necessary for you to post that? That's not the point I'm trying to make here.
Phaeron - 04 07 10 - 20:19
This is hilarious :D
mark - 05 07 10 - 03:59
There's a tale, I think on one of the MSDN blogs, where the guy was doing a presentation, and so as usual let the publicity people at the slides to make sure everything was copyrighted and trademarked properly. Unfortuantly the publicity people either didn't know to leave the source code alone or just did a global find/replace, and so destroyed half the variable names.
Torkell (link) - 05 07 10 - 05:33
Yup. It was from Raymond Chen's blog, as usual:
http://blogs.msdn.com/b/oldnewthing/arch..Phaeron - 05 07 10 - 07:00
When QBASIC shipped with DOS, IBM went through the help file and replaced all occurrences of Microsoft with IBM. Unfortunately "Microsoft" was used as an example string for the MID$ command.
Robert Claypool - 05 07 10 - 08:31
Would be longeresting to know the value of sizeof(long long) on platforms that have 64 bit longs.
Gabest - 06 07 10 - 03:16
I apologize if I'm being Captain Obvious by suggesting this, but sometimes the obvious gets overlooked. If the only real goal is playing a console game and using the laptop as a TV, there may be a simple workaround. Does the laptop have a microphone port? Try using the microphone port for audio and the capture card plus Virtualdub for video and see if the latency goes away.
I don't have game system, a laptop or a ATI theater 750 USB capture device to test this. I tested this using two DTA converter boxes tuned to the same channel. One was connected to a TV ,and the other to the ATI TV wonder 650 PCIE capture device installed in my desktop PC. The PC was 3-4 sylables behind the TV when using the capture card for both audio and video. When I set up using the microphone port for audio, and the capture card for video, it eliminated the delay as far as I can tell. The audio and video on the PC side still seemed pretty well synchronized even without going through the captue card.
Ms. R - 21 08 10 - 14:30
I posted in the wrong topic without realizing it. My apologies.
Ms.R - 21 08 10 - 16:29
Whats amazing is that people actually write code in which they use types like "int" and "long". You can save yourself the portability grief later on by always using your own fixed-width typedefs. Then when your existing compilers were all LLP64 and the new one turns out to be LP64, you just change a couple typedefs and everything is fine. Reasoning about the possible code behaviours on various platforms gets much easier when you know the actual sizes of your variables (though you still have to watch out for stupid integer promotion gotchas). Most people who think about what their code does at all, will do that thinking with one "intended" size in mind when they look at a variable, and will overlook the possible behaviours if that type has some other size on that platform. Best to avoid that by making the sizes explicit (by naming the types "uint32" etc.)
moo - 08 12 11 - 18:11
Yup, although you run into library functions and system calls that are still defined in terms of int.
The nastiest mess I remember with the standard types was with Lattice C on the 68000, which defined int as 32-bit even though the efficient integer size on the 68K was generally 16-bit. The result was that everyone purposely used short for values all over the place for speed, which led to headaches when porting the code to x86 and having to undo all of the shorts thrown all over the place.
Phaeron - 08 12 11 - 18:22