Ugh, did Oracle mis-implement jdeSnprintf ?

jolly

VIP Member
... with snprintf, you should be able to print zero characters into buffer NULL to find out how many characters you need in a buffer, e.g.

nSize = jdeSnprintf(NULL, 0, _J(""%s %s"), szName, szValue);

You can then jdeAlloc nSize * sizeof(JCHAR) and know it will be the right size.

However, when I call the above it immediately explodes with a memory violation. I suspect JDE have messed this up. It's not #defined as a macro to a character set defined snprintf or snwprintf.

Remarkably, jdeSnprintf never appears within the application source code!

Is it just me or has anyone else run into this apparent flaw?

Cheers
 
Semi-Short answer:
Yeah, it probably isn't implemented the way you would expect. I wouldn't say it's "wrong" because:

a. There isn't a standard C snwprintf. (windows has _snwprintf and I am not sure if even it works the way you expect)
b. If you know the size of the buffer to allocate you really don't need jdeSnprintf you can just use jdeSprintf.

I use jdeSnprintf but I usually when I have to output to a fixed size buffer and I simply want to prevent any buffer overruns. You could try jdeSprintf instead and see if it works they way you expect, i.e. use it to get the size of the buffer and then just use jdeSprintf to output the formatted string to your dynamically allocated buffer.


Longer answer:
For the most part the "jde" version of string functions are identical to their standard C counterpart but you can't always count on that. jdeStrcmp is a good example - JDE's implementation, although different, is much more useful in the context of JDE development since most strings are going to have trailing spaces (jde implementation ignores trailing spaces so "Hello World" == "Hello World ").

We are a windows shop so I might get this wrong, but I think it all comes down to the fact that JCHAR is effectively defined as two bytes on all platforms.

So, JDE can't simply do

Code:
#define jdeSnprintf  swprintf
or
#define jdeSnprintf  _snwprintf   //windows only

because code needs to be portable across multiple platforms and unfortunately Windows/Linux differs on how unicode is implemented. On Windows a unicode character is 2 bytes and on Linux it is 4 bytes, however, for whatever reason JDE defines JCHAR as two bytes on both platforms (I am sure JDE has a good reason for this) thus while swprintf would work on windows, it wouldn't work on Linux since the Linux implementation would be expecting 4 bytes per character.
 
Last edited:
Yeah it's kinda dumb that there's no snwprintf. But if JDE are going to implement jdeSnprintf they should at the very least make it look like snprintf in as much as you can safely print 0 characters to NULL in order to know the right size if you are dynamically allocating a buffer. If the items being plugged into the print format are unknown at design-time you can never know what the safe size is.
 
I haven't tried it, but does jdeSprintf return required buffer size when you pass NULL for the output buffer? If it does you wouldn't even need jdeSnprintf.
 
That's unfortunate. There are ways, even in JDE, to create a portable (windows/gcc) header only lib for use in C BSFNs so you could define something like getJdeSprintfBufferSize (I have a common header only lib for my organization for things like this) but it may be overkill for this...

I don't know what your exact use case is but your best bet is probably going to be to estimate how big of buffer you need and then alloc an even bigger one then use jdeSnprintf to make sure there isn't a buffer overrun should something unexpected happen. If JCHAR[n-1] contains a non null character (i.e. you filled the entire buffer), reallocate a bigger buffer and try again.
 
Back
Top