1 decade ago by mkreitler
Hi all,
I'm porting my impact game (http://www.freegamersjournal.com/boxt) from vanilla impact to iOSimpact, and I'm running into the following problem.
I have a class called "UserFont" that extends the Font class. UserFont calls the Font constructor to load its texture, then overrides the widthMap, indices, and first character variables with parameters sent to the constructor. The game correctly tries to create my UserFont class and calls into the base class constructor. Once there, the call to glGenTextures() always returns 0, which prevents proper initialization of the JavaScript object.
Here is the declaration of JS_UserFont (in JS_UserFont.h):
Here is it's constructor (note that I'm not yet overriding the widthMap, indices, or first character, yet, but that's irrelevant because the [super init] call fails to create a valid JavaScript object):
Here is how I invoke the object from my game.js module:
Tracing the code in the debugger, I see the UserFont getting created and calling into the base class constructor.
My understanding is that glGenTextures usually fails because it has an invalid graphics context. Am I somehow not passing the correct context to the base class constructor?
Any help is appreciated...
Thanks,
Mark
I'm porting my impact game (http://www.freegamersjournal.com/boxt) from vanilla impact to iOSimpact, and I'm running into the following problem.
I have a class called "UserFont" that extends the Font class. UserFont calls the Font constructor to load its texture, then overrides the widthMap, indices, and first character variables with parameters sent to the constructor. The game correctly tries to create my UserFont class and calls into the base class constructor. Once there, the call to glGenTextures() always returns 0, which prevents proper initialization of the JavaScript object.
Here is the declaration of JS_UserFont (in JS_UserFont.h):
@interface JS_UserFont : JS_Font { }
Here is it's constructor (note that I'm not yet overriding the widthMap, indices, or first character, yet, but that's irrelevant because the [super init] call fails to create a valid JavaScript object):
- (id)initWithContext:(JSContextRef)ctx object:(JSObjectRef)obj argc:(size_t)argc argv:(const JSValueRef [])argv { if( self = [super initWithContext:ctx object:obj argc:argc argv:argv] ) { // Override widthMap, indices, and firstChar here... } return self; }
Here is how I invoke the object from my game.js module:
numberFont: new native.UserFont('media/boxt/images/fat_font.png', [40, 22, 31, 33, 32, 32, 34, 30, 39, 37], [ 9, 64, 105, 154, 202, 250, 298, 350, 393, 448], 48),
Tracing the code in the debugger, I see the UserFont getting created and calling into the base class constructor.
My understanding is that glGenTextures usually fails because it has an invalid graphics context. Am I somehow not passing the correct context to the base class constructor?
Any help is appreciated...
Thanks,
Mark