I think iPad Mini 1 and iPhone 4s don't support ETC2_A8 format, in that case, the font should fallback to L8A8 format.
When an Image is loaded, then is called:
Graphics\Image IO.cpp
Code:
static Bool Load(Image &image, File &f, C ImageHeader &header, IMAGE_TYPE type_on_fail)
with the line:
Code:
if(image.createTryEx(want.size.x, want.size.y, want.size.z, want.type, want.mode, want.mip_maps, 1, type_on_fail)) // don't use 'want' after this call, instead operate on 'image' members
Then inside:
Graphics\Image.cpp
Code:
glCompressedTexImage2D(GL_TEXTURE_2D, 0, ImageTI[hwType()].format, hwW(), hwH(), 0, size, temp.elms() ? temp.data() : null);
attempts to setup the texture of desired compressed format
but if this format fails, then it tries another one (uncompressed)
Code:
Bool ok=(glGetError()==GL_NO_ERROR);
if( !ok && type_on_fail)
In your case, ok should be false, and texture should get created as L8A8.
Can you compile engine in debug mode, and verify that it's indeed the case?
When attempting to load your "Font" inside "Bool Init" ?
1) run Esenthel Builder and compile iOS in DEBUG mode
2) Modify your code to following:
Code:
void InitPre()
{
EE_INIT();
//Gui.default_skin = UID(2932164563, 1175793352, 1139732360, 3412372619);
}
bool Init()
{
Font font;
font.load(EncodeFileName(UID(1420520785, 1917302748, 4184509362, 3843416765)));
return true;
}
void Shut()
{
}
bool Update()
{
return true;
}
void Draw()
{
D.clear(WHITE);
}
insert a breakpoint when loading the font.
And verify what happens when attempting to create and load that image, by doing debug step into
I will do some more tests too.