Page 1 of 1

ConsoleDemoInit forces VRAM_C, but ConsoleInit requires it?

Posted: Sun Aug 07, 2011 10:32 pm
by Discostew
I know ConsoleDemoInit sets VRAM_C for use, but for some reason, when setting up the console myself like this...

Code: Select all

	videoSetModeSub( MODE_0_2D  );
	vramSetBankI( VRAM_I_SUB_BG_0x06208000 );
	consoleInit( NULL, 0, BgType_Text4bpp, BgSize_T_256x256, 23, 2, false, true );
...it doesn't work. The thing about it is that it used to work before (as it is part of the Paletted_Cube example), and another odd thing about this is that as long as I don't touch vramSetBankC or if I use it only to set the bank to VRAM_C_SUB_GB, the console works. If I set the bank to anything else, it doesn't work, even though the code above doesn't use the bank at all. For some reason, it just requires VRAM_C.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Mon Aug 08, 2011 2:19 pm
by elhobbs
you may want to check out a simple example in the desmume emulator. it has tools to look at the vram mapping as well as the different backgrounds independently. the mapping you used looks right to me. your issue may be a background priority issue rather than mapping/vram config. you are using background 0, but I think background 1 may be covering it.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Sat Aug 13, 2011 9:08 pm
by Discostew
I'm thinking it was a fluke in DeSmuME, as now after multiple tests to find the problem, the exact same code I began having the problem with is now showing no problems at all. Not sure what was going on. One thing I did find though with my tests was that after examining the VRAM states right after the "main" entry point (reading VRAM_CR), it held a value of 0x84848281, which stood for...

VRAM_A = MAIN_BG | ENABLE
VRAM_B = MAIN_SPRITE | ENABLE
VRAM_C = SUB_BG | ENABLE
VRAM_D = SUB_SPRITE | ENABLE

Not sure if this the work of DeSmuME or the initialization code of the binary before the actual "main" entry point is reached.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Sat Aug 13, 2011 10:08 pm
by WinterMute
That's the init code, at some point we decided it would be a good idea to have a simple default VRAM setup on reaching main, later it cleared VRAM too. The vram default function can be overridden by implementing vramDefault in your own code (with C linkage) - current code looks like this

Code: Select all

//---------------------------------------------------------------------------------
u32 __attribute__((weak)) vramDefault() {
//---------------------------------------------------------------------------------

	// map all VRAM banks to lcdc mode
	VRAM_CR = 0x80808080;
	VRAM_E_CR = 0x80;
	VRAM_F_CR = 0x80;
	VRAM_G_CR = 0x80;
	VRAM_H_CR = 0x80;
	VRAM_I_CR = 0x80;

	dmaFillWords(0, BG_PALETTE, (2*1024));	// clear main and sub palette
	dmaFillWords(0, OAM, 2*1024);			// clear main and sub OAM
	dmaFillWords(0, VRAM, 656*1024);		// clear all VRAM


	return vramSetPrimaryBanks(VRAM_A_MAIN_BG, VRAM_B_MAIN_SPRITE, VRAM_C_SUB_BG, VRAM_D_SUB_SPRITE);
}
There's a chance this could backfire if you're setting banks explicitly, for instance in your example if you moved bank C after your consoleInit call then I believe you'll lose the font on hardware at least. Haven't checked yet but I think the priorities for banks occupying the same memory work from A to I, high to low.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Sat Aug 13, 2011 10:53 pm
by Discostew
WinterMute wrote:That's the init code, at some point we decided it would be a good idea to have a simple default VRAM setup on reaching main, later it cleared VRAM too. The vram default function can be overridden by implementing vramDefault in your own code (with C linkage) - current code looks like this

Code: Select all

//---------------------------------------------------------------------------------
u32 __attribute__((weak)) vramDefault() {
//---------------------------------------------------------------------------------

	// map all VRAM banks to lcdc mode
	VRAM_CR = 0x80808080;
	VRAM_E_CR = 0x80;
	VRAM_F_CR = 0x80;
	VRAM_G_CR = 0x80;
	VRAM_H_CR = 0x80;
	VRAM_I_CR = 0x80;

	dmaFillWords(0, BG_PALETTE, (2*1024));	// clear main and sub palette
	dmaFillWords(0, OAM, 2*1024);			// clear main and sub OAM
	dmaFillWords(0, VRAM, 656*1024);		// clear all VRAM


	return vramSetPrimaryBanks(VRAM_A_MAIN_BG, VRAM_B_MAIN_SPRITE, VRAM_C_SUB_BG, VRAM_D_SUB_SPRITE);
}
There's a chance this could backfire if you're setting banks explicitly, for instance in your example if you moved bank C after your consoleInit call then I believe you'll lose the font on hardware at least. Haven't checked yet but I think the priorities for banks occupying the same memory work from A to I, high to low.
Alright, I think that was the issue. I began having the problem again, but then I set the primary banks to LCD mode prior to the console code I had while changing them yet again afterwards to what I needed, and now it works just fine.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Sun Aug 14, 2011 2:45 am
by mtheall
WinterMute wrote:There's a chance this could backfire if you're setting banks explicitly, for instance in your example if you moved bank C after your consoleInit call then I believe you'll lose the font on hardware at least. Haven't checked yet but I think the priorities for banks occupying the same memory work from A to I, high to low.
Even if you found this to be true in some scenario, I wouldn't count on it being true in every scenario. I recommend to always initialize every VRAM bank for the purposes you want.

Re: ConsoleDemoInit forces VRAM_C, but ConsoleInit requires

Posted: Sun Aug 14, 2011 10:48 am
by WinterMute
That was an observation and possible explanation for the problem, not a recommendation or advice