Why isn't my alpha blending working?

Post Reply
cyanprime
Posts: 3
Joined: Sun Jan 01, 2012 12:22 pm

Why isn't my alpha blending working?

Post by cyanprime » Sun Jan 01, 2012 12:26 pm

So I'm trying to make it so I can have sprites billboarded like doom, and the first step is getting them to show up without a background.
So for example, I want to ether make it so the texture uses it's own alpha channel, or make it so a color like magenta is turned into transparent. I've written this function in hopes of making a transparent texture, but it doesn't seem to be working. Anyone know why?

Code: Select all

int LoadGLTextures()
{
	sImage pcx;

	//load our texture
	loadPCX((u8*)drunkenlogo_pcx, &pcx);
	
	image8to16trans(&pcx, 0);
	
	for(unsigned i = 0; i < (TEXTURE_SIZE_128 * TEXTURE_SIZE_128); i++)
    {
        const u16 ALPHA_BIT = 1 << 15;
        pcx.image.data16[i] |= ALPHA_BIT;
    }


	glGenTextures(1, &texture[0]);
	glBindTexture(0, texture[0]);
	glTexImage2D(0, 0, GL_RGBA, TEXTURE_SIZE_128 , TEXTURE_SIZE_128, 0, TEXGEN_TEXCOORD, pcx.image.data16);
	
	imageDestroy(&pcx);

	return TRUE;
}

elhobbs
Posts: 358
Joined: Thu Jul 02, 2009 1:19 pm

Re: Why isn't my alpha blending working?

Post by elhobbs » Mon Jan 02, 2012 3:43 am

GL_RGBA is 15 bit rgb + 1 bit alpha channel(which is in bit 15). you are forcing all of the pixels in the source image to solid in you for loop.

cyanprime
Posts: 3
Joined: Sun Jan 01, 2012 12:22 pm

Re: Why isn't my alpha blending working?

Post by cyanprime » Mon Jan 02, 2012 6:30 am

elhobbs wrote:GL_RGBA is 15 bit rgb + 1 bit alpha channel(which is in bit 15). you are forcing all of the pixels in the source image to solid in you for loop.
Ah, thank you very much, it works now :3

Post Reply

Who is online

Users browsing this forum: No registered users and 17 guests