Hi Adam
I'm currently processing an OSC image of the IC405 area. I've used SXT to remove the stars and am using DBE subtraction to remove some gradients in the starless image. I found that after applying DBE I lost a lot of contrast as the mean level in the image shot up from around 0.03 in each channel to 0.35. This surprised me, I'd never noticed this happening before and I'd expected that the mean should be lower after subtracting the gradient, not higher.
But then I noticed the normalise checkbox (I've always wondered when I should use this). This was unchecked, so I checked it and with this setting the mean levels after gradient removal were similar to those before doing DBE and the resulting image looked much better. Can you explain when I should and shouldn't use the normalise option in DBE?
Thanks
Gordon
Comments