- #1
Loures
- 1
- 0
Hello,
I am working do determine an optimum threshold for wavelt transform image denoise. I have the following quesstion. considering a vector of a data d*:
d*= d + noise
where the noise is zero mean and with variance sigma ~(0,sigma) and the signal "d" has variance sigma_x.
I have to infer the noise variance sigma and the signal variance sigma_x.
How can I do it followin g the Bayesian Methodology of Inference.
I have experience with Bayesian inference, but never have feced this problem.
Thank you,
Loures
I am working do determine an optimum threshold for wavelt transform image denoise. I have the following quesstion. considering a vector of a data d*:
d*= d + noise
where the noise is zero mean and with variance sigma ~(0,sigma) and the signal "d" has variance sigma_x.
I have to infer the noise variance sigma and the signal variance sigma_x.
How can I do it followin g the Bayesian Methodology of Inference.
I have experience with Bayesian inference, but never have feced this problem.
Thank you,
Loures