java - What information is gained not starting an edge detection method with a Gaussian? -
Complete disclaimer, this is my first attempt to find the edge, when I think I know about software but not necessarily about computer vision / related areas
I am reading about ways to find different edges, many of them have a natural first step, which is to move the image smoothly. It sounds reasonable because you do not want random noise to interfere with any high level logic.
My question is, what useful information can be lost due to Gaussian stigma? (If any?)
Useful Contrast information lost for any reason The blurred technique (not only the gaussian stigma) is the reason that the blurred average neighbor image intensity, which naturally kills the opposite. In the context of signal processing, this action can be seen as a low-pass filter. An example of what happens to a very violent blurring: