Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB

Hierarchical Perceptual Grouping for Object Recognition

Theoretical Views and Gestalt-Law Applications

Book cover: Springer-Verlag.

This work offers new insights and proposes novel methods to advance the field of machine vision, which will be of great benefit to students, researchers, and engineers active in this area.


Dr.-Ing. Eckart Michaelsen is a researcher at the Object Recognition Department of Fraunhofer IOSB, Ettlingen, Germany.

Dr.-Ing. Jochen Meidow is a researcher at the Scene Analysis Department of Fraunhofer IOSB, Ettlingen, Germany.

About this book

This unique text/reference presents a unified approach to the formulation of Gestalt laws for perceptual grouping, and the construction of nested hierarchies by aggregation utilizing these laws. The book also describes the extraction of such constructions from noisy images showing man-made objects and clutter. Each Gestalt operation is introduced in a separate, self-contained chapter, together with application examples and a brief literature review. These are then brought together in an algebraic closure chapter, followed by chapters that connect the method to the data – i.e., the extraction of primitives from images, cooperation with machine-readable knowledge, and cooperation with machine learning.

Topics and features:

  • offers the first unified approach to nested hierarchical perceptual grouping
  • presents a review of all relevant Gestalt laws in a single source
  • covers reflection symmetry, frieze symmetry, rotational symmetry, parallelism and rectangular settings, contour prolongation, and lattices
  • describes the problem from all theoretical viewpoints, including syntactic, probabilistic, and algebraic perspectives
  • discusses issues important to practical application, such as primitive extraction and any-time search
  • provides an appendix detailing a  general adjustment model with constraints.

Series Title

Advances in Computer Vision and Pattern Recognition
For more information, please visit the website of Springer.