Low cost robots, such as vacuum cleaners or lawn mowers, employ simplistic and often random navigation policies. Although a large number of sophisticated localization and planning approaches exist, they require additional sensors like LIDAR sensors, cameras or time of flight sensors. In this work, we propose a global localization method biologically inspired by simple insects, such as the ant Cataglyphis that is able to return from distant locations to its nest in the desert without any or with limited perceptual cues. Like in Cataglyphis, the underlying idea of our localization approach is to first compute a pose estimate from pro-prioceptual sensors only, using land navigation, and thereafter refine the estimate through a systematic search in a particle filter that integrates the rare visual feedback. In simulation experiments in multiple environments, we demonstrated that this bioinspired principle can be used to compute accurate pose estimates from binary visual cues only. Such intelligent localization strategies can improve the performance of any robot with limited sensing capabilities such as household robots or toys.