We report on research conducted as part of the Universal Cognitive User Interface (UCUI) project, which aims at developing a universal, autarkic module for intuitive interaction with technical devices. First, we present an empirical study of image schemas as basic building blocks of human knowledge. Image schemas have been studied extensively in cognitive linguistics, but insufficiently in the context of human-computer-interaction (HCI). Some image schemas are developed early at pre-verbal stages (e.g., up-down) and may, thus, exert greater influence on human knowledge than later developed image schemas (e.g., centre-periphery). To investigate this for HCI contexts, we applied a speech interaction task using a Wizard of Oz paradigm. Our results show that users apply early image schemas more frequently than late image schemas. They should, therefore, be given preference in interface designs. In the second part of this contribution we therefore focus on the appropriate representation and processing of semantics. We introduce novel theoretical work including feature-values-relations and Petri net transducers, and discuss their impact on behaviour control of cognitive systems. In addition, we illustrate some details of the implementation regarding learning strategies and the graphical user interface.