Refine
Year of publication
Document Type
- Doctoral Thesis (908) (remove)
Language
- German (506)
- English (391)
- Multiple languages (7)
- French (4)
Keywords
- Deutschland (39)
- Stress (37)
- Optimierung (23)
- Modellierung (20)
- Fernerkundung (17)
- Hydrocortison (16)
- stress (16)
- Motivation (12)
- Stressreaktion (12)
- cortisol (12)
Institute
- Psychologie (182)
- Raum- und Umweltwissenschaften (148)
- Fachbereich 4 (77)
- Mathematik (64)
- Wirtschaftswissenschaften (61)
- Fachbereich 1 (34)
- Geschichte, mittlere und neuere (28)
- Informatik (28)
- Germanistik (26)
- Fachbereich 6 (24)
- Kunstgeschichte (22)
- Politikwissenschaft (18)
- Anglistik (17)
- Fachbereich 2 (17)
- Soziologie (16)
- Fachbereich 3 (12)
- Philosophie (9)
- Romanistik (9)
- Computerlinguistik und Digital Humanities (7)
- Medienwissenschaft (6)
- Geschichte, alte (5)
- Allgemeine Sprach- und Literaturwissenschaft (4)
- Fachbereich 5 (4)
- Klassische Philologie (4)
- Pädagogik (4)
- Ethnologie (3)
- Japanologie (3)
- Sinologie (3)
- Archäologie (2)
- Rechtswissenschaft (2)
- Bodenkunde (1)
- Phonetik (1)
- Slavistik (1)
- Universitätsbibliothek (1)
There is a wide range of methodologies for policy evaluation and socio-economic impact assessment. A fundamental distinction can be made between micro and macro approaches. In contrast to micro models, which focus on the micro-unit, macro models are used to analyze aggregate variables. The ability of microsimulation models to capture interactions occurring at the micro-level makes them particularly suitable for modeling complex real-world phenomena. The inclusion of a behavioral component into microsimulation models provides a framework for assessing the behavioral effects of policy changes.
The labor market is a primary area of interest for both economists and policy makers. The projection of labor-related variables is particularly important for assessing economic and social development needs, as it provides insight into the potential trajectory of these variables and can be used to design effective policy responses. As a result, the analysis of labor market behavior is a primary area of application for behavioral microsimulation models. Behavioral microsimulation models allow for the study of second-round effects, including changes in hours worked and participation rates resulting from policy reforms. It is important to note, however, that most microsimulation models do not consider the demand side of the labor market.
The combination of micro and macro models offers a possible solution as it constitutes a promising way to integrate the strengths of both models. Of particular relevance is the combination of microsimulation models with general equilibrium models, especially computable general equilibrium (CGE) models. CGE models are classified as structural macroeconomic models, which are defined by their basis in economic theory. Another important category of macroeconomic models are time series models. This thesis examines the potential for linking micro and macro models. The different types of microsimulation models are presented, with special emphasis on discrete-time dynamic microsimulation models. The concept of behavioral microsimulation is introduced to demonstrate the integration of a behavioral element into microsimulation models. For this reason, the concept of utility is introduced and the random utility approach is described in detail. In addition, a brief overview of macro models is given with a focus on general equilibrium models and time series models. Various approaches for linking micro and macro models, which can either be categorized as sequential approaches or integrated approaches, are presented. Furthermore, the concept of link variables is introduced, which play a central role in combining both models. The focus is on the most complex sequential approach, i.e., the bi-directional linking of behavioral microsimulation models with general equilibrium macro models.
In den letzten Jahren hat die Nutzung von Drohnen deutlich zugenommen. Dies liegt unter anderem an der Leistungssteigerung, der guten Verfügbarkeit und an dem einfachen Einsatz von Drohnen. Damit sind auch Anwendungen in der Forschung möglich geworden, die zuvor unmöglich oder mit hohen Kosten verbunden waren. Als Sensor zur Datenaufzeichnung findet im Bereich der Forschung häufig eine Kamera Verwendung. Zusammen mit einer Drohne können Bereiche einfach und kostengünstig überflogen und dabei erkundet, beobachtet oder überwacht werden. Neben der Kamera als Sensor werden auch häufig Multispektralkameras und Lidar eingesetzt. Dagegen findet Radar im Bereich von kleinen Drohnen kaum Anwendung. Ziel dieser Forschungsarbeit war es zu untersuchen, ob neuste Radartechnik einen Mehrwert in der Fernerkundung mit kleinen Drohnen bieten kann.
Hierfür wurden moderne Radarsensoren aus dem Automobilbereich ausgewählt. Als Drohnen wurden sowohl Quadrocopter als auch eine Starrflügler-Drohne eingesetzt. Für die Analyse, Berechnung und Auswertung der Daten wurde MATLAB verwendet. Der erste Ansatz beruhte auf einer Starrflügler-Drohne, die sich durch ihren freien Zugriff auf die Steuerung auszeichnet. Dadurch können auch spezielle Anforderungen an die Flugregelung berücksichtigt werden. Allerdings können mit einer Starrflügler-Drohne keine langsamen oder sogar statische Luftaufnahmen erstellt werden, um Erfahrung mit den Radardaten zu erlangen. Aus diesem Grund wurde anschließend ein Radar-Messsystem entworfen, das unabhängig von der Drohne eingesetzt werden kann. Zusammen mit einem Quadrocopter konnten so statische Radarmessungen durchgeführt werden, um die Verwendbarkeit der Radardaten in der Fernerkundung zu bestätigen. Das Messsystem konnte so aber nur für 2-dimensionale Anwendungen eingesetzt werden. In der weiteren Forschungsarbeit wurde untersucht, ob es möglich ist, mit einem Radarsensor der nur in 2-dimensionen misst eine 3-dimensionale Aufzeichnungen zu erstellen. Als Versuchsobjekt wurde eine Hütte gewählt, die Anhand der Radardaten dargestellt werden sollte. Dafür wurde ein Prozess zur Datenverarbeitung mit elf Schritten entworfen, womit die Hütte auf 0,6 Meter genau rekonstruiert werden konnte. Im letzten Teil der Forschungsarbeit wurde untersucht, ob sich die Genauigkeit des Messsystems erhöhen lässt, um noch mehr Anwendungsfälle bedienen zu können. Dafür wurde ein neuer Radarsensor eingesetzt, der eine höhere Genauigkeit besitzt. Die Forschungsarbeit konzentrierte sich darauf, die Abhängigkeit der Radardaten zum ungenauen Lagesensor aufzulösen. Dabei wurde die Fluglage über die Radardaten selbst berechnet, womit die Fluglage genauer bestimmt werden kann als allein über den Lagesensor. Erst damit kann die höhere Genauigkeit des neuen Radarsensors auch tatsächlich ausgenutzt werden.
Mit den Ergebnissen der Forschungsarbeit sowie den vorgestellten Radarsensoren, stehen der Fernerkundung mit kleinen Drohnen, neben den klassischen Sensoren, zukünftig auch Radarsensoren zur Verfügung. Mit dem Messsystem und den Erkenntnissen aus der Forschungsarbeit werden bereits erste spezifische Anwendungen in Forschungsprojekten untersucht. Darüber hinaus konnten auch Anwendungsfälle außerhalb der Fernerkundung identifiziert werden. Die Weiterentwicklung im Bereich des autonomen Fahrens wird für Leistungssteigerungen bei Radarsensoren sorgen. Damit stehen auch der Fernerkundung zukünftig noch bessere Radarsensoren zur Verfügung.
Within this thesis the hedging behaviour of airlines from 2005 to 2019 is analysed by using an unbalanced panel dataset consisting of a total of 78 airlines from 39 countries. The focus of the analysis is on financial and operational hedging as well as the influence of both on CO2 emissions and the development of emitted CO2 emissions. For the analysis Probit models with random effects and OLS models with fixed effects were used.
The results regarding the relationship between leverage and financial hedging indicate a negative relationship between everage and financial fuel hedging and a non-linear convex relationship for highly leveraged airlines, which is contrary to the theory of financial distress.
In addition, the study provides evidence that airlines using other types of derivatives, such as interest rate derivatives, engage in more fuel hedging.
In terms of operational hedging, the analysis suggests that operating a diversified fleet is a complement to, rather than a substitute for, financial hedging. With regard to alliance membership, the results do not show that alliance membership is a substitute for financial hedging, as members of alliances are more likely to engage in hedging transactions and to a greater extent.
The analysis shows that the relative CO2 emissions fall in the period under review, but this does not apply to the absolute amount. No general statement can be made about the influence of financial and operational hedging on CO2 emissions, as the results are mixed.
Zirkularität und zirkulare Geschäftsmodelle in der Holzindustrie: eine empirische Untersuchung
(2025)
Der ökologische Zustand der Erde befindet sich infolge von Umweltverschmutzung, Abfallaufkommen und CO₂-bedingtem Klimawandel in einem kritischen Zustand. Mit rund 40 % trägt der Bau- und Gebäudesektor erheblich zu den globalen Treibhausgasemissionen bei. Holz gilt als klimafreundliche Alternative zu Beton und Stahl, bedarf jedoch ebenfalls einer nachhaltigen Nutzung. Die Kreislaufwirtschaft bietet mit der Wiederverwendung ein zukunftsweisendes Konzept: So sind etwa 45% des beim Rückbau von Gebäuden anfallenden Holzes potenziell als Rohstoff nutzbar. Dadurch werden alternative Rohstoffquellen erschlossen und das Abfallaufkommen reduziert.
Trotz dieses Potenzials liegt der Zirkularitätsgrad der Weltwirtschaft derzeit nur bei 7,2 %. Vor diesem Hintergrund untersucht die Dissertation, welche Wettbewerbsstrategien und welche organisationalen Fähigkeiten die Entwicklung zirkulärer Geschäftsmodelle fördern. Der Fokus liegt auf der Holzindustrie der DACH-Region, die historisch durch forstwirtschaftliche Nachhaltigkeit geprägt ist, jedoch bislang überwiegend linearen Strukturen folgt.
Die Arbeit kombiniert theoretische Fundierung, eine vierjährige Literaturrecherche, Experteninterviews sowie im Zentrum eine quantitative Unternehmensbefragung (n = 200). Daraus wurde eine aktivitätsorientierte Skala zur Bewertung der Zirkularität eines Geschäftsmodells entwickelt. Analysiert wurden drei Perspektiven: Fähigkeiten, Strategien und Stakeholder.
Im Kontext der Fähigkeitsperspektive wurde ermittelt, dass die dynamischen Fähigkeiten positive Implikationen auf die Umsetzung von Zirkularität haben. Im Forschungsfeld der Strategieperspektive wurde deutlich, dass die Innovationsführerschaft positive Effekte auf die Umsetzung der Kreislaufwirtschaft besitzt. Zudem weisen sowohl die Innovationsführerschaft als auch die Qualitätsführerschaft einen positiven indirekten Effekt über die dynamischen Fähigkeiten auf die Entwicklung zirkulärer Geschäftsmodelle auf. Im Rahmen der Stakeholderperspektive wurde eruiert, dass der Stakeholder-Druck im Zusammenwirken mit einem grünen Unternehmensimage eine Katalysator-Wirkung besitzt. Der Einfluss der Interessengruppen führt dazu, dass die Unternehmen ein grünes Image in eine substanzielle Umsetzungsphase überführen. Darüber hinaus wurde ersichtlich, dass der Stakeholder-Druck als zentraler Veränderungsfaktor wirkt. Während die direkten Auswirkungen der dynamischen Fähigkeiten durch den Druck zurückgehen, nehmen die indirekten Effekte auf die Erreichung von Zirkularität zu. Abschließend werden Handlungsempfehlungen für Unternehmen sowie wissenschaftliche Implikationen und zukünftige Forschungsmöglichkeiten abgeleitet.
Case-Based Reasoning (CBR) is a symbolic Artificial Intelligence (AI) approach that has been successfully applied across various domains, including medical diagnosis, product configuration, and customer support, to solve problems based on experiential knowledge and analogy. A key aspect of CBR is its problem-solving procedure, where new solutions are created by referencing similar experiences, which makes CBR explainable and effective even with small amounts of data. However, one of the most significant challenges in CBR lies in defining and computing meaningful similarities between new and past problems, which heavily relies on domain-specific knowledge. This knowledge, typically only available through human experts, must be manually acquired, leading to what is commonly known as the knowledge-acquisition bottleneck.
One way to mitigate the knowledge-acquisition bottleneck is through a hybrid approach that combines the symbolic reasoning strengths of CBR with the learning capabilities of Deep Learning (DL), a sub-symbolic AI method. DL, which utilizes deep neural networks, has gained immense popularity due to its ability to automatically learn from raw data to solve complex AI problems such as object detection, question answering, and machine translation. While DL minimizes manual knowledge acquisition by automatically training models from data, it comes with its own limitations, such as requiring large datasets, and being difficult to explain, often functioning as a "black box". By bringing together the symbolic nature of CBR and the data-driven learning abilities of DL, a neuro-symbolic, hybrid AI approach can potentially overcome the limitations of both methods, resulting in systems that are both explainable and capable of learning from data.
The focus of this thesis is on integrating DL into the core task of similarity assessment within CBR, specifically in the domain of process management. Processes are fundamental to numerous industries and sectors, with process management techniques, particularly Business Process Management (BPM), being widely applied to optimize organizational workflows. Process-Oriented Case-Based Reasoning (POCBR) extends traditional CBR to handle procedural data, enabling applications such as adaptive manufacturing, where past processes are analyzed to find alternative solutions when problems arise. However, applying CBR to process management introduces additional complexity, as procedural cases are typically represented as semantically annotated graphs, increasing the knowledge-acquisition effort for both case modeling and similarity assessment.
The key contributions of this thesis are as follows: It presents a method for preparing procedural cases, represented as semantic graphs, to be used as input for neural networks. Handling such complex, structured data represents a significant challenge, particularly given the scarcity of available process data in most organizations. To overcome the issue of data scarcity, the thesis proposes data augmentation techniques to artificially expand the process datasets, enabling more effective training of DL models. Moreover, it explores several deep learning architectures and training setups for learning similarity measures between procedural cases in POCBR applications. This includes the use of experience-based Hyperparameter Optimization (HPO) methods to fine-tune the deep learning models.
Additionally, the thesis addresses the computational challenges posed by graph-based similarity assessments in CBR. The traditional method of determining similarity through subgraph isomorphism checks, which compare nodes and edges across graphs, is computationally expensive. To alleviate this issue, the hybrid approach seeks to use DL models to approximate these similarity calculations more efficiently, thus reducing the computational complexity involved in graph matching.
The experimental evaluations of the corresponding contributions provide consistent results that indicate the benefits of using DL-based similarity measures and case retrieval methods in POCBR applications. The comparison with existing methods, e.g., based on subgraph isomorphism, shows several advantages but also some disadvantages of the compared methods. In summary, the methods and contributions outlined in this work enable more efficient and robust applications of hybrid CBR and DL in process management applications.
When natural phenomena and data-based relations are driven by dynamics which are not purely local, they cannot be described satisfactorily by partial differential equations. As a consequence, mathematical models governed by nonlocal operators are of interest. This thesis is concerned with nonlocal operators of the form
$\mathcal{L}u(x) = PV \int_{\mathbb{R}^d} (u(x)-u(y)) K(x,dy), x \in \mathbb{R}^d$,
which are determined through a family of Borel measures $K=(K(x, \cdot))_{x \in \mathbb{R}^d}$ on $\mathbb{R}^d$ and which act on the vector space of Borel measurable functions $u: \mathbb{R}^d \rightarrow \mathbb{R}$. For a large class of families $K$, namely those where $K$ is a symmetric transition kernel satisfying a specific non-degeneracy condition, a variational theory for nonlocal equations of the type $\mathcal{L}u=f$ is established which builds upon gadgets from both measure theory and classical analysis. While measure theory is used to provide a nonlocal integration by parts formula that allows to set up a reasonable variational formulation of the above equation in dependency of the particular boundary condition (Dirichlet, Robin, Neumann) considered, Hilbert space theory and fixed-point approaches are utilized to develop sufficient conditions for the existence of variational solutions. This theory is then applied to two specific realizations of $\mathcal{L}$ of interest before a weak maximum principle is established which is finally used to study overlapping domain decomposition methods for the nonlocal and homogeneous Dirichlet problem.
Bilevel problems are optimization problems for which parts of the variables
are constrained to be an optimal solution to another nested optimization
problem. This structure renders bilevel problems particularly well-suited for
modeling hierarchical decision-making processes. They are widely applicable
in areas such as energy markets, transportation systems, security planning,
and pricing. However, the hierarchical nature of these problems also makes
them inherently challenging to solve, both in theory and in practice.
In this thesis, we study different nonlinear problem settings for the
nested optimization problem. First, we focus on nonlinear but convex bilevel
problems with purely integer variables. We propose a solution algorithm that
uses a branch-and-cut framework with tailored cutting planes. We prove
correctness and finite termination of the method under suitable assumptions
and put it into context of existing literature. Moreover, we provide an
extensive numerical study to showcase the applicability of our method and
we compare it to the state-of-the-art approach for a less general setting on
suitable instances from the literature. Furthermore, we discuss challenges that
arise when we try to generalize our approach to the mixed-integer setting.
Next, we study mixed-integer bilevel problems for which the nested
problem has a nonconvex and quadratic objective function, linear constraints,
and continuous variables. We state and prove a complexity-theoretical hardness result for this
problem class and develop a lower and upper bounding scheme to solve
these problems. We prove correctness and finite termination of the proposed
method under suitable assumptions and test its applicability in a numerical
study.
Finally, we consider bilevel problems with continuous variables, where
the nested problem has a convex-quadratic objective function and linear
constraints. We reformulate them as single-level optimization problems using
necessary and sufficient optimality conditions for the nested problem. Then,
we explore the family of so-called P-split reformulations for this single-level
problem and test their applicability in a preliminary numerical study.
Spatial microsimulation is an important tool for integrating geographical information into the evaluation of public policies and the analysis of social phenomena in urban regions. These models simulate the behavior and interaction between units of the region, such as individuals, households or firms, under specific conditions that may or not involve projections over time. This requires a representative base data set for their respective units.
In this thesis, we focus on the geo-referencing step of the population in the construction of this data set, where we define the location of the individuals so that the allocation obtained is representative in relation to the population of the region. To do this, we consider the assignment of households to dwellings with specific coordinates by solving a maximum weight matching problem where side constraints are included so that the allocation obtained satisfies statistical structures intrinsic to the considered region.
The model of this problem represents each feasible assignment of household to dwelling as a binary variable, which results in billions of variables for medium-sized municipalities such as the city of Trier, Germany. Therefore, standard solvers for mixed-integer linear optimization are not able to solve it due to their high time and memory consumption. Hence, we develop two approaches capable of producing high-quality allocations using a reasonable amount of computational resources, one based on specific decomposition algorithms, and the other characterized by the application of an approximation algorithm in the framework of Lagrangian relaxation of the side constraints.
We theoretically explore the allocations obtained by both approaches and perform an extensive computational study using synthetic data sets and real-world data sets associated with the city of Trier. The results show that the developed methods are able to obtain near-optimal solutions using significantly less memory and time than the solver Gurobi, which enables them to tackle significantly larger instances, with approximately 100 000 households and dwellings. Furthermore, the allocations obtained for the real-world data sets correspond to a realistic population distribution, which strengthens the practical applicability of our methods.
In Vielfalt geeint? Europäische Identitätskonstruktionen im bundesdeutschen Diskurs seit 1990
(2025)
Die Arbeit untersucht den bundesdeutschen Diskurs zur europäischen Integration seit 1990 aus diskurslinguistischer Perspektive und versteht ihn als Aushandlungsraum europäischer Identitätskonstruktionen. Ausgangspunkt ist die Annahme, dass institutionelle Vertiefung und geografische Erweiterung der EU nicht allein als verrechtlichte Integrationsschritte zu begreifen sind, sondern stets auch identitätspolitische Dimensionen tragen. Ziel der Studie ist es, die sprachliche Konstituierung der EU als identitätspolitisches Referenzsystem sichtbar zu machen und damit eine diskurslinguistische Ergänzung zur interdisziplinären Integrationsforschung zu leisten. Auf Grundlage eines diachronen Korpus, das zentrale integrationspolitische Etappen und Krisenphasen umfasst, wird ein Mixed-Methods-Ansatz entwickelt, der korpusgeleitete Verfahren mit der hermeneutischen Annotation diskurslinguistischer Kategorien verbindet. Analysiert werden nicht nur lexikalisch-semantische Repräsentationen Europas, sondern vor allem diskursive Grundfiguren wie Einheit, Vielfalt, Eigenes und Fremdes sowie deren Verbindung zu politischen Sinnzuschreibungen. Die Ergebnisse zeigen, in welchem Maße sich im deutschen Diskurs ein stabiler identitätspolitischer Bezugspunkt zur EU herausgebildet hat, wie sich normative Leitbilder und funktionale Rationalitäten überlagern und wie europäische Integration sprachlich zwischen symbolischer Aufladung und strategischer Instrumentalisierung verhandelt wird.
Extracellular enzymes in microbial communities play a central role in nutrient cycling and the degradation of (pollutant) substances in various natural and anthropogenic systems. Bound in aquatic biofilms, sludge aggregates, or even unbound at their interfaces, they are of great importance for both the environment and human health. In particular, in wastewater treatment plants and inland waters, hydrolytic activities influence the wide-reaching efficiency of nutrient removal and self-purification, thus contributing significantly to overall water quality.
The main goal of this dissertation project was to investigate the factors that influence enzymatic activity and the health of microbial communities in activated sludge and river systems, particularly in relation to anthropogenic influences and natural environmental conditions. The aim was to contribute to a better understanding of the sensitivity of our freshwater ecosystems and to support the long-term preservation of water quality and ecological stability. The development and optimization of appropriate methods, as well as their testing and applicability, were the focal points.
For this purpose, a fluorometric microplate assay was developed and adapted to determine both extracellular enzyme activities (EEAs) in activated sludge samples and in intact biofilms. Its suitability for field studies was subsequently tested. Inhibition and activity of selected hydrolases under different conditions were investigated to better understand the mechanisms and potential environmental risks posed by anthropogenic influences and seasonal fluctuations of hydrochemical and climatic parameters.
The first phase of the doctoral thesis involved studies on the inhibition of alkaline phosphatase in activated sludge by oxyanions. Using the fluorometric microplate assay, the inhibitory effect was sensitively detected over a pH range of 7.0 to 8.5. IC50- and IC20-concentrations were calculated from modeled dose-response functions. It was found that vanadate and tungstate caused strong inhibitory effects, while molybdate moderately inhibited the enzyme. An increasing pH led to a reduction in the inhibitory effect of tungstate and molybdate. The inhibition effects of vanadate were not significantly affected by the pH. In municipal wastewater, the concentrations of such metal ions are usually low, but industrial wastewater may have pollutant loads that can significantly impact the removal of phosphorus-containing compounds, and thus the efficiency of treatment plants.
In the second phase, an attempt was made to further adapt the developed methodology to investigate EEA and kinetics in intact freshwater biofilms. Four different types of bead materials (lava, glass, sintered quartz, and ceramics) fitting into a 96-well microplate were tested as carriers for biofilms on both the laboratory and field scale. The analysis included a total of seven hydrolases as representatives of key nutrient cycles such as phosphorus, carbon, and nitrogen: phosphatases, glucosidases, peptidases (two different types), and sulfatase. Experiments with increasing substrate concentrations led to classical kinetic profiles according to the Michaelis-Menten mechanism. This allowed for the prediction of the biofilm enzymes’ response to different substrate concentrations. Parameters such as Vmax and Km could be derived from the modeled curves.
Ceramic beads are particularly suitable for long-term studies due to their high stability, while sintered quartz beads should be preferred for the use in stagnant media (material loss under turbulent conditions). Lava and glass beads, on the other and, proved suboptimal for uniform biofilm development due to their surface properties. The potential use of this fast and sensitive test for ecotoxicological or even human-toxicological studies was demonstrated by the effects of caffeine on the activity of PDE. The result of this part of the research represents a powerful tool for assessing environmental pollution and monitoring water quality.
The high application potential was clearly highlighted in the final phase of the project. The goal here was to deepen the understanding of interactions between seasonal factors, anthropogenic influences, and biofilm processes in rivers by investigating EEA and biofilm parameters such as biomass and relating them to hydrochemical and climatic factors. Ceramic beads were exposed both upstream and downstream of a wastewater treatment plant discharge and sampled over a period of seven months. EEAs and biomass varied depending on the season and location, with higher microbial activity observed upstream in winter. Winter conditions led to the dilution of most nutrients as well as in an increse of dissolved oxygen. Nutrient concentrations analyzed downstream were significantly higher in the summer. Accumulation of nutrient or pollutants during the summer months cannot be excluded, which may have led to a general reduction in enzyme activities.
Potential causes could be inhibitory effects on the enzymes, or a reduced enzyme activity due to a sufficiently high nutrient supply. In general, the sampling site upstream showed a more pronounced seasonal dynamics, with a significant proportion of the variance in biological parameters (activity and biomass) attributable to seasonal factors. A secondary component, likely reflecting the impact of the treatment plant discharge, explained another portion of the data variance. Regardless of the season, high correlations between biological parameters were observed upstream, while downstream the data were more decorrelated. This could be because the biofilms, under chronic stress, respond less dynamically to seasonal fluctuations.
This dissertation illustrates that in addition to anthropogenic stress factors, seasonal fluctuations of hydrochemical and climatic parameters should also be considered in "stress downstream the pipe" studies. The selected methods are recommended for explaining and considering the data variance, as they highlight the complex interplay between microbial enzymatic activity, environmental factors, and pollutants in the activated sludge of wastewater treatment plants and also in aquatic systems. The novel bead assay could pave the way for the future standardization of effect-oriented studies on intact aquatic biofilms.
Perennial crops eliminate soil disturbance and reduce the amount of synthetic chemicals that are applied to the soil, improving soil biodiversity and food web structure. Additionally, perennial cropping is characterised by all year-round surface coverage which benefits soil biota in terms of habitat and food sources. Perennial intermediate wheatgrass (Thinopyrum intermedium, IWG) was domesticated and commercialised by The Land Institute in Kansas as Kernza® and serves as an example for these nature-based solutions. It develops an extensive root system that has a higher nutrient retention, possibly reducing nutrient runoff. It thereby follows a more resource-conservative strategy with improved belowground-oriented resource allocation in its root system. This may reduce the need for excessive fertiliser as the crop has a higher nitrogen efficiency, among other things.
IWG promoted the earthworm community and its diversity, more specifically, the occurrence of epigeic species (litter inhabitants), since those species benefit from the increased soil coverage and elimination of disturbances in the soil. As IWG creates a dense and extensive root system, as shown by the increased occurrence of root-feeding nematodes, endogeic species (horizontal burrowers) are supported through the provision of a reliable food source. IWG was characterised as a mostly undisturbed system with a highly structured food web through nematode analysis, as expressed through the promotion of structure indicators, for example, that are sensitive to disturbances in the soil and are therefore supported under no-till management. The root microbiome is continuously being shaped by the host as the crop regrows from the roots each vegetation period. This creates a symbiotic relationship and a beneficial feedback loop for the crop. Resultantly, the root-endophytic microbiome under IWG had a higher network complexity, connectivity and stability compared to annual wheat. The regrowth from the roots for IWG requires increased nutrient and energy storage, which was indicated by increased starch values. Correspondingly, the longer residence time of the roots in the soil resulted in higher lignin values. Furthermore, the decomposition pathway was dominated by fungivorous nematodes which may correspond to stimulated nutrient cycling and a heterogeneous resource environment, as seen for low input systems.
Overall, perennial wheat cultivation improved soil biodiversity already after an establishment of 3-6 years. As those benefits were present for all three countries, the varying soil and climate conditions do not seem to interfere with the positive effect of perennial wheat on the soil ecosystem, demonstrating a wide transferability and adaptability of the crop onto other study sites as well. Enhanced complexity and connectivity of the food web in comparison to annual wheat may indicate a resistance against abiotic stress, suggesting IWG cultivation as a viable option for a sustainable and resilient agriculture. The improvement in nutrient cycling and the resource-efficient cultivation strategy for IWG could enable cultivation on marginal land where annual crop cultivation is not possible as the soils are susceptible to erosion and nutrient runoff. This opens up new possibilities for agricultural cultivation on previously unused land, thus contributing to food security in the future.
Modellierung von o-PO4- Einträgen in saarländische Oberflächenwasserkörper im Trockenwetterfall
(2025)
Die Verfügbarkeit von ortho-Phosphat (o-PO₄) trägt wesentlich zur Eutrophierung von Fließgewässern bei und gefährdet damit das Erreichen des „guten ökologischen Zustands“ gemäß der EU-Wasserrahmenrichtlinie. Da die kommunalen Kläranlagen zentrale Eintragsquellen darstellen, gewinnt die Reduktion von o-PO₄ an dieser Stelle an Bedeutung. Neben der chemischen Phosphorelimination bietet insbesondere die vierte Reinigungsstufe, primär zur Entfernung von Mikroschadstoffen konzipiert, einen Synergieeffekt mit potenziellen Phosphorentfernungsraten von bis zu 85 %.
Zur Bewertung des Einflusses einer solchen Reinigungsstufe wurde ein Modell für ausgewählte saarländische Oberflächenwasserkörper (OWK) entwickelt, das den Trockenwetterfall als eutrophierungsrelevantes Szenario abbildet. Ein zentraler Bestandteil ist ein neu erarbeiteter Retentionsansatz, der biochemische und physikalische Prozesse wie Adsorption, Sedimentation und biologische Assimilation berücksichtigt. Auf Basis der Differenz zwischen emissionsseitig bilanziertem und gemessenem o-PO₄-Gehalt wurden für jeden OWK Verminderungsraten je Fließmeter abgeleitet und schließlich eine Gleichung zur Abschätzung der Retention in Abhängigkeit der Einzugsgebietsgröße formuliert. Die Validierung zeigt hinreichende Modellgenauigkeit, wenngleich negative Frachtdifferenzen in einigen Gewässern auf zusätzliche, nicht eindeutig quantifizierbare Einträge – etwa aus Landwirtschaft oder Kanalverlusten – hindeuten.
Die Szenarienanalyse belegt, dass eine vierte Reinigungsstufe grundsätzlich zur Reduktion von o-PO₄ an den Messstellen beiträgt. Eine Unterschreitung des geltenden Orientierungswertes wird jedoch nur erreicht, wenn sämtliche Kläranlagen eines OWK nachgerüstet werden – und auch dann nur in einigen Fällen. Damit stellt die vierte Reinigungsstufe allein keine ausreichende Alternative zu den Maßnahmen des 3. Bewirtschaftungsplans des Saarlandes dar, kann jedoch als ergänzende Strategie zur Verringerung der Phosphoreinträge dienen.
Price indices play a vital role in economic measurement as they reflect price levels
and measure price fluctuations. Price level measures are used with macroeconomic
indicators to express them in real terms. These measures are also used to index wages,
rents, and pensions. Furthermore, they are used as a reference for monetary policy
conducted by central banks. Therefore, the provision of accurate price indices is one
of the most important goals of National Statistical Institutes (NSIs), and numerous
studies have been devoted to this goal.
This cumulative dissertation also contributes to this goal. It contains four chapters,
each of which represents a separate research. The first two studies are devoted to
the treatment of seasonal products by using different price index methods. The first
research is co-authored with Ken van Loon. The third research is dedicated to finding
the most accurate method to make price predictions for missing products. The fourth
research is focused on the treatment of products by using different price index methods
when products’ quality characteristics are available.
Measuring the economic activity of a country requires high-quality data of businesses. In the case of Germany, this is not only required at national level, but also at federal state level and for different economic sectors. Important sources for high-quality business data are the business register and, among others, also 14 business surveys which are conducted by the Federal Statistical Office of Germany. However, the quality requirements of the Federal Statistical Office are in contrast to the interests of the businesses themselves. For them, answering to a survey's questionnaire is an additional cost factor, also known as response burden. A high response burden should be avoided, since it can have a negative impact on the quality of the businesses' responses to the surveys. Therefore, sample coordination can be used as a method to control the distribution of response burden while securing high-quality data.
When applying already existing business survey coordination systems, developed by different statistical institutes, legal and administrative standards of German official statistics have to be taken into account. These standards consider different sampling fractions, rotation fractions, periodicity, and stratification of the aforementioned 14 business surveys. Therefore, the aim of this doctoral thesis is to check the existing business survey coordination systems for their applicability in the context of German official statistics and, if necessary, to modify them accordingly. These modifications include the introduction of individual burden indicators which aim to take the individual perception of response burden into account.
For this purpose, several synthetic data sets have been created to test the application of the modified versions of the different business survey coordination systems through Monte Carlo simulation studies. These data sets include a large panel data set, reflecting the landscape of businesses in Rhineland-Palatinate and three smaller, synthetic data sets. The latter have been created with the help of the R package BuSuCo which has been developed within the scope of this thesis. The above mentioned simulation studies are evaluated based on different measures for estimation quality as well as for the concentration and distribution of response burden.
The application of machine learning and deep learning methods to hydrological modelling has advanced significantly in recent years, offering alternatives to traditional conceptual and physically based approaches. Within the numerous algorithms, long short-term memory (LSTM) networks have proven themselves particularly useful for the task of streamflow modelling. This thesis provides a collection of publications that investigate the capabilities, limitations and interpretability of LSTM for the purpose of streamflow modelling and climate change impact assessment within the lowland Ems catchment in Northwest Germany.
Within a comparative performance evaluation, LSTM and its predecessor, the recurrent neural network, demonstrate superior accuracy compared to the conceptual HBV model across various statistical performance metrics. However, a decline in performance was observed during low-flow conditions in certain sub-catchments. The evaluation of the flow duration curve revealed that the ML models more effectively capture the water balance, while HBV better represents streamflow dynamics.
To enhance the interpretability of LSTM, six explainable artificial intelligence techniques were applied. These methods consistently identified seasonal patterns in the temporal relevance of hydroclimatic input data. In combination with an observed correlation between the internal LSTM states and catchment-scale soil moisture dynamics, the findings suggest that LSTM models are capable of implicitly learning the relevant hydrological processes.
Following, the capabilities of LSTM to model climate change impact scenarios, particularly when they extend beyond historically observed climate conditions, are addressed. An ensemble of climate change projections is provided as hydroclimatic input to evaluate the performance of LSTMs and conceptual models. While all models reveal heterogeneous alterations in streamflow under future climate conditions, significant differences emerge based on the model type. Results provide evidence that LSTMs, in combination with the temperature-based Haude formula for estimating potential evaporation, work inadequately under altered climatic regimes, raising concerns about their applicability in long-term projections. The study also indicates the potential need to incorporate physical constraints into LSTM architectures to ensure model robustness and hydrological plausibility beyond the historical training range.
Collectively, this thesis contributes important insights into the applicability and interpretability of LSTM models in streamflow modelling. Despite the presence of a physically realistic representation of soil moisture dynamics of the Ems catchment, no robust change signals for streamflow under climate change can be derived. Those results underscore the potential of LSTM model approaches for accurate streamflow simulation, however, they require us to always critically question LSTM results, particularly when they are applied outside the training range.
Einige Forschungsergebnisse zeigen, dass emotionale Empfindungen kognitive Bereiche beeinflussen oder mit diesen im Zusammenhang stehen. Aufbauend auf den Ergebnissen wurden zwei Studien konzipiert. In Studie 1 wurde der Zusammenhang zwischen den Valenzen der dispositionalen emotionalen Empfindungen und der globalen Selbstbewertung des Gedächtnisses (Metagedächtnis) bei Lehramtsstudierenden (N = 218) untersucht. Die dispositionalen Empfindungen wurden mittels des deutschen Positive and Negativ Affect Schedule (PANAS) (Krohne, Egloff, Kohlmann & Tausch, 1996) und die globale Selbstbewertung des Gedächtnisses mit dem deutschen Squire Subjective Memory Questionnaire (SSMQ) (Wolf, 2017) erfasst. Angenommen wurde, dass die positive Valenz im Gegensatz zu der negativen Valenz im positiven Zusammenhang mit der höheren Gedächtniseinschätzung stehen. Die Ergebnisse bestätigen die Hypothesen. In Studie 2 wurde die aktuelle Valenz mittels des Open Affective Standardized Image Set (OASIS) (Kurdi, Lozano & Banaji, 2017) induziert, um Veränderungen des Metagedächtnisses und der tatsächlichen Gedächtnisleistung bei Lehramtsstudierenden (N = 44) zu untersuchen. Angenommen wurde, dass die positive Valenz positiv, die negative Valenz negativ und die neutrale Valenz nicht auf das Metagedächtnis und die Gedächtnisleistung wirkt. Weitere Zusammenhänge zwischen dem Metagedächtnis und der Gedächtnisleistung sowie der induzierten Valenz und der Gedächtnisleistung wurden angenommen. Die Messinstrumente aus Studie 1 blieben dieselben. Die Gedächtnisleistung wurde mittels eines sinnarmen Silbentests nach Ebbinghaus (1885) operationalisiert. Die Ergebnisse bestätigen die Hypothesen nicht. Die Emotionsinduktion hatte keinen Erfolg. Die Ergebnisse können damit nicht auf eine veränderte Valenz bezogen werden. Wie in Studie 1 zeigte sich ein Zusammenhang zwischen den dispositionalen Empfindungen und dem Metagedächtnis. Weitere explorative Ergebnisse, vor allem im Bezug auf das Geschlecht, wurden dargestellt. Die Ergebnisse sind bedeutsam für die Professionalisierung von Lehramtsstudierenden.
This dissertation examines how individuals unlock their personal power by investigating individual differences in self-regulation, in particular, how situational conditions interact with the personality dispositions of action versus state orientation. Action-oriented individuals are well able to regulate their affective states and to bridge the intention–behavior gap, showing initiative, implementing demanding intentions, and resisting temptations. State-oriented individuals, by contrast, often struggle to regulate affect and experience difficulties enacting intentions, especially under demanding conditions, tending to hesitate and ruminate. While extensive research has highlighted the advantages of action orientation across various domains such as education and health, this thesis challenges the prevailing one-sided perspective that presents action orientation as inherently superior and frames state orientation negatively. Drawing on Personality Systems Interactions theory, the dissertation adopts a dynamic view that understands these dispositions as context-sensitive rather than fixed. The central assumption is that action and state orientation each require different kinds of situational conditions to fully unlock their potential. Across six empirical studies (overall N = 1,067) using a multimethod approach that combines experimental and survey-based research in diverse populations and contextual settings, this dissertation examines (1) action and state orientation as distinct dispositions, (2) their dynamic interaction with situational factors, and (3) ways to support each in mobilizing personal power. Overall, the findings show that each disposition offers unique advantages - they simply require different situational conditions for their potential to unfold.
The role of implicit motives for affective, cognitive and behavioral processes has been a focal part of psychological research for decades. Yet, the majority of research in this field has been concentrated on processes involving implicit motives in adulthood. The systematic investigation of developmental correlates of implicit motives remains largely uncharted. The studies cumulated in this thesis aim to add to the sparse research on implicit motives in childhood and adolescence. Specifically, the development of the implicit power motive in the transition of middle to late childhood as a function of parenting behavior (Chapter 4), the predictive value of the implicit achievement motive for objective swimming performance in children and adolescents (Chapter 5) and the role of motive congruence for successful goal realization in adolescent samples across two cultures (Chapter 6) were investigated. Results of Study 1 (Chapter 4) indicate a negative longitudinal association of authoritarian parenting with the implicit power motive in children that is moderated by children’s perception of psychologically controlling parenting. Study 2 (Chapter 5) extends existing research on the assumed positive association of the implicit achievement motive and sports performance and demonstrates the moderating role of competitive anxiety on this association. Finally, Study 3 (Chapter 6) illustrates a moderating effect of implicit motives on the association of goal commitment and successful goal realization in German and Zambian adolescents, however, this effect was only observed in the domain of power motivation. Findings from all three studies are discussed in the context of the significance of implicit motives for psychological research.
This thesis presents four contributions in the domains of schema/ontology alignment and query processing. First, we present a novel alignment approach, denoted as FiLiPo (Finding Linkage Points), to align the schema of RDF knowledge bases with the response schema of RESTful Web APIs. FiLiPo only requires knowledge about a knowledge base (e.g., class names) but no prior knowledge about the
Web APIs’ data structure. It uses fifteen different string similarity metrics to find an alignment between the schema of a knowledge base and that of aWeb API.
Next, a benchmark system named ETARA (Evaluation Toolkit for API and RDF Alignment) is introduced that was created with the goal to simulate RESTful Web APIs and is able to cover all important characteristics of Web APIs, i.e., latency, timeouts, rate limits and, furthermore, provides configurable response structures (e.g., JSON or XML). Additionally, it was designed to support researchers during
the development of alignment systems.
Afterward, the alignments determined by FiLiPo are used to create a hybrid and federated query processor named TunA (Tunable Query Optimizer forWeb APIs and User Preferences), which allows SPARQL queries combining knowledge bases and RESTful Web APIs and is tunable towards user preferences, i.e., coverage, reliability and execution time. The primary goal of TunA is to return a query result that satisfies the user’s preferences in terms of data quality, even when using unreliable data sources by performing a majority vote over multiple sources.
Lastly, we present a federated query processor, denoted as ORAQL (Overlap and Reliability Aware Query Processing Layer), which uses overlap information to reduce the number of selected sources that are available in a federation. The goal is to reduce redundant data and, hence, improve the query execution speed. Therefore, ORAQL uses a profile feature that provides information about the overlap between all data sources of a federation. Furthermore, we extend the quality estimation of TunA to cover Triple Pattern Fragment interfaces to ensure a user-provided reliability goal.