Filtern
Erscheinungsjahr
- 2016 (21) (entfernen)
Dokumenttyp
- Dissertation (13)
- Arbeitspapier (8)
Sprache
- Englisch (21) (entfernen)
Schlagworte
- Binomialverteilung (2)
- Feuchtgebiet (2)
- Ghana (2)
- Wasserversorgung (2)
- binomial (2)
- Abwasser (1)
- Approximation (1)
- Arbeitsgedächtnis (1)
- Arzneimittel (1)
- Aufmerksamkeit (1)
- Ausländische Direktinvestitionen (1)
- Außenhandel (1)
- Bedingte logistische Regression (1)
- Begabtenförderung (1)
- Behandlungstechnologien (1)
- Berry-Esseen (1)
- Beurteilungsfehler (1)
- Binomial (1)
- Biological control (1)
- Biologischer Pflanzenschutz (1)
- Bodenpilze (1)
- Bodentiere (1)
- Cortisol (1)
- Diagnostische Urteilskompetenz (1)
- Distraktorverarbeitung (1)
- Ecological Momentary Assessment (1)
- Ecosystem services (1)
- Elektroencephalogramm (1)
- Elektroencephalographie (1)
- Elektrokardiogramm (1)
- Entwicklungsländer (1)
- Equity Premium Puzzle (1)
- European Union (1)
- Exekutive Funktionen (1)
- Fischerei (1)
- Flexibilität (1)
- Foreign Direct Investment (1)
- Formenräume (1)
- Formoptimierung (1)
- Fragmentation of Production (1)
- Fragmentierung (1)
- Frequenzbandkopplungen (1)
- Functional soil biodiversity (1)
- Funktionelle Biodiversität (1)
- Governance (1)
- Gravitätsmodell (1)
- Grundschullehrer (1)
- In-vitro-Kultur (1)
- Inhalation (1)
- Inhalation Toxicology (1)
- Inhibition (1)
- International Trade (1)
- Intervallalgebra (1)
- Konservierende Bodenbearbeitung (1)
- Lebenszyklusanalyse (1)
- Life Cycle Assessment (1)
- Lunge (1)
- Markov Inkrement (1)
- Markov-Kette (1)
- Multinomial (1)
- Mykotoxin (1)
- Nachhaltigkeit (1)
- Normalverteilung (1)
- Optimierung (1)
- PDE Beschränkungen (1)
- PDE Constraints (1)
- Partielle Differentialgleichung (1)
- Phasen-Amplituden-Kopplung (1)
- Plant pathogen repression (1)
- Pressorezeptor (1)
- Privatisierung (1)
- Psychologische Diagnostik (1)
- Psychotherapie (1)
- Pädagogische Diagnostik (1)
- Rechteckwahrscheinlichkeit (1)
- Rundungsfehler (1)
- Scan Statistik (1)
- Schullaufbahnempfehlung (1)
- Selektion (1)
- Shape Optimization (1)
- Shape Spaces (1)
- Sozialökologie (1)
- Stadt (1)
- Stress (1)
- Subsaharisches Afrika (1)
- Toxikologie (1)
- Toxizität (1)
- Waschmittel (1)
- Weltbankkonditionalität (1)
- Wertschöpfung (1)
- World Bank Conditionality (1)
- baroreceptor (1)
- biases in judgement (1)
- cell culture (1)
- cross-frequency coupling (1)
- detergents (1)
- ecological momentary assessment (1)
- educational assessment (1)
- electrocardiogram (1)
- electroencephalogram (1)
- in vitro (1)
- judgement accuracy (1)
- lung (1)
- markov increment (1)
- mean vector length (1)
- modulation index (1)
- multinomial (1)
- mycotoxin degradation (1)
- normal approximation (1)
- patient-focused psychotherapy research (1)
- patienten-orientierte Psychotherapieforschung (1)
- perception (1)
- pharmaceuticals (1)
- phase-amplitude coupling (1)
- rectangular probabilities (1)
- scan statistics (1)
- selection (1)
- selective attention (1)
- simulation study (1)
- teacher judgement (1)
- toxicity (1)
- visuelle Wahrnehmung (1)
- wastewater (1)
- Ökoeffizienz (1)
- Ökologische Dienstleistungen (1)
Institut
- Fachbereich 6 (9)
- Psychologie (4)
- Mathematik (3)
- Raum- und Umweltwissenschaften (3)
- Wirtschaftswissenschaften (2)
The development of our society contributed to increased occurrence of emerging substances (pesticides, pharmaceuticals, personal care products, etc.) in wastewater. Because of their potential hazard on ecosystems and humans, Wastewater Treatment Plants (WWTPs) need to adapt to better remove these compounds. Technology or policy development should however comply with sustainable development, e.g. based on Life Cycle Assessment (LCA) metrics. Nevertheless, the reliability or consistency of LCA results can sometimes be debatable. The main objective of this work was to explore how LCA can better support the implementation of innovative wastewater treatment options, in particular including removal benefits. The method was applied to support solutions for pharmaceuticals elimination from wastewater, regarding: (i) UV technology design, (ii) choice of advanced technology and (iii) centralized or decentralized treatment policy. The assessment approach followed by previous authors based on net impacts calculation seemed very promising to consider both environmental effects induced by treatment plant operation and environmental benefits obtained from pollutants removal. It was therefore applied to compare UV configuration types. LCA outcomes were consistent with degradation kinetics analysis. For the comparison of advanced technologies and policy scenarios, the common practice (net impacts based on EDIP method) was compared to other assessments, to better consider elimination benefits. First, USEtox consensus was applied for the avoided (eco)toxicity impacts, in combination with the recent method ReCiPe for generated impacts. Then, an eco-efficiency indicator (EFI) was developed to weigh the treatment efforts (generated impacts based on EDIP and ReCiPe methods) by the average removal efficiency (overcoming (eco)toxicity uncertainty issues). In total, the four types of comparative assessment showed the same trends: (i) ozonation and activated carbon perform better than UV irradiation, and (ii) no clear advantage distinguished between policy scenarios. It cannot be however concluded that advanced treatment of pharmaceuticals is not necessary because other criteria should be considered (risk assessment, bacterial resistance, etc.) and large uncertainties were embedded in calculations. Indeed, a significant part of this work was dedicated to the discussion of uncertainty and limitations of the LCA outcomes. At the inventory level, it was difficult to model technology operation at development stage. For impact assessment, the newly developed characterization factors for pharmaceuticals (eco)toxicity showed large uncertainties, mainly due to the lack of data and quality for toxicity tests. The use of information made available under REACH framework to develop CFs for detergent ingredients tried to cope with this issue but the benefits were limited due to the mismatch of information between REACH and USEtox method. The highlighted uncertainties were treated with sensitivity analyses to understand their effects on LCA results. This research work finally presents perspectives on the use of transparently generated data (technology inventory and (eco)toxicity factors) and further development of EFI indicator. Also, an accent is made on increasing the reliability of LCA outcomes, in particular through the implementation of advanced techniques for uncertainty management. To conclude, innovative technology/product development (e.g. based on circular economy approach) needs the involvement of all types of actors and the support from sustainability metrics.
Exposure to fine and ultra-fine environmental particles is still a problem of concern in many industrialized parts of the world and the intensified use of nanotechnology may further increase exposure to small particles. Since many years air pollution is recognized as a critical problem in western countries, which led to rigorous regulation of air quality and the introduction of strict guidelines. However, the upper thresholds for particulates in ambient air recommended by the world health organization are often exceeded several times in newly industrialized countries. Such high levels of air pollution have the potential to induce adverse effects on human health. The response triggered by air pollutants is not limited to local effects of the respiratory system but is often systemic, resulting in endothelial dysfunction or atherosclerotic malady. The link between air pollution and cardiovascular disease is now accepted by the scientific community but the underlying mechanisms responsible for the pro-atherogenic potential still need to be unraveled in detail. Based on the results from in- vivo and in vitro studies the production of reactive oxygen species due to exposure to particles is the most important mechanism to explain the observed adverse effects. However, the doses that were applied in many in vivo and in vitro studies are far beyond the range of what humans are exposed to and there is the need for more realistic exposure studies. Complex in vitro coculture systems may be valuable tools to study particle-induced processes and to extrapolate effects of particles on the lung. One of the objectives of this PhD thesis was the establishment and further improvement of a complex coculture system initially described by Alfaro-Moreno et al. [1]. The system is composed of an alveolar type-II cell line (A549), differentiated macrophage-like cells (THP-1), mast cells (HMC-1) and endothelial cells (EA.hy 926), seeded in a 3D-orientation on a microporous membrane to mimic the cell response of the alveolar surface in vitro in conjunction with native aerosol exposure (VitrocellTM chamber). The tetraculture system was carefully characterized to ensure its performance and repeatability of results. The spatial distribution of the cells in the tetraculture was analyzed by confocal laser scanning microscopy (CLSM), showing a confluent layer of endothelial and epithelial cells on both sides of the Transwellâ„¢. Macrophage-like cells and mast cells can be found on top of the epithelial cells. The latter cells formed colonies under submerged conditions, which disappeared at the air-liquid-interface (ALI). The VitrocellTM aerosol exposure system was not significantly influencing the viability. Using this system, cells were exposed to an aerosol of 50 nm SiO2-Rhodamine nanoparticles (NPs) in PBS. The distribution of the NPs in the tetraculture after exposure was evaluated by CLSM. Fluorescence from internalized particles was detected in CD11b-positive THP-1 cells only. Furthermore, all cell lines were found to be able to respond to xenobiotic model compounds, such as benzo[a]pyrene (B[a]P) or 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) with the upregulation of CYP1 mRNA. With this tetraculture system the response of the endothelial part of the alveolar barrier was studied in- vitro in a still realistic exposure scenario representing the conditions for a polluted situation without direct exposure of endothelial cells. After exposure to diesel exhaust particulate matter (DEPM) the expression of different anti-oxidant target genes and inflammatory genes such as NAD(P)H dehydrogenase quinone 1 (NQO1), superoxide dismutase 1 (SOD1) and heme oxygenase 1 (HMOX1), as well as the nuclear translocation nuclear factor erythroid-derived 2 (Nrf2) was evaluated. In addition, the potential of DEPM to induce the upregulation of CYP1A1 mRNA in the endothelium was analyzed. DEPM exposure led not to an upregulation of the anti-oxidant or inflammatory target genes, but to clear nuclear translocation of Nrf2. The endothelial cells responded to the DEPM treatment also with the upregulation of CYP1A1 mRNA and nuclear translocation of the aryl hydrocarbon receptor (AhR). Overall, DEPM triggered a response in the endothelial cells after indirect exposure of the tetraculture system to low doses of DEPM, underlining the sensitivity of ALI exposure systems. The use of the tetraculture together with the native aerosol exposure equipment may finally lead to a more realistic judgment regarding the hazard of new compounds and/or new nano-scaled materials in the future. For the first time, it was possible to study the response of the endothelial cells of the alveolar barrier in vitro in a realistic exposure scenario avoiding direct exposure of endothelial cells to high amounts of particulates.
The main achievement of this thesis is an analysis of the accuracy of computations with Loader's algorithm for the binomial density. This analysis in later progress of work could be used for a theorem about the numerical accuracy of algorithms that compute rectangle probabilities for scan statistics of a multinomially distributed random variable. An example that shall illustrate the practical use of probabilities for scan statistics is the following, which arises in epidemiology: Let n patients arrive at a clinic in d = 365 days, each of the patients with probability 1/d at each of these d days and all patients independently from each other. The knowledge of the probability, that there exist 3 adjacent days, in which together more than k patients arrive, helps deciding, after observing data, if there is a cluster which we would not suspect to have occurred randomly but for which we suspect there must be a reason. Formally, this epidemiological example can be described by a multinomial model. As multinomially distributed random variables are examples of Markov increments, which is a fact already used implicitly by Corrado (2011) to compute the distribution function of the multinomial maximum, we can use a generalized version of Corrado's Algorithm to compute the probability described in our example. To compute its result, the algorithm for rectangle probabilities for Markov increments always uses transition probabilities of the corresponding Markov Chain. In the multinomial case, the transition probabilities of the corresponding Markov Chain are binomial probabilities. Therefore, we start an analysis of accuracy of Loader's algorithm for the binomial density, which for example the statistical software R uses. With the help of accuracy bounds for the binomial density we would be able to derive accuracy bounds for the computation of rectangle probabilities for scan statistics of multinomially distributed random variables. To figure out how sharp derived accuracy bounds are, in examples these can be compared to rigorous upper bounds and rigorous lower bounds which we obtain by interval-arithmetical computations.
In the first overview lecture, we take a look at conceptualizations of water – from the hydrological cycle to socio-political perspectives on water. During the 20th century, water management developed from traditional uses and local industrial schemes to the “hydraulic paradigm” and finally, to the concept of modern water governance at the turn of the millennium. We will raise the question of whether there has truly been a paradigm shift from the natural, science based hydraulic paradigm to water governance and how dual- isms of culture/society and nature are still being reproduced. With this in mind, we will also take an introductory look at the much talked about global water crisis.
Stakeholder Mapping
(2016)
This report presents the results of a stakeholder mapping exercise carried out in the WaterPower project. The mapping was conducted for the following main research areas of the project: water supply, land use planning and management, wetland management and climate change adaptation/disaster risk reduction. The report gives an overview of the stakeholders that play a role in these respective areas and identifies those who have concomitant responsibilities in different sectors. It represents the first step towards further involvement of stakeholders in the WaterPower project.
The equity premium (Mehra and Prescott, 1985) is still a puzzle in the sense that there are still no convincing explanations for the size of the equity premium. In this dissertation, we study this long-standing puzzle and several possible behavioral explanations. First, we apply the IRR methodology proposed by Fama and French (1999) to achieve large firm level data on the equity premia for N = 28,256 companies in 54 countries around the world. Second, by using preferences data from the INTRA study (Rieger et. al., 2014), we could test the relevant risk factors together with time cognition to explain the equity premium. We document the failure of the Myopic Loss Aversion hypothesis by Benartzi and Thaler (1995) but provides rigorous empirical evidence to support the behavioral theory of ambiguity aversion to account for the equity premium. The observations shed some light on the new approach of integrating risk and ambiguity (together with time preferences) into a more general model of uncertainty, in which both risk premium and ambiguity premium play roles in asset pricing models.
The present work considers the normal approximation of the binomial distribution and yields estimations of the supremum distance of the distribution functions of the binomial- and the corresponding standardized normal distribution. The type of the estimations correspond to the classical Berry-Esseen theorem, in the special case that all random variables are identically Bernoulli distributed. In this case we state the optimal constant for the Berry-Esseen theorem. In the proof of these estimations several inequalities regarding the density as well as the distribution function of the binomial distribution are presented. Furthermore in the estimations mentioned above the distribution function is replaced by the probability of arbitrary, not only unlimited intervals and in this new situation we also present an upper bound.
Phase-amplitude cross-frequency coupling is a mechanism thought to facilitate communication between neuronal ensembles. The mechanism could underlie the implementation of complex cognitive processes, like executive functions, in the brain. This thesis contributes to answering the question, whether phase-amplitude cross-frequency coupling - assessed via electroencephalography (EEG) - is a mechanism by which executive functioning is implemented in the brain and whether an assumed performance effect of stress on executive functioning is reflected in phase-amplitude coupling strength. A huge body of studies shows that stress can influence executive functioning, in essence having detrimental effects. In two independent studies, each being comprised of two core executive function tasks (flexibility and behavioural inhibition as well as cognitive inhibition and working memory), beta-gamma phase-amplitude coupling was robustly detected in the left and right prefrontal hemispheres. No systematic pattern of coupling strength modulation by either task demands or acute stress was detected. Beta-gamma coupling might also be present in more basic attention processes. This is the first investigation of the relationship between stress, executive functions and phase-amplitude coupling. Therefore, many aspects have not been explored yet. For example, studying phase precision instead of coupling strength as an indicator for phase-amplitude coupling modulations. Furthermore, data was analysed in source space (independent component analysis); comparability to sensor space has still to be determined. These as well as other aspects should be investigated, due to the promising finding of very robust and strong beta-gamma coupling for all executive functions. Additionally, this thesis tested the performance of two widely used phase-amplitude coupling measures (mean vector length and modulation index). Both measures are specific and sensitive to coupling strength and coupling width. The simulation study also drew attention to several confounding factors, which influence phase-amplitude coupling measures (e. g. data length, multimodality).
Global food security poses large challenges to a fast changing human society and has been a key topic for scientists, agriculturist, and policy makers in the 21st century. The United Nation predicts a total world population of 9.15 billion in 2050 and defines the provision of food security as the second major point in the UN Sustainable Development Goals. As the capacities of both, land and water resources, are finite and locally heavily overused, reducing agriculture’s environmental impact while meeting an increasing demand for food of a constantly growing population is one of the greatest challenges of our century. Therefore, a multifaceted solution is required, including approaches using geospatial data to optimize agricultural food production.
The availability of precise and up-to-date information on vegetation parameters is mandatory to fulfill the requirements of agricultural applications. Direct field measurements of such vegetation parameters are expensive and time-consuming. On the contrary, remote sensing offers a variety of techniques for a cost-effective and non-destructive retrieval of vegetation parameters. Although not widely used, hyperspectral thermal infrared (TIR) remote sensing has demonstrated being a valuable addition to existing remote sensing techniques for the retrieval of vegetation parameters.
This thesis examined the potential of TIR imaging spectroscopy as an important contribution to the growing need of food security. The main scientific question dealt with the extraction of vegetation parameters from imaging TIR spectroscopy. To this end, two studies impressively demonstrated the ability of extracting vegetation related parameters from leaf emissivity spectra: (i) the discrimination of eight plant species based on their emissivity spectra and (ii) the detection of drought stress in potato plants using temperature measures and emissivity spectra.
The datasets used in these studies were collected using the Telops Hyper-Cam LW, a novel imaging spectrometer. Since this FTIR spectrometer presents some particularities, special attention was paid on the development of dedicated experimental data acquisition setups and on data processing chains. The latter include data preprocessing and the development of algorithms for extracting precise surface temperatures, reproducible emissivity spectra and, in the end, vegetation parameters.
The spectrometer’s versatility allows the collection of airborne imaging spectroscopy datasets. Since the general availability of airborne TIR spectrometers is limited, the preprocessing and
data extraction methods are underexplored compared to reflective remote sensing. This counts especially for atmospheric correction (AC) and temperature and emissivity separation (TES) algorithms. Therefore, we implemented a powerful simulation environment for the development of preprocessing algorithms for airborne hyperspectral TIR image data. This simulation tool is designed in a modular way and includes the image data acquisition and processing chain from surface temperature and emissivity to the final at-sensor radiance data. It includes a series of available algorithms for TES, AC as well as combined AC and TES approaches. Using this simulator, one of the most promising algorithms for the preprocessing of airborne TIR data – ARTEMISS – was significantly optimized. The retrieval error of the atmospheric water vapor during the atmospheric characterization was reduced. As a result, this improvement in atmospheric characterization accuracy enhanced the subsequent retrieval of surface temperatures and surface emissivities intensely.
Although, the potential of hyperspectral TIR applications in ecology, agriculture, and biodiversity has been impressively demonstrated, a serious contribution to a global provision of food security requires the retrieval of vegetation related parameters with global coverage, high spatial resolution and at high revisit frequencies.
Emerging from the findings in this thesis, the spectral configuration of a spaceborne TIR spectrometer concept was developed. The sensors spectral configuration aims at the retrieval of precise land surface temperatures and land surface emissivity spectra. Complemented with additional characteristics, i.e. short revisit times and a high spatial resolution, this sensor potentially allows the retrieval of valuable vegetation parameters needed for agricultural optimizations. The technical feasibility of such a sensor concept underlines the potential contribution to the multifaceted solution required for achieving the challenging goal of guaranteeing global food security in a world of increasing population.
In conclusion, thermal remote sensing and more precisely hyperspectral thermal remote sensing has been presented as a valuable technique for a variety of applications contributing to the final goal of a global food security.
This working paper examines the concept of metabolism and its potential as a critical analytical lens to study the contemporary city from a political perspective. The paper illustrates how the metabolism concept has been used historically, both as a metaphor to describe the technological, social, political and economic dimensions of human-environment relations, and as a concrete analytical tool to quantify and better understand how flows of matter and energy shape the territorial and spatial configurations of cityscapes. Drawing on the example of the urban water metabolism of the Greater Accra Metropolitan Area (GAMA), it is argued that contemporary approaches to metabolic analysis should be extended in two ways to increase the integrative potential of the urban water metabolism concept. On the one hand, the paper demonstrates that a political ecology approach is particularly well-suited to illuminate the contested production of urban environments and move beyond a narrow technical, managerial and state- centric focus in research on urban metabolic relations. On the other hand, the paper advocates for an approach to metabolic analysis that views the urban environment not simply as a relatively static exteriority that is produced by dynamic flows of matter, energy and information, but rather as a dynamic, nested and co-evolutionary network of complex biosocial and material relations, which in itself shapes how various metabolisms interact across scales. The paper then concludes by briefly discussing how a combination of metabolic analysis and political ecology research can inform urban water governance. In sum, the paper emphasizes the need for metabolic analysis to remain open to a plurality of different knowledge forms and perspectives, and to remain attentive to the inherently political nature of material and technological phenomena in order to allow for mutually beneficial exchanges between various scholarly communities.