Year
Publisher
Journal
Institution
1
Institution Country
Publication Type
Field Of Study
Topics
Open Access
Language
Filter 1
Year
Publisher
Journal
Institution
1
Institution Country
Publication Type
Field Of Study
Topics
Open Access
Language
Filter 1
Export
Sort by: Relevance
Understanding the Urgent Need for Direct Climate Cooling

The intensifying impacts of climate change are exceeding projections and amplifying the risk of catastrophic harm to the environment and society throughout the 21st century. Planned and proposed rates of emissions reduction and removal are not proceeding at a pace or magnitude to meet either the 1.5°C or 2.0°C targets of the Paris Agreement. Moreover, the impacts, damage and loss occurring at today’s 1.2°C of global warming are already significantly disrupting the environment and society. Relying exclusively on greenhouse gas (GHG) emissions reduction and removal without including climate cooling options is thus proving incompatible with responsible planetary stewardship. Multiple approaches to exerting a cooling influence have the potential to contribute to offset at least some of the projected climate disruption if deployed in the near term. Employed thoughtfully, such approaches could be used to limit global warming to well below 1° C, a level that has led to large reductions in sea ice, destabilization of ice sheets, loss of biodiversity, and transformation of ecosystems. An effective plan for avoiding “dangerous anthropogenic interference with the climate system,” would include: a) early deployment of one or more direct cooling influence(s), initially focused on offsetting amplified polar warming; b) accelerated reductions in emissions of CO2, methane and other short-lived warming agents; and c) building capacity to remove legacy GHG loadings from the atmosphere. Only the application of emergency cooling “tourniquets,” researched and applied reasonably soon to a “bleeding” Earth, have the potential to slow or reverse ongoing and increasingly severe climate disruption.

Read full abstract
Open Access
Implementing psychologically informed environments in homelessness services: a qualitative exploration of staff teams’ self-assessments

PurposeThis study aims to explore the perceptions of staff in four teams regarding the implementation of psychologically informed environments (PIE) across a community service and three hostels supporting individuals facing severe and multiple disadvantage.Design/methodology/approachUsing a pre-post design, the PIEs Assessment and Self-Development for Services (known as the Pizazz) was completed by staff before the implementation of PIEs and at a six-month follow-up. A narrative review of the results and thematic analysis of the qualitative data are presented.FindingsThe majority of the Pizazz elements were rated as improved following PIE implementation. Thematic analysis developed three themes influencing staff members’ ability to develop a PIE: Complexities of Our and Wider Systems; Ready-made or Baked from Scratch; and Reflective and Responsive Staff.Research limitations/implicationsA planned one-year follow-up was obstructed by the coronavirus pandemic, limiting understanding of longer-term impact. Having only gathered data from staff members, the results cannot corroborate staff members’ perceptions. Further research could explore other stakeholder perspectives, as well as the impact of PIE implementation on staff perception of resources, and of a possible ceiling effect for hostels trying to develop PIE.Originality/valueTo the authors’ best of knowledge, this is the first UK study to use the Pizazz to evaluate the implementation of PIE.

Read full abstract
Advances in numerical weather prediction, data science, and open‐source software herald a paradigm shift in catastrophe risk modeling and insurance underwriting

AbstractRecent advances in numerical weather prediction, combined with the new generation, high‐resolution climate simulations, and open‐source loss modeling frameworks, herald a move beyond the limited statistical representation of catastrophe risk based on past observations. In this new forward‐looking view of risk, an appreciation that our observed record of past natural catastrophes represents a limited sample of possible events, and that the statistics of weather and climate are changing as the planet warms, highlights a key limitation in traditional catastrophe modeling approaches that are built on defining statistical relationships using the observed record. Instead, ensembles of new spatially and dynamically consistent simulations of weather and climate provide physically plausible, but as‐yet‐unseen events at scales appropriate for making effective risk management and risk transfer decisions. This approach is especially useful in locations around the world where observational records are unobtainable or of short historical duration, such as in low‐income countries. We take a forward‐looking approach at the way that future catastrophe modeling and insurance underwriting could occur in response to these technological and scientific advances, using open‐source loss model frameworks.

Read full abstract
Open Access
Is the rule of halves still relevant today? A cross-sectional analysis of hypertension detection, treatment and control in an urban community.

To estimate percentages of patients with undiagnosed hypertension, diagnosed untreated hypertension and diagnosed, treated and uncontrolled hypertension and to identify sociodemographic factors for diagnosed, uncontrolled hypertension and not having a blood pressure (BP) reading recorded. Data from 320 094 patients aged 18 to less than 80 years from general practices in inner London was analysed using both last recorded BP (blood pressure) and mean BP. Logistic regression models identified factors associated with uncontrolled hypertension and no recorded BP. Twenty-nine thousand, seven hundred and nineteen (9.3%) patients had a recorded diagnosis of hypertension. On the basis of analysis of the last BP value, 14.2% (n = 4207) were untreated and 46.3% (n = 13 749) had uncontrolled hypertension; 10.0% (n = 28 274) without a prior hypertension diagnosis had undiagnosed hypertension. Corresponding values based on mean BP analysis were 8.9% (n = 2367) untreated, 51.5% (n = 13 734) uncontrolled; 4.1% (n = 11 446) undiagnosed. 17.5% (n = 55 960) had no recorded BP value.Black ethnicity was a predictor of uncontrolled hypertension: compared with the White British population, the adjusted odds ratio (AOR) for the Black African population was 1.39 (95% CI: 1.25-1.53) and for the Black Caribbean was 1.31 (95% CI: 1.19-1.45). The White Other group were most likely to have no record of BP measurement (AOR: 1.52; 95% CI: 1.47-1.57); conversely, unrecorded BP was less likely in the Black African (AOR: 0.79; CI: 0.74-0.83) and Black Caribbean (AOR: 0.71; CI: 0.66-0.76) groups, relative to the White British population. In an inner-city, multiethnic population, the 'rule of halves' still broadly applies to the diagnosis and control of hypertension, although only a small proportion were untreated.

Read full abstract
SCHEMA

PurposeThis paper aims to review the development of thinking about the information needed by companies to create an accurate picture of how well they manage their engagement with customers, taking into account the evolution of thinking and practice in this area over the past three decades towards the idea of data-driven customer engagement. It then describes the evolution and use of an assessment and benchmarking process and tool which provide the needed information.Design/methodology/approachLiterature review, conceptual analysis and explanation of the management consulting process are used.FindingsCompanies can get an accurate picture of how well they manage customer engagement provided that a careful assessment approach is used where assessors are properly selected and trained and that there is a strong focus on compliance with requirements rather than “box-ticking” based upon managers’ perceptions.Research limitations/implicationsThe assessment and benchmarking process was developed mainly for use by larger companies, though the findings could be adapted for use by smaller companies.Practical implicationsCompanies whose success depends upon customer engagement should consider using the assessment and benchmarking tool to guide their planning and implementation. They should heed the warnings about the risks of inaccurate assessments which may arise because of the incentives by which managers are managed.Social implicationsThe assessment and benchmarking process has been used by the public sector and government, and given government’s desire to engage citizens better, they should consider adopting the ideas in this paper to reform citizen engagement.Originality/valueThis is the only paper which reviews the development of the assessment process for customer engagement.

Read full abstract
Open Access
Integrated Reservoir Study of the Who Dat Field

Abstract The Who Dat field is located in Mississippi Canyon 503 under 3,100 ft of water, penetrating 11 stacked horizons between 10,000 – 18,000 ft, in Pliocene and Upper Miocene age. The Who Dat OPTI-EX® semi-submersible FPS was the first FPS built on speculation, and was the first privately owned FPS in the world. The field was discovered in December 2007, initial production in December 2011, and the field paid out in 2014. The field has produced more than 52 MMSTB of oil and 99 BCF of gas as of year-end 2017. This paper will cover the integrated reservoir study of the Who Dat field, which includes geological modeling, data management, reservoir surveillance, construction of the integrated reservoir models, history match and forecast of the integrated models, depletion plan optimization, production operation optimization, sensitivity, and uncertainty analysis. The eleven different horizons in the Who Dat field have various depositional environments, petrophysical properties, and fluid properties. As a result, the optimum depletion plan needs to be uniquely designed for each reservoir, as well as for the field as a whole. The biggest reservoir, the 4600, exhibited steep pressure decline and great connectivity which made it a potential candidate for water flooding. However, the facilities do not have space or weight capacity for the water flood equipment, which leaves only high cost solutions such as major platform modifications (wing decks, hull blisters) or building a standalone facility on the Shelf. Based on a thermal simulation model, the injected cold water viscosity is several times higher than the in-situ water which decreases the injectivity of the water injectors. Even though the water flood project was expected to increase ultimate oil recovery significantly, the project team recommended not going forward with the project based on the integrated reservoir study and the resulting economics. If the water flood project was sanctioned, the field would not pay out for years and the net present value of the project would be significantly decreased. The second biggest reservoir, the 4700, had a downhole sample showing undersaturated oil with a GOR of 1,280 SCF/STB; however, the well experienced an abnormally high 4,500 GOR when put on production. A later well that penetrated the updip area showed the zone to be gas bearing, which indicated the reservoir is not in equilibrium with a gas cap and undersaturated oil rim. The project team decided to decrease the rate to see whether the well was rate sensitive. With the help of multiphase meters, the team observed that with decreased rate, the GOR dropped to less than 2,500, and the productivity index increased from 4 stb/psi to 14 stb/psi. By continuing to produce the well at a reduced rate the ultimate recovery significantly increased. The reservoir models have been very consistent after a couple of years’ production. The production and pressure forecast from the 2014 history matched models are within 10% of the historical data, which provides confidence in the integrated model. After 6 years of production, the project team is still actively updating and utilizing the integrated models to evaluate future development wells, secondary recovery opportunities, and production optimization to further increase the value of the Who Dat field.

Read full abstract