The data is falling!

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

This GIS Trends column offers another angle on data preservation (see my "Here Today, Here Tomorrow…" piece from the Summer 2024 issue, linked in the references section below). While last time we discussed the loss of research outputs and the data hosted therein with the retirement of Esri's "Classic Story Maps", today we discuss the loss of more "stable" data sources, namely government sources. As we've seen, even something as simple as the name of a place (see Treisman, 2025 on the Gulf of Mexico) can be changed, though even that can lead to a question of changed for who (see Thiessen, 2025 on Denali).

Similar Papers
  • Book Chapter
  • 10.1093/acrefore/9780190264079.013.815
Measuring Violent Crime
  • Jan 30, 2024
  • Joanne Savage

There are a variety of considerations related to the measurement of violence. Concerns regarding collection of primary data include the fact that violence is rare, and typical samples would yield very low variability. Self-report data may provide rich detail, but asking questions about violent behavior may result in social desirability biases. Techniques such as life history calendars have enhanced data collection in some studies. Children may be too young to respond to surveys; measuring their physical aggression can be done with ratings. Aggregate measures of violence are typically derived from government reports. The primary source of official crime data in the United States has been the summary Uniform Crime Reports (UCR), collected from police departments by the Federal Bureau of Investigation for many decades, but it has been replaced by the National Incident-Based Reporting System (NIBRS), a change made nationwide as of 2021. Considerable gaps in the data are expected to be narrowed as local jurisdictions and states adapt to the new system. To complement UCR and NIBRS official estimates of reported crime, the Bureau of Justice Statistics also fields the annual National Crime Victimization Survey (NCVS) to understand victimization in more depth. Researchers wishing to use these sources of data should be aware of the many published discussions of limitations and caveats for each data set. Those interested in studying specific forms of violent crime, including homicide, rape, robbery, and assault, should be aware of idiosyncrasies in the measurement of those crime types. For homicide, there are a variety of additional sources of official data, such as the Supplemental Homicide Reports, the National Vital Statistics System, and the National Death Index. Changes in the collection of assault data, including a move away from the “aggravated assault” category commonly used in the past by UCR and NCVS, deserve particular attention. Cross-national data are limited, and most investigators rely on World Health Organization data for homicide or the International Crime Victimization Survey. These have limited reach and scope compared to Interpol reports that were available in the past. Researchers interested in individual-level data should also be aware that there are many large-scale, secondary data sets available for analysis. These are sometimes restricted and require approval of an application to access sensitive data. Data include those derived from longitudinal studies of children and teenagers such as AddHealth and the National Longitudinal Survey of Youth, the Social Development Project, the Cambridge Study in Delinquent Development, the Dunedin Multidisciplinary Health and Development Study, and the Pathways to Desistance study. There are several other categories of violence that deserve special attention. Because family violence is underreported to police, and official sources of data are thought to be inadequate, researchers have been systematically working to measure and track domestic violence for a long time. The data and reports generally come from noncriminal justice agencies. Many studies use the Conflict Tactics Scale and the Revised Conflict Tactics Scale(CTS2) to gauge intimate partner violence. This index includes subscales that account for severity of violence and context (e.g., fighting in self-defense). The CTS2 has been adapted for a variety of family and dating relationships. The Centers for Disease Control and Prevention fields the National Intimate Partner and Sexual Violence Survey, and researchers may access those data. In the early 21st century, the public and researchers have expressed growing interest in violence by and against the police, and there are now several sources of data for researchers to use. These include the Law Enforcement Officers Killed and Assaulted data collection program and nongovernment sources such as The Washington Post’s police shootings database. Similarly, for those interested in gun violence, government sources are limited, and many investigators turn to nongovernment sources such as the Gun Violence Archive, which has no government or advocacy affiliation. Finally, those interested specifically in mass violence will also find the National Mass Violence Resource Center. The magazine Mother Jones publishes data on mass shooting events, by year, dating back to 1982, and The Washington Post and USA Today also publish an archive of mass violence events.

  • Conference Article
  • 10.4043/20797-ms
SS: Metocean: Validation of HYCOM current profiles using MMS NTL observations
  • May 3, 2010
  • Shejun Fan + 5 more

The HYCOM (HYbrid Coordinate Ocean Model) consortium, sponsored by the National Ocean Partnership Program (NOPP), provides near real time global estimates of daily mean current data back to November 2003. In April 2005, the US Minerals Management Service (MMS) issued a Notice to Lessees and Operators (NTL) regarding the reporting of ocean current data in the deep water of Gulf of Mexico. An extensive body of NTL current data has since been collected by the offshore oil and gas industry and made available via the National Data Buoy Center (NDBC) web site. This provides an extremely valuable source of observational data for current model validation at deepwater drilling locations in the Gulf of Mexico. Careful validation of the HYCOM model is required to ensure critical features of the current regime are adequately represented and to assess model skill. The paper describes the methodology and results of a HYCOM current model validation exercise using the MMS NTL observations in the Gulf of Mexico. The suitability of the model for the offshore industry in Gulf of Mexico is discussed. Introduction As the offshore industry is moving to ever-deeper waters, assessment of the ocean current is required. The knowledge of the ocean currents through depth is essential to riser design and control, operation of Dynamically Positioned Vessels, and other elements of engineering design and operation of deepwater oil and gas facilities. The Hybrid Coordinate Ocean Model (HYCOM) consortium is a multi-institutional effort funded by the National Ocean Partnership Program (NOPP), as part of the U. S. Global Ocean Data Assimilation Experiment (GODAE), to develop and evaluate a data-assimilative hybrid isopycnal-sigma-pressure (generalized) coordinate ocean model (called HYbrid Coordinate Ocean Model or HYCOM). The horizontal dimensions of the global grid are 4500 × 3298 grid points resulting in ~7 km spacing on average. There are up to 32 vertical layers, depending on the water depth, with output at standard Levitus depth levels. Daily data are available from 3 November 2003 to three days into the future (Chassignet et al., 2009). On April 21, 2005, the US Minerals Management Service (MMS) issued a Notice to Lessees and Operators (NTL) regarding the reporting of ocean current data in the deep water of Gulf of Mexico. Since then, the offshore oil and gas industry has collected and reported current data, using Acoustic Doppler Current Profilers (ADCPs), at 78 deep water drilling locations in the Gulf of Mexico. The extensive body of NTL ADCP current data has been made available via the National Oceanic and Atmospheric Administration (NOAA) National Data Buoy Center (NDBC) web site1. This provides an extremely valuable source of observational data for model validation at deepwater drilling locations in Gulf of Mexico. Based on the results of the HYCOM validation in the Gulf of Mexico, this study assesses the suitability of the model for application by the offshore industry in the Gulf of Mexico.

  • Book Chapter
  • 10.1007/978-1-4939-3423-2_9
Sources of Morbidity Data
  • Dec 19, 2015
  • Richard K Thomas

Because there is no centralized source of morbidity data, health data users must often access multiple sources. Morbidity data are available from government sources, association sources, and private industry sources, with each source having advantages and disadvantages. The characteristics of each source are described, and this information is followed by a discussion of the synthetic data (i.e., estimates and projections) that are generated to fill gaps in existing morbidity data. The issues faced by health data users in drawing from a variety of different sources are discussed. This chapter describes the various available sources of data on sickness and disability for the US population and reviews their attributes, level of accessibility, and usefulness. The quality of the data sources is evaluated as appropriate.

  • Research Article
  • Cite Count Icon 53
  • 10.1016/j.marpol.2017.05.003
Offshore pipeline construction cost in the U.S. Gulf of Mexico
  • May 25, 2017
  • Marine Policy
  • Mark J Kaiser

Offshore pipeline construction cost in the U.S. Gulf of Mexico

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 12
  • 10.5194/essd-13-645-2021
A gridded surface current product for the Gulf of Mexico from consolidated drifter measurements
  • Feb 25, 2021
  • Earth System Science Data
  • Jonathan M Lilly + 1 more

Abstract. A large set of historical surface drifter data from the Gulf of Mexico – 3770 trajectories spanning 28 years and more than a dozen data sources – are collected, uniformly processed and quality controlled, and assimilated into a spatially and temporally gridded dataset called GulfFlow. This dataset is available in two versions, with 1/4∘ or 1/12∘ spatial resolution respectively, both of which have overlapping monthly temporal bins with semimonthly spacing and which extend from the years 1992 through 2020. Together these form a significant resource for studying the circulation and variability in this important region. The uniformly processed historical drifter data from all publicly available sources, interpolated to hourly resolution, are also distributed in a separate product called GulfDriftersOpen. Forming a mean surface current map by directly bin-averaging the hourly drifter data is found to lead to severe artifacts, a consequence of the extremely inhomogeneous temporal distribution of the drifters. Averaging instead the already monthly-averaged data in GulfFlow avoids these problems, resulting in the highest-resolution map of the mean Gulf of Mexico surface currents yet produced. The consolidated drifter dataset is freely available at https://doi.org/10.5281/zenodo.3985916 (Lilly and Pérez-Brunius, 2021a), while the gridded products are available for noncommercial use only (for reasons discussed herein) at https://doi.org/10.5281/zenodo.3978793 (Lilly and Pérez-Brunius, 2021b).

  • Conference Article
  • Cite Count Icon 3
  • 10.4043/12961-ms
Deep Water Gulf of Mexico Sea Floor Features Revealed Through 3D Seismic
  • Apr 30, 2001
  • E Scott + 4 more

The Gulf of Mexico sea floor has been mapped with a variety of different methods with varying resolution. Until recently higher resolution 3D seismic surveys have generally been limited to site specific localities. With hydrocarbon exploration advancing and evolving, larger 3D seismic surveys have been acquired and are now available on a regional scale. Mapping of the sea floor reflector on a regional 3D seismic data set in the Green Knoll/Walker Ridge area has resulted in a higher resolution map than previous data sets. The seismic data has revealed a set of regionally extensive furrows, a previously unknown Gulf of Mexico sea floor feature. Preliminary investigations show that the erosive style of the furrows change in a predictable pattern in conjunction with an increase in current flow velocities. The erosional features of the furrow field indicate the presence of strong ocean bottom currents. Introduction A variety of methods during the past few decades have been used to map the sea floor in the northern part of the Gulf of Mexico. Until recently, the highest resolution data sets with the most extensive coverage have been surface-towed, multibeam, side-scan-sonar (NOAA Seabeam) and long-range sonar (GLORIA). Maps from these data sets, such as those produced by Jia Liu and William Bryant1, have approximately a 30 meter resolution on the sea floor (Fig. 1). However, in deep and ultra-deep water depths the resolution is lower due to the greater travel time from source to receiver. Other data sources that have been used to generate sea floor maps of the Gulf of Mexico include gravity, magnetics, satellite sensors and 2D seismic surveys. While maps generated from gravity, magnetics and satellite data cover the whole of the northern Gulf of Mexico or over a larger area, they are of much lower resolution than either the NOAA or GLORIA data. 2D seismic data provides a higher resolution of the sea floor, but due to their nature of a lattice of separate lines, they lack the spatial resolution to define subtle features on the sea floor. As seismic acquisition and processing technology evolved, 3D seismic surveys became available. With the close spacing of data points, 3D surveys provide greater spatial resolution to map the sea floor and illuminate subtle features. However, due to cost and technological limitations, the first surveys were restricted to site specific localities with small areal extents. With the increased spatial resolution these initial surveys provided a more detailed map of the sea floor but only represented a postage stamp look in a regional context of the northern Gulf of Mexico. With the recent advances in technology along with exploration for hydrocarbons advancing to deep and ultra-deep water, larger 3D seismic surveys have been acquired and are now available on a regional scale. With the availability of these regional surveys, sea floor features that were previously unrecognizable have now been identified.

  • Research Article
  • Cite Count Icon 1
  • 10.1306/819a3ea6-16c5-11d7-8645000102c1865d
Late Neogene Stratigraphy (Foraminiferal, Coccolith, and Paleomagnetic), Upper Coastal Group, Jamaica, West Indies: ABSTRACT
  • Jan 1, 1972
  • AAPG Bulletin
  • J H Beard, W V Sliter, L A Sm

Late Neogene planktonic foraminiferal and calcareous nannofossil biostratigraphy of the Upper Coastal Group on the island of Jamaica is compared with the planktonic succession in the Gulf of Mexico and with the standard European stages and reference sections in Italy. Correlation of epoch boundaries and other paleontologic data from the Italian to the Caribbean and Gulf of Mexico regions utilizes restricted occurrences of planktonic foraminiferal and calcareous nannofossil species common to both regions. Species important for this intercontinental correlation and dating include: Globorotalia acostaensis, Sphaeroidinellopsis sphaeroides, Discoaster challengeri, and D. extensus in late Miocene; early Pliocene Globorotalia margaritae and Discoaster quinqueramus; middle and l te Pliocene species of the Globorotalia crassaformis lineage, Sphenolithus abies, and Reticulofenestra pseudoumbilica; and appearance of Globorotalia truncatulinoides, Helicopontosphaera sp., and Gephyrocapsa oceanica, and faunal evidence for onset of climatic deterioration in early Pleistocene. Climatic criteria obtained by analyses of the planktonic fauna provide a basis for recognition of the Pliocene-Pleistocene boundary within the most continuous and fossiliferous exposures of late Neogene marine sediments in the Gulf of Mexico and Caribbean region. On the basis of these data a sequence of planktonic foraminiferal zones and subzones is compared with the polarity reversal stratigraphy within the Gilbert, Gauss, and Matuyama geomagnetic epochs. End_of_Article - Last_Page 603------------

  • Research Article
  • Cite Count Icon 18
  • 10.3389/fpubh.2020.578463
Framework for a Community Health Observing System for the Gulf of Mexico Region: Preparing for Future Disasters
  • Oct 15, 2020
  • Frontiers in Public Health
  • Paul Sandifer + 36 more

The Gulf of Mexico (GoM) region is prone to disasters, including recurrent oil spills, hurricanes, floods, industrial accidents, harmful algal blooms, and the current COVID-19 pandemic. The GoM and other regions of the U.S. lack sufficient baseline health information to identify, attribute, mitigate, and facilitate prevention of major health effects of disasters. Developing capacity to assess adverse human health consequences of future disasters requires establishment of a comprehensive, sustained community health observing system, similar to the extensive and well-established environmental observing systems. We propose a system that combines six levels of health data domains, beginning with three existing, national surveys and studies plus three new nested, longitudinal cohort studies. The latter are the unique and most important parts of the system and are focused on the coastal regions of the five GoM States. A statistically representative sample of participants is proposed for the new cohort studies, stratified to ensure proportional inclusion of urban and rural populations and with additional recruitment as necessary to enroll participants from particularly vulnerable or under-represented groups. Secondary data sources such as syndromic surveillance systems, electronic health records, national community surveys, environmental exposure databases, social media, and remote sensing will inform and augment the collection of primary data. Primary data sources will include participant-provided information via questionnaires, clinical measures of mental and physical health, acquisition of biological specimens, and wearable health monitoring devices. A suite of biomarkers may be derived from biological specimens for use in health assessments, including calculation of allostatic load, a measure of cumulative stress. The framework also addresses data management and sharing, participant retention, and system governance. The observing system is designed to continue indefinitely to ensure that essential pre-, during-, and post-disaster health data are collected and maintained. It could also provide a model/vehicle for effective health observation related to infectious disease pandemics such as COVID-19. To our knowledge, there is no comprehensive, disaster-focused health observing system such as the one proposed here currently in existence or planned elsewhere. Significant strengths of the GoM Community Health Observing System (CHOS) are its longitudinal cohorts and ability to adapt rapidly as needs arise and new technologies develop.

  • Conference Article
  • Cite Count Icon 1
  • 10.4043/22580-ms
Permanent Borehole Seismic in Ultra Deep Offshore Appraisal Wells
  • Oct 4, 2011
  • Scott Taylor + 1 more

Incomplete reservoir characterization is a leading concern in estimating and predicting the reservoir recovery factor and thus the commercial viability of a discovery. The importance of reservoir characterization is amplified for both offshore and ultra deep water offshore asset classes where huge investment decisions are based upon an inequality of initial information from appraisal well tests. Characterization and monitoring of the reservoir between wells still depends largely on data from either towed streamers or ocean bottom cabled or node based sensor networks. The available seismic data is typically of very limited bandwidth, giving similarly limited spatial resolution, and so consequently uncertainties related to reservoir content and continuity (compartmentalization) remain high. In principle, this risk can be mitigated by well tests and by obtaining higher-resolution seismic data, such as 3D VSP's or even cross well tomography data; however, standard operational procedures and current workflows currently make these solutions both costly and risky activities. Offshore appraisal wells are an excellent potential source of dynamic borehole seismic data, in addition to distributed pressure and temperature measurements. The sensors can be installed closer to the reservoir horizon and most importantly below salt layers, yielding significant enhancements in resolution from time lapse active (VSP/cross-well) and passive (microseismic) seismic surveillance. Additionally, permanent seismic sensors are a field proven approach to providing operators with a dynamic and deterministic source of casing and cement bond integrity data in real-time (Smith, et.al, 1998, 2001, 2002, 2010). The acquisition of seismic data from appraisal wells is not currently a standard operating procedure or available option for operators due to a number of prevailing technical issues. The authors present an overview to instrument offshore appraisal wells during both temporary and permanent abandonment procedures and describe state-of-the-art approaches to meeting current challenges. Introduction "Deepwater" oil and gas assets are simply conventional reserves in an unconventional setting. They constitute an asset class of their own largely due to the fact that they share a common set of technical challenges over the entire exploration and production lifecycle: from their identification, development, and production??. Primary deepwater regions are located in the Gulf of Mexico (GOM), Brazil, and West Africa. Approximately 70% of Brazil's current portfolio of recoverable hydrocarbon reserves are located in deepwater basins. Although Brazil has some of the world's most prolific deepwater fields, as an investment target Brazil must compete with West Africa, where the deepwater exploration success rates run as high as 80% in some areas, and with the Gulf of Mexico (GOM), where discoveries are smaller but more frequent (Figueiredo, 2006). One key investment driver is the ability of an operator to optimize an assets recovery factor. This consideration is of more significance for deepwater assets.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 18
  • 10.1080/19425120.2016.1227404
Was Everything Bigger in Texas? Characterization and Trends of a Land-Based Recreational Shark Fishery
  • Jan 1, 2016
  • Marine and Coastal Fisheries
  • Matthew J Ajemian + 4 more

Although current assessments of shark population trends involve both fishery-independent and fishery-dependent data, the latter are generally limited to commercial landings that may neglect nearshore coastal habitats. Texas has supported the longest organized land-based recreational shark fishery in the United States, yet no studies have used this “non-traditional” data source to characterize the catch composition or trends in this multidecadal fishery. We analyzed catch records from two distinct periods straddling heavy commercial exploitation of sharks in the Gulf of Mexico (historical period = 1973–1986; modern period = 2008–2015) to highlight and make available the current status and historical trends in Texas’ land-based shark fishery. Catch records describing large coastal species (>1,800 mm stretched total length [STL]) were examined using multivariate techniques to assess catch seasonality and potential temporal shifts in species composition. These fishery-dependent data revealed consistent seasonality that was independent of the data set examined, although distinct shark assemblages were evident between the two periods. Similarity percentage analysis suggested decreased contributions of Lemon Shark Negaprion brevirostris over time and a general shift toward the dominance of Bull Shark Carcharhinus leucas and Blacktip Shark C. limbatus. Comparisons of mean STL for species captured in historical and modern periods further identified significant decreases for both Bull Sharks and Lemon Sharks. Size structure analysis showed a distinct paucity of landed individuals over 2,000 mm STL in recent years. Although inherent biases in reporting and potential gear-related inconsistencies undoubtedly influenced this fishery-dependent data set, the patterns in our findings documented potential declines in the size and occurrence of select large coastal shark species off Texas, consistent with declines reported in the Gulf of Mexico. Future management efforts should consider the use of non-traditional fishery-dependent data sources, such as land-based records, as data streams in stock assessments. Received January 8, 2016; accepted August 17, 2016

  • Book Chapter
  • 10.1016/b978-0-12-820288-3.00010-x
Chapter 10 - Gulf of Mexico Pipeline Construction Cost
  • Jan 1, 2020
  • The Offshore Pipeline Construction Industry
  • Mark J Kaiser

Chapter 10 - Gulf of Mexico Pipeline Construction Cost

  • Book Chapter
  • 10.1007/978-3-540-78849-2_66
A Framework for Query Capabilities and Interface Design of Mediators on the Gulf of Mexico Data Sources
  • Apr 26, 2008
  • Longzhuang Li + 2 more

The integration of various data sources collected from the Gulf of Mexico (GOM) will become a valuable resource for the public, local government officials, scientists, natural resource managers, and educators. Due to the exclusive and distributive nature of these data, a new framework is developed to retrieve partial results from the underlying data sources to answer more user queries for the union operator. In addition, the user interface of mediators is considered when computing the query capabilities.

  • Conference Article
  • Cite Count Icon 2
  • 10.3997/2214-4609.201413633
Integration of Seismic and Well Data to Characterize Facies Variation in a Carbonate Reservoir - The Tau Model Revisited
  • Sep 7, 2015
  • M Elahi Naraghi + 1 more

In this paper, we present a novel method of data integration based on the permanence of ratio hypothesis. In order to model the conditional probability, it would be convenient if the information from each data source can be assessed independently in order to find P(A|B) and P(A|C), and then these joint probabilities are merged to calculate P(A|B,C) accounting for the redundancy between different data sources. We propose a methodology for calculating the redundancy between different sources of information. Our formulation is based on the information from each data modeled using a mixture of Gaussian assumption indicative of the multiple facies or categories of rock properties observed in the reservoir. We implemented the proposed methodology to characterize a carbonate reservior in the Gulf of Mexico. The available data sets were drill cutting data, core data, well log measurements and 3D seismic volume. We used core data to calibrate log measurements to lithofacies. Then, we merged the probability maps of lithofacies using permanence of ratio hypothesis and generated multiple realization by Monte-Carlo sampling from the probability maps. The modeling resulted in identification of reservoir regions that have higher proportion of dolomitized grainstones that might be suitable drilling targets.

  • Research Article
  • Cite Count Icon 4
  • 10.4031/002533206787353250
Development of a Coastal Ocean Observing System for the Gulf of Mexico
  • Dec 1, 2006
  • Ann E Jochens + 1 more

The Gulf of Mexico Coastal Ocean Observing System (GCOOS) is being developed as one of the regional coastal ocean observing systems under the U.S. Integrated Ocean Observing System (IOOS), which is a contribution to the Global Ocean Observing System (GOOS). GCOOS will be a sustained ocean observing system that provides data, information, and products on marine and estuarine systems to a wide range of users. A GCOOS Regional Association (GCOOS-RA) has been established to develop GCOOS. Activities to build GCOOS have included development of an inventory of extant observing systems, connection of real-time physical data from extant systems into the National Data Buoy Center via the Internet, and establishment and implementation of mechanisms for ongoing identification of (1) stakeholder requirements and priorities and (2) priority pilot projects to meet regional needs. A storm surge and inundation workshop is being held to identify the measurements and products needed by emergency managers and responders to better predict and mitigate effects of storm surge and inundation in the southeastern U.S. and Gulf of Mexico. Funding for enhancements to the GCOOS is being sought from governmental and private sources. For GCOOS to evolve to its full potential, new federal resources targeted to regional coastal ocean observing systems must be committed.

  • Conference Article
  • Cite Count Icon 4
  • 10.4043/8845-ms
Troika Subsea Production System - An Overview
  • May 4, 1998
  • J.M Bednar

The Troika Subsea Production System is located in the Green Canyon Area of the Gulf of Mexico, 150 miles offshore and in 2700 feet of water. Featuring an eight-well subsea manifold with five wells installed initially, Troika is designed to produce 80-100,000 barrels of oil per day to its host platform located 14 miles away in 1350 feet of water. Concern about paraffin and hydrates necessitated considerable analysis, which in turn pointed to the need to insulate the trees, manifold, jumpers, and flowlines. Key features of the Troika development are described in this paper. Introduction The Troika development is an oil field located 150 miles offshore Louisiana in the Green Canyon area of the Gulf of Mexico. Designated as the Green Canyon 244 Unit, which was formed in June, 1993, the development includes Green Canyon blocks 200, 201, 244, and 245 (Fig. 1). Troika is owned equally by BP, Shell, and Marathon, with BP serving as the designated operator. Water depth at the development site is 2700 feet. Troika features an eight slot subsea manifold with five wells installed initially (Fig. 2). The compact size of the subsea manifold permitted its installation in a novel manner using only a supply boat and the drilling rig. The Troika wells are non-TFL and have a 10,000 psi pressure rating. Production from Troika is processed at Shell's Bullwinkle platform located 14 miles away in 1350 feet of water on Green Canyon block 65. Concern about the potential to form hydrates and paraffin was a key factor in formulating the design and operating plan for the subsea production system. As a result, the trees, jumpers, manifold, and flowlines are all insulated to minimize heat loss during production, as well as to extend reaction time to manage hydrate formation potential following a shut-in. Methods used to insulate the trees, manifold, and jumpers were novel and innovative. Insulated tubing was installed in one well to facilitate start-up. Troika represents the longest multi-phase subsea tie-back system in the Gulf of Mexico. First production from Troika was successfully achieved approximately 39 months after discovery further demonstrating industry's ability to deliver fast-track development schedules in deepwater. While this paper is intended to provide an overview of the Troika project, further details may be found in the various companion papers listed in the Reference section. Development History Productive hydrocarbons were discovered at Troika in June 1994 when Marathon drilled the Green Canyon 244–1 well and found 250 feet of net oil pay in the primary S-10 reservoir. In the summer of 1995, Marathon further tested and defined the lateral and vertical continuity of the S-10 reservoir by drilling the Green Canyon 200-1 appraisal well. A second appraisal well, Green Canyon 245–1, was drilled in September, 1995 and discovered a separate, but much smaller, reservoir in a section of the Unit designated as Area 2. Both appraisal wells were temporarily abandoned. During 1995 tradeoff studies were conducted to determine a development approach for Troika, with the two leading candidate systems being a subsea tie-back and a floating production system, the latter featuring a converted drilling rig. The subsea tie-back to Shell's Bullwinkle platform was ultimately selected as it provided the most cost-effective solution based on NPV and capex utilization.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.