_ This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 211115, “IBTIKAR Digital Laboratory: A Collaborative Approach Toward Research and Development Challenges in Oil and Gas Upstream,” by Rauf Iqbal, Pranav Kumar, and Muhammad Aamir, ADNOC, et al. The paper has not been peer reviewed. _ The complete paper describes creation of an approach involving a research and development (R&D) environment that supports the business community. The operator’s Ibtikar laboratory is the result of an initiative to develop and provide in-house infrastructure (hardware and software) support within a national oil company’s operations. This synopsis is mostly devoted to a description of R&D challenges faced by the developers and the steps taken to overcome them. Efficient Seismic Attribute Extraction for Azimuthal or Angle Stacks An asset team acquired a few high-density broadband single-source, single-sensor seismic surveys characterized by an enormous amount of data. From acquisition to processing, multiple subcubes (90 GB/cube) were produced for the purpose of capturing subtle geological features. On a conventional machine, loading of the seismic cubes took 40 hours and realizing them took more than 7 days—and this was for only one seismic attribute. To fully use the data in the static and dynamic models of the respective reservoirs, the asset team collaborated with the team behind the described “digital laboratory” to provide a solution best suited for such a large processing requirement. The latter team analyzed the existing system, estimated requirements, and provided a high-performance computing (HPC) system that could handle the massive workload. Table 1 of the complete paper shows the results of a comparison of the conventional workstation with the HPC system. Computationally Intensive Fault-Likelihood Attribute Generation and Fault-Extraction Process In complex geological settings, fault identification and extraction of fault planes has always been a challenging task using traditional seismic-interpretation techniques. Conventional discontinuity-based seismic attributes often face limitations in delineating complex fault patterns, especially in areas of strike/slip and compressional-tectonics regimes. To address these limitations, geoscientists often use sophisticated seismic attributes such as fault likelihood, which follows a sequential multiattribute work flow to generate a fault-imaging volume used later to extract fault planes directly. The key challenge associated with running a computationally intensive attribute with a traditional workstation is that the process is time-consuming and performance is highly compromised because of hardware limitations. The fault-likelihood process itself generates six different seismic volumes during the automatic computation process. The task becomes even more challenging when multiple scenarios are required with varying parameters to assess and finalize the outcomes of fault-imaging volumes and extraction of fault planes. The digital laboratory was used to run this computing-intensive fault-likelihood process on a high-end machine with 800 GB random access memory and 96 cores with improved efficiency. It took only 10 days to run this multiattribute work flow on a seismic volume after running multiple scenarios using different attribute parameters to reduce fault uncertainty and automatically extract fault planes.