Guest editorial In recent times, we have been inundated with articles concerning the oil and gas industry’s race toward digitalization and automation. The pillars of such a transformation are the rapidly evolving Industrial Internet of Things (IIoT), secure cloud computing, data analytics, artificial intelligence (AI), and machine learning. In addition, advances in sensor technology have enabled important breakthroughs in near-continuous real-time measurements, potentially transfiguring our ability to derive pertinent information from real-time measurements of processes that change over time. Sensors that are able to continuously report current downhole conditions in producing wells and surface facilities have become powerful tools for managing oilfields. The skillful combination of these rapidly emerging digital and permanent sensor technologies is moving the industry toward advances in automation to optimize asset performance across the full life cycle of a reservoir with minimum human intervention. Seeing the Unseen We strive to see what is beneath the surface. Our ability to construct a digital twin image of the subsurface is solely dependent upon the data we measure. Oilfield data are classified into two categories: data at rest (archived) and data in motion (live streaming data). The huge amounts of data gathered over time pose a formidable challenge to automation. Secrets to E&P problems may lie hidden within the massive amount of accumulated data. This quandary leads me to something I read in the MIT Sloan Management Review in the early 1990, called metaknowledge, an appreciation of what we know and what we do not know. Normally, we define knowledge as all the facts we have accumulated over time. Metaknowledge is a measure of our sense of awareness of the nature, scope, limits, and uncertainties associated with our primary knowledge. Often metaknowledge is more important than primary knowledge. Metaphorically, the authors, J. Edward Russo and Paul J. H. Schoemaker, described the difference by stating that knowing when to see a doctor (metaknowledge) is more important than how much we know about medicine (primary knowledge). To image the subsurface, we must extract and interpret pertinent information (metaknowledge) buried inside the massive amount of archived data (primary knowledge). It requires Big Data solutions and the development of complex mathematical algorithms that at times can mimic the reasoning processes of the human brain through AI. The integration of 3D seismic data, through geology and petrophysics is at the core of defining a static reservoir model, which can predict reservoir performance. The static reservoir model provides a 3D database for storing 3D reservoir properties referred to as data at rest. The data at rest is updated periodically as new data are acquired (well log and time lapse 3D seismic). Streaming live data measured by permanent sensors are referred to as data in motion. Data in motion include pressure and rate measured at the wellhead or bottomhole and temperature from distributed temperature sensing (DTS) fiber-optic cables at various measurement points (nodes) in the wellbore, on the seabed, and in surface facilities.