The Evolution, Not Revolution, of Digital Integration in Oil and ...

159
The Evolution, Not Revolution, of Digital Integration in Oil and Gas By Michael Trevathan MEng, Petroleum Engineering, Texas A&M University, 2012 BSc, Chemical Engineering, Texas A&M University, 2010 SUBMITTED TO THE SYSTEM AND DESIGN MANAGEMENT PROGRAM IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE IN MASTER OF SCIENCE IN ENGINEERING AND MANAGEMENT AT THE MASSACHUSETTS INSTITUTE OF TECHNOLOGY September 2020 © 2020 Michael Trevathan. All rights reserved. The author hereby grants to MIT permission to reproduce and to distribute publicly paper and electronic copies of this thesis document in whole or in part in any medium now known or hereafter created. Signature of Author_____________________________________________________________________________________________ Michael Trevathan MIT System and Design Management Program August 6, 2020 Certified by_______________________________________________________________________________________________________ Donna H. Rhodes Principal Research Scientist, Sociotechnical Systems Research Center Thesis Supervisor Accepted by______________________________________________________________________________________________________ Joan Rubin Executive Director, System Design and Management Program

Transcript of The Evolution, Not Revolution, of Digital Integration in Oil and ...

The Evolution, Not Revolution, of Digital Integration in Oil and Gas

By

Michael Trevathan

MEng, Petroleum Engineering, Texas A&M University, 2012

BSc, Chemical Engineering, Texas A&M University, 2010

SUBMITTED TO THE SYSTEM AND DESIGN MANAGEMENT PROGRAM

IN PARTIAL FULFILLMENT OF THE REQUIREMENTS

FOR THE DEGREE IN

MASTER OF SCIENCE IN ENGINEERING AND MANAGEMENT

AT THE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

September 2020

© 2020 Michael Trevathan. All rights reserved.

The author hereby grants to MIT permission to reproduce and to distribute publicly paper and

electronic copies of this thesis document in whole or in part in any medium now known or

hereafter created.

Signature of Author_____________________________________________________________________________________________ Michael Trevathan

MIT System and Design Management Program August 6, 2020

Certified by_______________________________________________________________________________________________________ Donna H. Rhodes

Principal Research Scientist, Sociotechnical Systems Research Center Thesis Supervisor

Accepted by______________________________________________________________________________________________________ Joan Rubin

Executive Director, System Design and Management Program

2

[This page is intentionally left blank.]

3

THE EVOLUTION, NOT REVOLUTION, OF DIGITAL INTEGRATION IN OIL AND GAS

Submitted to the System and Design Management (SDM) program

On August 6, 2020 in Partial Fulfillment of the

Requirements for the Degree of

Master of Science in Engineering and Management

Abstract

High impact digital innovations present opportunities for organizations to transform their

business capabilities to adapt for future sustainability. To adopt new platforms offered by

disruptive technologies, organizations must alter or retire existing business models, create and

develop new competencies, and build an agile business culture. An organization’s failure to

respond to evolving digital initiatives will inevitably lead to a loss of competitive advantage and

even obsoletion. Undertaking and managing transformative digital solutions may seem risky, but

the alternative is riskier.

This thesis explores the opportunities associated with integrating digital technologies into

established oil and gas (O&G) organizations where transformation will be exceedingly difficult.

Investing in the right technologies that fit the organizational size, competencies, and culture is

critical for the success of adopted digital initiatives. Case studies reviewing digital investment

portfolios within the O&G industry are presented to evaluate the investment size, capabilities,

and realized value creation associated with digital integration on design and operations.

A systems approach was employed to understand the barriers and limitations to digital

integration in the following areas: data value chain and workflows, data architecture

standardization, and end-to-end lifecycle integration, with emphasis on O&G drilling and

completion operations. Additionally, a business strategy roadmap was created to recommend

realized value opportunities for a digital investment portfolio to succeed in this constantly

evolving marketplace.

Thesis Supervisor: Donna H. Rhodes

Title: Principal Research Scientist, Sociotechnical Systems Research Center

4

[This page is intentionally left blank.]

5

Acknowledgements

The support and guidance from MIT professors and fellow students have made this program and

thesis an invaluable experience both academically and professionally. I want to thank my thesis

advisor, Dr. Donna Rhodes, for the time she dedicated to framing and influencing the direction

of my thesis work. Without her continued support and guidance, this thesis would not have been

possible. I want to also thank all of the MIT SDM and Sloan professors and students for creating

an incredible learning environment throughout the past year. The breadth and diversity of

education and experience within the MIT community has had a pronounced influence on my

personal and professional growth.

Additionally, I am immensely grateful to Chevron for enabling this opportunity to develop new

skills that will be instrumental to my career and future work with the company. The dedicated

support from the Chevron community throughout this program has made this a rewarding and

meaningful experience. I would also like to thank the employees at Chevron, FutureOn, Corva,

eDrilling, Seeq, Veros Systems, XMPro, ChaiOne, and Intellicess who devoted their time to share

insights into the digital space within the O&G industry through multiple interviews and

discussions. Finally, I want to thank my family for their enduring support throughout this

initiative.

6

[This page is intentionally left blank.]

7

Table of Contents ABSTRACT ............................................................................................................................................ 3

ACKNOWLEDGEMENTS ........................................................................................................................ 5

EXECUTIVE SUMMARY ........................................................................................................................ 15

1 INTRODUCTION .......................................................................................................................... 20

1.1 RESEARCH OBJECTIVE AND QUESTIONS ............................................................................................. 25

1.2 RESEARCH APPROACH AND SCOPE .................................................................................................... 26

1.3 OUTLINE OF THESIS ........................................................................................................................ 28

2 DIGITAL MARKET IN O&G ........................................................................................................... 30

3 O&G DIGITAL PORTFOLIOS ......................................................................................................... 36

3.1 SHELL .......................................................................................................................................... 39

3.2 BP .............................................................................................................................................. 41

3.3 CHEVRON ..................................................................................................................................... 43

3.4 EXXONMOBIL ............................................................................................................................... 45

3.5 EQUINOR ..................................................................................................................................... 46

3.6 SAUDI ARAMCO ............................................................................................................................ 47

3.7 OTHER OPERATORS ....................................................................................................................... 48

3.8 DC&I DIGITAL INITIATIVES .............................................................................................................. 49

3.9 DIGITAL PARTNERSHIP CATEGORIZATION ........................................................................................... 51

4 SYSTEMS APPROACH TO A DIGITAL PORTFOLIO .......................................................................... 52

4.1 SYSTEM CONTROL .......................................................................................................................... 57

4.2 CURRENT SYSTEM STATE ................................................................................................................. 59

4.2.1 ENGINEERING DESIGN ................................................................................................................ 59

4.2.2 OPERATIONS ............................................................................................................................ 61

4.3 FUTURE SYSTEM STATE ................................................................................................................... 68

4.3.1 ENGINEERING DESIGN ................................................................................................................ 69

4.3.2 OPERATIONS ............................................................................................................................ 78

4.4 IDENTIFIED GAPS ........................................................................................................................... 90

8

4.4.1 HUMAN TO SYSTEM INTEGRATION ............................................................................................... 91

4.4.2 SENSOR & DATA QUALITY ........................................................................................................... 92

4.4.3 DATA ACCESSIBILITY ................................................................................................................... 93

4.4.4 STANDARDIZATION AND INTEROPERABILITY .................................................................................... 94

4.4.5 RETURN ON INVESTMENT ........................................................................................................... 94

4.4.6 PARTNERSHIPS AND ALLIANCES .................................................................................................... 95

4.4.7 DEVOPS .................................................................................................................................. 95

5 MODEL BASED SYSTEM ENGINEERING ........................................................................................ 97

6 SYSTEMS APPROACH TO DIGITAL ARCHITECTURE ..................................................................... 100

6.1 O&G REAL-TIME DATA ARCHITECTURE ........................................................................................... 100

6.2 DATA STANDARDS ....................................................................................................................... 103

6.3 ENERGISTICS ............................................................................................................................... 105

6.4 ENTERPRISE ARCHITECTURE ........................................................................................................... 107

7 DATA ANALYTICS IN WELL DESIGN AND OPERATIONS ............................................................... 111

7.1 DATA MINING ............................................................................................................................. 114

7.2 EXAMPLE CASE STUDIES & APPLICATIONS ........................................................................................ 117

8 DIGITAL PLATFORM DESIGN ..................................................................................................... 120

8.1 DATA SCIENCE AND MACHINE LEARNING (DSML) PLATFORMS ........................................................... 120

8.2 CHALLENGES TO DIGITAL PLATFORM ADOPTION ............................................................................... 122

8.3 DIGITAL EVALUATION METHODOLOGY ............................................................................................ 127

9 ECONOMIC DESIGN FOR UNCERTAINTY .................................................................................... 132

10 COLLABORATION INITIATIVES ................................................................................................... 137

10.1 VIRTUAL ASSISTANTS .................................................................................................................... 137

10.2 SOCIO-NETWORK KNOWLEDGE GRAPH ........................................................................................... 140

11 CONCLUSION ............................................................................................................................ 144

11.1 LIMITATIONS AND FUTURE RESEARCH ............................................................................................. 148

REFERENCES ..................................................................................................................................... 150

9

[This page is intentionally left blank.]

10

Figures Figure 1: Digital Transformation with Focus on Organizational Capabilities .............................................. 16

Figure 2: EIA World Energy Outlook 2019: Oil Demand (International Energy Agency (EIA), n.d.) ............ 21

Figure 3: Digital Initiatives Capital Investment vs. Realized Value (Espinoza, Thatcher, and Eldred 2019) 24

Figure 4: Thesis Roadmap .......................................................................................................................... 27

Figure 5: Value-at-Stake for O&G Digital Initiatives, Adapted from (World Economic Forum 2017) ......... 31

Figure 6: O&G Investments in Digital Technology, Adapted from (World Economic Forum 2017) ........... 35

Figure 7: Oil & Gas Industry Decomposition .............................................................................................. 54

Figure 8: Oil and Gas Data Pipeline ............................................................................................................ 55

Figure 9: Non-Productive Time Categorization .......................................................................................... 56

Figure 10: IHS Markit Rushmore Reviews Database US Deepwater NPT% and Well Cost, 2012-2020 ...... 57

Figure 11: ANI-95 Enterprise-Control System Model, adaption from (de Wardt 2019) ............................. 58

Figure 12: Current State Drilling Engineering Design Workflow ................................................................ 60

Figure 13: Current State Drilling Operations Workflow ............................................................................. 62

Figure 14: Example Real-Time Drilling Data ............................................................................................... 64

Figure 15: Current State Operations Data Flow Architecture .................................................................... 65

Figure 16: Drilling Data System Flow Diagram – Emphasis on Data Stakeholders ..................................... 67

Figure 17: Front End Expectations versus Back-End Reality ....................................................................... 69

Figure 18: Future System State: Digital Design Platform ........................................................................... 71

Figure 19: Operations Edge Computing Data Flow Architecture (“Oil and Gas at the Edge | Automation

World” n.d.) ................................................................................................................................................ 80

Figure 20: Digital Twin Architecture ........................................................................................................... 83

Figure 21: O&G Digital Twin Architecture .................................................................................................. 85

Figure 22: Systems Methods & Tools, Adapted from (“INCOSE: Systems Engineering Vision 2025” 2014)

.................................................................................................................................................................... 99

Figure 23: General Field-to-Office Data Flow Architecture ...................................................................... 101

Figure 24: On-Shore Production Operations Data Telecommunications Architecture ............................ 101

Figure 25: Offshore Drilling Operations Data Telecommunication Architecture ..................................... 102

Figure 26: Enterprise Data Platform Architecture, Portions of Microsoft Data Platform Reference

Architecture Image (https://docs.microsoft.com/en-us/azure/architecture/example-

scenario/dataplate2e/data-platform-end-to-end) used with permission from Microsoft. ..................... 108

11

Figure 27: Data Science and Artificial Intelligence (“How To Be A Data Scientist - Quantum Computing”

n.d.) [left] and (“Part 1: Artificial Intelligence Defined | Deloitte | Technology Services” n.d.) [right] ... 112

Figure 28: Artificial Intelligent Applications in E&P Industry, derived from (Bravo et al. 2012) data ...... 113

Figure 29: 10 Years of Digital Search Trend Data on Google Trends ........................................................ 114

Figure 30: Common Algorithm Analytic Methods, Adapted from (Rexer 2017) ...................................... 116

Figure 31: Bayesian Network Feedback Loop ........................................................................................... 117

Figure 32: Neural Network for Drilling Rate (Xue 2020) ........................................................................... 118

Figure 33: Relative Risk Priority of Digital Barriers ................................................................................... 127

Figure 34: Example Plot of Digital Risk vs. Value for Focus on Tool Characterization .............................. 130

Figure 35: Example Digital Tool Analysis Results ...................................................................................... 131

Figure 36: Oil Production Break-Even Price, Adapted from (“Rystad Energy Ranks the Cheapest Sources

of Supply in the Oil Industry” n.d.) ........................................................................................................... 132

Figure 37: Historical Chart of Crude Oil Prices, Adapted from (McNally 2017; “EIA: Petroleum & Other

Liquids | Spot Price Data,” n.d.) ............................................................................................................... 133

Figure 38: Cost Breakdown to Produce a Barrel of Oil: US Shale (MacroTrends n.d.) ............................. 134

Figure 39: Economic Model Assumptions ................................................................................................ 135

Figure 40: Monte-Carlo Economic Model of Digital Initiatives in Drilling ................................................ 136

Figure 41: Conversational Bot Architecture, Portions of Enterprise-Grade Conversational Bot Image

(https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/ai/conversational-bot)

used with permission from Microsoft. ..................................................................................................... 139

Figure 42: Sociotechnical Knowledge Graph of O&G Major Capital Project, Created from (“Kumu” n.d.)

.................................................................................................................................................................. 141

Figure 43: Strategic Digital Approach ....................................................................................................... 146

12

Tables Table 1: Shell’s Digital Portfolio Initiatives 1 of 2 ....................................................................................... 39

Table 2: Shell’s Digital Portfolio Initiatives 2 of 2 ....................................................................................... 40

Table 3: BP’s Digital Portfolio Initiatives ..................................................................................................... 41

Table 4: Chevron’s Digital Portfolio Initiatives 1 of 2 ................................................................................. 43

Table 5: Chevron’s Digital Portfolio Initiatives 2 of 2 ................................................................................. 44

Table 6: ExxonMobil's Digital Portfolio Initiatives ...................................................................................... 45

Table 7: Equinor’s Digital Portfolio Initiatives ............................................................................................ 46

Table 8: Saudi Aramco’s Digital Portfolio Initiatives ................................................................................... 47

Table 9: Aker BP’s and Total’s Digital Portfolio Initiatives .......................................................................... 48

Table 10: Digital DC&I Initiatives ................................................................................................................ 49

Table 11: Summary of O&G Digital Initiatives ............................................................................................ 51

Table 12: Oilfield Data Transfer Speeds per Technology .......................................................................... 103

Table 13: Digital Platform Characterization Methodology ....................................................................... 121

Table 14: Barriers to Organizational Digital Adoption .............................................................................. 124

Table 15: Example of Organization Digital Risk Self-Assessment ............................................................. 128

Table 16: Example of Quantification of Digital Tool Risk and Value ......................................................... 129

13

Acronyms AI Artificial Intelligence

ANN Artificial Neural Networks

ASCII American Standard Code for Information Interchange

DC&I Drilling, Completions, and Intervention

DDR Daily Drilling Report

DHP Downhole Pressure

DHT Downhole Temperature

DOF Digital Oilfield

DSATS Drilling System Automation Technical Section

ECD Equivalent Circulating Density

EIA U.S. Energy Information Administration

ER Extended Reality

ETP Energistics Transfer Protocol

HDF5 Hierarchical Data Format v5

IaaS Infrastructure as a Service

IDS Independent Data Services

IEA International Energy Agency

INCOSE International Council on Systems Engineering

IIoT Industrial Internet of Things

IoT Internet of Things

IPT Invisible Productive Time

IT Information Technology

JSON Javascript Object Notation

LEO Low Earth Orbit

MBSE Model Based Systems Engineering

ML Machine Learning

MWD Measurements While Drilling

MR Mixed Reality

NLP Natural Language Processing

NPT Non-Productive Time

14

OC Organizational Capabilities

OPC Open Packaging Convention

PaaS Platform as a Service

PCA POSC Caesar Association

PLC Programmable Logic Controllers

PPDM Professional Petroleum Data Management Association

PRODML Production Markup Language

PT Productive Time

QAQC Quality Assurance Quality Control

RESQML Reservoir Q Markup Language

ROP Rate of Penetration

RPM Revolutions Per Minute

RTOC Real-Time Operations Center

RTU Remote Terminal Unit

SaaS Software as a Service

SCADA Supervisory Control and Data Acquisition

SLC Standards Leadership Council

SoS System of Systems

SPE Society of Petroleum Engineers

SysML Systems Modeling Language

T&D Torque and Drag

UML Universal Modeling Language

VSAT Very Small Aperture Terminal

VR Virtual Reality

V&V Verification and Validation

WITSML Well Information Standards Markup Language

WOB Weight on Bit

XML eXtensible Markup Language

XR Extended Reality

15

Executive Summary

The Oil and Gas (O&G) industry has an opportunity to redefine its operational perspective

through digitalization and artificial intelligence (AI). The industry has experienced high

commodity volatility, increased climate change accountability, challenging geo-political and

geological environments, and disruptions in consumer and business energy selections.

Digitalization empowers the O&G industry to address these disruptive challenges and to provide

innovative solutions that create value to the industry overall. The value at stake for achieving

high-functioning digital initiatives is estimated to be $1.6 to $1.9 trillion (World Economic Forum

2017). The challenge for the O&G industry is to understand the organizational and technical

transformation required to realize the full potential value enabled by digital solutions. To be sure,

the digital roadmap to data science and machine learning (DSML) platform adoption is opaque,

and organizations are at risk for initiating long-term investments that fail to produce value.

However, an organization’s failure to respond to evolving digital initiatives will inevitably lead to

a loss of competitive advantage and even obsoletion. Without a clear digital roadmap, the

tension between digital uncertainty and the immediate call-to-action risks isolated initiatives and

siloed solutions.

Digital transformation is a journey of leveraging connectivity and analytics at an

enterprise-scale in order to drive design and operational efficiencies. The focus around platform

standardization, interoperability, and flexibility is critical to ensure that multi-disciplinary value-

chain ecosystem is integrated for both local and holistic optimizations, which emphasizes the

value for robust collaborative environments. Operational competency (OC) in digital literacy and

AI has been revealed as a bottleneck for digital growth and innovation. Developing digital

competency is essential for creating a data-driven organization that utilizes the value and

commoditization of data, and for developing innovative solutions for leveraging data to engineer

a competitive advantage. The 2020 NewVantage AI survey indicated that “over 90% of the

challenges to becoming data-driven are in the people, process, and culture – not the technology”

(“Big Data and AI Executive Survey 2020” 2020). The advancements in available technology,

including big data, internet of things (IoT), automation, augmented reality, lifecycle system

integration, machine learning, and simulation optimization have surpassed the O&G industry’s

16

current ability to leverage it – and this technology gap continues to increase. Figure 1 shows the

system integration of digital emerging technologies with an emphasis on the organization’s role

to not only connect the digital ecosystem of digital elements, but also to leverage the elements

as an integrated system for improved industrial performance. To remove the bottleneck, an

organization must develop a data-driven (versus experience driven) culture that integrates with

digital platforms and services, with a focus on realized business value.

Figure 1: Digital Transformation with Focus on Organizational Capabilities

Interviews were conducted for this thesis with innovative leaders in the oilfield digital

space. Their input and perspective were utilized to develop and understand the current and

future state of the role of digital in oilfield design and process control. The bullets below outline

the guidance and takeaways from these organizations’ journey to transform the O&G industry

into a data-driven community:

• Digital innovations and technologies are revolutionary, however the process of adopting

and integrating digital platforms into the social and technical workflows within an

organization is evolutionary.

• Digital opportunities are transforming the design and operational workflows within the

O&G industry, however, the greatest challenges to programmatically adopting

technology enterprise-wide are internal digital competency and awareness, cultural

acceptance to change and disruption, and trust with leaders, partners, and alliances.

17

• The O&G industry has exhibited a historically siloed competitive ecosystem and this

isolated philosophy will no longer remain competitive. Digital platforms built to integrate

systems, partnerships, and expertise will thrive in a networked marketplace.

• A data-driven organization must fundamentally embrace the value of revitalizing,

monetizing, and democratizing organizational data, including both real-time and

historical data, as well as structured and unstructured data.

• Digital developments and services are more likely to provide value if the organizational

capabilities, capacities, and resources are aligned with the functional requirements of the

digital solution (i.e. organizations need to better understand their internal limits before

developing or buying digital solutions that exceed the abilities of the intended

stakeholders).

• Enterprise digital platforms and solutions should be developed with an integrated, open-

sourced, and standardized approach for interoperability and sustainability. Integrated

systems build integrated solutions where management of greater complexities and

optimizations is possible.

• Internal and external digital partnerships and alliances are essential for the continual

growth and development of an enterprise digital ecosystem. Organizations need to

transition to partnerships that embrace the “open” approach to shared growth

opportunities and resources. Innovations and breakthroughs are built off the incremental

improvements from predecessors (network of growth), and this theme should continue

with the system development of a digital enterprise. The objective should shift from

selecting a single system to replace all existing systems, to selecting a digital platform that

integrates and builds off the existing systems.

• Digitization is fundamentally geared toward creating faster and better business decisions

for both design and operations. Within the digital system, data quality (volume, variety,

veracity, velocity, and value) will persist as a continual challenge to analytic accuracy and

capabilities. The data lifecycle must be holistically evaluated to determine the data quality

and processing requirements to model actionable decisions for specific processes. The

data lifecycle is inclusive of data creation with sensor type, placement, and quality, to

18

data filtering, processing, and performing analytical models. Understanding the statistical

accuracy required for a data-driven model to influence business decisions will help dictate

the system requirements to govern a specific design or operation.

• Digitization is about connection, visualization and prediction (analytics), and action

(optimization and automation). The connectivity and repeated analytics create the full

system understanding that generates value. An organization can shift significant energy

from monitoring and data mining to design and operational decision support. Ultimately,

the information and business insight availability, democratized across an entire

organization, allows the workforce to focus entirely on value.

• Notably, successful asset performance management has been achieved through the

integration of cloud enterprise systems, unified operational centers, and operational

lifecycle management, including automation, optimization, and maintenance.

The data shows that advanced software applications, infused with AI on a cloud-based

platform, are able to boost and enhance business decisions for design and operations to provide

a strong competitive advantage. However, the opportunity is advertised as a revolutionary

change, and while the potential capability improvements are immense, the rhetoric is a disservice

to the movement because it underemphasizes the actual amount of infrastructural, cultural,

competency, and workflow shift required for success. Instead, the digital adoption is proving to

be more of an evolutionary process than a single revolutionary shift.

This thesis concludes that organizations underestimate the extent of the challenges

associated with developing, scaling, adopting, executing, and maintaining a digital initiative

across a large organization. The digital literacy, awareness, inquisitiveness, and culture of most

O&G organizations are not yet fully positioned to foster an environment where digital initiatives

can succeed without heavy reliance on external resources. Organizations should consider a

holistic systems perspective when selecting digital partnerships and services. The digital

evaluation method presented in this thesis provides a robust approach to consider the

interdependencies and relationships of the digital system being analyzed with the rest of the

organization. The safe approach is to start with a business challenge, create a digital vision or

strategy, and embrace open-source collaboration and standardization. The capabilities in the

19

digital space are growing rapidly, and organizations must develop or adopt digital initiatives that

prioritize extensibility and sustainability.

20

1 Introduction

Emerging digital technologies and rapid innovations are transforming the oilfield in what

is termed the Digital Oilfield (DOF) transformation. The term has been highly leveraged, both

through hype and to created value, to highlight growth and innovation to CEOs, managers, and

shareholders. The DOF has garnered attention from the entire industry, with integrated oil

companies (IOC), national oil companies (NOC), independents, and oilfield services (OFS) all

heavily investing to achieve the competitive advantage that the transformation promises. The

digital transformation can be simply defined as an organization’s adaption or journey to

restructure and strategize in order to capture the business opportunities enabled by digital

technology. The O&G industry has developed and adopted new technologies for decades,

however, not at the forecasted DOF rate or magnitude required to stay competitive in the

present marketplace. The advent of inexpensive sensors, increased bandwidth capabilities,

improved data processing power, cloud storage, and internet accessibility, combined with new

ways to source and analyze information, sparked the transition toward digital dependency. The

paradigm shift will require organizations to develop new business models, workflows, and

collaboration efforts to ultimately move from an experience-driven to a data-driven culture.

The momentum to digital in the O&G industry is not only about availability and

accessibility to digital products and services; the rapid push has also been influenced by economic

instability from reduced and volatile oil prices, political pressures to reduce environmental

impact, specifically around carbon footprint, and from competitor’s early digital adoption. Due

to the holistic, multi-disciplinary integrated approach that is aimed at enabling optimal economic

recovery, the implementation of a fully connected oilfield provides an opportunity for oil

companies to be profitable in a lower commodity price environment. O&G will be able to create

a business model that does not risk obsoletion during the boom and bust cycles of the oil

industry. Even though the IEA ‘Current Policy’ projects future oil demand to continuously increase

into 2040, oil prices can still fluctuate to uneconomic levels, as shown in Figure 2 with the Stated

Policy and Sustainable Policy demand forecast. The IEA has oil demand forecast models for three

scenarios: Current Policy, Stated Policy, and Sustainable Development – more information on

these forecast assumptions can be found in their Energy Outlook Report. By developing

21

techniques that step-change the profitability margins, companies will have more stability and

control with their business strategies.

The other strong driver for digital adoption is competition. The operational asset

performance efficiencies are now capable through data capture, data management, and data

visualization for advisory and automation algorithms. Operational experience as a core

competency no longer has the competitive advantage that it once did. The advantage is now

leveraged through linking data from thousands of wells to make informed business and

operational decisions, which is not a core competency of major oil companies. This opportunity

allows smaller and less experienced companies the ability to compete and outperform more

experienced organizations.

Figure 2: IEA World Energy Outlook 2019: Oil Demand (International Energy Agency (IEA), n.d.)

In addition to the improved return on investment (ROI) on core business, the emerging

digital technologies provide opportunities for companies to develop new growth ventures in

areas outside their core business. Familiar companies that failed to innovate and transition to the

digital model are Kodak, Blackberry, Sears, Xerox, and Blockbuster – which demonstrates the

difficulty of disrupting the core business model while at the top of an industry. Even though oil

has projected growth into 2040, other influences around carbon restrictions, renewable energy,

alternative energy (solar, wind, water, hydrogen, nuclear), as well as breakthroughs in battery

capacity and electric cars, can all have significant disruptions against the forecast. Companies are

2000 2018 2030 2040

Gt (C

O2 E

miss

ions

)

Mto

e (M

illio

nTon

nes o

f Oil

Equi

vale

nt)

IEA: World Energy Outlook (WEO) 2019 Energy Demand Forecast

Coal Oil Natural Gas Nuclear Renewables Solid Biomass CO2 Emissions (Gt)

0

5

10

15

20

25

30

35

40

45

50

2000 2018 2030 2040 -

2,000

4,000

6,000

8,000

10,000

12,000

14,000

16,000

18,000

20,000

2000 2018 2030 2040

IEA Current Policies IEA Stated Policies IEA Sustainable Development

22

going to take this opportunity, and lessons learned, to explore different business models in the

digital space to ensure future sustainability.

The strategic incorporation of digital technology is critical for future sustainability in the

oilfield, however, the roadmap to successful digital adoption is opaque, where misguided

“digital” projects can lead down frustrating and unproductive paths. Understanding the core

value initiative of the digital transformation is helpful to making value-driven decisions. As a

baseline, the objective of any new initiative includes one or more of the following: reducing costs,

increasing revenue, improving safety, or reducing environmental impact. These objectives are

accomplished through digital platforms by improved machine and human connectivity, abilities

to monitor and measure, big data analytics and algorithms, automation, improved modeling

techniques, collaboration, and optimizations. These opportunities demonstrate just a few core

principles that can be referenced to understand the added value from digital initiatives. The

digital world is able to integrate into the entire O&G value-chain ecosystem to develop strong

innovations, and because of the enterprise-scale impact, the ability to integrate, connect, and

adapt become the key drivers of success.

Upstream O&G exploration and production is a complex, multi-faceted business that

requires large coordination efforts to succeed. Upstream operations is the term that defines all

stages of finding, accessing, and producing oil and gas. This thesis primarily focuses on the

integration of digital platforms in the upstream O&G sector, and specifically around the increased

value from the various service types and methodologies. It will explore the details on each of

these subsystems to explain the infrastructure and software requirements, company and service

partnerships, as well as the financial, environmental, and safety value attained. The current

trends in digital portfolios in the upstream oil and gas sector are listed below.

• Real-time operational surveillance (drilling, completions, production, and facilities);

• Real-time edge computing (field advisory and automation);

• Cloud computing and data management for accessibility, visualizations, and analytics;

• Workflow integration and knowledge platforms (relational databases and mapping);

• Remote operating centers (remote advisory, control, and automation);

23

• Predictive maintenance (combination of sensors, IoT, and algorithms for anomaly

detection);

• Operational asset performance efficiencies;

• Automation and cognitive computing (combining physics-based models with analytical

reasoning);

• Subsurface seismic modeling and characterization (big data analytics for pattern

recognition);

• Multi-disciplinary optimizations (connecting models and historical operation data across

multiple disciplines);

• Production performance forecasting;

• Connected worker (IoT); and

• Supply chain management (smart contracts, logistics, inventory optimization).

These digital initiatives listed above all appear to be investment-worthy initiatives,

however, there are foundational infrastructure and capability requirements needed before a ROI

is achieved. Digital platforms generate value from the network effect – more data and entities

connected within the framework allows the algorithmic methodologies to more effectively

optimize value. The graphic in Figure 3 shows the investment sequential order to ensure the

foundational infrastructure requirements are available to enable the next order of digital

initiative or innovation. The graphic also represents the initiatives as diminishing returns, as is

with most types of investments. However, given the current early stages of digital adoption, the

uncertainty of the realized value will most likely be increasing (potentially exponential

improvements) and not diminishing as companies begin developing in machine learning,

cognitive computing, and optimizations. As these technologies mature within the O&G industry

there will be continued growth in applications and innovation to build additional operational

efficiencies. The initial three investment phases have been leveraged in the O&G industry for

decades with SCADA systems and well log monitoring. However, the volume and variety of data

and resources in that space are going to change dramatically. The data volume and variety are

where digital platforms are able to leverage their pattern recognition and anomaly detection for

24

advisory and automation control, which with increasing process complexity is increasingly more

difficult for an engineer’s mental models to recognize and take action.

Figure 3: Digital Initiatives Capital Investment vs. Realized Value (Espinoza, Thatcher, and Eldred 2019)

Several interpretations from Figure 3 are highlighted below:

• Data quality is paramount and is the first step to enabling quality analytic techniques.

Data is the feedstock that aggregates into analytic models, and organizations should value

the data quality input as much as they value the decision output.

• Data management is about the capturing the five V’s of big data: volume, velocity, variety,

veracity, and value. Data management infrastructure is critical for data connectivity and

accessibility to algorithm methodologies and visualizations.

• Workflows are critical but are presented at a later stage because they require trust in the

technology. The ability to adopt new workflows and depend on automation techniques,

largely depends on the level of trust associated with that technology. This means that the

implemented technology needs to have a history of model accuracy before an

organization will be willing to alter their business model.

• Holistic optimization is challenging to scale to an enterprise-level. Chasing this initiative

can have diminishing returns versus investing in higher quality platforms in the previous

25

(1-7) initiatives. However, once there is a clear path on how to manage scaled

optimizations, this will have significant savings and business improvements.

The methodologies in Figure 3 are not specific to any O&G function and they can be

applied across all sectors and disciplines for a full end-to-end digitally influenced asset lifecycle

management. It is important to recognize that digital initiatives are only an enhancement to core

business value, and unless developing a new growth venture, digital tools only enhance

performance improvements for increased profitability. The importance of this statement is that

the goal is not to be “digital,” the goal is to enhance and improve operational performance with

better data-driven decisions, more efficient workflows, and process and machine automation

(“Refueling the Oil Industry: Transforming Traditional O&G with the Oil of the 21st Century - Red

Chalk Group” n.d.). Understanding and acknowledging these benefits from digital methodologies

does not mean your company is “digital.” It takes a cultural paradigm shift to view problems and

solutions from an information technology perspective (Oil&Gas Journal n.d.). It takes investments

in data structuring, data management, and standardization. And, most of all, it takes trust to

develop and learn from the digital methodologies employed to develop these business

efficiencies (World Economic Forum 2017).

1.1 Research Objective and Questions

The objective of this thesis is to explore the tradespace of digital initiatives within large

O&G companies to develop insight into which strategic platforms and methodologies are

providing realized value. Adopting new digital platforms and workflows can be a burdensome

endeavor for large O&G organizations, considering capital required for digital infrastructure,

platform development, employee development (competency and culture), external competency

resources, administration, and temporary productivity decline during transition and rollout.

Figure 4 outlines the roadmap for this thesis from hypothesis to recommendation. The

hypothesis is aligned with the theory of the modern “productivity paradox” (Silver 2012) where

new technology, coupled with hype and misdirection, create a period of productivity decline. This

occurs during periods where technology growth outpaces the understanding of how to utilize

and process it. The transition to digital is undoubtedly complex, with cloud computing, platform

integration, sensor integration, complex algorithms, and data management. It is a challenging

26

endeavor to understand how each software algorithm or platform works, how and when to trust

the predictions, and what investments to make to lead toward bottom-line realized business

improvements. Ultimately, the hypothesis is that the O&G industry is currently mixed up in a

productivity paradox where an immense amount of energy and investments is directed toward

“getting to digital” that is leading to tangent projects that fail to achieve a return on investment

and deviate business models away from the optimal path. The approach to understanding the

breadth of digital influence in O&G is outlined below.

1.2 Research Approach and Scope

The approach of this thesis is to provide a holistic, systems approach to evaluate the

integration of digital initiatives into the O&G industry. The objective is to evaluate the

interdependencies of the organizational system shift from traditional to digital workflows. The

scope and structure of this thesis is outlined below.

• Evaluate the O&G digital market.

• Identify digital initiatives currently adopted in industry.

• Evaluate digital initiatives from a systems perspective, including:

o Market potential value creation;

o Current and future digital state comparison;

o Technical and digital infrastructure requirements; and

o Organizational capabilities.

• Identify systemic problems or barriers to digital system integration:

o Organizational – People, processes, and culture;

o Standardization – Platform and data standardization; and

o Ecosystem – Full ecosystem integration (local versus holistic optimization).

The intention is not to identify all of the digital dead-ends, which would be difficult to

achieve because most companies avoid advertising failed initiatives. Instead, this thesis highlights

the programs and initiatives that have shown promising results in the O&G industry, and

evaluates the dynamics and characteristics that contribute to that success. The approach is to

provide guidance on the general direction for portfolio and organizational investments to achieve

operational and design improvements.

27

Figure 4: Thesis Roadmap

This thesis aims to evaluate and understand the following questions:

1. What are the current digital investments and digital trends employed in the O&G industry,

specifically to DC&I design and operations?

a. How does this current digital state compare to the future digital vision?

b. What are the critical organizational and technical system barriers to achieving the

transformation to impactful business digitization?

2. Is the O&G industry currently mixed up in a productivity paradox where investments are

directed only toward “getting to digital” and fail to contribute to added business value?

3. Is there a framework that can more effectively evaluate the business value and alignment

of a digital initiative with respect to how it integrates with an organization’s culture,

competency, capability, and vision?

The evaluation of the current digital initiatives in upstream O&G, and more specifically

around drilling, completions, and intervention (DC&I) operations, will leverage a systems

methodology to identify the systems, subsystems, interrelationships and dependencies. Systems

methodologies are best utilized for simplifying complex problems into understandable and

28

actionable entities. Emerging O&G digital technologies and organizational dynamics are

increasingly complex, and our mental models are challenged to conceptualize the full dynamics

of the system without employing systems principles (Meadows and Wright 2008). The influential

components of the integrated system are both multifaceted and relational, with tangible and

intangible dynamics, making the emergent properties unpredictable without the proper

evaluation techniques. A systems approach to this problem is necessary to visualize and

understand the underlying dynamics of the integration of disruptive technologies in an

organization.

The recommendation of this thesis is based only on the analysis performed in this

research, and as a disclaimer, the evaluation of the performance of digital initiatives are difficult

to make conclusions in absolutes. There may be methods or strategies that appear to make more

sense, but equating one company or algorithm quality to another is extremely difficult. The

general adage for models is “trash in, trash out” which would not be at the fault of the

technology, but potentially the fault of the user or the data quality. Nonetheless, despite the

uncertainties, this thesis aims to provide insight on the current digital portfolios of several O&G

companies, as well as insights into the techniques that are providing operational improvement.

The goal is to empower the investor to ask the right questions, challenge hype, and direct a digital

portfolio customized to the needs of the organization.

1.3 Outline of Thesis

Chapter 2: Digital Market in O&G

This chapter provides a background in digital and artificial intelligence applications within

the O&G industry, and aims to provide the value at stake for digital initiatives, as outlined

by published market research reports.

Chapter 3: O&G Digital Portfolios

This chapter reviews the digital portfolios for major O&G companies. The information for

the digital initiatives was captured from news, industry reports, and market research.

Chapter 4: Systems Approach to a Digital Portfolio

This chapter takes a system approach to understanding the current and future state of

digital initiatives within DC&I design and operations processes. The systems approach

29

evaluates the people, tools, and processes involved with bridging the gap from the

current to future state.

Chapter 5: Model Based System Engineering

This chapter briefly reviews model-based systems engineering (MBSE) in order to

evaluate how traditional MBSE methods integrate with new innovative digital workflows.

Chapter 6: Systems Approach to Digital Architecture

This chapter reviews the process flow of data throughout an organization from field

sensors to office analysis. This section reviews data standardization protocols, processing

speeds, and data requirements.

Chapter 7: Data Analytics in Well Design and Operations

This chapter reviews data analytic techniques that are being leveraged for specific design

and operation applications, as described in industry literature.

Chapter 8: Digital Platform Design

This chapter reviews all the principles outlined in the previous chapters to develop a

methodology to review and compare how digital initiatives align with the value and goals

of the organization to reach the specified future digital state.

Chapter 9: Economic Design for Uncertainty

This chapter describes economic evaluation techniques to include uncertainty and

Monte-Carlo models to assess the potential value of a digital investment in DC&I

operations.

Chapter 10: Collaboration Initiatives

This chapter reviews digital collaboration initiatives for an organization, which includes

discussions around the value of removing silos and improving collaboration efforts.

Additionally, the organizational value of socially oriented digital opportunities with social

network graphs and virtual conversational chatbots are explored.

Chapter 11: Conclusion

The conclusion summarizes the findings and proposes a recommended strategy for

evaluating digital initiatives in an organization. The conclusion also discusses future

research and the path forward.

30

2 Digital Market in O&G

The World Economic Forum developed a joint White Paper with Accenture titled “Digital

Transformation Initiative – Oil and Gas Industry (2017).” The report reviews the value creation

from digital trends and themes in the Oil and Gas industry. Four themes that play a critical role

in the digital transformation were identified as (1) Digital Asset Life Cycle Management, (2)

Circular Collaborative Ecosystem, (3) Beyond the Barrel, and (4) Energizing New Energies. While

all these themes are impactful, this thesis will mostly explore the digital asset lifecycle

management in upstream O&G. The applicable technologies are mapped in Figure 5, visualized

by Industry Impact ($Billion) versus Societal Impact ($Billion). The Societal Impact is measured by

the economic impact of emissions (CO2, SO2, NOx, and CO), reduction in water usage and oil

spills, time savings, and reduction in cost to customers. The plotted relationship is an insightful

metric to understand the overall financial value of the digital initiative and the bearing for future

sustainability.

Within the digital asset lifecycle management theme, “operations optimization” has the

potential to unlock the most value for the industry (approximately $275 billion), followed by

predictive maintenance. According to the World Economic Report, within the upstream sector,

90% of the value associated with “operations optimization” is projected to be realized primarily

by optimizing extraction (DC&I) and production operations. The value is expected to accrue from

leveraging and integrating the disparate, multi-disciplinary data sources to feed advanced

analytics algorithms in order to reduce non-productive time and enhance production

performance.

31

Figure 5: Value-at-Stake for O&G Digital Initiatives, Adapted from (World Economic Forum 2017)

According to a new market report from BIS Research titled “Global Artificial Intelligence

(AI) in Energy Market, 2019,” the global AI market capitalization in energy is anticipated to reach

$7.79 Bn by 2024. This projection indicates that over the next 3-5 years, organizations will be

investing significantly in digital technologies. However, this also means that there will be many

new entrants into the market that will aim to capitalize on the new investment trends. Accenture

performed a global survey titled “Accenture Upstream Oil and Gas Digital Trends Survey 2019”

where they surveyed 255 upstream leaders in 47 countries regarding digital technology

investments (Holsman, n.d.). The survey identified five trends in the AI market in energy that are

summarized below with added discussion points.

1. Digital investments are a key enabler to upstream business success, and investments

will continue to increase.

The biggest drive indicated in the Accenture survey for digital investments is the fear of

losing competitive advantage. The Porter’s Five Forces method is useful for analyzing the

risk of competition of any business or industry. These forces derive the competitive

intensity and thus attractiveness of an industry in terms of profitability. Understanding

OperationsOptimization

Predictive Maintenance

Remote Operations Centers

Connected Worker

Autonomous Operations and Robots

Real Time Supply / Demand Balancing

-20

-10

0

10

20

30

40

50

60

70

-20 30 80 130 180 230 280 330

Socie

tal I

mpa

ct ($

Bill

ion)

Industry Impact ($ Billion)

Consumer Energy Choices

480

490

500

510

520Bubble Size = Total Societal Impact + Industry Impact

Societal Impact represents impact of emissons, reduction in water useage and oil spills, time savings, and reduction in costs to consumers. The monetary value of the societal impact of emissions is 1 ton of CO2 = $97 (World Economic Forum, 2017).

32

these Five Forces in O&G is key to understanding the driving influences behind the digital

transformation initiative. Porter’s Five Forces are Competitive Rivalry (revenues, profits,

ROIC), Threat of New Entrants (financial, regulatory, or geographical barriers), Threat of

Substitutes (solar, nuclear, hydrogen, biofuels), Bargaining Power of Buyers (refineries,

local, international, customers), and Bargaining Power of Suppliers (global demand

patterns, OPEC, geopolitics, climate governance) (“Porter’s Five Forces Model for Oil and

Gas Industry – Energy Routes” n.d.). AI will have the largest influence in Competitive

Rivalry with analytics and automated workflows increasing revenue and profits, and in

Threat of New Entrants with experience losing weight compared to innovative design and

operation advisory algorithms, where smaller companies will be more empowered to

compete.

2. Cybersecurity leads digital investments today, followed by Cloud, Big Data Analytics,

AI/ML, and IoT, respectively.

According to a market report published by Transparency Market Research, the global oil

and gas data monetization market is expected to reach $81.6 Bn by 2026. This value is

generated from the selling and trading of large volumes of data to drive value from

advanced data analytic solutions and platforms to improve asset productivity. Operators

currently store mindboggling amounts of unused operational data in their historians.

Recognizing the value of this data for developing digital model improvements that can be

leveraged internally and within the industry, companies are fearful of losing their

intellectual property (IP) with digital platform service providers. As digital partnerships

are integrating into single platforms, IP has become a big focus topic in industry around

the property rights to data.

3. Digital helps to optimize core business with cost reductions by making faster and better

decisions.

The integration of disparate, multi-functional data sources to perform machine learning

and other algorithms for visualization, advisory, and automation are entirely focused on

improving decisions and optimizing productivity. Additionally, open-source platforms

create enhanced workflows that reduce cost and time for design and development. This

33

value contribution over the end-to-end asset lifecycle of O&G operations must continue

to progress for the digital evolution to endure. And, although digital has aimed at

improving core business and opening up new venture opportunities, it also disrupts how

organizations perform business. Digital adoption must coincide with employee

development of skills and capabilities; there must be specific roles allocated in the

organization for digital integration, and engineers need to collaborate better with IT to

build a data-driven culture.

4. Full value from digital is not being realized due to challenges with scaling.

The reference to hype does not mean that a digital technology or machine learning

algorithms do not work per se; in fact, the opposite is true. Hype is frequently influenced

by reports, presentations, or articles that highlight the technology contributing unique

and promising functionality, but often at a small scale. The significant challenge that

industry is addressing is how to scale these AI tools to an enterprise level to validate the

proof-of-concept. The scalability beyond a skunkworks pilot is where most digital

platforms and initiatives fail to succeed. Organizations are highly complex sociotechnical

systems, and designing digital platforms and solutions to integrate with that complex

environment is critical for success.

5. External skills and partnerships are key to unlocking the value of digital.

The O&G industry has been traditionally siloed, both with operators and with oilfield

service providers (OFS). The fear of losing IP and market share has influenced the business

philosophy of developing internally, rather than externally. Operators and OFS are now

entering joint partnerships with both each other and IT companies to further develop

their digital portfolios. The most critical adaptation that has been to unlock the value of

digital are open platforms that integrate expert services from the respective provider with

standardized OpenAPI protocols. This means that digital platform providers are not

competing in the digital analytics and algorithms business model, but are instead

developing adaptable workflow platforms for AI/ML services (C3.ai, DELFI, etc.), physics-

based modeling services (Prosper, OLGA, Landmark, PIPESIM, etc.), and other modular

34

services to create a single source of truth. This is where the impact breadth feeds into

operations, design, and workflow optimizations.

The last part of this section, before reviewing digital portfolios, will address the

pervasiveness of digital applications in O&G. Figure 6 shows the different areas of digital

presence in their respective order of industry investment. The colored bars represent the

variance from the benchmark outlined in the World Economic Forums “Digital Transformation

Initiative” as discovered by the research presented in this thesis. The key points related to this

graphic are that big data processing, management, and analytics continues to be the top

investment for O&G companies. As an overlap, cloud services have been employed to make that

process easier (Microsoft Azure being the primary partnerships with most O&G companies), and

to provide interoperability of data management services to handle the complex processing,

cleaning, and revitalizing as a third-party initiative.

As the quality improves with cloud computing and big data management, artificial

intelligence with machine learning algorithms are able to use the data to develop complex

anomaly and pattern detection for improved design and decision-making. Collaborative tools are

on the rise with companies employing virtual assistants and building shared platforms to improve

communications. Digital platforms are integrating into collaborative environments that enable

open applications and connectivity with all involved stakeholders and analytic tools in one system

of truth. Robotics and automation have always been the goal-seeking objective, however,

progression in automation is plagued with trust and standardization issues. As the evolution of

digital capabilities grow, robotics will play a critical role in optimizing and maximizing

productivity. There is still movement in the wearables and connected worker space, along with

virtual and extended realities for training, but this has largely taken a backseat to big data and

analytics investments.

35

Figure 6: O&G Investments in Digital Technology, Adapted from (World Economic Forum 2017)

36

3 O&G Digital Portfolios

The first step to understanding the current value drivers in the O&G digital market is to

identify the companies and services that are making moves and developing partnerships

throughout the industry. This section provides a brief list of the major investments and

partnerships that have been discussed in recent news. The portfolio breakdown includes

investments and integration initiatives from Shell, BP, Chevron, ExxonMobil, Equinor, and Saudi

Aramco. Because it is difficult to discover this information without access to internal strategies,

this list is not exhaustive, and is only representative of the transparency provided in the public

mediums. The portfolio analysis is followed by a review of digital investments as applied to real-

time operational data, specifically to the oil extraction phases (DC&I). The following trends were

identified during the review of the major IOC’s digital portfolios:

1. Microsoft (MS) Azure is the comprehensive cloud-based platform of choice by industry. It

is an enterprise-wide platform that facilitates the data pipeline infrastructure. MS Azure

is robust, adaptable, and scalable, with a variety of powerful digital opportunities.

Microsoft has entered into partnerships to connect with various digital service providers

to create a single source architecture solution.

2. Data management systems are being leveraged to revitalize, clean, and orchestrate data

through the Azure pipeline. Services like Databricks and SEEQ are designed to integrate

historians and develop workflows for dormant data, for use in analytic algorithms. As

discussed above, the quality and volume of data are paramount for higher-level analytics

to provide value. O&G companies are focused on generating value from their current data

repositories, but this is a challenging effort due to the lack of historical standardization

with data structuring. In the current evolutionary stage of digital in O&G, the data

management space may be critical for early market capture, because revitalizing data

versus waiting for enough new quality data to arrive from operations, will expedite the

digital journey toward actionable success.

3. Open platforms that integrate any service or analytic tool through OpenAPI specifications

for REST APIs are the current trend for future digital sustainability. OpenAPI standards

37

allow for single truth workflow platforms that are interoperable with any software that

has developed that interface capability. This allows both digital service providers and

operators to gain flexibility with custom development and to freely leverage any new

software or algorithm packages into the platform. The trends for digital platforms and

initiatives are to enable cross-functional and cross-company collaboration and

coordination. This is the foundational principle of the FutureON platform for the subsea

workflow development, which is similar to Equinor’s Omnia. Most new services, including

real-time drilling advisor systems, have compatibility to connect via REST API’s in order to

integrate into business and data workflows.

4. Companies are exploring big data analytics in seismic and reservoir characterization,

which has the potential to unlock discoveries for new reservoir plays. This is probably the

highest data density discipline within an O&G organization, which is a perfect

environment to test big data analytic techniques. However, having enough quality data

and computational power for revolutionary reservoir characterization for oil discoveries

is likely a few years away. The current strategy is to develop characterizations and connect

geological data into workflow platforms for enhanced design and development.

Additionally, reservoir properties can be streamlined into Digital Twin models to forecast

well extraction and production performance.

5. Companies are investing in predictive maintenance to connect and analyze equipment

performance. GE Proficy is measuring electrical current data to train algorithms for

anomaly detection and for forecasting maintenance performance on rotating equipment.

Additionally, O&G companies have shown recent success in sensor connectivity combined

with machine learning to forecast, predict, and customize maintenance plans and, hence

supply chain methodologies.

6. Processing real-time data to create actionable operational decisions is influencing

significant changes within the industry. Real-time sensor data is mapped to mathematical

equations that define the state of the operation. For example, the weight-on-bit,

revolutions per minute, and torque can advise the driller on the mechanical specific

energy output of the drilling operation, and based on the bottom-hole-assembly

38

parameters, this can advise on the optimal strategy for drilling. Companies are leveraging

offset data and lithology data to integrate into the real-time advisory platform, where

performance and drilling events can be forecasted – this starts entering the realm of a

digital drilling twin. Additionally, an advisory program is only as powerful as the trust and

discipline of the operating company to follow the recommendations. Companies are

developing ways to integrate automation into the workflow. NOV’s NOVOS is a platform

that allows for automation on repetitive tasks like drilling or tripping but has the ability to

adapt to new circumstances. Automation is a big challenge for industry to address in the

O&G operation spectrum, as each system can have different control packages and

elements associated with the equipment. The burden here is potentially on the rig

contractors to develop machine control package standardization; before then, it is an

arduous task to customize software for each rig platform.

Table 1 through Table 9 provides a detailed summary of O&G industry digital initiatives

that were identified through interviews, news articles, professional papers, and market research.

The organizations included are Shell, BP, Chevron, ExxonMobil, Equinor, Saudi Aramco, and Aker

BP. The summary provides a framework of the historical digital adoption and direction that large

O&G companies are heading with the digital transformation. The intention of this work was to

identify key players and initiatives that were already creating an impact in the O&G industry, and

with this knowledge, to be able to identify digital themes that are reported as generating realized

value. Table 10 provides a digital initiative summary with respect to DC&I operations, and Table

11 summarizes the initiatives into functional categories of data storage, processing, analytics,

and visualization. Note that in Table 11 the arrows signify potential cross-functional capabilities

as most digital initiatives perform in multiple areas, but they are listed in the table with respect

to their core functionality.

39

3.1 Shell

Table 1: Shell’s Digital Portfolio Initiatives 1 of 2

Microsoft Azure is a comprehensive cloud platform service (PaaS) that provides enterprise-scale computation, analytics, storage, and networking opportunities. Microsoft Azure is alsoextremely diverse and flexible – the platform allows the flexibility to use preferred tools andtechnologies . Shell uses Azure as their cloud-based enterprise data pipeline infrastructure.

Shell partnered with C3 IoT in 2018 to develop their AI platform at a global scale by deployingthe on Microsoft Azure. C3 IoT offers a broad set of AI applications (i.e. ML, NLP, IoT, etc.) thatShell intends to implement both upstream and downstream. C3 IoT provides analytics inpredictive maintenance, supply chain optimization, sensor networks, energy management, etc.

ShellShell has been at the forefront of digital innovation and initiatives in the oil and gas industry. Shell has developed andincorporated a diverse amount of digital platforms into their portfolio, including machine learning, computer vision, virtualassistants, robotics, advanced data management and workflows – all to enhance operational efficiencies.Shell GameChanger is a program that works with start-ups and early-stage innovative ideas that have the potential to have afuture impact in the energy industry. The autonomous team within Shell invests in novel ideas throughout the globe fromideation to proof of concept – the vision context is to provide focus on more long-term opportunities. The group has existedsince 1996.Shell Technology Ventures (STV) is the capital venture branch of Royal Dutch Shell designed to enhance the development ofnew technologies that have potential to create deployment value in Shell.Shell TechWorks is an innovation center created in 2013 in Boston, MA to build an entrepreneurial environment to developadvanced, short-term product development capabilities. The group consists of teams with backgrounds outside of the energyindustry to provide a unique perspective and more agile problem-solving methodologies.

Digital Initiatives

MAANA is a knowledge-graph based data processing platform. Shell partnered with MAANA in2017 to (1) analyze corrosion-related concerns due to the impact of crude oil selection onrefinery equipment integrity, and (2) for HES risk applications designed to help Shell identifyoperational incidents with the objective to improve understanding and prevent reoccurrences.

Shell invested in Quantico Energy Solutions in 2015 to develop innovation in horizontal shalelogging, drilling, and fracking. Quantico offers subsurface artificial intelligence with data insightsgenerated from their solutions: QRes, QLog, QDrill, QFrac, and QResXAI. The platform leveragesimproved subsurface acquisition techniques to optimize drilling, fracking, and production.

Azure Databricks is an Apache Spark-based analytics platformed designed for Microsoft Azure.The Databricks platform extracts landed data from multiple sources (i.e. Azure Data Lake, AzureData Warehouse, etc.) and feeds this data into your analytic workflow – Power BI, DeepLearning/ML, and other applications. This is a critical application in Shell’s digital workflow.

Shell began using the analytics platform, Alteryx, in 2014. They have worked closely to developproducts like Alteryx Connect and other advanced platforms to further their digital capabilities.Alteryx Connect is a platform for managing data with emphasis an accelerating insights withinthe organization. Additionally, Alteryx created a “New Well Portal” for managing E&P data.

40

Table 2: Shell’s Digital Portfolio Initiatives 2 of 2

References:

(“Shell’s Companywide AI Effort Shows Early Returns - C3.Ai” n.d.; “Shell Announces Plans to Deploy AI Applications at Scale - CIO Journal. - WSJ” n.d.;

“Shell Selects C3 IoT as Strategic AI Software Platform | Business Wire” n.d.; “Artificial Intelligence Has Royal Dutch Shell ‘Super Excited’ | Investor’s Business

Daily” n.d.; “The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The Oil And Gas Giant” n.d.; “Pursuing Advanced Analytics at SHELL -

Alteryx Community” n.d.; “Shell, Know Thyself!” n.d.; “Maana and Shell to Co-Present on How to Accelerate Digital Transformation with the Maana

Knowledge Platform at Forrester’s Digital Transformation Forum | Business Wire” n.d.; “Shell GameChanger – A Safe Place to Get Crazy Ideas Started |

Management Innovation EXchange” n.d.; “JPT Shell’s Well Pad of the Future Is Open for Business” n.d.)

Shell

Shell initiated iShale in 2016 to develop a vision for the shale field of the future. The information below is reflective of both theShell iShale website home page and a Journal of Petroleum Technology (JTP) article:

Shell - iShale• Increasing automation in drilling and frac operations to reduce human exposure and increase safety.• Building fit-for-purpose (modular) facilities that can be scaled with production needs.• Constructing ‘Smart’ wells with onsite wireless instruments for better monitoring.• Putting in place surveillance that is driven by ‘exceptions’ to standard conditions for efficient operations.• Establishing central operating centers for production data analysis and cross-portfolio learnings.• Building a fluid ‘organization of the future’ enabled by a digitally connected and broadly skilled workforce.

JPT – “Shell’s Well Pad of the Future Is Open for Business”• Wireless controls and process automation to reduce well pad construction and instrumentation costs• Digital reporting to optimize vendor communications and reduce inefficient reporting during development and construction

operations• Multiphase meters on well pads to minimize work required for routine well tests, and reduce HSE exposure• Simplified well pad design to remove pad separators and enable separation at central processing facilities• Multifunction central processing facility to reduce site storage requirements at central processing facilities• Remote sensors and analytics to minimize production deferment and remotely detect incidents• Exception-based ways of working for efficient operator routes and surveillance• Mobile personal productivity tools for faster decision making and work efficiency• Failure analysis and advanced analytics to enable system-wide optimization and reduce cost

iShale

Udacity is an online education program that offers ”nano-degrees” in various data analyticsprograms, including data science, artificial intelligence, cloud computing, etc. Shell has allowedemployees to volunteer for a preferred education program and to take the courses at their ownpace. This allows Shell to enroll thousands of employees to build digital competencies.

Digital Initiatives

41

3.2 BP

Table 3: BP’s Digital Portfolio Initiatives

BPBP Ventures is the VC branch of BP that identifies and invests in game-changing technology that applies to any aspect of theenergy industry. The group aims to discover enabling technology that supports the balance between reducing carbonemissions and meeting the growing world energy demand.BP Center for High-Performance Computing (CHPC) in Houston is one of the most powerful commercial research computers inthe world. The research center opened in 2013 and has been used for high performance seismic image processing and rockphysics research to support exploration, appraisal, and development plans. Supercomputing has also been used to performfluid dynamic research for refinery and pipeline optimization in the downstream business.BP’s Field of the Future (SPE 112194) was a program from 2004-2008 that installed 1800km of fiber optic cable, enhancedreal-time tags by almost two million, and created twenty Advanced Collaborative Environments (ACE) to improve drilling andproduction performance. Initiatives included advanced sand monitoring, production visualization/optimization, and remoteperformance management. BP was reported to experience 1-3% improvements in all areas across the 20 fields.

Digital InitiativesBP Ventures invested $5MM in Belmont Tech. in 2019 to further BP’s digital portfolio. Belmontis a cloud-based geoscience platform that includes specially designed knowledge-graphs. Thisplatform connects the geology, geophysics, reservoir, and historical data for unified relationalinformation. Nicknamed “Sandy”, this platform is expected to unlock crucial subsurface data.

Kelvin Inc. is an integrated platform for intelligent control with the vision of connectivity andanalytics around sensor control of physical systems. Kelvin Inc. outfitted BP’s wells inWamsutter, WY with various arrays of sensors to gather and transmit data for optimizationsimulations - with realized performance improvements in venting, production, and costs.

Beyond Limits is focused on cognitive computing and how it can be applied to the oil & gasindustry. The ultimate vision is to learn cognitive decision-making logic in order to act as anassistant to solve problems through reasoning as a team/human enhancement. This includeslogic reasoning across multiple disciplines. BP partnered with Beyond Limits in 2018.

Microsoft unveiled the Azure Machine Learning service on their cloud services platform in2018. The goal is to provide the capability of end-to-end machine-learning pipeline to developpredictions from any landed data set. This technology reduces the work from a data scientist asit runs the various variable combinations that were previously manually tested.

BP has been on a cloud-first mission where the company aims to eliminate the singledatacenter strategy. BP has been steadily reduces overhead associated with databasemanagement, where Microsoft Azure has been the solution for both cost and flexibility aroundinnovation, scale, and applications.

RigNet was selected to provide Intelie Live across BP’s drilling fleet in 2019. BP intents to useIntelie Live in their Remote Collaboration Center to improve operations efficiency. The machinelearning platform will be leveraged to process and map all the real-time data elements acrossmultiple systems during operations.

42

References:

(“BP Technology Outlook 2018,” n.d.; “JPT: BP and Startup Beyond Limits Try To Prove That Cognitive AI Is Ready for Oil and Gas | beyond.Ai” n.d.; “BP Has

a New AI Tool for Drilling into Data – and It’s Fueling Smarter Decisions | Transform” n.d.; “BP’s New Oilfield Roughneck Is An Algorithm” n.d.; “‘Sandy’

Joins the Dots for BP” n.d.; “BP Upgrades Houston HPC, World’s Most Powerful Corporate Supercomputer - DCD” n.d.; “BP Supercomputer Now World’s

Most Powerful for Commercial Research” n.d.; “BP Invests in AI to Focus on Digital Energy|Powertech Review|” n.d.; “Microsoft Customer Stories: BP

Embraces Digital Transformation and the Cloud to Disrupt the Energy Industry” n.d.; “Microsoft Customer Stories: BP Explores Azure AI to Boost Safety,

Increase Efficiency, and Drive Business Sucess” n.d.; “Microsoft Customer Stories: BP Adopts Hybrid Cloud and Moves Applications and Datacenter

Operations to Azure” n.d.; “BP Invests in Chinese AI Energy Management Tech Specialist R&B | News and Insights | Home” n.d.; “Xpansiv Continues Its

Transformation of Commodity Value in Global Markets; Announces Strategic Investment from BP Ventures, Reflective Ventures, and S&P Global” n.d.;

“Xpansiv Completes Strategic $10M Series A Funding Round with Investments from BP Ventures, Avista, S&P Global, and Energy Innovation Capital” n.d.;

“RigNet Signs Strategic Agreement with BP for Intelie Live | RigNet” n.d.)

43

3.3 Chevron

Table 4: Chevron’s Digital Portfolio Initiatives 1 of 2

ChevronChevron has a robust digital transformation strategy compared to most competitors in the Oil & Gas industry. The accelerateddigital portfolio includes industry transforming collaborations, particularly a seven-year partnership with Microsoft Azure and anew collaboration between Chevron, Microsoft, and Schlumberger (DELFI). These multi-collaboration efforts with both SiliconValley and key O&G service providers separate Chevron’s digital strategy from the rest. Digital initiatives have proven thatcollaboration, standardization, and data accessibility are critical for impactful operational enhancements, and Chevron isfollowing these principles both within the organization, and within the entire O&G industry.Chevron Technology Ventures (CTV) was created in 1999 and is Chevron’s venture capitalist branch that pursues innovationsthrough start-ups and technology developments that have a potential to impact the current or future oil and gas industry. CTVhas partnered with many innovation and entrepreneurial hubs to champion technology development, including HoustonExponential, The Cannon, Chevron Innovation Lab, Fab Foundation, and the Catalyst Program.

Digital InitiativesChevron signed a partnership with Microsoft in 2017. Microsoft Azure is also extremely diverseand flexible – the platform allows the flexibility to use preferred tools and technologies .Chevron uses Azure as their cloud-based enterprise data pipeline infrastructure. This includesopportunities with other MS services: Data Lake, Machine Learning, IoT Edge, etc.

DELFI Schlumberger is a collaboration between Schlumberger, Chevron, and Microsoft. DELFI isa collaborative cloud-based AI platform that connects E&P lifecycle data in the Azure Cloud.This partnership was created in 2019. The solution is to include the integrate the full E&P assetlifecycle including exploring, design, and operations.

MAANA is a knowledge-graph based data processing platform. Chevron partnered with MAANAto develop semantic and analytic models to support better decision making at an enterprisescale. MAANA’s Knowledge Platform develops relationships between design / operational dataand human expertise to create knowledge representations for enhanced decision making.

SeeQ is a data processing platform that expedites the accessing, cleaning, andmodeling/reporting the massive amount of time-series data built-up and stored in historians (orother storage platforms). SeeQ works to integrate disparate data from multiple historians to asingle source, where data analytics and even third-party customization can be developed.

Moblize is a cloud-based big data analytic platform that aims at simplifying the entire end-to-end life-cycle process with drilling (ProACT), completions (ProFRAC), rig performance(ProINSIGHTS), and data sharing (ProWISE). In 2016, it was reported that Chevron has utilizedProACT on 7,000 unconventional wells. This platform aims to improve data process workflows.

In early 2020, Chevron invested in Worlds from Sensory Sciences, which builds extended realitymodels for “active physical analytics (XR).” Worlds creates a 4D platform environment toenable teams to understand automation, efficiency, and improved safety opportunities.

44

Table 5: Chevron’s Digital Portfolio Initiatives 2 of 2

References:

(“JPT Chevron, Schlumberger, Microsoft Team To Improve Digital, Petrotechnical Work Flows” n.d.; “JPT Seeq’s Focus on Time-Series Data Draws in

Chevron, Shell, and Pioneer” n.d.; “Houston-Based Chevron Technology Ventures Makes Investments in Carbon Capture and Spatial Artificial Intelligence -

InnovationMap” n.d.; “Meet Snake Arm: Robot Technology That Saves Lives — Chevron.Com” n.d.; “Chevron Finds Data Recovery as Hard as Oil Recovery

- Taps Panzura Controllers for Cloud-Based Storage Solution” n.d.; “Worlds - Hypergiant” n.d.; “Industrial AI Company, Veros Systems, Closes $4.3 Million

in Series B Funding” n.d.; “Moblize | Moblize Achieves Huge Milestone: 7,000 plus Wells - Moblize” n.d.; “Bringing AI To Data Analytics And Knowledge

Management: Startups Anodot And Maana Snag New Financing” n.d.; “Seeq Secures $23 Million Series B to Fuel IIoT Advanced Analytics Growth Strategy

| Seeq” n.d.; “The Amazing Technology Disrupting Oil and Gas (Think: AI, 3D Printing and Robotics)” n.d.)

Chevron

Panzura Freedom™ is a complete cloud solution that offers deployable, enterprise-scale datastorage and accessibility solutions that advertises “unprecedented cloud performance withmilitary-grade security.” Chevron has used Panzura Freedom™ to revitalize large volumearchives (specifically in seismic and geological data) for real-time data processing and analytics.

Digital Initiatives

Chevron partnered with Veros Systems in 2018. Veros Systems is an artificial intelligencecompany that utilizes a novel algorithm from capturing high-resolution electrical waveforms tomonitor industrial machine health, performance, and to predict failures. This platform can beeither stand-alone or embedded in a cloud platform.

45

3.4 ExxonMobil

Table 6: ExxonMobil's Digital Portfolio Initiatives

References:

(“ExxonMobil Is Optimising Oil and Gas Operations with Microsoft” n.d.; “World Oil Newsroom: ExxonMobil Awards License to EON Reality for Immersive

3D Operator Training Simulator Technology - EON Reality” n.d.; “New SMART Procurement Platform | ExxonMobil” n.d.; “Oil and Gas Engineering | L&T

Technology Services” n.d.)

ExxonMobilExxonMobil is leading the way for the oil and gas industry to develop a data-driven focus to improve the efficiencies in design

and operations. Most of the focus uncovered in this research is around the partnership with Microsoft and the enabling benefitsin the Permian Basin. The Microsoft Azure platforms offers capabilities in all aspects of the data capture, filtering, processing,analyzing, and visualizing ecosystem. Exxon has built partnerships to support the full data-pipeline, with some focus inrevitalizing geoscience historians (similar to other operators), but they have also built data analytics platforms from withinExxonMobil. Their Drilling Advisory System (DAS) algorithm is a rig-based drilling-surveillance program that leverages real-time

drilling data along with rock-mechanics and analytic algorithms to forecast and advise on drilling and completion operations.ExxonMobil has demonstrated a ”build” philosophy in the strategic digital decision of “buy, build, or outsource” for innovativeplatforms – it will be interesting to see how their Permian performance compares against companies with a more“partnership/outsource” philosophy.

Digital InitiativesExxonMobil partnered with L&T Technology Services (LTTS) in 2018 to convert historical

geoscience data into automated utilities and applications. LLTS is an engineering servicescompany that has a long history in the Oil & Gas industry – they advertise building enterpriseaccelerators around digital twins, robotics, worker safety, and drill automation.

ExxonMobil partnered with EON Realty Inc. in 2015 to develop 3D and 4D (VR/XR) simulations

and training for operational efficiencies and workforce safety. The commercial license wasawarded for developing immersive reality for an operator training simulator to preventincidents, and to practice response strategies.

SMART by GEP is a unified procurement cloud platform that creates a single integrated

structure for both parties to collaborate. This partnership was created in 2018, and includessupplier management, bidding, terms and negotiations, as well as procurement tracking.

XTO Energy (ExxonMobil subsidiary) partnered with Microsoft Azure to enhance operation

efficiencies in the Permian Basin. The cloud platform is being utilized to collect and processreal-time operational data for improved decision making, as well as, offer them the flexibility toincorporate third-party solutions that are already being leveraged in the Permian.

46

3.5 Equinor

Table 7: Equinor’s Digital Portfolio Initiatives

References:

(“Meet Omnia- the Statoil Data Platform That Enables Our Digital Roadmap” n.d.; “GitHub - Equinor/OmniaPlant: Documentation on How to Get Started

Building Industrial Applications and Services by Using Omnia Plant Data Platform” n.d.; “Equinor Taps FutureOn for Cloud-Based Offshore Data Visualization

Software” n.d.; “Equinor Will Broadly Implement Ambyint’s IoT Solution to Optimize Production in North Dakota - Equinor.Com” n.d.; “FutureOn for Cloud-

Based Offshore Data Visualization Software” n.d.; “GE Introduces Proficy SmartSignal Shield 4.0 | Business Wire” n.d.; “GE SmartSignal Classic” n.d.)

EquinorDigital Initiatives

Omnia Plant Data Platform is Equinor’s platform for accessing and processing industrial datafrom all Equinor’s operations. Equinor is taking an “API-first” approach and standardized theirplatform sharing using OpenAPI. This platform is the backbone to their digital roadmap andenables applied analytics to their Integrated Operations Center (IOC) and Digital Twin/AR/VR.

Equinor partnered with FutureOn in 2018. FutureOn is a cloud-based digital platform enablingworkflow visualization and next-generation project planning and collaboration. Equinorlicensed their Field Activity Planner (FieldAP) – and they have two other platforms: DigtalTwinand SubseaAlliance. This tool is design for advanced project and engineering decision making.

Equinor entered a seven-year partnership with Microsoft cloud services in 2018 to continue todevelop new digital solutions for the oil and gas assets.

Equinor partnered with GE for the Proficy SmartSignal that provides actionable warnings ofindustrial equipment anomalies and diagnostics. The software tracks the historical data ofequipment to develop a normal operating model and then compares real-time sensor readingsto the current and predicted performance of the equipment.

Equinor Technology Ventures invested in Ambyint in 2017. Equinor is using Ambyint’s AI-powered artificial lift and production optimization software in their Bakken shale assets. Thetechnology includes cloud-based AI and edge computing technology, along with an”autonomous set-point management system” to optimize well performance.

47

3.6 Saudi Aramco

Table 8: Saudi Aramco’s Digital Portfolio Initiatives

References:

(“FogHorn, Stanley Black & Decker, Saudi Aramco and Linde Highlight the Value of Edge Intelligence at the ARC Industry Forum” n.d.; “How Saudi Aramco

Is Digitalising Its Operations - Products & Services, Digitalisation, Digital, Saudi Aramco, 4IR, Fourth Industrial Revolution, AI, Drones, VR, AR - Oil & Gas

Middle East” n.d.; “Corporate VC Arms of Saudi Aramco and Chevron Invest in $24M Round for Seattle Startup Seeq - GeekWire” n.d.; “Saudi Aramco

Energy Ventures Invests in Norwegian Artificial Intelligence Software Provider Earth Science Analytics | Business Wire” n.d.; “Data Gumbo Blockchain

Expands from Oil to Geothermal Drilling in Asia - Ledger Insights - Enterprise Blockchain” n.d.; “Data Gumbo Secures $6M in Series A Funding from Venture

Arms of Leading International Oil & Gas Companies | Business Wire” n.d., 44)

Saudi AramcoDigital Initiatives

Saudi Aramco invested in the Norwegian AI software provider for Earth Science Analytics in2019. The goal of the platform (EarthNET) is to perform rock and fluid analytics and predictionsto improve oil and gas exploration and production. This platform provides geoscience-drivendata analytics as well as also database and workflow solutions.

FogHorn is an AI developer for Edge computing technology for IIoT. They focus on real-timeonsite intelligence with Edge computing for centralized connectivity and analytics of industrialsensors and equipment. Saudi Aramco used FogHorn solutions to monitor and reduce stackflaring in their gas refineries, but this technology has a broad spectrum of applications.

Data Gumbo Corp. is a Houston based company that developed a Blockchain-as-a-Service(BaaS) platform for industry Smart Contracts. Smart contracts build transparency by connectingcontract specifications with physical transactions for all the stakeholders involved. SmartContracts enable real-time detection of anomalies or deviations that can be quickly corrected.

SeeQ is a data processing platform that expedites the accessing, cleaning, andmodeling/reporting the massive amount of time-series data built-up and stored in historians (orother storage platforms). SeeQ works to integrate disparate data from multiple historians to asingle source, where data analytics and even third-party customization can be developed.

48

3.7 Other Operators

Table 9: Aker BP’s and Total’s Digital Portfolio Initiatives

References:

(“Application of Artificial Intelligence in Oil and Gas Industry: Exploring Its Impact” n.d.; “Woodside Energy Drills for Insight with Cognitive Computing” n.d.;

“Artificial Intelligence Improves Real-Time Drilling Data Analysis | Offshore” n.d.; “Exploring the Potential of Robotics in the Oil and Gas Industry | Aker BP

ASA” n.d.)

Formed a strategic partnership with AI SaaS firm Cognite to explore the potential for robotics in the offshore oil and gas platform. Cognite’s cloud-based industrial data operations and intelligence platform Cognite Data Fusion (CDF) will service as the data infrastructure for the initiative.

Aker BP

SparkCognition will be used in an analytics solution platform called SparkPredict, whichmonitors topside and subsea installations for more than 30 offshore structures. Uses ML toanalyze sensor data which is used as a predictive model for failures before they occur. This ispart of their ‘Cognitive Operations’ initiative.

The quadruped robot Spot has been involved in the robotics initiative with Cognite to test therobot’s performance for autonomous inspections. The inspections will be directed at high-riskareas and be coupled with automated report generation technology to alert and provideinsights to operators.

Applying real-time predictive drilling analytics to explain their challenges associated with“significant NPT events and performance variability.” Looking for a “finger-print” ahead of anon-desired situation. “The predictions gives the operating team sufficient warning time to takeactions to avoid events.”

Total

49

3.8 DC&I Digital Initiatives

Table 10: Digital DC&I Initiatives

Precision Drilling developed a technology suite called Alpha™ in 2019, which includesAlphaAutomation™, AlphaApps™, and AlphaAnalytics™. AlphaAutomation™ was previouslyreferred to as Process Automation Control (PAC).

Drilling Digital Initiatives

Corva is an all-in-one drilling and completion AI platform. Corva integrates disparate datasources to perform analytics and visualizations for operational performance optimization. Thealgorithms learn from both historic well data and physics-based models to provide real-timeguidance during operations.

AI Driller is focused on automating the drilling process to build the drilling system of the future.AI Drilling currently provides self-drilling applications for rotary and slide drilling operations.The system is available on NOV’s NOVOS Reflexive Drilling Platform.

Baker Hughes, C3.ai, and Microsoft Corp. entered an alliance in 2019 to develop enterprise-scale AI solutions into the energy industry. This initiative will be rolled out on Microsoft’s Azureplatform and is tailored to address predictive maintenance, process and equipment reliability,energy/production management, and inventory optimization.

Provides a software called wellAhead™ for automated monitoring and real time optimization ofdrilling operations. This software is designed to be used on the rig and remote operatingcenters. The full suite from eDrilling includes automated drilling control, drilling planoptimization, drill well in simulator, dynamic well control, and real-time MPD.

Petro.AI is a cloud-based platform (PaaS) that can clean, process, and store operational datawithout requiring a need to share with third party vendors. Petro.AI uses machine leaning onreal-time and historic data to understand the state of the rig and to advise/predict operationalperformance.

Intellicess provides two products, Sentinel™ and RigDAP™. Sentinel™ is a real-time drilling dataanalysis engine that works as a backend program to combine sensor data with physics-basedmodels to develop rig state, drilling anomalies, and drilling advisory. RigDAP™ is an open SCADAplatform to perform data analytics and enables the transition to drilling automation.

NOV’s NOVOS Reflexive Drilling System is an open drilling advisory and drilling automationplatform to enhance drilling performance. The platform provides the capabilities to automaterepetitive processes (i.e. drilling, tripping, rotating, etc.) and to perform real-time drillingoptimization (custom and open-source algorithms).

Intelie Live is a stream analytics platform to collect, filter, aggregate, and visualize data fromany source. Petrobras is utilizing Intelie to address real-time well issues at their drilling remote-operating center. The platform leverages Intelie Pipes, an advanced real-time query language.Intelie is a RigNet company and their algorithms can be applied in a function with source data.

Moblize is a cloud-based big data analytic platform that aims at simplifying the entire end-to-end life-cycle process with drilling (ProACT), completions (ProFRAC), rig performance(ProINSIGHTS), and data sharing (ProWISE).

50

References:

(“Precision Drilling Corporation - Alpha” n.d.; “Homepage - Corva” n.d.; “Directional Drilling Automation” n.d.; “Baker Hughes & C3.Ai Release BHC3

Production OptimizationTM | Baker Hughes” n.d.; “Products - EDrilling” n.d.; “Applications of Machine Learning in Drilling with Petro.Ai — Data Shop Talk”

n.d.; “Services | Intellicess” n.d.; “NOVOS Reflexive Drilling System” n.d.; “Intelie Live Machine Learning Analytics - RigNet” n.d.)

51

3.9 Digital Partnership Categorization

Table 11: Summary of O&G Digital Initiatives

Data Storage

Data Processing

Data Analytics

Data Visualization

OMNIA

52

4 Systems Approach to a Digital Portfolio

Adopting and integrating emerging digital technologies into an organization is a daunting

task, and so is embracing a new way of thinking. ‘Systems thinking’ methodology approaches a

problem from a holistic life-cycle point-of-view. The interdisciplinary approach views the entire

system as a hierarchy of interconnected subsystems. Instead of the traditional method of a direct

mindset from problem to solution, the systems method evaluates the entire system of

interactions and controls for discovery. The systems belief is that everything in some way is

interconnected and understanding those control and feedback relationships is the key to solving

complex problems. The central considerations employed in a systems analysis includes system

decomposition (hierarchical structure), relationship mapping (control and feedback loops), and

system function or emergence.

The hierarchical decomposition of a system breaks down the system into its foundational

elements. At this level of abstraction, a system can be understood by the organization of the

individual parts or entities. Although seemingly simple, the process of creating a representative

decomposition assists with visually understanding where a component fits within a system.

Relationship mapping is utilized to identify the control mechanisms of the system. Control and

feedback loops can be both physically and informationally driven. The value created from a

control structure depends on how well all the relevant stakeholders and components are mapped

relationally within the system. The system components should be represented at the appropriate

abstraction level for the problem being evaluated. The architecture of the system form (i.e. the

physical or cyber construct) and the respective relationships generate emergent functional

properties. The most valuable discovery from systems engineering is uncovering the resulting

emergent properties of a system, and how different architectures and respective relationships

impact the resulting system function.

Emerging digital innovations are tools that are adopted and integrated into a system to

improve performance. The systems approach can be leveraged to evaluate a holistic view of the

current system, to forecast the future state of an ideal system, and to synthesize how to

effectively bridge the gap. This type of analysis includes the existing tools, processes, procedures,

and people that are involved with the operation or system today and compares this current state

53

to the ideal future state. This path of discovery helps to identify where entities (tools, process,

procedures, or people) are broken or need improvements to reach the goal of the new state.

Bridging the gap, or mapping the transient state between current and ideal, assists with creating

a roadmap of tools that create positive momentum and contribution in the intended direction.

This thesis takes a systems approach to the integration of digital technologies with respect

to drilling systems within the oil and gas industry. The approach starts with the holistic view of

the drilling sector within the oil and gas industry and progresses into more granular detail around

the digital evolution of drilling systems. This analysis reviews the current state and future ideal

state of drilling systems, as well as the oil and gas industry as a whole. This provides a framework

around where the digital platforms, outlined in Chapter 3, contribute to the evolution of digital

integration into the oil and gas industry.

To start, the oil and gas industry is decomposed in Figure 7 to show where drilling is

positioned within the hierarchical structure of the system. Although this structure does not

represent the associated relationships, for example, the drilling design, location, and target depth

are largely influenced by the geologic discovery and future production operations. However, it is

important to understand that Midstream capabilities and Downstream economics play a large

role in the strategic decisions around exploration budget and asset investment decisions. So, in

terms of systems theory, there is some type of connection from drilling to all the elements

represented in the decomposition.

54

Figure 7: Oil & Gas Industry Decomposition

The focus of this thesis is around digital integration, and more specifically, how that

relates to the improved design and operation in DC&I design and operations. The objective of

this systems review is to understand what type of data availability and analytics would assist to

improve the productivity of operations, and the graphics should be viewed with that type of data-

driven lens. On a macroscale of digital connectivity, Figure 8 shows an ideal data pipeline where

all real-time and historic data is accessible across the organization. This figure is over-simplistic,

to say the least, but it provides a broad idea of the level of data and collaborative connectivity

that is being pursued with digital platforms. This ideal endeavor, shown as a state-shift

revolutionary change, is an unfeasibly expensive and disruptive challenge to perform in a singular

fashion. The challenge that the systems approach undertakes to resolve is determining how to

implement O&G subsystem projects, over time, that can eventually interconnect into an ideal

infrastructure of the future. And, more importantly, the systems approach aims to demonstrate

which investment steps to take that provides immediate benefit and prepares the organization

for future digital synergies.

Oil & Gas Industry

Upstream Midstream Downstream

Exploration Production Processing Transport

Geologic Discovery

Drilling

Completion

Production Operations

Workover & Interventions

Pipelines

Storage

Trucks & Shipping

Supply & Trading

Oil & Gas Refining Transport

Petrochemical Plants

Marketing & Retail

Refining Distribution

Gas Processing

Field Processing

55

Figure 8: Oil and Gas Data Pipeline

The system problem statement for the adoption and integration of digital technology is

to increase productivity, improve safety, and decrease environmental impact. For this thesis, we

will assume that increasing the efficiency and productivity of an operation with digital and

automation techniques will produce an environment that reduces the environmental impact per

barrel extracted and reduces the worker exposure to high-risk scenarios. Therefore, the

discussion will be mainly driven around productivity improvements, with the understanding that

this has resulting impacts in the other areas of focus. Improving productivity in operations can be

defined as an objective to extract oil reserves at a bottom-line cheaper cost per barrel. This can

be accomplished by many mechanisms that can be leveraged:

• Discovering larger, more accessible, and more productive oil reservoirs (seismic

processing with data analytics and visualization)

• Optimized well placement to exhibit improved reservoir production performance

(interdisciplinary physical model optimizations)

• Improved well design techniques for both drilling and completions – this contributes to

both cost of operations and to reduced administrative design overhead (data mining,

case-based modelling, physics-based modeling, interdisciplinary optimizations, improved

workflows)

DRILLSHIP

DRILLINGCOMPLETIONS

INTERVENTIONS

PUMP JACK

PLATFORMDOWNSTREAM

PRODUCTIONINTERVENTIONS

LAND RIGHORIZONTAL DRILLING

HYDRAULIC FRACTURINGPRODUCTION

DATA PIPELINE

DESIGN MODELSSTANDARD OPERATING PROCEDURES

POWERPOINTS, PDF, WORD DOCS

REFINERYPETROCHEMICALS

INFORMATIONDATA PROCESSING

DATA VISUALIZATION

INTELLECTUAL PROPERTY SECURITY STANDARDIZATION VALUE CHAIN ECOSYSTEMDATA CONNECTIVITY

OPERATIONS OPTIMIZATIONPREDICTIVE MAINTENANCE

REMOTE OPS CENTERSCONNECTED WORKER

DIGITALAPPLICATIONS

CLOUD DATABASE

DATA ACCESSIBILITY

FACILITIESPIPELINESTORAGE

MIDSTREAM

OFFICE

56

• Reduced cost of operations for both CapEx and OpEx (data analytics for enhanced

decision making and process or machine automation)

• Reduced organizational overhead (improved workflows and collaboration)

The problem statement addressed in this systems analysis focuses on reducing

operational costs for drilling or any rig-based operation, and the role that digital initiatives

contribute to this objective and help maintain an interoperable approach for future

sustainability. The main objective with reducing operational costs is by reducing non-productive

time. Figure 9 shows the breakdown in operational time by three categories: Technical Limit Well

Time, Invisible Lost Time (ILT), and Conventional Non-Productive Time (NPT). The graph

demonstrates that digital enhancements and automation can reduce well time more than

industry believes is currently possible with the employed drilling techniques. The reduction of

both the ILT and NPT represents the value at stake for operational efficiency through the use of

digital enhancements.

Figure 9: Non-Productive Time Categorization

As a reference for the value at stake in relation to the non-production time, Figure 10

displays a quartile plot of US deepwater drilling (DRL) and completions (COM) from 2012 to 2020,

as recorded by IHS Markit Rushmore Reviews Database. The conventional NPT for drilling

operations has a P50 (i.e., probability of 50%) of 27% and, for completions operations, a P50 of

21%. The total median cost of a new deepwater drill and completion is $144 million and $61

million, respectively. The total cost of a new U.S. Land well is between $6 million and $8 million,

with some variation between the Eagle Ford, Bakken, Marcellus, Permian, and Delaware basins

(“Trends in U.S. Oil and Natural Gas Upstream Costs” 2016). Data retrieved from Statista reports

that the United States is manufacturing +/-20,000 wells per year (US Land plus US Offshore), with

a small percentage as deepwater wells (Garside, n.d.). The extrapolated savings are substantial

57

and can be realized through efficiency improvements, technology innovations, and automation

to reduce conventional NPT and ILT.

Figure 10: IHS Markit Rushmore Reviews Database US Deepwater NPT% and Well Cost, 2012-2020

With this objective at the core of the analysis, the rest of the system and subsystems can

be analyzed in the perspective of associated contribution for reaching this goal. Other goals and

objectives can be added to the analysis thereby creating a weighted contribution function.

However, the steps for achieving operational improvements will have inadvertent positive

influences on the other objectives, as noted in the analysis. Now that the system has been

decomposed and the objective function defined, the next step is to understand the controls and

feedback loops of the system. This step is critical for understanding the influence behaviors have

on system performance.

4.1 System Control

The ANSI/ISA-95 is an international standard from the International Society of

Automation that developed a hierarchical control structure from enterprise to operations.

Control structures are fundamental to understanding the system levers that can be leveraged to

influence system performance. The ISA-95 enterprise-control interface hierarchy was adapted to

show the control mechanisms from enterprise to operations for Drilling operations in Figure 11.

The adapted ISA-95 graphic shows how asset business strategy, well design, operation

58

surveillance, and physical and automated manipulation influence the operational results. Digital

techniques can be initiated in any of these control categories, and it is important to recognize the

impact and propagation that is realized on the system is dependent on the level from where the

system is influenced.

Figure 11: ANI-95 Enterprise-Control System Model, adaption from (de Wardt 2019)

While enterprise initiatives are the overarching governance of the organizational system,

this area will not necessarily achieve the most value-creating initiatives with respect to the

system problem statement. Additionally, the objective here is not to single out a specific area for

improved control, but to identify how added measures in all the categories can be combined to

create the most cost-effective solutions. It is also important to note that from Level 4 to Level 0,

information is the controlling force. Digital initiatives are geared to leverage data to develop

enhanced informational insights, and understanding that this impacts control from enterprise to

operation provides perspective on why the digital journey is so influential on productivity

improvements. These initiatives, with progression, will change the way the entire system is

Level 4: Business Planning & Logistics

Level 3: Manufacturing Operations Management

Level 2: Monitoring & Supervising

Level 1: Sensing & Manipulating

Level 0: Production Process

ISA 95Business Systems, Governance, and Planning

Process Management

Equipment and Process Control

Sensors, Instrumentation, and Data Collection

Production Assets

Controlling Equipment [Top Drive, Pumps, Drawworks for Torque & Drag, Hydraulics, and Rate of Penetration]

Physical Process [Drilling, Completions, Intervention]

Operations State Identification & Control [Well Condition, Well Parameters, Well Procedures, Well Operations]

Operating Procedures [Well Design & Construction, Standard Operating Procedures, Well Procedure, Risk & Uncertainty Management]

Asset Selection [Asset Field Construction, Field Location, Well Location, Cost & Schedule, AFE, Execution Strategy]

Hier

arch

y of

Con

trol

• Real-Time Advisory Algorithms• Visual Observation• Manual Manipulation• Machine Automation

• Real-Time Analytic Algorithms• Model & Case Comparisons • Manual Observations

• Physics-Based Model Simulations

• Case Driven Comparisons (Data Mining and Analytics)

• Strategic-based corporate decisions

• Economic Models & Forecasts• Case Driven Comparisons (Data

Mining & Analytics)

59

fundamentally controlled. The next chapter will decompose the current digital state with an

emphasis on the control and feedback loops that are governing the current process.

4.2 Current System State

The current state of the drill system will act as the benchmark for comparison against the

future, digitally enhanced state. The current state is split into two categories for analysis: design

and operations. The design phase reviews the data aggregation and workflow processes required

to design a drilling or completions program, and the operations phase reviews the data flow and

actionable response that influences real-time operations. Referring back to Figure 11, this

approach expands across all Levels of control.

The purpose of developing a system state is to understand the processes, procedures,

and tools that are currently being utilized in the control workflow. This step maps the interactions

of the systems to evaluate interconnectedness and relationships that can be structured more

efficiently, or to leverage new technology for improvements.

4.2.1 Engineering Design

Engineering design relates to the entire socio-technical workflow involved with creating

an end-to-end engineering solution for the objective or problem. The workflow consists of three

components: input, process, and output. The input is defined by external data and sources

required to feed the internal system to perform the function’s work. The processing refers to the

physics-based modeling and design requirements. And, the output refers to final product

generated from the internal processing, which can also be the inputs to feed other functions’

workflows. The design system for an end-to-end oilfield development is highly interrelated and

iterative, but this diagram provides a framework that defines the immediately relevant

stakeholders to the system process. Figure 12 shows the workflow for the drilling engineering

design, along with the specified medium with which data is traditionally transferred.

60

Figure 12: Current State Drilling Engineering Design Workflow

The drilling design system has more dimensionality than what is conveyed in Figure 12, as

input sources, processing, and output products all have different time and quality requirements.

That said, the figure provides a baseline framework to better understand current communication

patterns within the industry. The following assumptions were derived from the current drilling

design process:

• Data is sourced from multiple functions as inputs for the drilling design process.

• Functions can communicate critical information through various channels.

• Drilling design progress is dependent on the quality and speed at which data inputs are

provided.

• Data can be sourced from multiple individuals, even within the same function.

• Any change in an individual function’s input can result in rework for the drilling design.

• Functions operate in silos and communicate information as progress or project dictates.

• Data communication through multiple channels are susceptible to inconsistencies or

errors that result in potential rework or redesign.

• Data communicated through the channels specified are accessibility limited (i.e. email

archived, or files saved in a specific folder).

61

• Individual functions perform their own expert physics-based modeling and data-driven

case-base modeling (quality subject to engineer) from individual desktop or network

application.

Engineering design workflows suffer from communication silos, data accessibility,

coordination inefficiencies, error-prone manual entries, data mining abilities, individual

competency dependencies, and other time-consuming repetitive tasks. The process workflow

should be viewed as an entire socio-technical system that follows an iterative path of interrelated

dependencies that ultimately develop an end-to-end project proposal. From the system

perspective, there is an enormous amount of complexity built into the traditional design and

collaboration methods that cause delays, unnecessary iterations and rework, and productivity

inefficiencies. These key issues will be addressed with the evolution of a more digitally

collaborate workspace as described in the Future State and Identified Gaps section of this thesis.

4.2.2 Operations

Operations relate to the execution of a project and, in this specific example, the process

of drilling an oil well. The operation control workflow consists of three components (similar to

the design phase): input, process, and output. The input is defined by external data and sources

required to feed the field or office workers information about the operation. The processing

refers to the physics-based modeling, case-based modeling, or experience cognition that

determines what manipulations to make to control the operations. And, the output refers to

actual manipulation of the physical operation, which can be executed with either manual

intervention or automation. The operations phase of drilling an oil well is represented as a

workflow in Figure 13. The objective is to understand what data is produced and consumed by

the system, both to drive informed operational decisions and to understand how action is taken

on those decisions to improve performance.

62

Figure 13: Current State Drilling Operations Workflow

Innovations in operations have evolved more quickly than that of the office design

workflows. Productivity impacts to operations have immediate and quantifiable return on

investment, if successful. Even small improvements in operations can accumulate to large CapEx

and OpEx savings. The objective is to understand the key procedural and data points that are

leveraged to make operational decisions. The following assumptions can be derived from current

operational workflow:

• Well Programs provides the governance and strategy for well execution, and are

communicated as a digital document, i.e., Word or Adobe PDF, but are subject to multiple

revisions and manual entry errors.

• Physics-based model outputs to twin well operations and to guide operation

specifications are provided as digital attachments or as an Appendix to the Well Program.

In the current state, this requires manual observation for Well Programs to match and for

manual comparison during execution for assurance validation.

• Data streaming is the primary source of well state and well data. Technology innovations

have evolved in this area with improved transfer speeds and enhanced data analytics. See

Figure 14 and Figure 15 for additional information regarding the type of operational data

being streamed, and with the general data flow from rig to office.

63

• In the operation optimization space, the trend is pushing toward data analytic monitoring

with manual manipulation and control. The gap to an automated system is further

discussed in the Automation section of this thesis. However, the limit to monitoring and

advising will always be with the abilities of the human interface to correctly manipulate

the well operation as per the recommended response.

• Data analytics for monitoring and advising for improved productivity is generally being

developed from optimization of efficiency through mathematical equations (i.e.

mechanical specific energy for improving rate of penetration) and through offset

performance analysis.

• As mentioned in the Portfolio chapter, organizations have deployed Real-Time Remote

Operating Centers to deploy advanced analytics and monitoring techniques across assets

that are then communicated back to the rig site to improve productivity.

• Daily rig reports are still being managed manually, but companies like RigNet are pursuing

automation techniques to drastically reduce the man-hours required to complete daily

reporting. This also improves the data structuring and consistency of reporting, which can

then be leveraged with data analytics.

Figure 14 shows a decomposition of some of the drilling data points that are aggregated,

visualized, and streamed to be further analyzed. These instrument data points are visualized real-

time at the rig-site and remotely for analysis and operational decision-making. Instrument data

points can be either at the surface where the system state is defined by a surface electric signal,

or downhole where the data is transferred via hydraulic telemetry through the mud system. The

latency transfer time will be different due to the respective flow medium. Additionally, drilling

operations is a dynamic system, which would typically suggest that high- quality and high-

frequency data are required for accurate process or machine control. However, understanding

the required reaction time for both safety and processes efficiency is critical to understanding

the potential solutions for improvement.

64

Figure 14: Example Real-Time Drilling Data

A high-level, real-time data path and feedback loop is shown in Figure 15 to demonstrate

how real-time data is being leveraged with the design and operations system. The WITSML and

PRODML are data standardizations from Energistics that will be discussed in the Data

Architecture section of this thesis. However, understanding the high-level abstraction of the on-

site and off-site data feedback loops for operational control can help to identify where digital

solutions will have the highest impact. Additionally, the interfaces shown in the graphic can help

direct attention to how digital platform interoperability can better connect data sources, even

from the design phase, to provide better guidance for operational efficiencies.

65

Figure 15: Current State Operations Data Flow Architecture

As per the systems methodology, the control and feedback loops are fundamental to

understanding the emergent properties of a system. Figure 16 shows a detailed diagram that

identifies the stakeholders involved with leveraging real-time data to improve operations. The

diagram provides perspective of the data flow paths and data users that are leveraged to provide

feedback for the drilling system control. This diagram also provides insights on the stakeholders

at a higher level, emphasizing that the need for improving efficiency is driven from high levels

within the organization, industry, and government. The arrows that connect each entity are

labeled with descriptions that provide detail on the type of feedback or information is being

transferred in the control structure. This diagram demonstrates the ubiquity of data flow and

need for accessibility throughout an organization to drive further efficiencies. Traditional limited-

accessibility data flows like email, PowerPoint, and Excel Spreadsheets are no longer sufficient to

provide the accessible and collaborative environments needed for optimal feedback control for

operations. The current trends are gearing toward single systems of truth that allow accessibility

for all stakeholders involved with both design and operations. This is only a step in the digital

evolution toward full automation, where comprehensive operation optimization can be realized.

However, as we discuss automation in a later section, the models, processing speed, accuracy,

and, most importantly, trust needed for automation are not yet being employed in industry.

66

The next section, Future State, will provide an idealized system to address the limitations

that were discovered in mapping the system control structures. The purpose of this section is to

set the benchmark where new innovations can be measured on how well they align with the ideal

direction. This allows clear scope definition on whether a new technology or innovation should

be adopted as part of the organization’s digital portfolio.

67

Figure 16: Drilling Data System Flow Diagram – Emphasis on Data Stakeholders

68

4.3 Future System State

The future systems state for the drilling design and operation has been derived from the

projection of digital initiatives as outlined in the digital portfolio research in Chapter 3. The

purpose of this section is to create a regression of the digital evolution resulting from the trends

of both the adopted industry digital initiatives and from industry interviews. Developing an end

vision to direct a digital roadmap is critical for ensuring that implemented initiatives contribute

to adding value toward the overall organization’s objective.

The systems approach to the future state will review both the visualization and

functionality of the frontend of the digital architecture, as well as the interdependencies and

platform architecture of the backend. The frontend functionality of the future state is typically

more intuitive to understand – i.e. knowing what you want is easier than knowing how to build

it. This is an important recognition, as digital initiatives often suffer from development and

scalability issues due to the miscalculation of backend requirements. Figure 17 demonstrates the

relationship of the frontend versus the backend development of a digital platform with respect

to an iceberg model. The infrastructure and architecture developments contribute to the largest

portion of the project and are often not well understood in terms of required effort for feasibility

or for effective scalability. The companies listed below the “water level” are employed specifically

to develop the infrastructure and architecture needed to build the connectivity that develops the

backend for business productivity initiatives.

The iceberg model is mentioned early in this chapter to underscore the holistic mindset

needed to analyze the proposed future system state. The consensus through industry interviews

is that the predominant barrier between engineers and IT developers is the misunderstanding of

software platform backend requirements. Digital initiatives fail to deliver when expectations are

high, but the required allocated resources and investments are not met. Frontend functionality

is the exciting topic for business and engineering intelligence, but without the corresponding

development of the backend infrastructure, the system benefits and competitive advantage will

not be realized.

69

Figure 17: Front End Expectations versus Back-End Reality

With the recognition and focus on the importance of the backend software architecture,

the next section of this thesis will discuss the future trends of design workflow and operational

digital initiatives. The discussion is largely driven by the frontend functionality because that is the

interface where the value is realized, however, with the systems approach, the backend

dependency and feasibility will always be in consideration.

4.3.1 Engineering Design

The engineering design system consists of a socio-technical system of multiple disciplines

coordinating together to develop a cohesive project design for implementation. However, all of

the individual functional work is mostly performed in silos with the impacting interfaces

communicated through various means, as outlined in Figure 12. The digital trends are directed

toward the democratization and monetization of information, which translates to the need for

accessible data to the entire organization and digital tools that develop reliable business

decisions. The idea is to also reduce the amount of siloed processing and to create collaborative

environments that create transparent interactions across all disciplines. Additionally, an iterative

system though slow communication mediums is difficult to optimize; providing real-time design

Visualization Frontend

DevOps Backend

70

connectivity through core work processes can enable workflow optimization that can create

substantial project savings.

There are two types of design workflows in drilling: factory and custom. Factory type

design usually describes highly repetitive wells in more predictable environments, like with shale

land assets. And, custom wells are usually found in complex and unique environments, like

deepwater assets. Both types of design workflows have a high potential for improvement with

digital innovations, but factory type wells are better suited for more automated processes, and

custom wells benefit better from more collaborative working environments. With either

direction, they can both be improved through process standardization on a single digital platform

where data aggregation and automation can be leveraged and adapted to both approaches as

needed. Creating a single-source digital workflow that adopts consistent standardization will not

only improve the design efficiency, optimization potential, and profitability, but consistent

automated workflows can be developed and scaled to the entire organization. This next section

will discuss possibilities and recommendations around digital platforms that connect the entire

end-to-end design process. Also note that these design models, if developed correctly, can then

act as the Digital Twin or model reference to the operations phase when being implemented.

4.3.1.1 Digital Workflows

The idea of a cohesive digital platform for design workflow is to have a single source of

truth that can be accessible by any engineers involved in the project. A single platform

democratizes the design phase data across functions where real-time design decisions can be

adjusted to optimize the entire project. Figure 18 represents the digital design “sandbox” that is

developed from multiple design layers to create the collaborative environment. This specific

scenario outlines the use-case for subsurface, drilling, subsea, production, and facilities

development to be performed in a single system. The idea is not to replace or cannibalize the

existing systems, but to create standardized protocols that connect each system into the same

working environment. Inputs and outputs can be relationally mapped from each expert system

that is connected into the workflow. Design changes from one end could potentially

automatically preliminarily update propagating changes throughout the system. However, this

initiative would need support from participating expert systems. Physics-based, case-based, and

71

analytic-based models would need to develop standard protocols that would integrate into an

interoperable environment. Details regarding the different platform layers are explained below

Figure 18.

Figure 18: Future System State: Digital Design Platform

Single-Source Design Platform

The single-source cloud-based platform approach offers a working environment that

creates design continuity through cross-functional connectivity and collaboration. The

visualization platform is a frontend development that enables integration with other backend

systems or tools. This digital initiative enables visualized workflows that incorporate design layers

to provide immediate cross-functional feedback, ranging from design and cost impacts from

suggested changes. The idea is to develop a digital representation of the entire development field

by leveraging visualization mapping with currently used expert systems to process the flow of

model inputs and outputs to generate real-time design results. Backend systems that can be

integrated into a design platform are physics-based modeling systems, rule-based governance,

case-based or data analytic models, and economic models.

Design Governance & Specification Layer

Data Analytic Layer

Economic Integration Layer

Graphic Visualization Layer

Expert Physics-Based Model Integration

OpenAPI Standard Protocol Connectivity

Single Source Design Platform

Software Platform Design “Sandbox”

Reservoir Design Field OptimizationFacility DesignWell Design

72

The evolutionary change toward this type of single source of truth system will significantly

increase the transparency, competency, and speed at which work is performed. However, this

change will be challenging because the transformation toward a collaborative environment

cannot be pursued alone but must be developed through cooperation and partnerships.

Organizations will have a difficult time transitioning away from traditional software systems and

companies that refuse to develop in collaboration. But, it will be necessary to create new

alliances with organizations that are pursing the same digitally innovative initiatives due to the

standardization and product commitment involved to produce successful results.

If the process flow of model inputs and outputs are mapped correctly within a single

design platform, then the entire end-to-end workflow can be performed automatically. The

intention is for the automation to provide preliminary, invalidated design results that would then

be checked and processed through an engineer. An analytic layer can provide additional insight

to offset design and selection criteria. For example, a preliminary well design can be developed

through offset generated inputs from the nearest geographically or environmentally similar wells

with a location and mechanical earth model as inputs into the design platform. These results

would then be verified and validated by a drilling engineer to ensure they are in compliance with

design standards. Any changes on the subsurface data would trigger an automatic redesign – this

real-time feedback would enhance attention and competency cross-functionally. This

methodology can be repeated down the value chain with subsea infrastructure, pipeline design,

and facility design for a deepwater or land environment.

OpenAPI Standardization

For sustainable development and innovation with a single system, there must be strict

standardization for unified system interoperability. The current trend in the digital platform

connectivity space is the adoption of the OpenAPI Specification (OAS) standard. This standard is

a programming language agnostic interface for REST APIs, which allows for the understanding

and interaction of a service with minimal additional knowledge or logic (“OpenAPI Initiative”

n.d.). It provides the foundation for any system to be integrated into the single platform system

as a service. OpenAPI not only creates a layer for all software services and data sources to

connect, but it allows for third-party application support and development to drive further

73

system customization. API standardization inclusivity is mandatory for selecting digital

partnerships to ensure sustainable integration into the designed workflow. Physics-based expert

systems that drive the design process in oil and gas development need to design their software

services for API connectivity.

Innovative software-as-a-service (SaaS) or platform-as-a-service (PaaS) opportunities are

providing new ways to enhance productivity across the entire spectrum in O&G. Digital

methodologies gain exponential value from the network effect, where increasing the

interoperability (e.g. connectedness) with other services greatly enhance the possibilities.

Industry is starting to focus on end-to-end workflows by integrating all software services onto a

single platform using OpenAPI standardization. The research data suggests that this is the

direction in which the industry is heading in order to create value from innovations in digital

technology. The API standardization is a necessary step that provides the building block for

creating a cohesive digital platform for information democratization and process efficiencies that

creates value within an organization.

Physics-Based Model Integration

The evolution of the digital roadmap requires cooperation of physics-based modeling

software providers to develop software services that can integrate with digital workflows.

Notably, FutureOn created a Subsea Digital Alliance that includes the expert systems RagnaRock

Geo (geoscience AI), Oliasoft (well design), Entail (engineering), and WINS (operations)

integrating through Microsoft Azure to develop a single workflow platform. In fact, FutureOn’s

FieldAp™ provides a cloud-based platform and rapid visualizations to integrate, in collaboration

with other physics-based software modeling systems for improved end-to-end workflows.

FutureOn’s PaaS creates a connective platform through OpenAPI that integrates all the expert

design systems to enhance design speed with real-time collaboration. This platform can leverage

offset well or field information to develop preliminary designs and cost forecasts, and allow the

user to run any of the expert design software from within the platform (“Subsea Digital Alliance

- FutureOn” n.d.).

Equinor has developed the digital platform Omnia that is entirely built from OpenAPI

standardization for connecting resources. The platform is advertised on GitHub with specific

74

standardization protocols to instruct developers on how to build and integrate applications into

their platform (“GitHub - Equinor/OmniaPlant: Documentation on How to Get Started Building

Industrial Applications and Services by Using Omnia Plant Data Platform” n.d.; “Meet Omnia- the

Statoil Data Platform That Enables Our Digital Roadmap” n.d.).

The current software services that are leveraged in the design process should be analyzed

to determine if there is alignment with future innovations. Adopting new design software will be

a challenging transition for organizations, but failure to build the right competencies early on will

result in a loss of innovation potential and competitive advantage. The pursuit for platform

connectivity between all design software in the oilfield development value chain can be realized

with strategic organizational decisions regarding digital partnerships and alliances.

Visualization Layer

The visualization layer, or more commonly referred to as the user interface and user

experience (UIUX), is one of the most critical elements for a successful digital platform. The

usability and satisfaction from the customer are heavy influences on whether a digital platform

or software is readily adopted by an organization. The amount of training and competency

required is largely driven by the UIUX, including whether coding or a programming language is

required to navigate the offered capabilities. The failure to select software that does not focus

heavily on the UIUX can be a challenging barrier to overcome as the initiative diffuses through

the organization.

The UIUX for a PaaS that connects to various expert and analytic systems must have both

a field layout visualization and a UIUX to interface with each software application. The idea is to

be able to perform all of the work within one development interface. Integrated visualization will

help develop new insights and cross-functional transparencies that have not been robustly

explored within the oil and gas industry. As an example, well designs and casing structures

developed within the 3D lithology layers can help visualize potential execution issues that need

to be mitigated. Additionally, well designs imposed on seafloor bathymetry, along with the

subsea infrastructure, will develop enhanced visualized relationships with the operation. Oil and

gas design and operations suffer from the lack of visibility, due to operations occurring below the

earth’s surface. Therefore, any type of enhanced visualization will build stronger competencies

75

and improve operation efficiencies. Entire companies are dedicated to developing extended

reality and visualization discovery to improve operator performance. The visualization layer

should be a key focus when selecting a PaaS to drive efficient workflows. The usability of the

software will influence the preference and acceptance of employees’ transition and adoption

from their old design and workflow tools. The importance of the UIUX quality cannot be

overstated to ensure the success of scaling a new workflow to the entire organization.

Economic Layer

The economic layer ties the holistic optimization together. The purpose is to have mapped

economic assumptions for each function that integrate together for transparency on the total

project cost and forecast. The idea is to have both real-time updates on project economics, and

to also understand cross-functional impacts from change propagation. Preliminary economic

assumptions can be driven from the data analytic layer from offset well developments in SAP or

other financial data sources. The preliminary economic evaluation can be updated with quotes

(potentially real-time, if service providers have application access to specific entries) to provide

validated costs. The economic layer can be tied together with the decision analysts’ probabilistic

Monte-Carlo models for economic assessment. This holistic approach develops the financial

acumen of all participating engineers with the project – employees receive immediate project

economic feedback from suggested local changes. This type of transparency builds capital

stewardships within the organization and helps to guide enhanced decision making with not only

an engineering perspective, but at a holistic project value perspective.

Data Analytic Layer

Visualization (UIUX), economic relationships, and transparency of holistic

interdependencies are all essential, but what sparked the digital evolution is the advent of robust

data analytics that democratizes and monetizes the immense amount of server-stored stagnant

data. Data analytics provides anomaly detection and relationship mapping that promotes

enhanced design and operations decision making. The value driver for the digital transition has

been largely focused on the innovative capabilities of the artificial intelligence technique,

machine learning, and how these algorithms, with current computational power, are able to

leverage available data to provide operational efficiencies. The momentum for a digital workflow

76

connectivity is largely centered on the ability to aggregate disparate data sources to perform

actionable data analytics. The idea is to create a higher level of understanding and insights with

design and operational decisions. The analytic layer provides tremendous value by connecting

offset performance data, offset design data, and recorded lessons learned to build a robust

design plan where realized value is achieved.

The data mining that once took engineers weeks to perform can now be aggregated into

a single platform where not only the historic data is readily available, but also performance

curves, design selections, and recommendations, depending on the functionality of the

algorithms. Yet, having a single-source platform with OpenAPI standardization allows for third-

party participation to develop data analytic algorithms that can be integrated within the

platform.

From this perspective, data sources can be pulled from anywhere to develop more robust

design workflows. For example, IHS Rushmore data for drilling performance can be pulled from

any participating operator. The dashboard can be compared against other offset data sources to

understand the appropriate design decisions – this is an immediate feedback response on how a

design proposal compares against industry, which will expedite the design process. Finally, these

same methodologies can be applied to the respective geology, where any current or past

operations performance issues can be linked to specific lithology layers or with specific casing

equipment to provide an alert within the platform as a potential risk. The power behind data

analytics greatly reduces the experience required to develop complex engineering designs and

allows organizations to build more resilient trust with proposed designs in order to vastly improve

both the quality and design lifecycle of project developments.

Governance Layer

The idea of the governance layer is to build objective reasoning around internal and

regulatory standards for the design criteria. Dependence on either a single or group of engineers

to review all the governing design standards and specifications is an unnecessarily time-

consuming and high-risk prone endeavor. Having the criteria coded into a reasoning algorithm

that can identify potential issues is a more robust strategy. These types of assurances can then

be automated and streamlined to provide verification and validation on the design specification

77

of a well or field development, enabling engineers to spend more time focusing on innovating

ideas instead of reviewing documentation to ensure design criteria is met. Ontological semantics

language (OWL) can be leveraged to build relationship mapping between structured design

reasoning and regulatory standard specifications. This layer will alleviate the inherent risk an

organization accepts with mechanical processes and assurances to catch design and operational

issues. Verification and validation (V&V) will still need to be performed by a group of expert

engineers, but the focus may be redirected to unique issues and circumstances requiring the

attention of engineers, thereby saving needless time spent on mundane assurance requirements

and checklists.

4.3.1.2 Design Workflow Automation

The previous section analyzed improved design opportunities through knowledge

democratization and enhanced data analytics, but there are other inefficiencies within design

workflows that absorb time and energy away from important engineering decisions. These

inefficiencies are generated from coordination and communication requirements within a

project system. Reports, presentations, emails, meetings, and other communication mediums for

updates and phase gates consume a substantial amount of time. The other objective of a cloud-

based, single-platform system is to build accessibility and transparency to reduce coordination

efforts and provide an opportunity for information output standards to be automated. The

journey towards workflow optimization obligates that all repetitive tasks become automated –

this not only reduces the time spent on each task, but also greatly reduces the likelihood of

communication and coordination errors. Each progress update communicated via presentation,

report, meeting, or email consumes time, and when this is repeated across a large organization

and across multiple functions, this can accumulate to a significant portion of the project cost.

The idea of having a single platform for a digital workflow means that digital applications

may be connected to develop structured communication reports. Project phase presentations

can be automated to include required model and offset outputs that provide the V&V for project

decisions. Automated design and economic reports provide cross-functional accessibility to

reflect the current position of the project at any point in time during development. This means

leaders will be able to access these files and stay up-to-speed without engineers creating updated

78

presentations and communication reports to keep leaders informed about project decisions and

forecasts.

Automated communication workflows provide significant savings for an organization.

With the aggregation of all the project design data into a single-system, workflows may now

adopt outputs that can be generated through API applications within the system. Additionally, all

engineers will have accessibility to cross-functional documentation that has traditionally been

communicated formally through structured meetings and emails. This presents a notable

opportunity for large organizations that utilize formal workflow milestones, reports, and

presentations. This initiative offers significant time reductions thereby enabling substantial

economic advantages by bringing projects to delivery much earlier than previously thought

possible. This digital initiative provides organizations with a huge competitive advantage when

considering discount rates required for project selections. Big deepwater projects can become

more competitive to short-term land projects by reducing the design and development durations.

4.3.2 Operations

The future operations state relates to the digital techniques that offer advances in

performance during the execution phase of the oil and gas project. These initiatives impact the

physical control of the operation by either machine or human interface as described as Level 0-2

in Figure 11. Digital enhancements in operations can be separated into three increasingly more

complex categories with monitoring, advisory, and automation, respectively. First, monitoring

systems aggregate instrument data into a visual interface to trend, evaluate, and understand

system state. Second, advisory systems further utilize the same aggregated instrument data to

perform mathematical, case-based, or physical model comparisons in order to advise on

operational decisions for improved efficiencies. Advisory systems can be as simple as alerts on

process trends or as complicated as predictive cognition for evaluating the likelihood of future

events with recommended mitigation strategies. Finally, automation systems can be used either

for either machine automation with repetitive tasks (i.e. tripping in hole with pipe) or for process

automation (i.e. pump or other parameter control in response to operational changes), with

some gray area in between, which will be discussed in a later chapter. Machine and process

automation are not new – both techniques have been leveraged in many industries – but due to

79

the dynamic nature of drilling operations, this is a new area of application. For the purpose of

this thesis, most of the discussion around automation innovations is focused on process

automation in light of the advent of data modeling and data analytical techniques that are

enabling reliable real-time control of drilling or process operation enabled by low enough

processing latency.

The instrument sensor data quality, availability, and process latency requirements are

significantly different for each operational control hierarchy. Most of the current trends and

adopted innovations have focused on the advisory control, as this provides significant efficiency

gains with minimal operational disruptions. Instrument sensor data are processed and analyzed

in real-time, both on-site and at remote operating centers, as shown in Figure 19. The real-time

processing becomes increasingly powerful with respect to the volume of quality data and with

the speed at which data is able to be analyzed. The value of data-driven insights diminishes with

increasing control response delay, and with the reliability of the control response itself from

insufficient quality data. Data quality issues can originate from either the operations with the

sensors or with the analytic algorithms, which may include other external data for processing.

The objective of the advisory system is to ingest real-time data, offset performance data,

and/or modeling data to develop enhanced insights through relationship mapping and anomaly

detection that far surpass the ability of a human observer. Drilling, completions, and intervention

operations are extremely dynamic processes, where it is sometimes difficult to understand the

current operational state and to optimize accordingly. Organizations are leveraging these new

analytic techniques to deliver wells more safely and more efficiently than before. And, these

techniques can be linked with the design processes and expert modeling platforms to provide

more robust insights on operations. It is the interoperability of systems that provides the highest

value; the more high-quality information available, the better the performance of the machine

learning algorithms. For this reason, it is always important to focus on the interoperability and

connectivity of these systems within a digital portfolio. This section of the thesis will review

operational innovations and how these systems can be linked with the future state of the design

workflow processes.

80

Figure 19: Operations Edge Computing Data Flow Architecture (“Oil and Gas at the Edge | Automation World” n.d.)

4.3.2.1 Monitor and Advisory

Monitoring and advisory systems are connected to real-time instrument data to develop

actionable operational insights. These insights currently leverage mathematical equations to

optimize performance efficiencies, and the algorithms can be integrated into an onsite

computational system (Edge Computing), cloud-based system, or at remote operating centers.

Figure 19 shows the data flow process for Edge Computing and for office or RTOC advisory

computing. These techniques are taking speed within the oil and gas industry as they are proving

to exhibit “10x return” (Reference: Research Interview) on investment, and they are minimally

invasive to the operation. Many digital companies advertise these monitoring and advisory

capabilities, however from a holistic perspective, few are creating alliances and partnerships to

integrate into more powerful and sustainable systems.

The future state of advisory systems is to integrate operations with design platforms and

expert modeling systems for descriptive and predictive analytics. The caveat is that most current

digital infrastructures and modeling techniques have latency times that are insufficient to

support real-time operational control. And, this becomes even more of a challenge with transient

models that are required for process automation – this will be discussed in the Closed-Loop

81

Automation section of this thesis. The physics-based mathematical models leveraged to optimize

energy efficiency for improved drilling performance have proven to be an effective technique.

These efficiency calculations hold true in most environments with the respective updated system

parameters. The evolution of advisory systems is to have integrated physics-based modeling

systems that allow the entire system state to be better understood. The drilling environment is

both dynamic and complex with varying lithologies, pressures, temperatures, hole size,

inclination, depth, etc. This operational environment is difficult to truly optimize without the

incorporation of advanced physical modelling into the data analytics layer of advisory control.

Other interoperability suggestions are to incorporate the aggregation of performance

data into real-time monitoring services. The connectivity of advisory platforms to design or

performance data allows the capability of optimization applications for suggested tool and

parameter selection. There are so many capabilities that can be leveraged if the systems are able

to integrate together, but the vision needs to be understood and communicated early in the

selection processes to ensure alignment on future innovation growth.

The suggested roadmap for integrating advisory tools into an operational environment is

as follows:

1. Incorporate real-time analytic algorithm services into operations via Edge-based or Cloud-

based.

• The system should leverage physics-based mathematical algorithms to measure

system state.

• Focus on algorithm accuracy and historical return on investment.

• Evaluate sensor data quality and pre-processing algorithms for data ingestion into

the descriptive model.

• Focus on usability with user interface and user experience.

• Align with objectives to integrate offset performance data for visual comparison

during operations (define cause of performance deviations).

• Align on future objectives of incorporating expert modeling systems to provide

real-time comparison against a full system physical model (this can be referred to

as a Digital Twin).

82

2. Incorporate optimization applications into the digital platform to provide tool and

parameter advisory based on analytic offset performance data

• Connect design, tools, and operational decisions to performance.

• Develop optimization algorithms for operational advisory recommendations.

3. Connect with physics-based modeling systems to evaluate current state and predict

future state of operations

• Connect directly to physics-based models through real-time inputs or develop

proxy models with machine learning.

4.3.2.2 Digital Twin

A digital twin is a representation of a physical entity within a digital model. The digital

twin is not exactly the model itself, but more the specific representation of the process, product,

or asset. The representation can describe the characteristics, attributes, and behaviors of the

physical system (van Schalkwyk 2019). The specificity of that instance is driven from the sensor

connectivity that defines an individual model to represent the current state of that entity. The

advent of digital twins has been driven by more economically available sensors and by improved

data computational power. The aggregation of industry instrument sensors connected into a

cloud-based environment is referred to as the Industry Internet of Things (IIoT). The processing

of time-series and real-time IIoT data into physics-based or analytic-based models create the

holistic view of the modeled asset – this modeled instance is the digital twin (“The Promise of a

Digital Twin Strategy,” n.d.). Industry has leveraged IIoT cloud-based connectivity to develop

digital twin platforms to create visualizations and cognitive insights that provide advanced

business intelligence for operation management. The general architecture of a digital twin is

represented in Figure 20, which consists of a feedback loop between assets, data, models,

visualization, and response.

83

Figure 20: Digital Twin Architecture

Smart, connected equipment and products provide aggregated sensor data for advanced

predictive analytics. Machine learning can be applied to the large volume of data to determine

pattern recognition and anomaly detection that provides business insights for predictive

maintenance, performance information, and design and operation mitigation strategies. Unique

relationships between operations and equipment can now be discovered through advanced

mathematical algorithms. The operation fingerprint (time-series) and accumulation of pressure

cycles, temperature cycles, actuations, revolutions, stresses, and strains can be correlated to

performance data to provide actionable insights to improve efficiencies. These time-series

correlations have been leveraged in analytical models for predictive maintenance where a model

instance for a specific piece of equipment or system will have a probability of overall health

comparative to the accumulative operational exposure. Companies like Veros Systems Inc. are

applying machine learning to current and voltage waveform to identify performance patterns of

rotating pumps and equipment.

84

Analytic models are not the only data application that creates digital twins. Physics-based

models can be utilized to represent system state through real-time data input from IIoT. Physics-

based models can represent system’s physical interactions with mechanical models,

thermodynamic models, or hydraulics models. The mechano-, thermo-, or hydro- representation

of a process or system can provide valuable insights on the current state, as well as, predict future

state. The advantage of a physics-based model is that parameter trending, as an extrapolation,

can be inputted into the model as a time-series. This methodology can predict, at current

operating trends, where a system state is heading, and what actions are needed to mitigate the

trending state, if needed. Physics-based digital twin models can be used for understanding

dynamic systems, like drilling operations. The digital twin initiative ties together with the future

design platform as a connectivity to expert, physics-based modeling systems, and to the cloud-

based advisory system as a source for aggregated data and to combine analytic-based and

physics-based models to evaluate system state.

The digital twin initiative in oil and gas has been heavily driven by opportunities within

facilities and topsides equipment. Analytical models are used on historical data, real-time IIoT

data, and time-series IIoT data for business and operational insights. Equipment monitoring

algorithms can be repeated within a single system or across multiple systems to generate unique

representations of all operating assets. Figure 21 shows a generic O&G analytic digital twin data

flow architecture from IIoT data generation to insight analytics, and to actionable business

visualization and response. Different machine learning techniques can be applied to the sensor

data as shown in the Insights hub.

85

Figure 21: O&G Digital Twin Architecture

The future state of digital twins within well operations is application analytics on topside

and downhole equipment for predictive failure, but also with system or operational state

algorithms with physics-based models. Well operations are dynamic, and traditional control

structures are to follow parameters outlined on model outputs in Word documents or PDFs. This

creates a manual comparison protocol during operations without any guidance on how or when

to alter operations if deviations are experienced. The inclusion of physics-based models on a

monitoring and advisory platform to determine operation state will enhance the operator’s

ability to identify and address issues. Data analytic applications can be included to determine

similar trends in offset scenarios with a list of potential mitigation actions. However, there are

some limitations with steady-state physics-based model as they can only describe specific steady-

state points in an operation and not the transient, or in between, state. Transient models have

not been readily used within Well operations or design environments, but they provide the

means necessary to understand the full dynamics of the system. The idea of this section is to

understand the future use and connectivity of the systems, and how they will interrelate to

develop advanced business insights.

Aggregating physics-based models and analytic-based models into an ensemble model

can lead the way to process automation. Closed-loop process automation, or system automation

without human intervention, requires processing speeds faster than operational changes. For

86

monitoring and advisory systems, steady-state models are sufficient, but as further control

mechanisms (process automation) are pursued, transient models will be necessary. For

automation opportunities the system needs to know more than just point A and B for an

operational state – the control system will need the pathway from A and B to ensure accurate

and optimized decision-making. This will be discussed further in the next section.

4.3.2.3 Automation

Well operations are challenged with increasingly complex environments with narrow geo-

operating windows and higher temperatures and pressures. The challenge further extends to the

economic viability of all types of wells provided the current commodity pricing conditions. And,

whether the strategy is to pioneer difficult environments or to reduce the well cycle-time, the

common denominator is to improve operational efficiencies and capabilities to become

profitable. Drilling automation or semi-automation provides the means to construct wells into an

optimized state. Referring back to Figure 9, process and machine automation principles provide

the only solution that has a theoretic capability of reaching the Technical Limit Well Time.

Monitoring and advisory systems allow for performance efficiencies to relieve some of the

Invisible Lost Time (ILT), but due to the information, knowledge, and reaction-time gap between

the human versus machine interface with the process control, complete optimization will never

be achieved.

For the purpose of this section, repetitive machine automation will not be discussed as an

integrative solution. Machine repetition only requires spatial and functional recognition to

complete the programmed task, and although dynamic responses are programmed conditionally

to mitigate against hazards, this type of automation does not require interoperability with the

physics-based models for control dependency like process automation. Repetitive machine

automation can be deployed in many areas of Well operations to improve efficiencies, such as

picking up pipe from the deck or making up bottom-hole assemblies. This is an important

initiative to improve operating time and reduce human exposure to lifting heavy equipment.

Machine automation still has dependencies on sensor quality, similar to the other digital

initiatives, but the focus for this section is on closed-loop or semi-autonomous drilling

automation. Drilling automation encompasses the dynamic control and response with respect to

87

surface equipment (top drive, rotatory, and pumps), and with downhole equipment (directional,

telemetry, and activation) as a complete system.

Automation initiatives have been applied across many facets of Well operations with

varying levels of success. Most applications have been directed toward surface control, but

innovations are now trending toward downhole automation. Successful examples in Well

operations include: rotary steering technologies, auto-drilling for WOB or ROP control, managed

pressure drilling (MPD) control systems, top drive control for pipe and casing tripping, safety

control, and monitoring (“Automated Drilling Gains Momentum in Offshore Operations |

Offshore” n.d.). Automation has gained momentum with high repetition, low dynamic tasks, with

the goal of providing safety reflexes, task efficiencies, and enhanced decision support in the wake

of a distraction-polluted environment. As automation initiatives are challenged with increasingly

complex tasks, the limitations with data quality and standardization are becoming more evident;

this discovery to advance the core-supporting infrastructure for automation is driving the

evolution toward closed-loop automation opportunities.

The value of integrating automation into DC&I operations is realized through reducing non-

productive time (NPT), improving the precision and speed of tasks, managing operating

envelopes, and maximizing performance through entity and system optimizations. Note that

automation has the potential to reduce both NPT and productive time (PT) of the operation. NPT

is reduced by operating equipment within strict boundaries for actuation sequences and normal

operations – this level of precision greatly reduces the strain on equipment and can prevent

downtime. Additionally, NPT is reduced by early recognition and quicker reactionary times to

deviations in the process. Automation can help mitigate trends that are deviating toward a

hazardous environment. PT is reduced by improving the efficiencies of task execution and

maximizing operating speeds with pipe tripping and with drilling operations. Automated drilling

can respond to different lithologies and downhole environments to maximize the mechanical

specific energy for improved rate of penetration. Automation of trajectory control can improve

the homogeneity of the well curvature which reduces potential for stuck pipe, and due to

minimum curvature calculations, continuous closed-loop directional control can help with hitting

targets more accurately. Employment of automation techniques in other industries have

88

produced 20-30% operational improvements (de Wardt 2019), which is a substantial amount

when extrapolated to the entire industry. Another value driver of automation is that these

opportunities reduce human exposure to machinery and hazardous environments. This includes

both removing human presence and reducing the likelihood of equipment either exceeding

specified operational boundaries or from operating out of pre-determined sequences, which

holistically reduces the overall system risk.

Process automation aims to address four focal points for successful deployment: detection

of state (physic-based and analytic-based models), equipment mechanical boundaries (safe

operating conditions), process operational windows (the boundaries of the safe system state),

and with response sequences (safely and effectively control response variables to account for

either planned or unplanned deviations) (Cayeux et al., n.d.). These opportunities create realized

value and deliver a competitive advantage to the organization. However, due to the complexity

around standardization, interoperability, and organizational competency, if automation is an

objective for your digital portfolio, these objectives and subsequent innovations must be aligned

with business partners. It is important to note that NPT due to operations is not solely caused by

issues associated with rig performance, as equipment can exhibit poor design quality and suffer

from lacking V&V oversight. Automation and enhanced data analytics can assist with developing

robust equipment and tools that are utilized during operations. This is why selecting business

partners and alliances is so important – operations are only as strong as the weakest link in the

process. Developing these partnerships and alliances for the pursuit of the intermediate goal of

automating repetitive tasks and creating semi-autonomous processes will help pave the way to

understanding the infrastructure and standardization requirements to develop a fully

autonomous system. There are three major barriers to employing automation opportunities that

will be discussed in this thesis, they are system architecture standardization and interoperability,

sensor data quality and accessibility, and physics-based modeling techniques.

Standardization for efficient and effective software connectivity for data exchange is critical

for developing automation opportunities. Interoperability is the term used to specify the ability

of software systems to work in conjunction and exchange data. OPC-UA is an interoperability

standard in the automation space and promotes platform independency, enhanced security,

89

simplified architectures, scalability, future-proof, and easy deployment (“What Is OPC? - OPC

Foundation” n.d.) (see Software Architecture section for more details). The complexity associated

with aggregated software systems and connecting to entire rig control systems for process

automation dictate the need for an open industry approach with standardization. The evolution

to system automation would be far too expensive and time consuming if each rig system and

subsequently upgraded system model had their own operability protocols. Standards

commonality creates the building blocks for innovation to thrive into an industry realized value.

The initiative needs to make economic sense for both the operator and for the service providers.

If software and connectivity development to control systems for process automation required

extensive programming and testing for each individual mechanical system, no one would have

the economic means to develop it to the full potential. However, the value at stake for

automation is high enough that competitors are influenced to continue working in silos – this is

an example of the Prisoner’s Dilemma (Game Theory) where individual parties would rather work

for their own self-interests to protect themselves rather than produce an optimal outcome.

Therefore, standardization and interoperability initiatives need to be created through

partnerships and alliances that make it worthwhile for companies to work together. Committing

to standardization, or as an O&G operator, requiring a specific standard will not only influence

service providers to contribute on that domain, but it will allow their efforts to achieve higher

economic feasibility due to the shared network of applicability.

Data quality standards increase as control trends toward semi-autonomous or closed-loop

automation. The tolerance for data accuracy and quality issues diminishes as the human interface

is removed from the system control. The objective is to understand the control requirements

necessary and operational tolerance for errors. Sensor model and quality standards can be

adopted through industry automation suggestions, or through improved data ingestion V&V

filtering techniques. It is important to note in the specific case of a drilling system, data transfer

and control to downhole equipment is transferred through mud-pulsing telemetry. That data

quality and communication latency could be the weakest link that needs to be addressed before

assuming further autonomous control of the system process. The systems approach is to

understand the holistic view of the problem and to develop solutions to start bridging the gap to

90

the ideal state. Sensor quality and data accuracy is an evolutionary gap that will take time to

upgrade but communicating the objectives to vendors to influence sensor selection and higher

quality data telemetry will drive energy in the right direction towards achieving innovation and

value for digital initiatives.

The last concept to discuss with automation initiatives for process control is the physics-

based modeling. The aggregation of sensor data, like temperature, pressure, depth, flow rate,

etc. does not necessarily provide an understanding of system state without the integration of

physics-based modelling. And, without an understanding of system state, there can be only

limited process automated control. As discussed earlier, for a system to assume full control, the

control system needs to understand both the steady-state and transient state of the system. And

model processing, either directly or indirectly through proxy analytic models, is required to

exhibit latency speeds that allow for operational control against process changes. The challenge

within industry is to shift design and operational competency from utilizing only steady-state

models for the DC&I design, to services that offer transient capabilities. Additionally, efforts

towards improving the integration (refer to the single-platform design workflow) and reducing

the latency processing and running speeds of models are needed to match the level of desired

process control.

Complete closed-loop automation is the only avenue to achieve true system optimization.

This endeavor may or may not be worth the investment, depending on the economic advantages

from development and adoption for the specific system. As standardization and interoperability

initiatives trend toward open industry alliances, the economic advantages become more

apparent. DC&I operations are a complex and highly dynamic environment that can realize

substantial efficiency benefits from automation, but these same environmental conditions

contribute to the barriers that challenge innovation breakthroughs. The next chapter will discuss

the identified gaps to achieving realized value in the transitional shift to the future state of

digitization.

4.4 Identified Gaps

The systems methodology employs a holistic approach to identifying the barriers and

associated gaps to achieve the determined ideal future state. The process of mapping the

91

interconnectivity and feedback relationships between systems, tools, and stakeholders provides

objective insights and opportunities to create robust and innovative solutions. This section of the

thesis will categorically explain the identified barriers to achieving realized value with

organizational digital initiatives for DC&I design and operations. These categories should be

reviewed as specific areas to challenge and question when selecting partnerships and alliances

with digital services.

4.4.1 Human to System Integration

Human capabilities and organizational competencies are one of the largest barriers to

deploying successful digital initiatives. The organizational requirements to scale transformative

digital initiatives across the enterprise is both challenging on the frontend architecture for

employee’s ability and preferences toward the usability and on the backend architecture for

infrastructure and integration development. Each type of software and digital system possesses

different levels of required digital competencies to become a power user. Key questions to

brainstorm and understand when selecting a digital service are as follows:

• Do you currently have the core competency to effectively use the selected software?

• What is the intended relationship of your employees with the software interface?

• Do you currently have the required talent for successful development, deployment, and

maintenance of transforming digital initiatives?

• Are you able to afford or attract the talent necessary to be successful?

• Has this digital platform been leveraged for the same function and scale as intended? Or,

will the organization have to further develop to scale and to full functionality?

These are just some of the questions that may be asked. Ultimately, though, recognition

of these barriers is understanding the current organizational capabilities and undertaking

mitigation measures to not over commit to digital platforms that exceed those capabilities.

Capabilities are all relative to the expectations of the human interface requirements within the

organization. Expectations of IT, engineers, or operators, as well as the support structure from

the service provider should each be evaluated separately.

Another topic that has been discussed is how to develop a culture that is data driven.

Traditional engineering is derived from physics-based models that define the specifications and

92

boundaries of designed systems. Changes to designs are influenced by experience, events, or

novel ideas, however, with the integration of data analytics, new discoveries with pattern

recognition and anomaly detection technology will provide a more enhanced understanding of

the system. What is critical about data collection is the utilization of statistics with respect to

statistical significance, data biases, causation, and correlation. These principles need to be

properly understood to ensure appropriate decisions are being implemented into operations that

add value to productivity. Engineers must both understand the capabilities and the limitations of

the data analytic tools before they are able to apply the discoveries effectively in the design and

operations process.

Addressing the usability of the software services, whether it is for the design engineer or

the operator in the field, is critical for successful deployment within the organization.

Understanding and addressing the usability and capabilities early in the software’s deployment

will dictate how well it will be adopted and scaled across the organization. Asking an employee

to adopt new tools and processes is disruptive to their workflow and will produce resistance and

rejection if the new tool or process does not add equivalent realized value to the employee or

project.

The last point with respect to employee competency is training. For IT and engineers to

have productive discussions that influence transformation, they need to speak the same

language. IT professionals must have a better understanding of the process engineering and

workflows, and the engineers must have a better understanding of the data system infrastructure

and capabilities. When the holistic system is understood by all stakeholders, realistic

expectations are set, and higher productivity is achieved. And, due to the evolutionary trend of

digital progression, early deficiencies in the required digital competencies can lead to poor

decision and error propagation throughout new designs and workflows.

4.4.2 Sensor & Data Quality

The accuracy necessary to monitor and control a system is determined by the acceptable

tolerance for error. Highly dynamic systems require higher frequency and higher quality data to

have effective control of the process. However, simple and slower system processes require a

lower data frequency and can potentially tolerate higher deviations because of the delayed

93

reaction response. Additionally, higher control techniques, like advisory versus automation, also

require higher quality and higher frequency data. These levels of model error tolerance need to

match the level of control instilled into the digital system or platform. All data quality issues

associated with the holistic process should be managed with intent, from sensor selection in

equipment and tools to incomplete data sets leveraged in historians (Cayeux et al., n.d.). To be

sure, this thesis does not recommend a revolutionary change; it suggests the creation of a

roadmap that defines the data limitations to the process control objectives, and that

organizations work with service providers and IT professionals to invest in equipment and

software infrastructure that is designed to meet the digital strategies of the organization.

4.4.3 Data Accessibility

Digital investments should focus on organizational data accessibility. The latency time

required for an engineer to perform data mining and data processing to find and develop robust

engineering decisions should be a major organizational concern. The accessibility of quality data

can rapidly advance both design and operational decision making. The barrier to data accessibility

is the lack of quality backend platform infrastructure and standardization. The end goal,

holistically, is understanding how to get the right data to the right people to make business driven

decisions. If that goal is well understood and communicated, then the direction of digital

investments and digital functionality should align with achieving the desired organizational data

accessibility. This means that the organization will have to accept standardization across

disciplines and functions to allow for the shared connectivity of data and digital platforms.

The pursuit for data accessibility calls for innovative ways to store, share, and

communicate data, where accessibility is intuitive and maximizes productivity efforts. The O&G

industry has been notoriously siloed to protect competitive advantages, but innovations in open

platforms have demonstrated that collaboration efforts are creating more powerful and efficient

solutions. Opening the barriers and inviting others to review working data (versus final results)

and data repositories are a necessary vulnerability and discomfort to spark holistic thought and

innovation. This initiative is as much of a technology shift as it is a cultural shift.

94

4.4.4 Standardization and Interoperability

Interoperability gaps refer to developing digital solutions as partnerships and as alliances,

both internally and externally. This references the need for organizations to design digital

solutions as a cross-functional effort to ensure that data transfer and data integration

opportunities are preserved for future sustainability. Interoperability efforts include creating

standardization protocols across the organization to ensure developed applications are able to

connect together for synergistic benefits. Standardization should be pursued as an industry as a

whole, to create an environment where service organizations can create solutions that are

applicable to all operations and working platforms.

Innovative breakthroughs to difficult problems are developed through incremental

improvements over the work of predecessors provided the situational combination of

circumstance, character, education, and intelligence (Chollet, n.d.). If we accept that great

innovations are derived from the efforts and momentum of others, then we can conclude that

collaborative environments expedite the development of technological advancements. The

recent explosion of data analytic capabilities and applications were founded off the development

of increased computational processing, improved internet capabilities, and cost efficient IoTs –

all of which were not specifically developed from data analytic providers. Understanding that

collaborative environments are creating a new benchmark for speed of development and

innovation, then remaining in a siloed operation will quickly result in a suppressed competitive

environment for engineering design and operations.

4.4.5 Return on Investment

The hypothesis of digital initiatives entering a phase of a modern “productivity paradox”

suggests that organization’s efforts and direction are not producing optimal realized value. The

commitment to “digital” without the thorough benchmarking and evaluation of system value

emergence can contribute to the sanctioning of ineffective projects. This includes a potential lack

of understanding in the time, headcount, competencies, and culture required to develop and

deploy digital initiatives to scale. The holistic focus needs to be value driven and not digitally

driven, with the philosophy of value creation through digital tools and platforms. The future state

95

of digital goals should be generated in collaboration with each digital investment related to its

value and capability contribution to the roadmap.

Project management metrics, KPIs, and dashboards are critical for understanding the

evolution of performance with the adoption of digital initiatives. Detailed tracking and

benchmarking will help to understand the value and gaps associated with developing and scaling

of changes throughout an organization. Analytics on these details will sharpen the expectations

for sanctioning future projects and help to avoid investing in projects that show historic difficulty

in implementation. This thesis suggests approaching the digital evolution with strict adherence

to value creation. Leveraging digital initiatives that have a clear contribution to the future goals

and a proven track record of success are clear ways to avoid ineffective programs.

4.4.6 Partnerships and Alliances

Discussions through interviews and market research suggest that partnerships and

alliances are critical for success. Organizations will have to reevaluate their partnerships for

software, equipment, and tool services to select providers that will innovate and contribute to

organizations’ future digital goals. The digital evolution exponentially improves with the network

effect of connected partnerships, and organizations that are resistant to change will fall behind

in achieving competitive advantage in the design and operational space.

Service providers are taking the initiative to create alliances among themselves to provide

single-platform services that include needed design and operational workflow systems. These

alliances are pioneering the way for new workflows and processes that improve collaboration,

reduce design cycle-time, and increase operational efficiencies through optimizations and

analytics applied across the aggregated data. This thesis recommends developing a culture that

is agile and open for change through adoption of new tools and techniques. Although change can

initially be burdensome, the process efficiencies gained through collaborative partnerships that

share the same values can be a game-changer for creating a sustainable competitive advantage.

4.4.7 DevOps

As described in Figure 17, the lack of understanding of the backend complexities to digital

developments poses a significant gap for creating realistic expectations between engineers and

IT professionals. This gap can be bridged by integrating engineers with IT professionals in

96

collaboration groups to ensure stakeholders, needs, and expectations are better understood.

Additionally, digital training can be employed to develop the digital acumen and competency of

engineers to provide insight on the capabilities and complexities required to develop specific

functionalities. The development operations space is rapidly changing with the integration of

cloud computing and the subsequent transformation of digital workflows and partnerships.

Companies are dedicated to restructuring these developments to increase data accessibility and

interoperability of software platforms. This is another area where developing partnerships and

selecting organizations that understand how to design for future capabilities will provide

immense financial benefits.

The next chapter briefly reviews model-based systems engineering (MBSE) in order to

evaluate how traditional MBSE methods integrate with the new innovative digital workflows that

were presented in this chapter. Integrated engineering design and operational workflows have

been utilized for many years in other industries through system modeling techniques, but these

methods required in-depth programming skills to master. However, these principles can be used

to develop design governance and verification layers to incorporate effective MBSE values into

digital initiative developments.

97

5 Model Based System Engineering

Model-based systems engineering (MBSE) is a methodology for systematically managing

the design, analysis, verification, and validation through modeling applications (Hause and

Ashfield 2018). System models are built to represent systems and subsystems or components to

digitally map the behavior, performance, structural, and other engineering dependencies and

requirements. System models are an accepted and utilized practice in the aerospace and defense

industry and is gaining interest in other industries as increasing complexity becomes more

difficult to manage. The International Council on Systems Engineering (INCOSE) is a professional

society that promotes and advances systems engineering, and is most recognized for their

published Systems Engineering Handbook (INCOSE 2015). The Object Management Group (OMG)

is an organization that develops standardization and specifications, most notably their Unified

Modeling Language (UML) for system design visualization and the extension or subset language

System Modeling Language (SysML) (Hause 2013). These two professional and standards

organizations account for many of the solution concepts that are currently employed in the MBSE

space.

The objective of this section of the thesis is to provide an understanding of the vision of

MBSE and how it relates to the digital initiatives in DC&I design and operations. INCOSE published

a Systems Engineering Vision 2025 (“INCOSE: Systems Engineering Vision 2025” 2014) to guide

the direction and evolution of systems engineering in response to rapid advancements in

technology across a diverse set of industry domains. The vision incorporates the extension

Systems of Systems (SoS) modeling to integrate the interconnection of other independent

systems and devices (IoT). The vision further suggests the integration into a “single, consistent,

[and] unambiguous” (“INCOSE: Systems Engineering Vision 2025” 2014) representation of the

system that provides transparencies and accessibility to all stakeholders over the full end-to-end

lifecycle of a system process. The integrated, digital engineering model enables rapid knowledge

representation of concept and designs. The INCOSE 2025 Vision aligns with the single platform,

multi-functional integration as outlined in the Future Digital State section and provides clear

guidance and considerations at the system level on how to effectively implement.

98

MBSE has been leveraged in complex engineering environments for many years using

OMG UML SysML modeling languages to represent system architectures and components. As

MBSE continues to evolve with increasing system complexity and with improving digital

technologies, the question remains as to whether new modeling languages or more intuitive and

user-friendly platforms will replace the traditional methods. The Department of Defense has

archives of Architecture Framework utilizing SysML that provides a comprehensive workflow for

developing system models (“The DoDAF Architecture Framework Version 2.02” n.d.). This

framework and methodology have a historical track record of critical engineering experience and

have been adapted to changes associated with lessons learned and efficiency improvement that

provides a level of system reliability. Deviating away from these proven methods does impose

some inherent risk, but from a competency perspective, requiring organizations to adopt a new

programming language might be a difficult barrier to overcome for widespread adoption and

scalability.

Systems engineering can be adapted to support many types of applications, and perhaps

the oil and gas industry needs to adopt a fit-for-purpose methodology. The principles promoted

by INCOSE and OMG, as well as the lessons learned and best practices from NASA and DoD,

should be considered and integrated into the new oil and gas system. INCOSE has created an Oil

and Gas Working Group (“Oil and Gas Working Group” n.d.) to advance systems engineering

within the Oil and Gas industry. The steering committee is composed of representatives from

major oil and gas companies that are seeking to define the best practices and strategy for

integrating systems engineering into the oil and gas industry. The principles are similar to the

integrated design approach that creates a cross-functional collaborative digital environment that

hosts and visualizes the interdependencies across the full scope of a project system. The intent

of the platform is to create a standardized protocol where additional applications can be

integrated – included design, analysis, verification, and validation standards mapping as outlined

by INCOSE. As these new system platforms are being developed, referencing the systems

experience from INCOSE will be a valuable resource to shape the functionality requirements for

robust engineering design.

99

The INCOSE Systems Engineering Vision 2025 provided a figure that represents the full

spectrum of the systems engineering methodology. Figure 22 shows the integration of concepts

that develop and contribute to the holistic approach for project engineering management.

Systems engineering is designed to ensure all system components and interdependencies are

designed to work together to achieve the objective of the whole system.

Figure 22: Systems Methods & Tools, Adapted from (“INCOSE: Systems Engineering Vision 2025” 2014)

The vision of the future model-based approach will encompass visualizations, designs,

requirements, architectures, reasonings, optimizations, and other pertinent design and

operational data. The model will allow for transparent knowledge communication through

unified navigation portals, both cross-functionally (horizontally) and within systems hierarchies

(vertically). This enables complex design insights and system emergent properties to be rapidly

evaluated and identified for efficient workflows and robust design changes. This integration will

shift the communication medium from siloed document dependency to transparent and

accessible virtualization.

PROBLEMDEFINITION

SYSTEMSSYNTHESIS

SYSTEMSANALYSIS

SYSTEMSVERIFICATION

SYSTEMSVALIDATION

Define System Value

Stakeholder Analysis

SYSTEMS METHODS & TOOLS

System Decomposition

Design Structure Matrix (DSM)

System Architecting

System Model (OPD)

Tradespace Modeling

Tensions & Tradeoffs

Composable & Platform Design

System Safety

Interface Management

V&V (V-Model)

Risks & Uncertainties

Simulation & Optimization

Project Management

100

6 Systems Approach to Digital Architecture

This chapter reviews the process flow of data throughout an organization from field

sensors to office analysis. This section reviews data standardization protocols, processing speeds,

and data requirements. The intention of this chapter is to provide the details of the system

architectural diagrams and standardization requirements that are fundamental to the core digital

design attributes and decisions.

6.1 O&G Real-Time Data Architecture

The oil and gas data system architecture is the core enabler for digital innovation. As

sensor technology continue to become more economic and the data processing opportunities

(e.g. machine learning) are greatly improving, the oil and gas industry is transitioning to a digital

oilfield (DOF). Understanding the process flow of data from field to office, where information can

be collected, processed, and analyzed, will help to improve capabilities with respect to

operational efficiencies. These operational efficiencies include automating workflows, remote

operations, improved operational transparency for better decision making, reduced downtime,

etc. SCADA systems (Supervisory Control and Data Acquisition) have commonly been used for

decades in the oil and gas industry to gather instrumentation data of producing assets and

communicate to a server for processing. Figure 23 represents a simple data flow architecture

where operational data is processed through programmable logic controllers (PLC) to a remote

terminal unit (RTU) or SCADA system, where the data can be transmitted via satellite or cellular

to local servers or databases. The sensor data options described in Figure 23 is not limited to

those examples, as there are a multitude of different instrumentation from different

manufacturers that can be transmitted. Further discussion will be presented regarding the

number of stakeholders that can leverage the data for improved process efficiencies, where the

case for data and data protocol standardization becomes more apparent.

101

Figure 23: General Field-to-Office Data Flow Architecture

Figure 24 further expands on the details of data communications, as well as some typical

sensors that are being monitored with respect to oil and gas production operations. This is a

specific case where sensor data can be routed to a PLC-RTU (either wired or wirelessly) and

communicated to a central server or database. As described earlier, this list for sensor types is

limited, as it could include any equipment that is run downhole or any other additional sensors

installed in the operations process.

Figure 24: On-Shore Production Operations Data Telecommunications Architecture

DRILLSHIP

PUMP JACKLAND RIGHORIZONTAL DRILLING

HYDRAULIC FRACTURINGPRODUCTION

FACILITIESPIPELINESTORAGE

MIDSTREAM

Satellite

Cellular

SCADAHistorianDatabase

Field

Sensor DataTorque

DragROP

MWDDepth

PressureTemperature

Flow RateVolumes

Water-CutSand

VibrationsTank Levels

PLC-RTU

Program Logic Controller

Remote Terminal Unit

WellProduction

Choke

Land Communications

VSAT

Cellular

Very small aperture terminal (VSAT) is most widely used for vessels in remote locations. This is

broadband satellite communications that as higher latency and smaller bandwidth than other options.

Cellular telecommunication solutions can only be used if available. Some locations close to shore have

this capability, specifically in US Gulf of Mexico.

Sensor Types

Tubing Head Pressure

THP

CHP Tubing Head Temperature

Casing Head Pressure

Choke Position Flowline Pressure Flowline Temperature

Flow Meter Tank Meter [Separation] Testing System

THT

102

Offshore operations are unique for data transfer opportunities, given that offshore

operations can either be in extremely remote environments or among existing facilities or

infrastructure. Figure 25 depicts an offshore drill ship communicating operational drilling and/or

completion data. The most common form of data transfer from an offshore drilling vessel is via

satellite or VSAT. This allows for data to be transferred in remote locations but has limitations

with bandwidth and data transfer rates. Microwave technology is usually only used if a vessel is

within a short range of an existing facility where the data can be relayed to land/office locations.

Existing facilities and infrastructures are typically wired with fiber optics, which is the optical

technology for data transfer. Offshore operations that are nearshore (Gulf of Mexico Shelf) can

maybe use cellar technology is their operation is within range.

Figure 25: Offshore Drilling Operations Data Telecommunication Architecture

Now that the general operational flow of data has been reviewed, the next step is to

review the data standardization (i.e., the format of the data as it is being transferred). The value

creation generated from the combination of high-volume, high-quality data, plus advanced

analytic algorithms is creating a large competitive advantage to early adopters. The critical

component of the digital journey is to create a foundation that is both value-capturing and

flexible toward breakthrough technology. Data standardization is a steppingstone to digitalizing

the oil field, where initial investments lead an organization to a vast amount of operationally

improving opportunities.

Riser

Mudline

Sea Level

Blow Out Preventer (BOP)

Well

Drillship

Offshore Communications

VSAT

Microwave

Cellular

Fiber

Very small aperture terminal (VSAT) is most widely used for vessels in remote locations. This is

broadband satellite communications that as higher latency and smaller bandwidth than other options.

Microwave telecommunication solutions are typically used when a vessel is in close proximity to other

vessels, facilities, or infrastructures. This broadband has higher bandwidth, but limited distance.

Cellular telecommunication solutions can only be used if available. Some locations close to shore have

this capability, specifically in US Gulf of Mexico.

Fiber telecommunications is the optimal solution, but requires cables to be run from point A to B. This

option is typically used from a supporting facility in conjunction with microwave from the rig.

LEOLow Earth Orbit (LEO) satellites are planned to be launched in the next few years by SpaceX called

StarLink. This technology has the potential to rival or even exceed fiber optic speeds.

103

Table 12: Oilfield Data Transfer Speeds per Technology

References:

(“What Is the Latency for Satellite Connectivity? - DTP” n.d.; “Download Speeds: Comparing 2G, 3G, 4G & 5G Mobile Networks” n.d.; “Microwave Link -

Gigabit Microwave Connectivity” n.d.; “Optical Fiber’s Gigabit Bandwidth, 200 Km Range Attractive for Subsea Work | Offshore” n.d.; “Real-Time Latency:

Rethink Possibilities with Remote Networks” n.d.; “Satellite Broadband Internet and Megaconstellations | Deloitte Insights” n.d.; “WiMAX Coverage and

Speed | HowStuffWorks” n.d.; “Beyond the Barrel: How Data and Analytics Will Become the New Currency in Oil and Gas” n.d.; “The Maritime VSAT

Advantage” n.d.)

6.2 Data Standards

Data standardization across the multi-disciplinary value chain is critical for workflow

connectivity. The Standards Leadership Council (SLC) is a group of 11 members who work

together to build data standardization in a collaborative environment that promotes the

adoption of data standards in order to (1) avoid conflicting standards, (2) build synergy, (3) deliver

maximum value, and (4) harmonize workflows and semantics. The below list encompasses all of

the participating members in SLC, however, the focus of real-time data streaming is with

Energistics. As the industry becomes increasingly collaborative, it is important to be aware of the

other Standards Organizations. Software development around strict data standardization will

ensure more reliable deployment and modularity with the integrated systems. Four of the critical

data standards organizations are described below.

SLC Standards Leadership Council

Website: http://www.oilandgasstandards.org/about

Energistics Energistics is responsible for developing standards for the transmittal of data

related to upstream oil and gas operations and activities. This includes

Technology Typical Data Transfer Rate Typical Latency Range Characteristics

VSAT Satellite 1-6 Mbit/s 500-650 ms Global High Cost

3G Cellular4G Cellular5G Cellular

0.1-8 Mbit/s15-90 Mbit/s

150-200 Mbit/s

0.1 s0.05 s

0.001 s

Regional Lowest Cost

Microwave (WiMAX)

70 Mbit/s 50 ms Regional [~30 miles]

Low Cost

Fiber 2.5 Gbit/s 5 µs per km Regional [120 miles]

Highest Cost

LEO Satellite 100-120 Mbit/s [proven – Kepler]

30-50 ms Global 35x closer to Earth than GEO orbit satellite

Average Rig Data Collection – 1-2 TB per day

104

interoperability requirements for both data transfer standards and data transfer

protocols. The common technical architecture for Energistics data standards that

allow for cross-functional workflows are WITSML (drilling), PRODML

(production), and RESQML (reservoir).

Website: https://www.energistics.org/

OPC The Open Platform Communications (OPC) Foundation creates and maintains

data transfer standards for industrial automation. The organization creates

specifications, certifies, and collaborates with industry to manage secure and

reliable, multi-vendor, multi-platform automation data transfer protocols.

Future collaboration efforts are in works with DSATS (Drilling System Automation

Technical Section) of the Society of Petroleum Engineers (SPE) to build guidelines

around rig controls systems and a roadmap for Drilling Systems Automation.

OPC also works in collaboration with Energistics to build integration of real-time

data with drilling system automation and process controls. The drilling industry

is becoming more automated, and it is critical to build standardization and

connectivity between real-time data and automation.

Website: https://opcfoundation.org/.

PCA

[ISO 15926]

POSC Caesar Association (PCA) is an organization that develops and promotes

data interoperability standards with special focus on ISO 15926 and Web

services. ISO 15926 is an International Standard that facilitates data interface

representation among all domains within the lifecycle (i.e., engineering,

construction, and operation) of oil and gas processing plants. ISO 15926 provides

an information semantic model (ontology) for enterprise applications to

facilitate the interoperability of system engineering design, construction,

commissioning, and operation.

Website: https://www.posccaesar.org/.

PPDM The Professional Petroleum Data Management (PPDM) Association developed

data management standards for Exploration and Production (E&P) data. PPDM

created the PPDM Data Model to provide a standard data management solution

105

for data representation for all acquired data in E&P. Data management and

metadata standardization has become increasingly important with the explosion

in volume and sophistication of E&P operational data.

Website: https://ppdm.org/ppdm.

6.3 Energistics

The oil and gas data transfer standards created from Energistics provide a true data

interoperability capability to improve multidisciplinary communications between the rig site,

service companies, and office-based professionals (Hollingsworth 2015). The idea of a single

software architecture standard is to provide a common interface to transmit, exchange, and

receive data. This allows for consistency across the workflow value chain, which drives efficiency

and performance (“Khudiri et al. - Open Standard Protocol Can Improve Real-Time Drill.Pdf” n.d.).

The Energetics portfolio can be summarized by the following standards: WITSML, PRODML,

RESQML, and ETP.

WITSML – Wellsite Information Standards Markup Language is an industry initiative to provide

consistent interfaces for all instrumentation and software in the digital oilfield (DOF) to

optimize operations. The current capabilities are related to wellsite-to-office data

transfers for all operational and equipment data. The initiative drives the importance for

all technologies to work together to provide a highly innovative platform that can lead to

advanced process and operation optimization. Utilizing this standardization with the ETP

provides near real-time data to office engineers for improved efficiencies or decision

making and/or even to Third-Party analytic software packages. (“Adopting Data

Standards: Cheapest Way to Survive Downturn | Rigzone” n.d.)

PRODML – Production Markup Language is a standard focused on the transfer of data from the

reservoir-wellbore boundary. This initiative is consistent with the evolving transformation

for smart/intelligence fields. The current capabilities include production volumes, well

tests, fiber optic distributed temperature surveys (DTS) and distributed acoustic sensing

(DAS), and PVT analysis. The data transfer standard applies to the full inventory of

downhole equipment and operations performed on a producing well.

106

RESQML – Reservoir Q Markup Language is an XML- and HDF5- based data exchange standard

that facilitates reliable, automated exchange of data among software packages used in

subsurface workflows. This standard addresses the complications associated with the

multiple software packages required to interpret, model, and simulate subsurface

environments. The intent is to build a data transfer environment that allows for flexible

workflows, productivity improvement, and clear data sharing paths for improved

efficiency.

ETP – Energistics Transfer Protocol – is a data exchange specification that enables the data

transfer between applications, including the full family of data standards – WITSML,

PRODML, and RESQML. This transfer protocol was specifically designed for the oil and gas

industry and supports both real-time transfers and historical data queries. This protocol

uses Web-Socket, Avro, and JSON to transfer both real-time and static data from server

to client (Hollingsworth 2015).

CASE STUDY 1: “WITSML is the Key Data Source for Automated Daily Drilling Reports”

Independent Data Services (IDS) developed an automated Daily Drilling Report (DDR)

application, DrillNet, to auto-populate daily activities, saving 1 to 3 hours of work each day. The

IDS DDR application leverages WITSML data to improve reporting accuracy and consistency,

which frees up personnel for other important responsibilities. IDS has termed the reporting

technique as Lean Automated Reporting (LAR). LAR can access data from a WITSML data store to

capture contextual and numeric data – this includes the ability to categorize state of operation

(OSD – operation state detection) in order to populate relevant parameters. IDS is reporting that

more than 80% of the DDR is able to be automatically populated.

Standardization with reporting has multiple benefits:

• Reduction in inconsistencies and errors from reporting;

• Improvement in data quality for enhanced data analytics;

• Provision of real-time, natural language operation updates to office engineers; and

• Saving time and money by allowing highly paid personnel on the rig to focus on current

operations.

107

As service companies are developing software architectures based on real-time sensor

data from WITSML (Energistics) Protocols, digital solutions can become more plug-and-play

versus customized. This standardization incentivizes operators to adopt this framework to

improve capabilities for more cost-effective digital solutions (Energistics and Independent Data

Services n.d.).

CASE STUDY 2: “Maximizing the Benefits of Wellsite Data using WITSML”

Murphy Oil Company was challenged with the amount, variety, and velocity of data

created from the wellsite, and managing non-standardized data was time consuming, error-

prone, and costly to the organization. Murphy leveraged WITSML to standardize wherever

possible to build a consistent, streamlined digital platform for their wellsite. The WITSML system

integration is shown below. This standard data transfer protocol allows Murphy to (1) automate

data entry and reporting, (2) reduce QA/QC processes on data, (3) perform reliable and effective

data analytics for better decision making, and (4) open integration opportunities for Third-Party

service providers. Murphy projects that this initiative saved the company 10 to 15 hours per week

from the original data management processes. This initiative also provides Murphy with a digital

software architecture where service providers are capable of connecting and running detailed

analytics on real-time operations data. These opportunities provide significant value to

operational efficiency by vastly speeding up deployment time of new digital initiatives. And, this

type of digital solution prevents Murphy from expending resources to develop in-house, which

can sometimes be beneficial, but also challenging to compete with the analytic capabilities of

Silicon Valley.

6.4 Enterprise Architecture

The enterprise system architecture is the information technology (IT) foundation that

manages all evolving IT systems. The architecture manages data flow, accessibility, storage,

processing, and visualization, and is thus a fundamental contributor to business operations. The

systems methodology outlines the importance of understanding the dynamics and

interdependencies of all contributing systems and subsystems associated with the objective as a

whole. The primary focus of engineering innovations and improvements have been on the

physical operation; however, with the capabilities associated with the developing analytic and

108

visualization techniques, directing new attention to IT infrastructure will create a greater

economic advantage. The purpose of this section is to outline the enterprise architecture

platform with the available tools in order to develop an understanding of the capabilities and

stakeholders involved to create digital opportunities within an organization.

As identified in the O&G Portfolio section above, many O&G companies are adopting

Microsoft (MS) Azure as the cloud-based system architecture for their organization. MS Azure

provides the data ingestion, filtering, processing, storage, and visualization capabilities within the

available suite package, but also allows for the opportunity to host other data applications

through an API Gateway. Figure 26 shows a high abstraction diagram of an enterprise

architecture for connecting and processing disparate data sources to develop visualized business

insights (“Azure Architecture Center | Microsoft Docs” n.d.). The intent of this thesis is not to

analyze the specifics of the system architecture to develop a recommendation, but, rather to

offer MS Azure as an example of the system components and interdependencies of an IT

architecture that must be managed and considered when developing and adopting new digital

initiatives within an organization.

Figure 26: Enterprise Data Platform Architecture, Portions of Microsoft Data Platform Reference Architecture Image

(https://docs.microsoft.com/en-us/azure/architecture/example-scenario/dataplate2e/data-platform-end-to-end) used with

permission from Microsoft.

The MS Azure data platform architecture can be leveraged to understand the different

steps or applications required for data to flow from creation or storage to visualization and

Load & Ingest

Stream[IoT, Sensors, Instrumentation]

Non-Structured[Images, video, Word docs,

Power Point, PDF, email]

Semi-Structured[Spreadsheets, logs, metadata, csv, xml]

Relational Database[Structured Information]

Storage

Process Serve

Data Factory[Scheduled or Manual Ingestion]

Real-Time Stream

Hot Path

Cold Path

Data Lake[Raw Data, Use Undefined]

Real-Time Analytics

Cognitive Services[Analytic Applications &

Machine Learning]

Databricks[Big Data Processing &

Preparation]

Analytics

Business User

Visualizations

Applications

Power Applications

Database[Cosmos, SQL Data

Warehouse]

109

business insights. The example architecture has two paths: Hot Path and Cold Path. These two

avenues are specific to the type of data being processed and to the type of processing

capabilities. The Hot Path originates from real-time streaming data where reduced latency speed

for direct visualization is the priority. The MS Azure architecture shows that the real-time

streaming data is processed as streamlined as possible to the end-user through Power BI.

However, the data is also directed to the Cold Path, where it is combined with disparate data

sources of structured, semi-structured, and unstructured data. The structured, semi-structured,

and unstructured data is ingested by the Data Factory, which acts as the pipeline and engine to

organize data into acceptable inputs for data analytics. Through the workflow enablement and

capabilities of Databricks, the data can then be processed for data analytics and machine learning

to provide enhanced business insights.

Other service providers might provide similar applications that are fit-for-purpose or

prove to have higher capabilities than what are currently offered with the MS Azure suite. And,

due to the flexibility of the cloud-based host, these applications can be connection and integrated

into the MS Azure platform to provide these beneficial services, as needed. Understanding the

platform architecture helps with three fundamental questions: (1) What are current system

capabilities that can be leveraged to advance business insights? (2) How do new digital

applications integrate or replace current services for improved functionality? And, (3) what are

the identified functionality gaps that are specific to the needs of the organization or industry?

The importance of the systems approach is to understand the comprehensive system and its

interdependencies to challenge the effectiveness and integration of capabilities proposed by new

digital solutions.

As it relates to artificial intelligence specifically, different applications and algorithms can

be hosted on the Azure or other cloud-based platform to contribute to the business insight

discovery process. This application is specifically called out in the MS Azure architecture as

Cognitive Services (machine learning). Different data models can be leveraged side-by-side for

comparison purposes, or in series as an ensemble approach, to achieve additional accuracy in the

model predictability.

110

The next section below will discuss different artificial intelligent techniques that are being

utilized to generate pattern recognition and causation discovery within the oil and gas industry.

This will provide an idea of the mathematical algorithms that have demonstrated proof-of-

concept and realized value in industry.

111

7 Data Analytics in Well Design and Operations

Data analytics and artificial intelligence have been integrated into O&G Exploration and

Production (E&P) design and operations workflows for many years, by employing methods like

regressions, forecasting, and optimizations (Bravo et al. 2012). However, the resurgence of the

term artificial intelligence for business applications has been sparked by the growing capabilities

of machine learning (ML). With the increasing volume of digital connectivity and digital abilities

within industrial applications, AI provides a means to “sense, reason, engage, and learn” from

the immense amount of data being produced (“Part 1: Artificial Intelligence Defined | Deloitte |

Technology Services” n.d.). To understand the value and functionality of digital platforms and

digital initiatives, it is necessary to be familiar with the tools and algorithms being leveraged in

those applications.

The science of AI is often misunderstood due to the broad range of capabilities

encompassed within the term. The umbrella of AI includes both the future breakthrough

associated with the digital singularity, where computation technology exceeds the intelligence of

human intellect, and the classical technique of linear or logistic regression. The term AI has been

hyped to such a high degree that it is the only description used to advertise any new digital

capabilities, even though the range of algorithmic capabilities is extremely broad. Therefore, the

advertised rhetoric of these emerging digital technologies contributes to the misunderstanding

of the science. Figure 27 shows the breakdown of data science and AI technology to provide a

more holistic view of where these terms fit into the discussion. To effectively evaluate a digital

initiative, one must be able to speak the digital language associated with the techniques a

company is employing to produce their advertised benefits. Artificial intelligence is more

commonly integrated into industrial applications than most people recognize, and the

information included in this chapter aims to equip the reader with the knowledge and tools to

challenge and discuss the techniques being utilized in order to improve business insights.

112

Figure 27: Data Science and Artificial Intelligence (“How To Be A Data Scientist - Quantum Computing” n.d.) [left] and (“Part 1:

Artificial Intelligence Defined | Deloitte | Technology Services” n.d.) [right]

The former, current, and future data analytic and AI methods were evaluated to provide

a recommendation for developing infrastructure, competencies, and tools for digital adoption.

In 2012, a survey was conducted among professional members of the Society of Petroleum

Engineers (SPE) to capture the business integration of artificial intelligence in oil and gas

Exploration and Production (E&P). The survey received 612 responses to general AI questions

regarding applications, value, and techniques (Bravo et al. 2012). Figure 28 shows both the

relative E&P applications where AI was integrated into the workflow (left) and the type of AI that

was leveraged (right). Although the data is dated, it shows that data science has been integral to

the design and operational workflow in O&G for many years. This data provides some perspective

to the former and current state because oil and gas has been historically slow to adopt new digital

innovations.

The top five areas of AI and data analytic integration in E&P were production optimization,

reservoir modeling, data management, production management, and process control (Bravo et

al. 2012). And the top five AI or data analytic techniques utilized were data mining, workflow

automation, neural networks, expert systems, and automatic process control. These areas and

techniques are similar to the orientation and direction that are still being pursuing today (Future

Digital State), with the exception of the business disruption from the more advanced ML

113

techniques – note that ML was only a small portion of the applications in 2012 E&P. Data mining

continues to dominate at the forefront of innovation, with both the backend infrastructure and

the frontend visualization developed to efficiently deliver to the user the right information to

make enhanced business decisions. The development and application for each of these available

analytical techniques in O&G could be a thesis on its own, but the intent here is to show the

analytic approach as a whole system, in order to connect the technique and application in

proportion to its contribution in industry.

Figure 28: Artificial Intelligent Applications in E&P Industry, derived from (Bravo et al. 2012) data

As noted, the current state of effective AI integration into O&G is relatively similar to the

state, or described proportionality, from 2012. However, the term “effective” is used because

the industry is currently in a transitory state, where large digital investments and initiatives are

starting to fundamentally disrupt the design and operational workflows, but they are yet to be

holistically effective. This thesis has already discussed the enabling of cloud-based platforms that

improve data aggregation, visualization, collaboration, and communication effectiveness. But, in

terms of data analytic techniques, the most disruptive innovation has been caused by

applications of machine learning. To visualize and understand the explosion of the application of

machine learning, Figure 29 shows the Google Trends of Web Searches for different AI and

analytical methods. This graph, provided by Google Trends, shows Web Searches relative to a

Percentage of E&P Applications Using Data

Analytics [2012]

Percentage of Data Analytics in E&P by

Category [2012]

114

peak total – the relative peak is represented as 100 for the most searched topic in a given time

frame. The duration for the graph is over the last ten years of internet searches through Google.

The Trends show that the interest in machine learning was sparked in 2015, and has surpassed

that of AI, Data Analytics, Data Mining, and Digital Transformation. This provides a relative

visualization and perspective of the evolution of interest into the applications of machine

learning.

Figure 29: 10 Years of Digital Search Trend Data on Google Trends

The advent of machine learning has sparked capabilities and promises for pattern

recognition, business insight discovery, and efficiency improvements. This is especially attractive

in the O&G industry, where there are immense volumes of archived design and operational data,

as well as daily real-time data acquisition, that are not currently monetized to the optimal extent.

However, with the growing platform technologies providing integrated systems for data

aggregation, all analytic methods can be more easily applied to learn, forecast, and take action.

The next sections in this chapter will review key analytical methods with their current and future

applications within the O&G industry.

7.1 Data Mining

Data democratization and monetization can yield significant value for O&G organizations.

Significant opportunities can be extracted from tapping into large data reservoirs and applying

advanced data mining techniques to develop descriptive, predictive, and optimization models

115

(“Demystifying Data Mining,” n.d.) for enhanced business decisions. Quantitative algorithms

explore data to discover patterns, anomalies, and historical trends to develop useful business

models. Data mining techniques can filter, cleanse, and aggregate disparate data sources that

enable advanced algorithms, like machine learning applications, to further process and analyze

data.

The value associated with monetizing data will force engineers to embrace news way of

thinking and problem solving – physics-based modeling is the current core competency, but data

cleansing, data analytics, and data forecasting is the competency of the future. Engineers will

need training in statistical inference and thought to be able to understand both the power and

limitations associated with data-driven models. Data mining techniques provide the capability of

understanding massive reservoirs of data in short time frames, which can significantly improve

decision-making efficiencies.

Data aggregation and preparation are critical aspects of the data mining process that

arrange data appropriately for the specific problem. This data can further be utilized with other

analytical and AI techniques for discovery. Cloud-based systems are paving the way for

collaborative and connected environments where data can be easily accessed and processed

from data lakes or enterprise data warehouses through API enabled analytic applications. Data

mining returns value to organizations by developing business insights and operational efficiencies

that can create compounded savings across thousands of wells and assets. Accessing, organizing,

and preparing data is traditionally a time-consuming task for an engineer, however, with the

inclusion of cloud-based data accessibility and data processing applications through API

connectivity, specific data models can be created by an engineer in minutes instead of weeks.

Data mining opportunities are ubiquitous within the O&G sector; techniques can be

applied to exploration, reservoir, drilling, production, pipeline/transport, refinery, and sales.

Models can be built for seismic interpretation to increase probability of oil discovery, reservoir

characterization to better evaluate oil-in-place for economic viability and extraction strategies,

drilling design and performance to reduce costs and exposure, production trends for completion

and facility designs for efficiencies (Mohammadpoor and Torabi 2019).

116

As a reference for the perspective of analytic tool utilization, Rexer Analytics periodically

surveys analytic professionals to better evaluate analytics usage in the corporate, consultant,

academic, and government workforce. The Rexer 2017 survey consisted of 1,123 analytic

professionals from 91 countries. Figure 30 shows the results of the analytic algorithms that are

most utilized in the respondent’s field. Rexer Analytic has also identified Regressions, Decision

Trees, and Cluster Analysis as the top three primary analytic algorithms since 2007. While

complex machine learning algorithms have generated much of the attention, the foundational

algorithms continue to dominate in application and usage. Although it is important to recognize

that the machine learning tools are deemed more transformational, there is still significant value

to be gained by robust applications of older methods. From a systems perspective, recognizing

and understanding the key contributing algorithms for data analytics is necessary to evaluate the

functionality of digital tools. There is potentially inherent risk associated with applications that

promote lesser used analytic tools, as they do not have the benefit of experience-driven best

practices and proven results. This information can assist with challenging the track-record and

competency regarding the advertised tool, which will provide insight on value versus hype.

Additionally, the Rexer survey identified the top skills of a data scientist as Data Preparation and

Management Skills and Specific Domain Knowledge (Rexer 2017), which aligns with this thesis’

points about standardization and an approach for single system of truth.

Figure 30: Common Algorithm Analytic Methods, Adapted from (Rexer 2017)

3%

3%

4%

4%

4%

4%

4%

5%

5%

6%

6%

7%

9%

9%

9%

13%

15%

30%

5%

9%

6%

8%

12%

9%

13%

12%

14%

13%

18%

19%

14%

19%

23%

32%

28%

37%

12%

11%

15%

13%

14%

18%

18%

19%

22%

18%

18%

23%

16%

16%

24%

24%

21%

17%

13%

12%

14%

17%

15%

18%

16%

16%

17%

12%

16%

14%

11%

12%

15%

10%

11%

4%

D E E P L E A R N I N G

R U L E I N D U C T I O N

S O C I A L N E T W O R K A N A L Y S I S

S U R V I V A L A N A L Y S I S

S U P P O R T V E C T O R M A C H I N E S ( S V M )

M O N T E C A R L O M E T H O D S

A S S O C I A T I O N R U L E S

N E U R A L N E T S

B A Y E S I A N M E T H O D S

A N O M A L Y D E T E C T I O N

T E X T M I N I N G

F A C T O R A N A L Y S I S

E N S E M B L E M E T H O D S

R A N D O M F O R E S T S

T I M E S E R I E S

C L U S T E R A N A L Y S I S

D E C I S I O N T R E E S

R E G R E S S I O N

Most of the time Often Sometimes Rarely

117

7.2 Example Case Studies & Applications

After covering the different types of data analytics and with the proportionality in which

they are leveraged in O&G, this chapter provides several examples of how the different analytic

algorithms have been applied to specific applications in O&G.

Reduce Drilling Nonproductive Time

A detection and alerting system was created using Bayesian Networks to define event

conditional probabilities to indicate drilling, equipment, or sensor failures. The belief or reasoning

system is built from selected nodes of conditions that would be used to define an event logic.

Values can be discrete or continuous, and the model probabilities are conditioned with training

data sets. For example, a belief system for a drilling washout would consist of rig activity, pump

rate, flow-out rate, pump pressure trend, and other details as desired. The defined nodes would

be linked to an “unplanned state” that can then be trained with actual washout events to

determine appropriate probabilities. When this model is run in real-time, the aggregated sensor

data can determine a response on the probability associated with a washout (Ashok and

Behounek 2018). These types of models can be leveraged for many different “unplanned states,”

where alert systems are built to define likelihood of potential issues. Early detection of process

or mechanical issues can provide significant value to the operation. Even with the likelihood

forecast of events, the reasoning associated with logic can be addressed if there is a sudden

change in one of the defining variables.

Figure 31: Bayesian Network Feedback Loop

Mud PitBOP

Drill Pipe

Drill Bit

Top Drive

Derrick

Rig

SENSE

REASON

ENGAGE

DRILLINGCOMPLETIONSINTERVENTIONS

118

Improve Drilling Performance

The drilling rate is dictated by several factors, including lithology (one aspect is

compression strength), bit model, bit type, weight on bit, rotary speed, and mud weight. A neural

network was utilized to develop a predictive model for mechanical drilling rate (Xue 2020). This

model can be trained with a large volume of drilling data to assist with bit selection, duration

forecasting, and optimization. Performance models can be integrated together to create proxy

models for an entire system. Figure 32 shows the conceptualization of a neural network with a

single hidden layer. The neural network is a collection of neurons (nodes) and connections that

work by processing input data through the weighted connections and nodes to produce an

output response. The network learns by traversing forward- and back- propagation through the

network to develop node activations and weights that produce the correct final result (“Machine

Learning for Everyone” n.d.).

Figure 32: Neural Network for Drilling Rate (Xue 2020)

Anomaly Detection

Operational anomaly detection was achieved through the use of regression and

classification models for drilling and workovers (Alotaibi, Aman, and Nefai 2019). Machine

learning models were constructed to identify abnormal behaviors and alerts during a well

operation. The method employed suggested a continuously looped train and predict model in a

predetermined moving time window – this prevents the decay of the model validity with

119

increasing time. These types of anomaly detection models can quickly identify abnormal trends

and monitor the health of the operation.

Data analytics and artificial intelligence are powerful tools for understanding complex

systems. There are a multitude of proven methods and tools that can be incorporated into

monitoring, performance, and design models. However, as with any data-driven model, an

understanding of the statistical accuracy and limitations should be properly evaluated before

taking action. As noted in the data mining section, the majority of the work is directed toward

aggregating, filtering, cleaning, and preparing data for processing. Most of these analytical

models are relatively simple to run once the right variables, volume, and quality of data are

available. This chapter demonstrated the value associated with applying machine learning

techniques to large data sets. Organizations can sense, learn, and respond to operational

anomalies and abnormal conditions before nonproductive events occurs. Additionally,

correlative discoveries and optimizations can be leveraged in the design stage to develop more

robust programs. It is extremely important for an engineer to understand both the power and

limitations of analytic tools that can be leveraged to develop enhanced knowledge of the design

and operational system. These techniques show that when organizations invest in developing

platforms that aggregate and connect disparate data sources and real-time operational sensors,

valuable opportunities are available to take action.

120

8 Digital Platform Design

Integrating digital tools and platforms into an organization’s workflow takes a substantial

amount of effort and resources. The industry interviews, market research, and professional

industry papers reviewed for this thesis have all recognized that the effort and resources are

commonly underestimated to develop and manage a digital initiative for sustainable use and

scalability. This thesis hypothesizes that this is due to the lack of understanding of the holistic

system view of the platform integration process. This chapter reviews a proposed methodology

to question, understand, and rank different digital platforms and initiatives. This methodology is

applied to several of the digital initiatives as outlined in the Digital Portfolio chapter. Note that

the questions presented offer perspective on beneficial attributes to successful digital

development, however, the importance and ranking of each characterization are subjective and

should be customized to the specific needs of an organization. The objective is to build a visual

understanding with respect to the holistic questions of whether a digital initiative is building your

competitive advantage within a digital ecosystem for faster and better decision making across

the entire organizational value chain.

8.1 Data Science and Machine Learning (DSML) Platforms

Gartner, a global research and advisory firm, developed a “Magic Quadrant” to rank data

science and machine learning platforms (DSML) with respect to capability and vision (Krensky et

al. 2020). Some of these principles, along with others from research articles and professional

papers, were leveraged to develop the Platform Characterization in Table 13. The

Characterization reviews 16 critical categories that this thesis recommends being thoroughly

questioned, reviewed, and relatively ranked prior to digital adoption in any organization. The

method starts with data management principles, like accessibility, preparation, and visualization,

and ends with the consideration of the business strategy. This promotes conversations around

standardization, interoperability, collaboration, user interface, required competencies, required

resources, scalability, security, automation, and business value. Additionally, the platform

characterization is linked with potential organizational barriers, as outlined in the next chapter,

which are Digital Infrastructure, Organizational Capabilities, Working Environment, External

Ecosystem, Process & Governance, and Measures of Success. The idea behind linking the

121

organizational risks to the platform characteristics is to emphasize their dependency on one

another to ensure that organizations are selecting initiatives that are aligned with their current

capabilities and business strategy.

Table 13: Digital Platform Characterization Methodology

Risk Type

1 Data Accessibility How well does a platform access and manage different data types and data sources?Digital Infrastucture

Process & Governance

2 Data Preparation How well does a platform filter and clean ingested data?Digital Infrastructure

Process & Governance

3 Data Visualization How well can data be visualized and explored?Digital Infrastructure

Organizational Capabilities

4 Advanced AnalyticsWhat analytic applications are available?

How well is an analytic suite packaged together?

Digital Infrastructure

Organizational Capabilities

Measures of Success

5 User Interface

Does the product have an intuitive and coherent UIUX?

What is the rated customer experience?

How well has the user experience been recorded, tracked, and integrated?

What is the competency required to operate or use platform or digital tool?

Organizational Capabilities

Working Environment

External Ecosystem

6Interoperability &

Standardization

How well does the platform integrate with the data infrastructure (i.e. cloud-based

host)?

How well does the platform or tool integrate or connect with other systems via APIs

(i.e. does it contribute to the data pipe ecosystem)?

Does theh platform or tool offer 3rd-party applications or open-source capabilities?

Digital Infrastructure

Organizational Capabilities

External Ecosystem

Process & Governance

Measures of Success

7 Scalability How well can the platform be deployed at a large scale, and what is the track-record?

Digital Infrastructure

Organizational Capabilities

Working Environment

8 Platform ManagementHow well can the platform be managed internally and externally? What are the

resources required?

Organizational Capabilities

Working Environment

External Ecosystem

9 Model Management How are models monitored and managed? What are the resources required?

Organizational Capabilities

Working Environment

External Ecosystem

10Collaboration &

Partnerships

How does this platform link or connect with other functions and workflows?

Does the company have any partnerships or alliances with other tools or services?

Working Environment

External Ecosystem

Process & Governance

11 Automation How well can automated processes and procedures be integrated into the platform?

Digital Infrastructure

Organizational Capabilities

Process & Governance

12 Operational ValueWhat is the historical performance of the platform with other adopters?

What is the predicted value-added of adoption and integration?

Organizational Capabilities

Measures of Success

13 Pricing What is the cost to develop and/or integrate platform?Organizational Capabilities

Measures of Success

15 Security

How has the consideration for business security been integrated into the design and

architecture of the platform or tool?

Does the platform present any cybersecurity issues?

Process & Governance

Measures of Success

16 Business Strategy

What is the future vision of platform or digital tool?

How well does it integrate and align with the business strategy?

What is the competitive advantage with adopting? Build versus buy?

What is the extensibility of the digital platform? How easily can it adapt to future

changes?

Digital Infrastructure

Organizational Capabilities

Working Environment

External Ecosystem

Process & Governance

Measures of Success

Platform Characterization

122

Chapter 8.2 reviews the internal organizational risks associated with enterprise-scale

digital adoption. The Risk Type from Figure 12 is decomposed into more specificity to understand

the considerations associated with each risk category. The critical questions and evaluation

strategy outlined in Chapter 8.1 and Chapter 8.2 are combined in Chapter 8.3 to create a ranking

method that aligns both organizational limits and digital tool attributes. This step-by-step ranking

methodology helps to ensure that the holistic system view is considered when selecting a digital

initiative.

8.2 Challenges to Digital Platform Adoption

Risk recognition and analysis is important for understanding the dynamic influencers

within a system. Recognizing potential issues provides the opportunity to develop mitigation

strategies that help reduce the likelihood of unintended events. The barriers and challenges

associated with organizational digital development and adoption were recorded throughout the

research process. This section presents barriers and challenges to digital platform design and

organizational adoption. The design challenges reference architectural decisions that are in

tension with one another – the difficulty is selecting the right design attributes that contribute

most to the business value. The barriers to organizational digital adoption reference the

infrastructure, competencies, and cultural limitations – the difficulty is shaping the organization

to adapt to the required disruption. These risks are not exhaustive, but they have been identified

as a common theme presented throughout the research process.

The utility attributes of a digital initiative can be quantified or measured to define the

quality of the system. Utility attributes are commonly in tension, or have negative influence on

one another, and requires architectural design decisions to define the system’s functionality that

meets the business needs of the end user. For example, system performance exhibits design

tension around latency, capacity, and accuracy, where latency is the response speed, capacity is

the volume handled, and accuracy is the margin of error in the response. The system needs of

these requirements can be substantially different depending on the type of system and control

expectations. Additionally, scalability presents sacrifices depending on the size and diversity of

users required to share the platform. Customizations or fit-for-purpose design characterizations

would be at tension with the ability for the system to be compatible for many different

123

applications and use cases. The availability of the platform with respect to accessibility and

usability would be at tension with system infrastructure and design costs. Platform extensibility,

or the ability to adapt to future changes (sustainable design), would be in design tension with

cost and schedule constraints (“Architecting For The -Ilities - Towards Data Science” n.d.). These

design tensions are aligned with the Iron Triangle where Cost, Scope, and Schedule are in

constant tension with one another. It is important to recognize what trade-offs are necessary for

your operation within a specific use context. These principles will guide the design and overall

functionality of the platform, so it is necessary to design or select digital tools with the trade-offs

thoroughly understood. Design tensions can be a major barrier for organizations, as there is

uncertainty to desired system emergence to achieve optimal business value. It would be a grave

mistake to ignore the architectural design tensions, where all beneficial utility attributes are

pursued or commissioned without accepting trade-offs, which leads to excessive overages in cost

and schedule on digital project development and implementation.

The digital transformation is a disruptive effort that requires a shift change within the

organization to improve collaboration, agility, inquisitiveness, competency, and leadership to be

successful. Table 14 summarizes a selective list of critical barriers to organizational adoption with

consideration for digital infrastructure, organizational capabilities, working environment,

external ecosystem, process and governance, and measures of success (Thajudeen 2018). The

challenges highlighted in bold were influenced by a PETRONAS Offshore Technology Conference

(OTC) paper, OTC-28591-MS.

124

Table 14: Barriers to Organizational Digital Adoption

Table 14 highlights a theme of challenged connectedness, knowledge, trust, and

transparency, where the digital bottleneck is driven by the complex dynamics of the social and

technical organizational system. The Digital Infrastructure limitations suggest persistent issues

with data management and data accessibility. The transition toward a cloud-native platform (i.e.,

Microsoft Azure or AWS) alleviates many of these concerns by developing a unified system for

data and workflow connectivity. The opportunities and scalability enabled by a single cloud-

based platform promotes collaboration and standardization efforts across working groups. The

challenges associated with Organizational Capabilities (OC) suggest a lack of core digital

competency and awareness. Issues around minimal or scattered digital resources, as well as low

digital literacy or inquisitiveness can be mitigated with clear digital leadership and transparency.

Low digital OC can also be managed by selecting initiatives that have a higher dependency on

125

external resources. An organization needs to conduct an honest OC self-evaluation to determine

the right investment balance to ensure successful development, deployment, and maintenance

of a digital system.

The Working Environment challenges show an inherent lack of trust and ownership for

the pursuit of a digital transition. Digital capabilities are an entirely new field of knowledge that

are exponentially dominating changes in design and operational workflows. Leaders who fully

understand the capabilities, opportunities, and value of the digital transformation can sometimes

drastically underestimate how the disruptive changes can make the workforce feel. The

complexity, black-box nature, and ambiguous rhetoric regarding digital and AI may invoke a

feeling of fear and intimidation for the job security of many employees. Trust and intentional

transparency are thus required to facilitate the shift to a data- and digitally-driven culture. The

digital evolution is not an individual effort, but a collaborative effort, which requires all

employees to understand, align, and take charge toward achieving a sustainable competitive

advantage, or, otherwise, risk obsoletion. The perception that “IT = Digital” further isolates

design engineers and operations from collaborating with IT for engineering solutions. A digital

organization should have IT integrated with the base business, where they no longer act as strictly

a cost center, but are responsible for profit and losses.

The challenges associated with the External Ecosystem suggest a lack of understanding of

both the overall capabilities of the digital transformation and the synergistic value of combining

collaborative efforts. Organizations risk building systems and tools with the “this works for us”

attitude, without the recognition of the immense opportunities of connecting with open-source

resources to take advantage of multiple application types and digital tool offerings. The internal,

isolated approach risks forgoing [neglecting/ignoring/disregarding] extensible growth

opportunities related to the combined network effect of external digital development. The

approach involving industry partnerships, alliances, standardization, and open-source

opportunities aligns with the digital perspective of full industrial connectivity. It is important to

recognize the value gained by having the ability to build and contribute to previous innovations

and accomplishments – this is the source of exponential growth opportunities.

126

The Process and Governance barriers are related to the lack of trust and leadership with

respect to the digital changes. A top-down approach is difficult when the organization is resistant

to recognize or realize the value of digitization, whereas a bottom-up approach is much more

contagious as the value is exhibited on the frontlines of the operation. The goal is to develop an

organization that is continuously seeking to optimize operations through a digital-first approach,

where valuable innovations will rapidly diffuse through the organization for adoption. Finally, the

challenges with Measures of Success are associated with not having a clear vision of the future

digital state. The difficulty lies in defining the realized value and creating an intuitive roadmap

where digital initiatives are able to progressively advance the organization toward the end goal.

Even though digitization is going to play an immense role in the optimization and efficiency

improvements of the future work environment, it is still critical to focus on value-adding

initiatives that enhance the competitive advantage of the organization.

The top 10 risks are plotted on a risk map in Figure 33. The relative risk ranking is

subjective and based on conversations, technology articles, and professional papers explored

during this thesis. The highest risk is characterized as the ability to scale a tool or platform to an

enterprise level, which encompasses infrastructure, standardization, collaboration, competency,

and trust. The next diagonal (orange) row of similar risks is characterized as infrastructure issues,

which highlight capabilities, standardization, and collaboration. And the next diagonal risk row

(yellow) is characterized as organizational competency and trust.

127

Figure 33: Relative Risk Priority of Digital Barriers

8.3 Digital Evaluation Methodology

This methodology was applied to several of the DC&I oriented digital initiatives outlined

in the Digital Portfolio chapter. Ranking evaluation methods not only help to characterize and

order a preferred digital tool, but it can also help provide a signature to define and record

experienced barriers and challenges during the integration process. These lessons learned can be

used to adapt a new method that highlights newly identified gaps that were not previously

understood.

The method employs an organizational self-assessment to the digital barriers outlined in

the previous chapter. The self-assessment ranks each risk category for an organization as low,

medium, or high, depending on how well an organization can handle a deficiency in a specific

digital tool characteristic. Notional quantitative values were applied to each qualitative ranking,

with a “low” importance characteristic contributing 33%, a “medium” importance characteristic

contributing a 66%, and a “high” importance characteristic contributing 100% to the overall

cumulative ranking. This creates a custom beneficiary utility attribute for an organization to

evaluate a digital initiative on whether or not it is right for them. For example, a high performing

128

tool that requires advanced programming or complex inputs might not be the best fit for an

organization that lacks the required functional competencies. Whereas a lower performing, but

high usability tool may have better benefits due to the network and higher organizational

adoption rate. The next step is to evaluate the digital initiative with a simple 1-10 ranking of the

ability to achieve sufficiency toward the objectives outlined in the respective category (Table 13),

with 1 as the lowest score and 10 as the highest. Additionally, the sum-product of the self-

assessment risk ranking, and the platform characteristic ranking is calculated to determine the

overall total ranking of the digital tool for the organization.

An example analysis is shown in Table 15, Table 16, and Figure 34 which steps through

the evaluation method. This was performed over three of the companies and digital tools

identified in the Digital Portfolio chapter, but the company names have been removed for

anonymity. The ranking of the importance of platform characteristics is based on a medium-to-

large sized integrated oil and gas (IOC) corporation that primarily has core competency in the

core business design and operations, but minimal capabilities to deploy programming intensive

initiatives. Table 15 shows the relative risk ranking of an IOC with respect to the organizational

barriers.

Table 15: Example of Organization Digital Risk Self-Assessment

LEVEL 0 LEVEL 1 LEVEL 2 Risk CommentsDigital initiatives are siloed

Data storage and management lack standardizationLarge amount of disparate data sources

Limited analytic capabilities with available data structureNo single data repository

Low organizational digital literacyScattered digital resources

Lack of digital inquisitivenessDifficulty attracting digital resources

Insufficient data scientistsInsufficient design/software architects

Lack of understanding of OC to determine digital build vs. buyFear of digital initiatves and innovation

Lack of agile mindset and structureDigital = IT

Lack of digital collaborationLack of understanding of digital capabilities and value

Digital = Black BoxDifficulty scaling to enterprise level

Lack of standardization among providersDifficulty in understanding realized value vs hype

Requires high level of commitment to adopt new external platformUnaware of internal steps to make external partnerships more compatible

Unclear vision and directionDifficulty developing trust to adopt new workflows

Resistance to sharing dataCybersecurity and IP barriers

Difficulty with skunkworks development and scalingDifficulty with understanding how to measure digital

Digital value vs IT costsDifficulty understanding realized performance improvements

Lack of focus on realized value vs digital initiative

Lack of connectivity between short-term and long-term goals

High

Medium

Medium

Medium- to- Large Integrated Oil & Gas Company

Digital resources and digital compentency is a high risk area for large IOC's, where the core competency has historically been with operations and physics-based modeling (design). The data-driven culture requires a

new set of skills around data management, statistics, analytics, and software acumen for solutions.

The work environment still tends to separate the digital space from the P&L centers with an organization. This can be viewed by how IT

departments are positioned within companies - do they act as a cost center or is IT integrated into the business to have P&L responsibilities? The lack of integration with base business still creates a divide between

traditional design/operations and digital.

This risk is a medium during the initial adoption phase where organizations are still transitioning to a cloud-based enterprise service. However, partnering with SaaS companies that develop through multi-

service, integrated alliances will ultimately create the competitive advantage need to sustain continual efficiencies.

Most large organizations are challenged with breaking the silos associated with different satelites within the organization.

Customization is still viewed as more advantages than making sacrifices for the connected environment. IT is still positioned to protect the

security (which is valuable), but not to take P&L responsibility.

Most organizations have a created a digital roadmap and, on a corporate level, understand the value associated with the digital transition. The risks in this area are associated with how digital initatives are being measured for success, especially with the design and collaboration

oriented tools.

LowMost IOC's have adopted a cloud enterprise platform with Microsoft

Azure or Amazon Web Services. This provides the hosting environment to layer cloud-based digital applications for connectivity and scalability.

High

Medium

BARRIERS TO DIGITAL[Includes OTC-28591-MS: PETRONAS Journey Toward a Data Driven Organization in BOLD]

KEY DIGITAL CHALLENGES

Digital Infrastructure

Organizational Capabilities (OC)

Working Environment

External Ecosystem

Process & Governance

Measures of Success

129

Table 16 demonstrates how the organizational barrier risk ranking can be linked to the

benefit utility of a digital platform that is being reviewed. The Risk Type and Risk Ranking values

are averaged together over the associated Risk Types for that Characterization, as shown in the

Example, to create an overall contribution to the total risk alignment that a digital solution has

for a specific organization. The method is focused on linking an organization with the benefit

attributes that provide the highest value for the organization.

Table 16: Example of Quantification of Digital Tool Risk and Value

Figure 34 shows the overall risk versus value diagram that provides a customized

approach to understanding the important focus attributes for digital characterization. This

diagram shows that large O&G corporations are at high risk for achieving (1) operational value,

(2) scalability, (3) user interface for enterprise usability, (4) business strategy, and (5)

standardization and interoperability. These items have been identified as high value areas that

should provide some insight to what characteristics are required for a digital tool to be successful.

This method suggests that digital tools need to be strong in these characterization areas for them

to be a feasible solution at a medium-to-large O&G corporation. The other high value attributes

are important, but they are not a key focus, largely because the organization is not necessarily at

risk for being able to manage or mitigate the deficiencies in those areas.

Risk Type RiskRanking

Quantified Avg. Risk

Value QuantifiedValue

Risk x Value

1 Data Accessibility How well does a platform access and manage different data types and data sources?Digital Infrastucture

Process & GovernanceLow

Medium50% High 100% 50%

2 Data Preparation How well does a platform filter and clean ingested data?Digital Infrastructure

Process & GovernanceLow

Medium50% Low 33% 16%

3 Data Visualization How well can data be visualized and explored?Digital Infrastructure

Organizational CapabilitiesLowHigh

67% Medium 66% 44%

4 Advanced AnalyticsWhat analytic applications are available?How well is an analytic suite packaged together?

Digital InfrastructureOrganizational Capabilities

Measures of Success

LowHigh

Medium66% High 100% 66%

5 User Interface

Does the product have an intuitive and coherent UIUX?What is the rated customer experience?How well has the user experience been recorded, tracked, and integrated?What is the competency required to operate or use platform or digital tool?

Digital InfrastructureOrganizational Capabilities

Working Environment

LowHigh

Medium77% High 100% 77%

6Interoperability & Standardization

How well does the platform integrate with the data infrastructure (i.e. cloud-based host)?How well does the platform or tool integrate or connect with other systems via APIs (i.e. does it contribute to the data pipe ecosystem)?Does theh platform or tool offer 3rd-party applications or open-source capabilities?

Digital InfrastructureOrganizational Capabilities

External EcosystemProcess & GovernanceMeasures of Success

LowHighHigh

MediumMedium

73% High 100% 73%

7 Scalability How well can the platform be deployed at a large scale, and what is the track-record?Digital Infrastructure

Organizational CapabilitiesWorking Environment

LowHighHigh

78% High 100% 78%

8 Platform ManagementHow well can the platform be managed internally and externally? What are the resources required?

Organizational CapabilitiesWorking Environment

External Ecosystem

HighMediumMedium

77% Low 33% 26%

9 Model Management How are models monitored and managed? What are the resources required?Organizational Capabilities

Working EnvironmentExternal Ecosystem

HighMediumMedium

77% Medium 66% 51%

10Collaboration &

PartnershipsHow does this platform link or connect with other functions and workflows?Does the company have any partnerships or alliances with other tools or services?

Working EnvironmentExternal Ecosystem

Process & Governance

MediumHigh

Medium77% Medium 66% 51%

11 Automation How well can automated processes and procedures be integrated into the platform?Digital Infrastructure

Organizational CapabilitiesProcess & Governance

LowHigh

Medium66% High 100% 66%

12 Operational ValueWhat is the historical performance of the platform with other adopters?What is the predicted value-added of adoption and integration?

Organizational CapabilitiesMeasures of Success

HighMedium

83% High 100% 83%

13 Pricing What is the cost to develop and/or integrate platform?Organizational Capabilities

Process & GovernanceMeasures of Success

HighMediumMedium

77% Medium 66% 51%

15 SecurityHow has the consideration for business security been integrated into the design and architecture of the platform or tool?Does the platform present any cybersecurity issues?

Process & GovernanceMeasures of Success

MediumMedium

66% High 100% 66%

16 Business Strategy

What is the future vision of platform or digital tool? How well does it integrate and align with the business strategy?What is the competitive advantage with adopting? Build versus buy?What is the extensibility of the digital platform? How easily can it adapt to future changes?

Digital InfrastructureOrganizational Capabilities

Working EnvironmentExternal Ecosystem

Process & GovernanceMeasures of Success

LowHigh

MediumHigh

MediumMedium

72% High 100% 72%

Platform Characterization

Example:Data AccessibilityAverage Rank =

= Low*0.5 + Medium*0.5= 0.33*0.5 + 0.66*0.5= 50%

Risk Type:Link associated risk with digital platform characterization

Value:Determined value-added performance of a digital tool characteristic to develop within a medium to large IOC

130

Figure 34: Example Plot of Digital Risk vs. Value for Focus on Tool Characterization

The Platform Characteristics were organized in descending order as per the prioritization

of the Risk x Value number, as shown in Figure 35. Three digital tools were ranked in the

respective categories from a 1-10 value and this number was multiplied by the Risk x Value to

achieve a cumulative score or ranking. This evaluation exercise helped to identify and prioritize

characteristics that are critical for the organization.

Ultimately, there are many available digital tools that perform the same functional

analytical tasks, however, as demonstrated by the prioritization list, the historically achieved

value, scalability, user interface, interoperability, and business strategy are the dominating

factors that separate where a digital tool aligns with the cultural and digital acumen of an

organization. The conclusion from this assessment is that the objective is not necessarily to find

the most technologically advanced tool, but to invest into a digital culture and ecosystem that is

part of a more holistic system integration. These digital tools often have similar capabilities, but

the growth opportunity is related more so to the cumulative contributions of an entire network

of systems that build an integrated digital system. Also note that the method outlined in Figure

35 is a relative value for comparison for tools that serve similar purposes, as different digital tools

that serve different purposes will not have meaningful comparable overall scores.

Data AccessibilityData Preparation

Data Visualization Advanced Analytics

User Interface

Interoperability & Standardization

ScalabilityPlatform ManagementModel Management

Collaboration & Partnerships

Automation

Operational Value

Pricing

Security

Business StrategyRI

SK

VALUE

High Attention, Higher Value

Low Attention, Higher ValueLow Attention, Lower Value

High Attention, Lower Value

Low

High

Low High

131

Figure 35: Example Digital Tool Analysis Results

With the digital tool properties and characteristics aligned with an organization’s

capabilities, the next chapter describes an economic evaluation technique to incorporate

uncertainties into financial models in order to assess the potential operational value of a digital

investment in DC&I operations.

Risk Type RiskRanking

Quantified Avg. Risk

Value QuantifiedValue

Risk x ValueCompany

#1Company

#2Company

#3

1 Operational ValueWhat is the historical performance of the platform with other adopters?What is the predicted value-added of adoption and integration?

Organizational CapabilitiesMeasures of Success

HighMedium

83% High 100% 83% 7 8 8

2 Scalability How well can the platform be deployed at a large scale, and what is the track-record?Digital Infrastructure

Organizational CapabilitiesWorking Environment

LowHighHigh

78% High 100% 78% 8 7 7

3 User Interface

Does the product have an intuitive and coherent UIUX?What is the rated customer experience?How well has the user experience been recorded, tracked, and integrated?What is the competency required to operate or use platform or digital tool?

Digital InfrastructureOrganizational Capabilities

Working Environment

LowHigh

Medium77% High 100% 77% 7 8 8

4Interoperability & Standardization

How well does the platform integrate with the data infrastructure (i.e. cloud-based host)?How well does the platform or tool integrate or connect with other systems via APIs (i.e. does it contribute to the data pipe ecosystem)?Does theh platform or tool offer 3rd-party applications or open-source capabilities?

Digital InfrastructureOrganizational Capabilities

External EcosystemProcess & GovernanceMeasures of Success

LowHighHigh

MediumMedium

73% High 100% 73% 9 7 8

5 Business Strategy

What is the future vision of platform or digital tool? How well does it integrate and align with the business strategy?What is the competitive advantage with adopting? Build versus buy?What is the extensibility of the digital platform? How easily can it adapt to future changes?

Digital InfrastructureOrganizational Capabilities

Working EnvironmentExternal Ecosystem

Process & GovernanceMeasures of Success

LowHigh

MediumHigh

MediumMedium

72% High 100% 72% 9 7 8

6 Advanced AnalyticsWhat analytic applications are available?How well is an analytic suite packaged together?

Digital InfrastructureOrganizational Capabilities

Measures of Success

LowHigh

Medium66% High 100% 66% 8 8 8

7 Automation How well can automated processes and procedures be integrated into the platform?Digital Infrastructure

Organizational CapabilitiesProcess & Governance

LowHigh

Medium66% High 100% 66% 8 6 7

8 SecurityHow has the consideration for business security been integrated into the design and architecture of the platform or tool?Does the platform present any cybersecurity issues?

Process & GovernanceMeasures of Success

MediumMedium

66% High 100% 66% 7 7 7

9 Model Management How are models monitored and managed? What are the resources required?Organizational Capabilities

Working EnvironmentExternal Ecosystem

HighMediumMedium

77% Medium 66% 51% 7 7 7

10Collaboration &

PartnershipsHow does this platform link or connect with other functions and workflows?Does the company have any partnerships or alliances with other tools or services?

Working EnvironmentExternal Ecosystem

Process & Governance

MediumHigh

Medium77% Medium 66% 51% 10 6 7

11 Pricing What is the cost to develop and/or integrate platform?Organizational Capabilities

Process & GovernanceMeasures of Success

HighMediumMedium

77% Medium 66% 51% 7 7 7

12 Data Accessibility How well does a platform access and manage different data types and data sources?Digital Infrastucture

Process & GovernanceLow

Medium50% High 100% 50% 7 7 7

13 Data Visualization How well can data be visualized and explored?Digital Infrastructure

Organizational CapabilitiesLowHigh

67% Medium 66% 44% 8 7 7

14 Platform ManagementHow well can the platform be managed internally and externally? What are the resources required?

Organizational CapabilitiesWorking Environment

External Ecosystem

HighMediumMedium

77% Low 33% 26% 7 7 7

15 Data Preparation How well does a platform filter and clean ingested data?Digital Infrastructure

Process & GovernanceLow

Medium50% Low 33% 16% 6 6 6

TOTAL68 62 64

Platform Characterization

132

9 Economic Design for Uncertainty

To evaluate the realized value of a digital initiative for an investment decision, the return

on investment must be part of the conversation. It is important to recognize that value does not

necessarily need to be defined strictly as financial benefit, because improvements in safety and

environmental provide significant value to an organization as well. However, the scope of the

desirable value for success should be well defined before evaluating digital solutions. This section

will provide a brief overview of how to evaluate financial value with the inclusion of uncertainty

in a simple Monte-Carlo model. This methodology provides a complete range of outcomes for

the net present value (NPV), represented as a cumulative distribution function (CDF). The

uncertainties (although not exhaustive) developed into this model focus on oil price, initial

production, production volatility, well CapEx, digital initiative CapEx, and digital performance.

The intention of this economic model is not to provide a perfect evaluation of any specific digital

technology, but, instead, to provide a workflow in order to properly understand and evaluate a

digital decision that recognizes uncertainty in costs and performance.

The financial driver for digital initiatives is to reduce operating costs. This will enable O&G

companies to be competitive in any oil price environment. Figure 36 provides the estimated

break-even oil price for global and field type assets from the market research group Rystad

Energy. Compare this graph to Figure 37 to understand different assets’ risk and vulnerability to

oil price volatility.

Figure 36: Oil Production Break-Even Price, Adapted from (“Rystad Energy Ranks the Cheapest Sources of Supply in the Oil

Industry” n.d.)

$26

$42 $46

$49

$58 $59 $59 $60

$83

$-

$20

$40

$60

$80

$100

$120

P R O D U C I N G F I E L D S

O N S H O R E M I D D L E E A S T

N A T I G H T L I Q U I D S

S H E L F D E E P W A T E R R U S S I AO N S H O R E

E X T R A H E A V YO I L

R E S T O F W O R L D O N S H O R E

O I L S A N D S

BRENT BREAK-EVEN PRICE, USD/BBL [RYSTAD ENERGY UCUBE, 2019]

= 60% Confidence Interval

133

There are significant challenges associated with evaluating long-term digital

developments due to clearly defining baselines and measuring improvement with removing all

other influencing variables. Enterprise-wide digital platforms are more complex to evaluate than

the presented methodology, but the general principle of understanding uncertainty in both the

business model and digital performance is applicable for any evaluation.

There are many different methods to evaluate and predict the price of oil, and the

intention of this thesis is not to perfect that model. The economic model fit-curves a Beta

distribution to best match the price of oil over the last three years in order to develop the

probability distribution function (PDF) – this is more for a measure of volatility than for achieving

a high accuracy on oil price.

Figure 37: Historical Chart of Crude Oil Prices, Adapted from (McNally 2017; “EIA: Petroleum & Other Liquids | Spot Price Data,”

n.d.)

The break-even price per barrel was further broken down by Rystad Energy into four

different categories: administration, production costs, capital spending, and taxes. Figure 38

shows the Rystad Energy estimates for U.S. Shale. These values were leveraged as the baseline

in the economic model to evaluate the digital improvements with respect to the size, scope, and

variability of the asset.

$(30.00)

$(10.00)

$10.00

$30.00

$50.00

$70.00

$90.00

$110.00

$130.00

$150.00

1987 1992 1997 2002 2007 2012 2017

WTI & Brent Spot Price, Dollars per Barrel, EIA[Noted Highlights from "Crude Volatility" by Robert McNally]

WTI Brent Spread

Gulf War

Asian FinancialCrisis

OPEC Era Boom-Bust III

Chinese demand andweak non-OPEC supply

Great Recession

Saudi Arabiarefuses to swing production

134

Figure 38: Cost Breakdown to Produce a Barrel of Oil: US Shale (MacroTrends n.d.)

The economic model looks at three use cases, with

• Economic Model Assumptions:

o Well Pad Count: 4 wells

o Well Cost ($MM): 7.5

o Well IP (BOEPD): 2,000

o Production Period: 1 year

• Case 1:

o No digital initiatives incorporated (digital CapEx=$0)

• Case 2:

o $3M investment in digital CapEx with impacts reducing well CapEx, CapEx

cost variability, and OpEx (operation efficiency)

• Case 3:

o $5M pad investment in digital CapEx with impacts reducing well CapEx,

CapEx cost variability, OpEx (operation efficiency), administration costs

(workflow efficiencies), and increase in initial production (seismic and well

placement optimizations)

135

Figure 39: Economic Model Assumptions

The Data Tables function was used in Excel to generate a 3,000 run Monte-Carlo model

with a Beta fit-curve PDF to define the variability in the assumed asset performance and

productivity. Figure 40 shows the output cumulative distribution functions of the three cases to

show the variability of potential outputs based on the uncertainties imbedded into the economic

model. The model demonstrates that operational efficiencies at the P50 distribution created

about $1M per 4 well increase ($250k per well) in NPV, and the additional digital investment in

seismic and subsurface collaboration efforts created another $1M per 4 well increase ($250k per

well). The calculations here are not meant to depict an actual digital initiative, as these savings

are company confidential. However, this presents a methodology to understand levels of

investment as it pertains to initiating digital integration into the operation program.

136

Figure 40: Monte-Carlo Economic Model of Digital Initiatives in Drilling

Understanding the holistic economics and the respective uncertainties associated with a

digital initiative is critical in the selection and development process. Inclusion of the uncertainty

for system variables with respect to operational and organizational challenges will help to build

a more robust economic model to understand the forecasted realized value of the initiative. The

next chapter will outline two ancillary collaborative initiatives that were discovered during the

research process that provide additional insight to the valuable business efficiencies achieved

through relationships and data connectivity.

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

-15.000 -10.000 -5.000 0.000 5.000 10.000 15.000 20.000 25.000 30.000 35.000

PRO

BABI

LITY

NET PRESENT VALUE, $MILLIONS

Digital Initiative NPV - 4 Well Pad - US Shale - Cumulative Density Function

NPV1 - No Digital NPV2 - Digital - Reduction in OpEx & CapEx NPV3 - Digital - Reduction in OpEx, CapEx, and G&A; Increased Recovery

137

10 Collaboration Initiatives

Removing silos and improving collaboration efforts are the cornerstone to the digital

evolution. Innovation rates rapidly increase with the catalyst of community knowledge sharing.

Breakthroughs are derived from incremental improvements compounded over the work of

predecessors. With business knowledge recognized as the key to design and operations

improvements, efforts directed toward streamlining knowledge to the end user possess

significant value. Organizations have high dependencies on the experience of their workforce,

and that poses a risk for the reliability and sustainability of success. This thesis explores two

additional types of collaboration efforts, outside specific design platforms and operational

workflows. The two areas are (1) virtual conversational chatbots, and (2) social network graphs.

Conversational chatbots or virtual assistance aggregate and deliver pertinent information to a

user request to streamline knowledge share across an organization. Social network graphs are

developed to connect a user with the right people and resources that can be associated with

many different dynamics, like a specific project or specific competency. These two areas will be

analyzed to understand the capabilities, value, and integration opportunities associated with

these new collaboration initiatives.

10.1 Virtual Assistants

The O&G industry operates in extremely dynamic environments, where unique

techniques and operation standards are custom to specific areas. Even DC&I operations that are

a few feet apart can experience completely different operational or geological challenges. This

has historically created a siloed work environment given that information and protocols from one

asset are not typically applicable to another asset. However, as history has proven, there are

more similarities between assets that are able to be leveraged in the pursuit of optimizations

that were previously not recognized. Offshore assets are adopting land techniques to create a

pseudo-factory design and execution programs to improve efficiencies, and land operations are

adopting advanced downhole tools that were originally developed for complex offshore

operations.

Three chatbots that have been designed for the oil and gas industry are Sandy (Belmont

Technology), Nesh, and Ralphie (Earth-Index) (Jacobs 2019). Think Siri, Alexa, and Cortana, but

138

specific to O&G engineering. Chatbots leverage artificial intelligence in the form of natural-

language processing (NLP), machine learning (ML), and deep learning to imitate human

conversations and provide intelligent responses based on available data (“Ultimate Guide to

Artificial Intelligence Chatbots | Smartsheet” n.d.). Due to the infeasibility of programing

accurate responses to every possible input prompt or question, the employment of powerful

chatbots has become possible due to the learning and language processing of artificial

intelligence innovations.

Machine learning uses neural networks that calculate an output from provided inputs

using weighted connections from a multi-layer (hidden layers) algorithm. These networks are

iteratively trained, adjusting node weights, to create an output. The number of hidden layers and

the amount of training data can be adjusted to reach a specified or required accuracy. These

responses, or patterns, are used as references to evaluate an input prompt to deliver the best

response. The input prompt can be a question or any type of sentence, and the chatbot can

leverage the neural network algorithm to search and develop the appropriate response. Due to

the learning aspect of the artificial intelligence, the sentence or question is weighted based on

how the algorithm was trained. This means that sentences and requests do not have to be exact,

as would be needed if each request and response was manually programmed into the algorithm.

The ML technique provides the flexibility needed for large scale deployment of an algorithm that

is capable of searching through large volumes of accuracy and returning a useful response (“How

Do Chatbots Work? A Guide to the Chatbot Architecture” n.d.).

Natural language processing provides the synthesis of the human language. NLP is used

to convert the input sentence into structured data that can be further processed in the ML

algorithm. NLP can provide context, sentiment, name recognition, and other capabilities to

breakdown the meaning of a request. Human language is extremely dynamic and complex, and

there are subtleties that complicate meaning and make computational understanding a

challenge. NLP is the bridge from linguistics to computer science, and innovations in this branch

of study are growing rapidly, with specific attention from Google, Amazon, IBM, and others all

contributing to its advancement. The current capabilities in NLP have made powerful chatbots a

possibility to implement and scale through an organization.

139

The details on the specifics of artificial intelligence associated with conversational virtual

bots are endless, especially with the technology continuing to advance each day. The intent of

the systems approach is to understand the available technology, where it can add value, and how

it will integrate into the current system. An example chatbot platform architecture, adapted from

a Microsoft architecture in Figure 41, shows the feedback loops and interdependencies of the

data flow from user prompt to system response (“Azure Architecture Center | Microsoft Docs”

n.d.).

Figure 41: Conversational Bot Architecture, Portions of Enterprise-Grade Conversational Bot Image

(https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/ai/conversational-bot) used with permission

from Microsoft.

The MS architecture provides the inclusion and connectivity from the user prompt to Bot

Logic and UX, Security and Governance, Bot Cognition and Intelligence, Quality Assurance,

Logging and Monitoring, and Databases. The required or suggested system components to

achieve a successful virtual assistance can be further broken down into subsystem components

and analyzed to understand the dependencies. Additionally, communicating these dependencies

to stakeholders for data managers, ML algorithm applications, monitoring and reporting

applications can help to achieve participation and input from the impacting parties to ensure that

the chatbot development is addressing the needs of the entire system.

Bot UIUXBot Cognition &

Intelligence

Data Extract, Transform, & Load

InputOutput

Authentication

Web AppSearch Index

Knowledge Base (FAQ)Intent & Entities

BusinessUser

Request

Response

Queries

Results

Security & Governance

Logging, Monitoring, & Reporting

Non-Structured[Images, video, Word docs,

Power Point, PDF, email]

Semi-Structured[Spreadsheets, logs, metadata, csv, xml]

Relational Database[Structured Information]

Raw Data

Data Lake

ETL Hub

Logic Apps

Quality Assurance[Retraining, testing, debugging,

feedback analysis]

140

Virtual chatbots in O&G can provide capabilities to data mine well logs, reservoir data,

seismic data, standards, lessons learned, best practices, and other pertinent information from

either aggregated or disconnected sources. The chatbot platform can connect with both internal

or external resources, depending on the architecture, with external data sources for O&G being

regulatory agencies, energy or industry news, or even industry academic repositories.

The value of a chatbot is derived from the increased worker efficiency to find and process

useful information – this is the more intuitive solution. However, value can also be created by

providing relevant information unsolicited – meaning that a virtual assistant can be trained to act

as a new age “spell check” for engineering design and operations. If integrated into an

engineering design platform, a virtual assistant can recognize deviations in design patterns and

suggest revisiting or validating that specific design decision. For an easy example, if a subsea

engineer installs a 10-inch seafloor pipeline in the project design, but all offset assets have a 12-

inch pipeline, then an alert can be presented with the statistics of previous design decisions along

with the asset locations.

These capabilities are determined based on the technology readiness and on the current

software platform architecture. These potential capabilities, along with the MS architecture,

show that information connectivity becomes more seamless as systems are designed with

interoperability standards. The vision for creating a single, collaborative design platform has its

advantages with digital opportunities to further improve the capabilities, just like the integration

of a virtual chatbot for data accessibility and verification/validation purposes.

10.2 Socio-Network Knowledge Graph

Socio-network knowledge graphs provide an intuitive means to build transparency

between the organization, employees, projects, and documents. The knowledge graph works by

mapping relationships between multi-variate nodes to establish a visualized framework of

connections within an organization. These graphs provide quick searchability, which can identify

organizational connections and resources for a user’s request. Note that these platforms provide

the flexibility to include almost any type of node and defined relationship to fit a specific purpose,

for example, stakeholder mapping, system mapping, social network mapping, and concept

141

mapping (“Kumu” n.d.). This type of platform is included in this thesis because of the connectivity

and collaboration opportunities that it creates.

This thesis provides an example graph of an entire major capital project group linked

through their perspective functions. This is only a small subset of the potential capabilities, but it

provides insight on the visual interface that is used for relationship discovery. Figure 42 is

composed of three images, with the top left representing the relationship and search

functionality, the middle showing the full platform visualization, and the bottom right showing

the search output when querying for all Drilling, Completion, and Intervention (DC&I) employees

on the project.

Figure 42: Sociotechnical Knowledge Graph of O&G Major Capital Project, Created from (“Kumu” n.d.)

The intent was to further the example knowledge graph with the inclusion of project

documents and project outputs linked to the employees responsible for the work. This would

then create a project ecosystem where information on employee-to-group-to-document

matching can help understand required resources and direct questions to the right people.

Additionally, metadata can be included into node profiles, meaning that specific competencies

or experiences can be searched to communicate and collaborate any information. The metadata

would need some standardization to be effective, but essentially the future state would be to

provide the relationship of any node holistically mapped to the entire organization. This

142

availability of relationships would greatly assist with breaking silos and seamlessly connecting

resources. These opportunities can ultimately contribute to improving productivity,

relationships, and culture within an organization.

Here are some possible use cases for organizational knowledge graphs:

Use Case 1:

Engineer assigned to work on a specific project document. The relationship graph

provides a platform to immediately discover all individuals responsible and accountable

for each project document.

Use Case 2:

A technology venture group within the organization is working with a start-up to develop

technology that impacts your specific function. If a relationship is mapped from that

project to your function within the platform, this venture becomes transparent to the

organization and it can leverage multiple resources that would otherwise have no idea

that project was being pursued.

Use Case 3:

An engineer needs a subject matter expert’s (SME’s) support in something specific, which

he or she would normally have to track through several layers of recommendations. The

knowledge graph allows employees to “tag” themselves as an expert or knowledgeable

in a specific area. If they needed someone in optimization, they can search “tag =

optimization SME” and all of the optimization SMEs will show up on the graph or in a table

in a query.

Virtual assistants and knowledge graphs seek to produce the same holistic objective – to

act as a pipeline to feed reliable information to the end-user for improved organizational

productivity. The digital space allows for capabilities to connect an entire organization through a

single, intuitive UIUX. And, if the infrastructure and architecture are designed for agile integration

of new initiatives, then opportunities, once the platform system is connected, are endless. The

143

systems approach places value, opportunities, stakeholders, and sustainability at the forefront

of concept design.

144

11 Conclusion

The O&G industry has a major opportunity to redefine its organizational and operational

perspective through digitalization and artificial intelligence. Several challenges face the industry,

including commodity volatility, social pressures, geo-political uncertainty, and increased

geological complexity. These disruptions pose an existential threat to the O&G industry, where

drastic improvements with design efficiencies, operational performance, safety, and the

environmental footprint are necessary for sustainable survival. The digital transformation

empowers companies to address these disruptive challenges by creating efficiencies and

optimizations across the entire value chain to create a monumental shift in the industry’s

competitive advantage. This thesis recognizes the opportunities and capabilities of current digital

advancements and innovations but also identifies significant organizational and technical barriers

to achieving successful digital integration.

The hypothesis of this thesis was to evaluate the presence of a “productivity paradox”

with respect to digital developments in the O&G industry. The theory makes the claim that the

current pursuit of digitization has not yet achieved ideal realized value in engineering design or

operations. That is to say, that the high level of digital investments are not equitable to the

achieved organizational and operational performance efficiencies. This thesis analyzed this

hypothesis with a systems-based approach to provide a unique holistic perspective of the entire

digital upstream O&G ecosystem. The digital system analyzed included O&G operators, O&G

service providers, and digital software and platform service providers. The analysis suggests that

digital capabilities, specifically with respect to Cloud and AI initiatives, are providing enhanced

capabilities to transform the capacity and capabilities of an organization. Cloud services can be

leveraged to unify the data management layer of an organization, where the scalability and

connectivity provides a step-change in workflow collaboration efficiencies. And, with integration

of a single source of data accessibility, AI and ML techniques can be applied to optimize and

automate many critical functions and workflows across the asset life cycle. However, the

opportunity is advertised as a revolutionary change, and while potential capability improvements

are immense, this rhetoric is a disservice to the movement insofar as it underemphasizes the

actual amount of infrastructural, cultural, competency, and workflow shift required for success.

145

Instead, digital adoption is proving to be more of an evolutionary process. The revolutionary shift-

change is an infeasibly expensive and disruptive challenge to perform in a singular fashion. The

challenge addressed by the systems approach is determining how to implement O&G subsystem

projects over time that can eventually interconnect into an ideal infrastructure of the future. And,

more importantly, which investment steps to take that both provide immediate benefit and

prepare the organization for future digital synergies.

The data shows that advanced software applications, infused with AI on a cloud-based

platform, are able to boost and enhance business decisions for design and operations to provide

a strong competitive advantage. To put it simply, the entire digital movement is driven by the

need to make faster and better business decisions – this is a decision economy that is

benchmarked by the quality and speed of the business insight. And with digital connectivity to

assets and process control systems, this can be further enhanced with automation methods.

However, the Challenges to Adoption chapter of this thesis demonstrates the presence of

significant barriers limiting the adoption and diffusion of digital techniques.

This thesis concludes that organizations underestimate the extent of the challenges

associated with developing, scaling, adopting, executing, and maintaining a digital initiative

across a large organization. The digital literacy and culture of most O&G organizations are not yet

positioned to foster an environment where digital initiatives can succeed without a high

dependency on external resources. Even with the recognition that the digital space is not a core

competency, O&G organizations are still electing to build digital systems internally, where these

systems are isolated from external support and development. All digital techniques and

technologies have been developed from incremental innovations founded from the

achievements of others – open-source, innovative collaboration created this possibility of digital

transformation. This collaborative and open-source theme must be continued to achieve the

network synergies of combined and connected innovative initiatives. Ignorance is not bliss when

it comes to overlooking business opportunities that put an organization at risk during difficult

times.

146

This thesis recommends that organizations consider the holistic systems perspective

when selecting digital partnerships and services. The digital evaluation method presented

provides a robust approach to consider the interdependencies and relationships of the digital

system being analyzed with the rest of the organization. The safe approach is to start with a

business challenge, create a digital vision or strategy, and to embrace open-source collaboration

and standardization. A top-down approach is difficult when the organization is resistant to

recognize or realize the value of digitization, whereas a bottom-up approach is more contagious

as the value is exhibited on the frontlines of the operation. The capabilities in the digital space

are growing rapidly, and organizations should develop or adopt digital initiatives with

extensibility and sustainability at the forefront of the requirements, with strict adherence to

value creation.

Figure 43: Strategic Digital Approach

The bullets below outline the recommendations and takeaways from the research and

interviews performed for this thesis:

• Digital innovations and technologies are revolutionary, however the process of adopting

and integrating digital platforms into the social and technical workflows within an

organization is evolutionary.

• Digital opportunities are transforming the design and operational workflows within the

O&G industry, however, the greatest challenges to programmatically adopting

technology enterprise-wide are internal digital competency and awareness, cultural

acceptance to change and disruption, and trust with leaders, partners, and alliances.

• The O&G industry has exhibited a historically siloed competitive ecosystem and this

isolated philosophy will no longer remain competitive. Digital platforms built to integrate

systems, partnerships, and expertise will thrive in a networked marketplace.

147

• A data-driven organization must fundamentally embrace the value of revitalizing,

monetizing, and democratizing organizational data, including both real-time and

historical data, as well as structured and unstructured data.

• Digital developments and services are more likely to provide value if the organizational

capabilities, capacities, and resources are aligned with the functional requirements of the

digital solution (i.e. organizations need to better understand their internal limits before

developing or buying digital solutions that exceed the abilities of the intended

stakeholders).

• Enterprise digital platforms and solutions should be developed with an integrated, open-

sourced, and standardized approach for interoperability and sustainability. Integrated

systems build integrated solutions where management of greater complexities and

optimizations is possible.

• Internal and external digital partnerships and alliances are essential for the continual

growth and development of an enterprise digital ecosystem. Organizations need to

transition to partnerships that embrace the “open” approach to shared growth

opportunities and resources. Innovations and breakthroughs are built off the incremental

improvements from predecessors (network of growth), and this theme should continue

with the system development of a digital enterprise. The objective should shift from

selecting a single system to replace all existing systems, to selecting a digital platform that

integrates and builds off the existing systems.

• Digitization is fundamentally geared toward creating faster and better business decisions

for both design and operations. Within the digital system, data quality (volume, variety,

veracity, velocity, and value) will persist as a continual challenge to analytic accuracy and

capabilities. The data lifecycle must be holistically evaluated to determine the data quality

and processing requirements to model actionable decisions for specific processes. The

data lifecycle is inclusive of data creation with sensor type, placement, and quality, to

data filtering, processing, and performing analytical models. Understanding the statistical

accuracy required for a data-driven model to influence business decisions will help dictate

the system requirements to govern a specific design or operation.

148

• Digitization is about connection, visualization and prediction (analytics), and action

(optimization and automation). The connectivity and repeated analytics create the full

system understanding that generates value. An organization can shift significant energy

from monitoring and data mining to design and operational decision support. Ultimately,

the information and business insight availability, democratized across an entire

organization, allows the workforce to focus entirely on value.

• Notably, successful asset performance management has been achieved through the

integration of cloud enterprise systems, unified operational centers, and operational

lifecycle management, including automation, optimization, and maintenance.

In conclusion, it is important to recognize that digital initiatives are only an enhancement

to core business value, and unless developing a new growth venture, digital tools only serve to

enhance operations and profitability. The goal is not to be “digital,” but, rather, to enhance and

improve operational performance with better data-driven decisions, more efficient workflows,

and process and machine automation (“Refueling the Oil Industry: Transforming Traditional O&G

with the Oil of the 21st Century - Red Chalk Group” n.d.). Understanding and acknowledging

these benefits from digital methodologies does not mean your company is “digital.” It takes a

cultural paradigm shift to view problems and solutions from an information technology

perspective (Oil&Gas Journal n.d.). It takes investments in data structuring, data management,

and standardization. And, most of all, it takes trust to develop and learn from the digital

methodologies employed to develop these business efficiencies (World Economic Forum 2017).

11.1 Limitations and Future Research

The work performed in this thesis addressed the O&G digital transformation from a high-

level systems perspective, with a focus on drilling and completion design and operations. Digital

success and connectivity create a powerful competitive advantage for organizations and, thus,

financial information for integration and resulting realized value was difficult to uncover.

Information was gathered through the outward lens of news outlets, interviews, and available

industry articles, where it was sometimes difficult to differentiate the advertised rhetoric and the

actual value. Additionally, there was no hands-on involvement or experience with the managing

149

or integration of digital platforms in an O&G organization. Future research should be focused on

disclosing the financial value associated with the adoption of different digital techniques across

the O&G value chain. The financial value can be compared with the presented ranking

methodology with respect to buy versus build, alliances, partnerships, standardization, and open-

source approaches. Developing a profitability correlation to the different benefit attributes of a

digital platform would be a powerful tool to help guide how organizations invest in their digital

transformation. The proposed methodology can be applied to digital initiatives that have

adopted within organizations to determine how well the organization’s digital strategy aligns

with the initiative, and this can be tracked with resulting successes or failures. The methodology

can be used to rank digital tools, but a limitation is that it has not been thoroughly evaluated with

in-depth analysis over a sufficiently large sample of digital initiatives.

Finally, future research should be directed to performing a more detailed review of the

available O&G digital tools and applications. This thesis approached the problem from a holistic

systems perspective with a broad boundary of investigation. However, additional research into

the specific details of the digital platforms and tool offerings with respect to the attributes

outlined in the ranking method would provide more insight on the value-added functionalities.

This information can help to create a more accurate O&G digital roadmap with the inclusion of

critical benefit attributes. Digital tools, attributes, and the respective performance can be

correlated to better understand the trends and the direction in which the industry is heading.

150

References

“Adopting Data Standards: Cheapest Way to Survive Downturn | Rigzone.” n.d. Accessed March 28,

2020.

https://www.rigzone.com/news/oil_gas/a/142161/Adopting_Data_Standards_Cheapest_Way_t

o_Survive_Downturn/?all=HG2.

Alotaibi, Bader, Beshir Aman, and Mohammad Nefai. 2019. “Real-Time Drilling Models Monitoring Using

Artificial Intelligence.” Society of Petroleum Engineers.

“Application of Artificial Intelligence in Oil and Gas Industry: Exploring Its Impact.” n.d. Accessed May 12,

2020. https://www.offshore-technology.com/features/application-of-artificial-intelligence-in-

oil-and-gas-industry/.

“Applications of Machine Learning in Drilling with Petro.Ai — Data Shop Talk.” n.d. Accessed May 12,

2020. https://datashoptalk.com/applications-of-machine-learning-in-drilling/.

“Architecting For The -Ilities - Towards Data Science.” n.d. Accessed April 1, 2020.

https://towardsdatascience.com/architecting-for-the-ilities-6fae9d00bf6b.

“Artificial Intelligence Has Royal Dutch Shell ‘Super Excited’ | Investor’s Business Daily.” n.d. Accessed

May 12, 2020. https://www.investors.com/news/artificial-intelligence-royal-dutch-shell-

permian-ceraweek/.

“Artificial Intelligence Improves Real-Time Drilling Data Analysis | Offshore.” n.d. Accessed May 12,

2020. https://www.offshore-mag.com/drilling-completion/article/16764029/artificial-

intelligence-improves-realtime-drilling-data-analysis.

Ashok, Pradeep, and Michael Behounek. 2018. “An Artificial Intelligence Belief System Reduces

Nonproductive Time.” Journal of Petroleum Technology, October.

“Automated Drilling Gains Momentum in Offshore Operations | Offshore.” n.d. Accessed May 29, 2020.

https://www.offshore-mag.com/drilling-completion/article/16758411/automated-drilling-gains-

momentum-in-offshore-operations.

“Azure Architecture Center | Microsoft Docs.” n.d. Accessed June 2, 2020.

https://docs.microsoft.com/en-us/azure/architecture/browse/.

“Baker Hughes & C3.Ai Release BHC3 Production OptimizationTM | Baker Hughes.” n.d. Accessed May 12,

2020. https://www.bakerhughes.com/company/news/baker-hughes-c3ai-release-bhc3-

production-optimizationtm.

“Barrel Breakdown - WSJ.Com.” n.d. Accessed March 29, 2020. http://graphics.wsj.com/oil-barrel-

breakdown/.

151

“Beyond the Barrel: How Data and Analytics Will Become the New Currency in Oil and Gas.” n.d.

Accessed July 13, 2020. https://gblogs.cisco.com/ca/2018/06/07/beyond-the-barrel-how-data-

and-analytics-will-become-the-new-currency-in-oil-and-gas/.

“Big Data and AI Executive Survey 2020.” 2020. NewVantage Partners LLC. http://newvantage.com/wp-

content/uploads/2020/01/NewVantage-Partners-Big-Data-and-AI-Executive-Survey-2020-1.pdf.

“BP Has a New AI Tool for Drilling into Data – and It’s Fueling Smarter Decisions | Transform.” n.d.

Accessed May 12, 2020. https://news.microsoft.com/transform/bp-ai-drilling-data-fueling-

smarter-decisions/?_lrsc=3134a45f-15b6-426a-8288-d8c8178e9b55.

“BP Invests in AI to Focus on Digital Energy|Powertech Review|.” n.d. Accessed May 12, 2020.

https://powertechreview.com/bp-invests-ai-focus-digital-energy/.

“BP Invests in Chinese AI Energy Management Tech Specialist R&B | News and Insights | Home.” n.d.

Accessed May 12, 2020. https://www.bp.com/en/global/corporate/news-and-insights/press-

releases/bp-invests-in-chinese-ai-energy-management-tech-specialist-r-and-b.html.

“BP Supercomputer Now World’s Most Powerful for Commercial Research.” n.d. Accessed May 12,

2020. https://www.maritime-executive.com/article/bp-supercomputer-now-worlds-most-

powerful-for-commercial-research.

“BP Technology Outlook 2018.” n.d., 72.

“BP Upgrades Houston HPC, World’s Most Powerful Corporate Supercomputer - DCD.” n.d. Accessed

May 12, 2020. https://www.datacenterdynamics.com/en/news/bp-upgrades-houston-hpc-

worlds-most-powerful-corporate-supercomputer/.

“BP’s New Oilfield Roughneck Is An Algorithm.” n.d. Accessed May 12, 2020.

https://www.forbes.com/sites/christopherhelman/2018/05/08/how-silicon-valley-is-helping-bp-

bring-a-i-to-the-oil-patch/.

Bravo, Cesar E., Luigi Alfonso Saputelli, Francklin Ivan Rivas, Anna Gabriela Perez, Michael Nikolaou,

Georg Zangl, Neil de Guzman, Shahab D. Mohaghegh, and Gustavo Nunez. 2012. “State-of-the-

Art Application of Artificial Intelligence and Trends in the E&P Industry: A Technology Survey.” In

SPE Intelligent Energy International. Utrecht, The Netherlands: Society of Petroleum Engineers.

https://doi.org/10.2118/150314-MS.

“Bringing AI To Data Analytics And Knowledge Management: Startups Anodot And Maana Snag New

Financing.” n.d. Accessed May 12, 2020. https://www.crn.com/news/applications-

os/300097101/bringing-ai-to-data-analytics-and-knowledge-management-startups-anodot-and-

maana-snag-new-financing.htm.

152

Cayeux, E, B Daireaux, E W Dvergsnes, and F Florence. n.d. “Toward Drilling Automation: On the

Necessity of Using Sensors That Relate to Physical Models,” 25.

“Chevron Finds Data Recovery as Hard as Oil Recovery - Taps Panzura Controllers for Cloud-Based

Storage Solution.” n.d. Accessed May 12, 2020. https://1067ec1jtn84131jsj2jmuv3-

wpengine.netdna-ssl.com/wp-content/uploads/2019/08/CS-Chevron-FNAS.pdf.

Chollet, François. n.d. “The Implausibility of Intelligence Explosion,” 13.

“Corporate VC Arms of Saudi Aramco and Chevron Invest in $24M Round for Seattle Startup Seeq -

GeekWire.” n.d. Accessed May 12, 2020. https://www.geekwire.com/2020/seattle-startup-

seeq-raises-24m-help-companies-analyze-manufacturing-data/.

“Data Gumbo Blockchain Expands from Oil to Geothermal Drilling in Asia - Ledger Insights - Enterprise

Blockchain.” n.d. Accessed May 12, 2020. https://www.ledgerinsights.com/data-gumbo-

blockchain-expands-oil-geothermal-asia/.

“Data Gumbo Secures $6M in Series A Funding from Venture Arms of Leading International Oil & Gas

Companies | Business Wire.” n.d. Accessed May 12, 2020.

https://www.businesswire.com/news/home/20190507005752/en/Data-Gumbo-Secures-6M-

Series-Funding-Venture.

“Demystifying Data Mining.” n.d., 11.

“Directional Drilling Automation.” n.d. Accessed May 12, 2020. https://aidriller.com/.

“Download Speeds: Comparing 2G, 3G, 4G & 5G Mobile Networks.” n.d. Accessed July 13, 2020.

https://kenstechtips.com/index.php/download-speeds-2g-3g-and-4g-actual-meaning.

“EIA: Petroleum & Other Liquids | Spot Price Data.” n.d.

https://www.eia.gov/dnav/pet/hist/RBRTED.htm.

Energistics, and Independent Data Services. n.d. “Case Study: WITSML Is the Key Data Source for

Automated Daily Drilling Reports.” Accessed March 28, 2020. https://www.energistics.org/wp-

content/uploads/2018/01/2017ids-case-study.pdf.

“Equinor Taps FutureOn for Cloud-Based Offshore Data Visualization Software.” n.d. Accessed May 12,

2020. https://www.futureon.com/wp-content/uploads/2018/08/FutureOn_Equinor-Press-

Release_for-distribution-082418.pdf.

“Equinor Will Broadly Implement Ambyint’s IoT Solution to Optimize Production in North Dakota -

Equinor.Com.” n.d. Accessed May 12, 2020. https://www.equinor.com/en/how-and-why/etv-

news/equinor-will-broadly-implement-ambyint-solution-in-north-dakota.html.

153

Espinoza, Roberto, Jimmy Thatcher, and Morgan Eldred. 2019. “Turning an Offshore Analog Field into

Digital Usuing Artificial Intelligence.” Society of Petroleum Engineers, no. SPE-195027-MS

(March).

“Exploring the Potential of Robotics in the Oil and Gas Industry | Aker BP ASA.” n.d. Accessed May 12,

2020. https://www.akerbp.com/en/exploring-the-potential-of-robotics-in-the-oil-and-gas-

industry/.

“ExxonMobil Is Optimising Oil and Gas Operations with Microsoft.” n.d. Accessed May 12, 2020.

https://www.technologyrecord.com/Article/exxonmobil-is-optimising-oil-and-gas-operations-

with-microsoft-87758.

“FogHorn, Stanley Black & Decker, Saudi Aramco and Linde Highlight the Value of Edge Intelligence at

the ARC Industry Forum.” n.d. Accessed May 12, 2020. https://www.globenewswire.com/news-

release/2020/01/30/1977672/0/en/FogHorn-Stanley-Black-Decker-Saudi-Aramco-and-Linde-

Highlight-the-Value-of-Edge-Intelligence-at-the-ARC-Industry-Forum.html.

“FutureOn for Cloud-Based Offshore Data Visualization Software.” n.d. Accessed May 12, 2020.

https://www.futureon.com/equinor-taps-futureon/.

Garside, M. n.d. “Number of Oil and Gas Wells Drilled in the United States from 2014 to 2022*.”

“GE Introduces Proficy SmartSignal Shield 4.0 | Business Wire.” n.d. Accessed May 12, 2020.

https://www.businesswire.com/news/home/20120223005954/en/GE-Introduces-Proficy-

SmartSignal-Shield-4.0.

“GE SmartSignal Classic.” n.d. Accessed May 12, 2020.

https://www.ge.com/digital/sites/default/files/download_assets/smartsignal-classic-from-ge-

digital.pdf.

“GitHub - Equinor/OmniaPlant: Documentation on How to Get Started Building Industrial Applications

and Services by Using Omnia Plant Data Platform.” n.d. Accessed May 12, 2020.

https://github.com/equinor/OmniaPlant.

Hause, Matthew. 2013. “How to Fail at MBSE.” Atego.

Hause, Matthew, and Steve Ashfield. 2018. “The Oil and Gas Digital Engineering Journey.” INCOSE

International Symposium 28 (1): 337–51. https://doi.org/10.1002/j.2334-5837.2018.00485.x.

Hollingsworth, James L. 2015. “Digital Oilfield Standards Update.” In SPE Digital Energy Conference and

Exhibition. The Woodlands, Texas, USA: Society of Petroleum Engineers.

https://doi.org/10.2118/173398-MS.

154

Holsman, Richard. n.d. “The Search for Value: Trends in Digital Investment.” Accenture.

https://www.accenture.com/us-en/insights/energy/trends-digital-investment.

“Homepage - Corva.” n.d. Accessed May 12, 2020. https://www.corva.ai/.

“Houston-Based Chevron Technology Ventures Makes Investments in Carbon Capture and Spatial

Artificial Intelligence - InnovationMap.” n.d. Accessed May 12, 2020.

https://houston.innovationmap.com/chevron-technology-ventures-invests-insvante-and-

worlds-2645102500.html.

“How Do Chatbots Work? A Guide to the Chatbot Architecture.” n.d. Accessed June 3, 2020.

https://marutitech.com/chatbots-work-guide-chatbot-architecture/.

“How Saudi Aramco Is Digitalising Its Operations - Products & Services, Digitalisation, Digital, Saudi

Aramco, 4IR, Fourth Industrial Revolution, AI, Drones, VR, AR - Oil & Gas Middle East.” n.d.

Accessed May 12, 2020. https://www.oilandgasmiddleeast.com/products-services/36150-how-

saudi-aramco-is-digitalising-its-operations.

“How To Be A Data Scientist - Quantum Computing.” n.d. Accessed June 7, 2020.

https://quantumcomputingtech.blogspot.com/2019/06/how-to-be-data-scientist.html.

INCOSE. 2015. “Systems Engineering Handbook.” Wiley.

“INCOSE: Systems Engineering Vision 2025.” 2014. International Council on Systems Engineering.

“Industrial AI Company, Veros Systems, Closes $4.3 Million in Series B Funding.” n.d. Accessed May 12,

2020. https://verossystems.com/wp-content/uploads/2018/04/Veros-Systems-Series-B-Press-

Release-April-24-2018-1.pdf.

“Intelie Live Machine Learning Analytics - RigNet.” n.d. Accessed May 12, 2020.

https://www.rig.net/rignet_resources/intelie-live-machine-learning-analytics.

International Energy Agency (IEA). n.d. “World Energy Outlook 2019,” 810.

Jacobs, Trent. 2019. “The Oil and Gas Chat Bots Are Coming.” Journal of Petroleum Technology 71 (02):

34–36. https://doi.org/10.2118/0219-0034-JPT.

“JPT: BP and Startup Beyond Limits Try To Prove That Cognitive AI Is Ready for Oil and Gas | beyond.Ai.”

n.d. Accessed May 12, 2020. https://www.beyond.ai/news/jpt-bp-oil-and-gas/.

“JPT Chevron, Schlumberger, Microsoft Team To Improve Digital, Petrotechnical Work Flows.” n.d.

Accessed May 12, 2020. https://pubs.spe.org/en/jpt/jpt-article-detail/?art=5969.

“JPT Seeq’s Focus on Time-Series Data Draws in Chevron, Shell, and Pioneer.” n.d. Accessed May 12,

2020. https://pubs.spe.org/en/jpt/jpt-article-detail/?art=4699.

155

“JPT Shell’s Well Pad of the Future Is Open for Business.” n.d. Accessed May 12, 2020.

https://pubs.spe.org/en/jpt/jpt-article-detail/?art=6404.

“Khudiri et al. - Open Standard Protocol Can Improve Real-Time Drill.Pdf.” n.d. Accessed March 29, 2020.

https://www.petrolink.com/petrocustom/wp/Open_Standard_Protocol_can_Improve_Real-

time_Drilling_Surveillance.pdf.

Krensky, Peter, Pieter den Hamer, Erick Brethenoux, Jim Hare, Carlie Idoine, Alexander Linden, Svetlana

Sicular, and Farhan Choudhary. 2020. “Magic Quadrant for Data Science and Machine Learning

Platforms.” Gartner.

“Kumu.” n.d. Accessed June 3, 2020. https://kumu.io/.

“Maana and Shell to Co-Present on How to Accelerate Digital Transformation with the Maana

Knowledge Platform at Forrester’s Digital Transformation Forum | Business Wire.” n.d. Accessed

May 12, 2020. https://www.businesswire.com/news/home/20170505005180/en/Maana-Shell-

Co-Present-Accelerate-Digital-Transformation-Maana.

“Machine Learning for Everyone.” n.d. Accessed June 9, 2020.

https://vas3k.com/blog/machine_learning/.

McNally, Robert. 2017. Crude Volatility: The History and Future of Boom-Bust Oil Prices. Columbia

University Press.

Meadows, Donella H., and Diana Wright. 2008. Thinking in Systems: A Primer. Green Publishing.

“Meet Omnia- the Statoil Data Platform That Enables Our Digital Roadmap.” n.d. Accessed May 12,

2020. https://www.linkedin.com/pulse/meet-omnia-statoil-data-platform-enables-our-digital-

larsen/.

“Meet Snake Arm: Robot Technology That Saves Lives — Chevron.Com.” n.d. Accessed May 12, 2020.

https://www.chevron.com/stories/snake-arm-the-robot-who-saves-lives.

“Microsoft Customer Stories: BP Adopts Hybrid Cloud and Moves Applications and Datacenter

Operations to Azure.” n.d. Accessed May 12, 2020. https://customers.microsoft.com/en-

us/story/724138-bp-infrastructure-energy-azure.

“Microsoft Customer Stories: BP Embraces Digital Transformation and the Cloud to Disrupt the Energy

Industry.” n.d. Accessed May 12, 2020. https://customers.microsoft.com/en-in/story/724142-

bp-digital-transformation-energy-azure.

“Microsoft Customer Stories: BP Explores Azure AI to Boost Safety, Increase Efficiency, and Drive

Business Sucess.” n.d. Accessed May 12, 2020. https://customers.microsoft.com/en-

us/story/bp-mining-oil-gas-azure-machine-learning.

156

“Microwave Link - Gigabit Microwave Connectivity.” n.d. Accessed July 13, 2020.

https://www.microwave-link.com/.

“Moblize | Moblize Achieves Huge Milestone: 7,000 plus Wells - Moblize.” n.d. Accessed May 12, 2020.

https://moblize.com/moblize-achieves-huge-milestone-7000-plus-wells/.

Mohammadpoor, Mehdi, and Farshid Torabi. 2019. “Big Data Analytics in Oil and Gas Industry: An

Emerging Trend.” Southwest Petroleum University.

“New SMART Procurement Platform | ExxonMobil.” n.d. Accessed May 12, 2020.

https://corporate.exxonmobil.com/Procurement/SMART-procurement-platform.

“NOVOS Reflexive Drilling System.” n.d. Accessed May 12, 2020. https://www.nov.com/products/novos.

“Oil and Gas at the Edge | Automation World.” n.d. Accessed March 31, 2020.

https://www.automationworld.com/products/control/blog/13317745/oil-and-gas-at-the-edge.

“Oil and Gas Engineering | L&T Technology Services.” n.d. Accessed May 12, 2020.

https://www.ltts.com/industry/oil-and-gas.

“Oil and Gas Working Group.” n.d. Accessed June 1, 2020. https://www.incose.org/incose-member-

resources/working-groups/Application/oil-and-gas.

Oil&Gas Journal. n.d. “Digital Transformation: Powering the Oil & Gas Industry.” Accessed April 19, 2020.

https://www.ogj.com/home/article/17297879/digital-transformation-powering-the-oil-gas-

industry.

“OpenAPI Initiative.” n.d. Accessed May 21, 2020. https://www.openapis.org/.

“Optical Fiber’s Gigabit Bandwidth, 200 Km Range Attractive for Subsea Work | Offshore.” n.d. Accessed

July 13, 2020. https://www.offshore-mag.com/subsea/article/16763423/optical-fibers-gigabit-

bandwidth-200-km-range-attractive-for-subsea-work.

“Part 1: Artificial Intelligence Defined | Deloitte | Technology Services.” n.d. Accessed June 7, 2020.

https://www2.deloitte.com/se/sv/pages/technology/articles/part1-artificial-intelligence-

defined.html.

“Porter’s Five Forces Model for Oil and Gas Industry – Energy Routes.” n.d. Accessed June 27, 2020.

https://energyroutes.eu/2016/05/23/porters-five-forces-model-for-oil-and-gas-industry/.

“Precision Drilling Corporation - Alpha.” n.d. Accessed May 12, 2020.

https://www.precisiondrilling.com/alpha/.

“Products - EDrilling.” n.d. Accessed May 12, 2020. https://edrilling.no/products/.

157

“Pursuing Advanced Analytics at SHELL - Alteryx Community.” n.d. Accessed May 12, 2020.

https://community.alteryx.com/t5/Alteryx-Use-Cases/Pursuing-Advanced-Analytics-at-SHELL/ta-

p/182837.

“Real-Time Latency: Rethink Possibilities with Remote Networks.” n.d. Accessed July 13, 2020.

https://www.telesat.com/sites/default/files/telesat/brochures/telesat_leo_-_real-

time_latency_rethink_the_possibilities_with_remote_networks.pdf.

“Refueling the Oil Industry: Transforming Traditional O&G with the Oil of the 21st Century - Red Chalk

Group.” n.d. Accessed April 19, 2020. https://www.redchalk.com/industry/oil-gas/refueling-the-

oil-industry-transforming-traditional-og-with-the-oil-of-the-21st-century/.

Rexer, Karl. 2017. “A Decade of Surveying Analytic Professionals: 2017 Survey Highlights.” Rexer

Analytics. www.rexeranalytics.com.

“RigNet Signs Strategic Agreement with BP for Intelie Live | RigNet.” n.d. Accessed May 12, 2020.

https://investor.rig.net/news-releases/news-release-details/rignet-signs-strategic-agreement-

bp-intelie-live.

“Rystad Energy Ranks the Cheapest Sources of Supply in the Oil Industry.” n.d. Accessed March 29, 2020.

https://www.rystadenergy.com/newsevents/news/press-releases/Rystad-Energy-ranks-the-

cheapest-sources-of-supply-in-the-oil-industry-/.

“‘Sandy’ Joins the Dots for BP.” n.d. Accessed May 12, 2020. https://www.maritime-

executive.com/article/sandy-joins-the-dots-for-bp.

“Satellite Broadband Internet and Megaconstellations | Deloitte Insights.” n.d. Accessed July 13, 2020.

https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-

telecom-predictions/2020/satellite-broadband-internet.html.

“Saudi Aramco Energy Ventures Invests in Norwegian Artificial Intelligence Software Provider Earth

Science Analytics | Business Wire.” n.d. Accessed May 12, 2020.

https://www.businesswire.com/news/home/20190308005326/en/Saudi-Aramco-Energy-

Ventures-Invests-Norwegian-Artificial.

Schalkwyk, Pieter van. 2019. “The Ultimate Guide to Digital Twins.” XMPRO.

“Seeq Secures $23 Million Series B to Fuel IIoT Advanced Analytics Growth Strategy | Seeq.” n.d.

Accessed May 12, 2020. https://www.seeq.com/resources/news-press/47-news-releases/484-

seeq-secures-23-million-series-b-to-fuel-iiot-advanced-analytics-growth-strategy.

“Services | Intellicess.” n.d. Accessed May 12, 2020. https://www.intellicess.com/services/.

158

“Shell Announces Plans to Deploy AI Applications at Scale - CIO Journal. - WSJ.” n.d. Accessed May 12,

2020. https://blogs.wsj.com/cio/2018/09/20/shell-announces-plans-to-deploy-ai-applications-

at-scale/.

“Shell GameChanger – A Safe Place to Get Crazy Ideas Started | Management Innovation EXchange.”

n.d. Accessed May 12, 2020. https://www.managementexchange.com/story/shell-game-

changer.

“Shell, Know Thyself!” n.d. Accessed May 12, 2020. https://www.maana.io/downloads/STV-Wonder-

Maana-Shell-Know-Thyself.pdf.

“Shell Selects C3 IoT as Strategic AI Software Platform | Business Wire.” n.d. Accessed May 12, 2020.

https://www.businesswire.com/news/home/20180920005470/en/Shell-Selects-C3-IoT-

Strategic-AI-Software.

“Shell’s Companywide AI Effort Shows Early Returns - C3.Ai.” n.d. Accessed May 12, 2020.

https://c3.ai/shells-companywide-ai-effort-shows-early-returns/.

Silver, Nate. 2012. The Signal and the Noise. Penguin Group.

“Subsea Digital Alliance - FutureOn.” n.d. Accessed May 21, 2020. https://www.futureon.com/digital-

subsea-alliance/.

Thajudeen, S. Syed. 2018. “Advanced Analytics Solutions: Towards a Data Driven Organization.” In .

Offshore Technology Conference.

“The Amazing Technology Disrupting Oil and Gas (Think: AI, 3D Printing and Robotics).” n.d. Accessed

May 12, 2020. https://www.entrepreneur.com/article/308094.

“The DoDAF Architecture Framework Version 2.02.” n.d. Accessed June 1, 2020.

https://dodcio.defense.gov/Library/DoD-Architecture-Framework/.

“The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The Oil And Gas Giant.” n.d.

Accessed May 12, 2020. https://www.forbes.com/sites/bernardmarr/2019/01/18/the-

incredible-ways-shell-uses-artificial-intelligence-to-help-transform-the-oil-and-gas-giant/.

“The Maritime VSAT Advantage.” n.d. Accessed July 13, 2020.

https://www.groundcontrol.com/Maritime_VSAT/Marine_VSAT_Comparison.pdf.

“The Promise of a Digital Twin Strategy.” n.d. Microsoft, 23.

“Trends in U.S. Oil and Natural Gas Upstream Costs.” 2016, 141.

“Ultimate Guide to Artificial Intelligence Chatbots | Smartsheet.” n.d. Accessed June 3, 2020.

https://www.smartsheet.com/artificial-intelligence-chatbots.

159

Wardt, John P. de. 2019. “Drilling Systems Automation Roadmap 2019-2025.” Joint Industry Project 19

05 31. DSARoadmap.org. www.dsaroadmap.org.

“What Is OPC? - OPC Foundation.” n.d. Accessed May 30, 2020. https://opcfoundation.org/about/what-

is-opc/.

“What Is the Latency for Satellite Connectivity? - DTP.” n.d. Accessed July 13, 2020.

http://www.dtp.net.id/knowledge/what-is-the-latency-for-satellite-connectivity/.

“WiMAX Coverage and Speed | HowStuffWorks.” n.d. Accessed July 13, 2020.

https://computer.howstuffworks.com/wimax2.htm.

“Woodside Energy Drills for Insight with Cognitive Computing.” n.d. Accessed May 12, 2020.

https://www.ibm.com/blogs/client-voices/woodside-energy-cognitive-computing/.

World Economic Forum. 2017. “Digital Transformation Initiative: Oil and Gas Industry.” World Economic

Forum, no. REF 060117 (January).

“World Oil Newsroom: ExxonMobil Awards License to EON Reality for Immersive 3D Operator Training

Simulator Technology - EON Reality.” n.d. Accessed May 12, 2020.

https://eonreality.com/world-oil-newsroom-exxonmobil-awards-license-to-eon-reality-for-

immersive-3d-operator-training-simulator-technology/.

“Worlds - Hypergiant.” n.d. Accessed May 12, 2020. https://www.hypergiant.com/worlds/.

“Xpansiv Completes Strategic $10M Series A Funding Round with Investments from BP Ventures, Avista,

S&P Global, and Energy Innovation Capital.” n.d. Accessed May 12, 2020.

https://www.prnewswire.com/news-releases/xpansiv-completes-strategic-10m-series-a-

funding-round-with-investments-from-bp-ventures-avista-sp-global-and-energy-innovation-

capital-300779046.html.

“Xpansiv Continues Its Transformation of Commodity Value in Global Markets; Announces Strategic

Investment from BP Ventures, Reflective Ventures, and S&P Global.” n.d. Accessed May 12,

2020. https://www.globenewswire.com/news-release/2018/09/06/1566553/0/en/Xpansiv-

Continues-Its-Transformation-of-Commodity-Value-in-Global-Markets-Announces-Strategic-

Investment-from-BP-Ventures-Reflective-Ventures-and-S-P-Global.html.

Xue, Qilong. 2020. Data Analytics for Drilling Engineering: Theory, Algorithms, Experiments, Software.

Information Fusion and Data Science. Cham: Springer International Publishing.

https://doi.org/10.1007/978-3-030-34035-3.