Risk Reward Optimization Spreadsheet

99
PFP Project Exit Cr Goal/Objective PFP Procurement process Budget Procurement vehicles OTA Low barrier FAR vehicles PFP Poject Management PFP Acquisiton Strategy (AS) PFP objectively demonstrates cost- effectiveness, i.e. "bends the cost curve" USAF IT budget reflects growing investment in PFP Open System Acquisition OSA process Lessons learned from PFP OTA prototyping translate to traditonal FAR vehicles PFP objectively demonstrates process- effectivenesss, i.e. "bends the cost curve" PFP Acquisiton Strategy standardized and easily adaptable

Transcript of Risk Reward Optimization Spreadsheet

PFP Project Exit Criteria for 1 Jan 2017Goal/Objective

PFP Procurement process

Budget

Procurement vehicles

OTA

Low barrier FAR vehicles

PFP Poject Management

PFP Acquisiton Strategy (AS)

PFP objectively demonstrates cost-effectiveness, i.e. "bends the cost curve"

USAF IT budget reflects growing investment in PFP Open System Acquisition OSA process

Lessons learned from PFP OTA prototyping translate to traditonal FAR vehicles

PFP objectively demonstrates process-effectivenesss, i.e. "bends the cost curve"

PFP Acquisiton Strategy standardized and easily adaptable

PFP Project Execution Plan (PEP )

PFP PEP standardized and easily adaptable

PFP Engineering

Requirements

Development

Testing

DPS supports USAF DT & OT requirements

Certification

PFP objectively demonstrates engineering-effectiveness, i.e. "bends the cost curve"

Persistent customer feedback loop enables continuous evolution of mission-centric requirementsPersistent market outreach process allows continuos situational awareness of commercial technology state-of-the art

Continuosly evolving Distributed Plugtest System (DPS) suports rapid evolutionary development

DPS supports rapid, iterative, test-based development

DPS supports Joint Interoperability certification requirements

DPS supports IA certification and accreditation requirements

Certification and Accreditation (C&A)

Need-to-share services

Outreach

Recruit PFP project sponsors

Recruit PFP technology providers

Training and Education

AFIT curriculum

DPS supports IA certification and accreditation requirements

Information Assurance and Cyber Securityeffectiveness, i.e. cost-effective, process-effective, engineering-effective, balance between need-to-share and need-to-protect information system resources

C&A, per Risk Management Framework (RMF) is standardized, streamlined, and reciprocal.

PFP affordable, virtual, open standard IA services, including cross domain services, supports operational dynamic need-to-share policy decisions in runtime

PFP process becomes method of choice among many government project offices

Marketplace of traditional, and especially non traditional technology providers crowdsources USAF information system requirements

DAU curriculum

USAF PFP training process

PFP Project Exit Criteria for 1 Jan 2017Metric

USAF OSA standard AS strategy template approved

against baseline in (cost-per-capability)-(per-development-time) across portolio of PFP projects

Over all budget for USAF PFP OSA increased 10X from FY15-FY17 POM reflects PFP OSA investments from all USAF PEOs across the FYDP

USAF PFP OSA consortium type OTAs exist with $500M ceiling and 5 year Period of PerformanceUSAF PFP OSA OTA consortia membership of 100+ non traditional performersNo fraud, waste, or abuse associated with USAF PFP OSA OTAestablishedContract schedule for OSA COTs products on the APL in placeStandard FAR plugtest based solicitation language approvedStandard FAR OSA Statement of Objectives (SOO) template approvedStandard FAR language data rights options for incentivizing open system development approvedStandard FAR language for plugtest based process for source selection and contract performance verification and validation approved

speed-to-capability across portolio of PFP projectsprocess from requirement identification through contract award to delivery of new capability increment

Risk-Reward optimization tools OSA schedule templateOSA Test Plan template emphasizing plugtesting

DPS provisions on-demand virtual test services PFP project demonstrates successful DT & OT

USAF OSA standard PEP template approved includes the followingprocess

capabilty-per-cost across portolio of PFP projects

Distributed Plugtest System (DPS) includes low barrier, convenient access to operational beta users

DPS encourages low barrier to broad industrial community of potential solution providers

DPS "cloud" has multiple and increasing number of federated governmental and industrial nodesincluding Internet, CFBLnet, DDTE, TS test networks. DPS suppors live, virtual, and constructive simulations of USAF mission workflowsDPS provisions GFE OSA reference implementations

approves PFP NR-KPP plugtest plan. (Inherit NR-KPP controls from DPS reference implementation.)virtual IA/Cross Domain Services across at least one security layer DPS achieves Authority to Operate on all required networks.

Typical cost of C&A for new capability <$50K. Typical C&A timeline for new capability <30 days

Zero security violations

dynamic policy-based C&A plan.(Inherit controls from DPS security layer.)

PFP project MOE and MOP show 100% improvement in cost and time required for C&A, in cost of certified devices, and speed of execution of need-to-share decisions.

Official (AO) is automatically accepted by all relevant other AOs

Need-to-share authorization change decision executed in < 10secondsTypical cost for high assurance device (e.g. server blade) <$10K

At least ten USAF projects successfully execute PFP projectsAt least one non USAF service or agency contributes resources to PFP project

days) and sharing of data, GFE/I, including incontext with on going procurements, with non-tradtional providers. At least ten non traditional firms participate in funded PFP project workAt least 3 PFP pre-approved COTS products available for immediate purchase via low barrier government contract schedule

Continuing AFIT course work includes instruction in PFP OSA

SAF AQ requests DAU to incorporate PFP OSA lessons learned into standard training curriculum for PMs, KOs, and Engineers

underway.

Risk Status Risk criteria

Procurement

Project Management

Engineering

Security

Outreach

Responsibiliy assigned..and..Sufficient resourcess allocated...and...Risk management plan status is green

Responsibiliy assigned...and...Risk management plan exists...butResources assignd are insufficient...orRisk management plan status is yellow

Responsibiliy unassigned...orRisk management plan does not exist..orNo resources assigned....or Risk management plan status is red

Training and Educatiion

Responsibiliy assigned...and...Risk management plan exists...butResources assignd are insufficient...orRisk management plan status is yellow

Responsibiliy unassigned...orRisk management plan does not exist..orNo resources assigned....or Risk management plan status is red

XX? = a guess CG/GH PFP Procurement process

BF= Brad Frye BudgetCG = Camron GorguinpouGH PFP cylce #1

cg = Chris Gunderson GH PFP cylcel #2DD = Dick Dramstad GH? PFP cylce #3GH = Gabe HileyMM = Mike Mayhew OTA consortiumRD = Ryan Durante GH

C5 OTA

RD USAF PFP OTA

CG/cg

GH? PFP Acquisiton Strategy (AS)cg PFP AS templateDD? DCGS-AF transition plan

GH? PFP Project Execution Plan (PEP )

cg PFP PEP Templatecg PFP requirements templatecg PFP risk management templatecgcg PFP test plan templatecg PFP solicitation templatecg PFP contract language template

DD? DCGS-AF PEPDD/MM? MRIP test case

Action Assigned

PFP Poject Management Process Development

template

CG/RD PFP Engineering

RequirementsMM PFP cylce #1 MM? PFP cylce #2

MM? PFP cylce #3

Federated nodesMM HmCDD? DGS-Xcg Commercial cloud

Reference implementation (RI)

Mission layersSecurity layer

MM? Plugtest suiteMM? Persistent DPS test servicesBF? Live operational simulation

MM? NetworksMM? InternetMM/cg? DDTEMM/cg? CFBLnetDD? DCGS TS test networkDD? DCGS TS operational network

CG/RD? Security

MM/cg? Accreditation Official EngagementMM/cg? HmC AO

Distributed Plugtest System (DPS) Development

DD? DCGS AOMM/cg? DDTE AO (DISA)MM/cg? CFBLnet AO

MM/cg?

MM/cg? RMF for DPS

MM RMF for MRIPRRF for DDTERMF for CFBLnetRMF for TS test networkRMF for TS operational network

RequirementsBudget

CG/cg Outreach

GH/cg? Recruit PFP project sponsors

CG/GH? Internal USAFCG/cg Other US Gov sponsors

cg Recruit PFP technology providers

CG/cg Training and Education

cg

cg

Reciprocal Certification and Accreditation (C&A), i.e. Risk Management Framework (RMF)

Services

curriculum

Incorporate PFP lessons learned into DAU curriculum

cgEstablish internal USAF PFP training process

PFP Plugtest #1 (PT #1) Entry and Exit CriteriaEntry Critieria

Funding identified

MRIP Solicitation releasedMRIP Credible bids received

PFP OTA Solicitation released

PFP Transition Strategy template ver #1 DCGS-mod transition target identified

PFP FY15 critical path mapped

PFP PEP template ver #1 drafted

PFP riks template ver #1 draftedPFP sked template ver #1 draftedPFP TP template ver #1 drafted

SOO template ver #1 drafted

DCGS-mod FY15 PFP critical path mappedMRIP test case published

Funding identified (includes funding for security services)

drafted

drafted

HmC DPS node on line

Security layer rqmts identified

DPS tool suite #1 installed at HmC

Hmc Internet POP established

AO for HmC engaged

deliveredpublished

HmC

RI#1 mission tech stack (GWE) installed as GFE on HmC

HmC

AO for DCGS engaged

Select USAF PEOs briefedSelect others briefed

Vendor participation

AFIT informed of PFP strategy

DAU informed of PFP strategy

RMF template ver 1 for DPS environment deliveredDraft RMF template for MRIP application delivered

Draft USAF internal training strategy published

PFP Plugtest #1 (PT #1) Entry and Exit CriteriaExit Critieria

Sponsors confirm

Sponsors confirm

Award made within 7 days SOO agreed within 14 daysMIPR executed within 45 daysSAF AQ acknowledges

SAF Aq commentsDCGS-AF SPO approves

SAF AQ approves

SAF AQ commentsSAF AQ commentsSAF AQ commentsSAF AQ commentsSAF AQ commentsSAF AQ commentsSAF AQ comments

SPO approvesSPO approves

Sponsor(s) confirm

Plugtest validates and verifies MRIP bids

SAF AQ approves

Plugtest validates and verifies MRIP bids

Plugtest validates and verifies MRIP bids

HmC AO acknowleges

candidates

MRIP bidders registered and trained on RI#1

DCGS AO acknowleges

DCGS AO acknowleges

SAF AQ comments

SAF AQ comments

Interim Authority to Test (IATT) established

Hot washup for USAF PFP senior stakeholders within 3 weeks of PT#1washup

workspace in HmC DPS environment and bids on MRIP

SAF AQ comments

PFP Plugtest #2 (PT #2) Entry and Exit CriteriaEntry Critieria

Funding identifiedFunding identified

PFP cycle #2 solicitations releasedPFP cycle #2 credible bids received

PFP OTA awarded

PFP Transition strategy publishedDraft DCGS-mod AS presented

PFP FY16 critical path mapped

PFP PEP template ver #1 publishedPFP rqmts template ver #1 publishedPFP riks template ver #1 publishedPFP sked template ver #1 publishedPFP TP template ver #1 published

SOO template ver #1 published

PFP cycle #2 test cases published

published

mapped

HmC DPS node on lineDGS-X DPS node established

Security layer protype 0.1 installed

deliveredPreliminary high level requirements published

established

PFP cycle #2 tech stack, RI#2, installed as GFE

Project sponsor other than DCGSProject sponsor other than DCGS

USAF Internal PFP training strategy published

PFP Plugtest #2 (PT #2) Entry and Exit CriteriaExit Critieria

MIPR executed within SAF AQ acknowleges

Award made within 7 daysSOO agreed wihin 7 daysMIPR executed within 30 daysPFP OTA open for business nlt 1 Oct 2015

SAF AQ approvesSAF AQ comments

SAF AQ approves

SAF AQ approvesSAF AQ approvesSAF AQ approvesSAF AQ approvesSAF AQ approvesSAF AQ approvesSAF AQ approves

SPO approvesSponsors approve

Sponsors approve

Sponsors confirm

IATTIATTIATT

IATT

PFP cycle #2 bidders registered and trained on RI#2

Funding commitmentFunding commitment

SAF AQ approves

PFP FYXX QY SchedulePLAN PLAN ACTUAL ACTUAL

ACTIVITY START DURATION START DURATION

1000 Identify project sponsors 1 4 1 41010 Identify operational SMEs 1 4 1 41020 3 2 3 21030 Identify budgets 1 4 1 41040 3 2 3 21050 Prepare testcases 3 2 3 21060 Issue solicitations 4 1 4 11070 Execute TEM 5 1 5 1

1080 6 1 6 1

1090 Make awards 6 1 6 1

1100 6 1 6 11110 7 6 7 61120 Perform C&A 7 15 7 15

113020 4 20 4

Track #

Specify requirments and MOEsIdentify transition targets

Execute plugtests in Distributed Plugtest System (DPS)

Establish Statement of Objectives (SOO) Perform prototyping in DPSTranstion capability as pluggable off-the-shelf technology

PFP FYXX QY Schedule Period Highlight: 10

PERCENT Month #1 Month #2

COMPLETE 1 2 3 4 5 6 71 2 3 4 5 6 7

% Complete Actual (beyond plan)

Plan Actual

Month #2 Month #3 Month #4 Month #5

8 9 10 11 12 13 14 15 16 17 188 9 10 11 12 13 14 15 16 17 18

% Complete (beyond plan)

Month #5 Month #6 Jul Aug Sep Oct

19 20 21 22 23 24 1 2 3 4 5 6 7 8 9 10 11 12 13 14 1519 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39

Oct Nov Dec

16 17 18 19 20 21 22 23 2440 41 42 43 44 45 46 47 48 49 50

PFP Better-Speed-to-Better-Capabilty Risk-Reward Strategy

PFP Enterprise Interoperabilty Risk-Reward Hypothesis

PFP Enterprise Information Assurance (IA) Risk-Reward Hypothesis

Assumptions and Boundary Conditions1 Technology value half life is one year2 Adversary has ready access to up-to-date COTS technology

3 Adversary is able to share information and network resources at will

4

5 Budget is fixed at $?

Risk = Requirements will obsolesce faster than capability is deployed wasting all time and money expended Reward = Leveraging interoperability associated with open systems approaches (OSA) can allow delivery of new and better capability within mission evolution cycle provides profound operational advantage

Risk = Failure to adequately protect information and network resources will lead to unacceptable vulnerabiliy.

Reward = Sharing information and network resources across security boundaries will provide significant financial and logistical efficienciesRisk = Failure to efficiently comply with C&A requirements will lead to unacceptable impact to cost and schedule.

Reward = Streamlining and standardizing C&A compliance will significantly reduce time and cost to deploy capability

Risk = Failure to share critical information and resources with mission partners will lead to unacceptalbe mission vulnerabilities. Reward = Sharing critical information and network resources, across security boundaries, and within critical, time sensitive, mission threads, will provide asymetric advantage.

Security regime will support rapid-onboarding of new technology (see security risk/reward strategy)

DCGS-modernization Risk Reward Hypothesis

Assumptions and Boundary ConditionsTechnology value half life is one yearAdversary has ready access to up-to-date COTS technologyAdversary is able to share information and network resources at will

Budget is fixed at $??DCGS-modernization Cost Risk

Risk = Requirements will obsolesce faster than capability is deployed wasting all time and money expended

Reward = delivering new technology within mission evolution cycle provides measurably improved operational advantage against baseline

Security regime will support rapid-onboarding of new technology (see security risk/reward strategy)

Risk = Insufficient investment in upfront interoperabilty vis-à-vis open system infrastructure will result in one-off solutions with usustainable lifecycle tech refresh tails.

Reward = Engineering interoperabiliy through opens system approaches can reduce tech refresh costs through plug-and-play of off-the-shelf capability

DCGS-modernization Performance Risk

DCGS-modernization Schedule Risk

DCGS-modernization IA Risk

Schedule Risk-Reward Optimization Actions Conceptual Risk-Reward Optimization Actions

x Establish persistent low barrier procurement vehicles

Risk = Information processing and sharing requirements, in the face of adversaries access to rapidly evolving commercial technology, evolve too fast to address them with specialized solutions and associated long RDT&E cycles. Reward = Operational and engineering interoperabiliy, enabled by open system approaches, can enable better capabiliy-per-cost by crowdsourcing DCGS-modernization measures of effectiveness to broader communities of off-the-shelf solution providers.

Risk = Traditional serial processes, and excessive bureacracy will prevent deploying new technology in time to harvest its marginal benefits

Reward = Parallilzing acquisition processes such as collecting requirements, peforming AoA, contracting, developing, testing, and certifying can enhance speed-to-capability

Risk = Traditional IA paradigms based on physical separation and monolithic access control policies preclude achieving all objectives regarding cost, performance and schedule. Reward = New IA paradigms based on logical separation and dynamic "need-to-share" policies can support objectives regarding cost, performance, and schedule and adequately protect classified resources.

Use Approved Product Lists and COTS schedules to immediately deploy "80% solutions"

Create standard procurement templates appropriate for OSABuild in security from the ground upEngage Accrediting Officials(AO) early and often

Incrementally implement open standard virtual dynamic "IA layer"

Establish project management team with expertise in OSASelect internal contractors based on OSA prior performance

Establish feedback loop with operational customers

Develop persistent buildtime/runtime "pluggable" reference implementation

Engage testing authorities early and often

Execute incremental tests Schedule incremental bundling/integration events

Perform marketing and outreach

Identify agile procurement vehicles friendly to non traditional contractors and consistent with scope persistent with project objectives.

Select government team based on OSA prior performance, or conduct OSA training

Establish developmental and test tools, processes, and infrastructure aligned with OSA MOP & MOE

Establish testable operational Measures of Effectiveness (MOE) and correlating Measures of Performance (MOP)

Scope work for parallel, incremenatl development of interoperable componentsWrite test procedures for test driven development, build-a-little/test-a-little, approach

Publicize plans and visions

Publish Acquisition Strategy, budget, and Systems Engineering Plan

Issue brief solicitations based on clear description of MOE, MOP and associated test case

Expose persistent buildtime/runtime reference implementation to solution providers Establish flexible "data rights" regimes that incentivize COTS providers to participate

PFP Better-Speed-to-Better-Capabilty Risk-Reward Strategy

PFP Enterprise Interoperabilty Risk-Reward Hypothesis

PFP Enterprise Information Assurance (IA) Risk-Reward Hypothesis

Assumptions and Boundary ConditionsTechnology value half life is one year Risk-Reward StrategyAdversary has ready access to up-to-date COTS technology

Adversary is able to share information and network resources at will Target technologies/processes with high reward potential

Budget is fixed at $?

= Requirements will obsolesce faster than capability is deployed Conditional Risk Statement: If new capability cannot be deployed within 3 months of requirments indentification; Then technology will be obsolete when deployed and all time and money wasted.

= Leveraging interoperability associated with open systems approaches (OSA) can allow delivery of new and better capability within mission evolution cycle provides profound operational advantage

Conditional Reward Hypothesis: If critical new capability can be deployed within 3 months of requirements identification, on budget then enterprise will achieve threshold MOE = X % improvement over baseline

= Failure to adequately protect information and network resources will Risk Hypothesis: If security components do not assure cross security domain separation, then adversaries can compromise system.

= Sharing information and network resources across security boundaries will provide significant financial and logistical efficiencies

Reward Hypothesis: If security components assure cross security domain separation, and provide policy-based access, then valued information and resources can be efficiently shared across security boundaries.

= Failure to efficiently comply with C&A requirements will lead to Risk Hypothesis: If C&A (RMF) process precludes new capability being deployed within 3 months of requierments indentification; Then technology will be obsolete when deployed and all time and money wasted.

= Streamlining and standardizing C&A compliance will significantly Reward Hypothesis: lf RMF of new capability can inherit controls from pre-certified standard security stack, then RMF can be achieved fast enough to allow new capability to be deployed within 3 months of requirements identification.

= Failure to share critical information and resources with mission partners will lead to unacceptalbe mission vulnerabilities.

Risk Hypothesis: If security policy and security components do not allow sharing critical information across security boundaries, then blue forces will become vulnerable and/or lose potential for asymmetric information advantage.

= Sharing critical information and network resources, across security boundaries, and within critical, time sensitive, mission threads,

Reward Hypothesis: lf security policy and security components allow sharing critical information across security boundaries within critical mission thread time lines, then blue forces will achieve asymetric information advantage

Security regime will support rapid-onboarding of new technology (see •Operators Identify critical mission threads and associated desired outcomes up front •Establish associated testable Measures of Effectiveness (MOE) lag metrics

Target technology portfolio with balanced risk profile

DCGS-modernization Risk Reward Hypothesis

Assumptions and Boundary ConditionsTechnology value half life is one yearAdversary has ready access to up-to-date COTS technologyAdversary is able to share information and network resources at will

Budget is fixed at $??DCGS-modernization Cost Risk

•Establish Measures of Performance (MOP) lead metrics that are testably coupled to MOE lag metrics•Build iterative test plan that assures MOP lead metrics and MOE lag metrics

•Perform AoA of potential technology components per the above

•At least 80% of technology components must exist as COTS/GOTS*•Any developed technology has known transition path to COTS/GOTS•All performers have prior success with Open System development•Project scope and process must support technology onboarding within “Moore’s Law” time window

= Requirements will obsolesce faster than capability is deployed

= delivering new technology within mission evolution cycle provides measurably improved operational advantage against baseline

Security regime will support rapid-onboarding of new technology (see

= Insufficient investment in upfront interoperabilty vis-à-vis open system infrastructure will result in one-off solutions with usustainable

Conditional Risk Hypothesis: If upfront investments required to establish "plug-and-play" infrastrucuture are insufficient, then targeted down stream efficiencies will not be harvested.

= Engineering interoperabiliy through opens system approaches can reduce tech refresh costs through plug-and-play of off-the-shelf capability

Conditional Reward Hypothesis: If sufficient upfront investments are sufficient to establish "plug-and-play" infrastructure, then lifecycle tech refresh cost-per-capability measurably better than baseline values can be achieved.

DCGS-modernization Performance Risk

DCGS-modernization Schedule Risk

DCGS-modernization IA Risk

Risk-Reward Optimization Actions Due Complete? PFP DCGS Modernization R-R Actions

Establish persistent low barrier procurement vehicles

### Establish DCGS-m Approved Product List (APL) critieria

= Information processing and sharing requirements, in the face of adversaries access to rapidly evolving commercial technology, evolve too fast to address them with specialized solutions and associated long RDT&E

Conditional Risk Hypothesis: If upfront investments required to establish "plug-and-play" infrastrucuture are insufficient, targeted down stream efficiencies will not be harvested. = Operational and engineering interoperabiliy, enabled by open

system approaches, can enable better capabiliy-per-cost by crowdsourcing DCGS-modernization measures of effectiveness to broader communities of off- Conditional Reward Hypothesis: If sufficient upfront investments in If

critical new capability can be deployed within 3 months of requirements identification, on budget then enterprise will achieve threshold MOE = X % improvement over baseline

= Traditional serial processes, and excessive bureacracy will prevent deploying new technology in time to harvest its marginal benefits

Conditional Risk Statement: If the same acquisition processes that have led to the unsatisfactory current DCGS capability deployment time lines are applied to DCGS-modernization, then similarly unsat timelines will result.

= Parallilzing acquisition processes such as collecting requirements, peforming AoA, contracting, developing, testing, and

Conditional Reward Hypothesis: If DCGS-modernization streamlines bureacratic processes and performs traditionally serial processes in parallel, then DCGS-modernization will measurably improve speed-to-capabiltiy measured against baseline values.

= Traditional IA paradigms based on physical separation and monolithic access control policies preclude achieving all objectives regarding cost,

Conditional Risk Hypothesis: If traditional IA arguments are used to certify and accredit DCGS-modernization open system approaches, then the targeted efficiencies re cost, performance, and schedule can not be achieved.

= New IA paradigms based on logical separation and dynamic "need-to-share" policies can support objectives regarding cost, performance, and Conditional Reward Hypothesis: If AO's accept new IA paradigms based

on logical separation and dynamic need-to-share policies, then targeted OSA efficiencies can be achieved.

30-May x Employ Army C5 OTA30-Sep Create USAF OSA OTA

### Create standard project management templates appropriate for OSABuild in security from the ground up ### Establish IA/CDS/C&A rqmts and team

2/30/15 x Engage USAF RDT&E AO### Engage USAF DCGS AO

### Develop prototype### Refine prototype

Establish project management team with expertise in OSASelect internal contractors based on OSA prior performance ### Select internal contractors based on OSA prior performance

### Identify government lead for OSA program management best practice### Schedule OSA training

Establish feedback loop with operational customers 2/30/15 x Identify operational SMEs5/5/2015 Create operational simiulation of targeted workflow

### Vender SME engagement plan

Develop persistent buildtime/runtime "pluggable" reference implementation ### Make and execute plan for continuos evolution of Distributed Plugtest System### Make and execute plan for continuos evolution of Distributed Plugtest System### Make and execute plan for continuos evolution of Distributed Plugtest System### Make and execute plan for continuos evolution of Distributed Plugtest System

2/30/15 x Establish MOE & MOP for development cycle #1

### Establish MOE & MOP for development cycle #2Engage testing authorities early and often ### Engage JITC. Make and execute NR-KPP compliance plan.

### Engage DCGS testing authorities. Make and execute test plan.

Execute incremental tests Schedule incremental bundling/integration events

Perform marketing and outreach

Select government team based on OSA prior performance, or conduct OSA

Establish developmental and test tools, processes, and infrastructure

Establish testable operational Measures of Effectiveness (MOE) and

Scope work for parallel, incremenatl development of interoperable Write test procedures for test driven development, build-a-little/test-a-

Publicize plans and visions quarterly Review marketing plan and execution### Issue brief solicitations based on clear description of MOE, MOP and associated test case ### Issue brief solicitations based on clear description of MOE, MOP and associated test case ### Issue brief solicitations based on clear description of MOE, MOP and associated test case ### Issue brief solicitations based on clear description of MOE, MOP and associated test case

### Expose HmC DPC to solution providers ### Invite vendors to assist development of "data rights" templates in PFP OTA language### Invite vendors to assist development of "data rights" templates in PFP OTA language### Invite vendors to assist development of "data rights" templates in PFP OTA language

Publish Acquisition Strategy, budget, and Systems Engineering Plan ### x Publish PFP Plan of Action### Determine legal guidelines

Continuin x Use FedBizOps to share PFP goals, visions, and specific projects2/30/15 x Establish collaborative portal on DPS (HmC/Confluence)

### Establish official web site for sharing PFP information

Issue brief solicitations based on clear description of MOE, MOP and

Expose persistent buildtime/runtime reference implementation to solution Establish flexible "data rights" regimes that incentivize COTS providers to

PFP Better-Speed-to-Better-Capabilty Risk-Reward Strategy

PFP Enterprise Interoperabilty Risk-Reward Hypothesis

PFP Enterprise Information Assurance (IA) Risk-Reward Hypothesis

Risk-Reward Strategy

Target technologies/processes with high reward potential

: If new capability cannot be deployed within 3 months of requirments indentification; Then technology will be obsolete when deployed and all

: If critical new capability can be deployed within 3 months of requirements identification, on budget then enterprise will achieve threshold MOE = X % improvement over

: If security components do not assure cross security domain separation, then adversaries can compromise system.

: If security components assure cross security domain separation, and provide policy-based access, then valued information and resources can be efficiently shared across security boundaries.

: If C&A (RMF) process precludes new capability being deployed within 3 months of requierments indentification; Then technology will be obsolete when deployed and all time and money wasted. : lf RMF of new capability can inherit controls from pre-certified standard security stack, then RMF can be achieved fast enough to allow new capability to be deployed within 3 months of

: If security policy and security components do not allow sharing critical information across security boundaries, then blue forces will become vulnerable and/or lose potential for asymmetric information advantage. : lf security policy and security components allow sharing critical information across security boundaries within critical mission thread time lines, then blue forces will achieve asymetric information advantage

Operators Identify critical mission threads and associated desired

Establish associated testable Measures of Effectiveness (MOE) lag

Target technology portfolio with balanced risk profile

DCGS-modernization Cost Risk

Establish Measures of Performance (MOP) lead metrics that are testably coupled to MOE lag metricsBuild iterative test plan that assures MOP lead metrics and MOE lag

Perform AoA of potential technology components per the above

At least 80% of technology components must exist as COTS/GOTS*Any developed technology has known transition path to COTS/GOTSAll performers have prior success with Open System developmentProject scope and process must support technology onboarding

within “Moore’s Law” time window

: If upfront investments required to establish "plug-and-play" infrastrucuture are insufficient, then targeted down stream efficiencies will not be harvested.

: If sufficient upfront investments are sufficient to establish "plug-and-play" infrastructure, then lifecycle tech refresh cost-per-capability measurably better than baseline

DCGS-modernization Performance Risk

DCGS-modernization Schedule Risk

DCGS-modernization IA Risk

Risk-Reward Optimization Actions PFP DCGS Modernization R-R Actions

Establish DCGS-m Approved Product List (APL) critieria

: If upfront investments required to establish "plug-and-play" infrastrucuture are insufficient, targeted down stream efficiencies will not be harvested.

: If sufficient upfront investments in If critical new capability can be deployed within 3 months of requirements identification, on budget then enterprise will achieve threshold MOE = X % improvement over baseline

: If the same acquisition processes that have led to the unsatisfactory current DCGS capability deployment time lines are applied to DCGS-modernization, then similarly unsat

: If DCGS-modernization streamlines bureacratic processes and performs traditionally serial processes in parallel, then DCGS-modernization will measurably improve speed-to-capabiltiy measured against baseline values.

: If traditional IA arguments are used to certify and accredit DCGS-modernization open system approaches, then the targeted efficiencies re cost, performance, and schedule can not

: If AO's accept new IA paradigms based on logical separation and dynamic need-to-share policies, then targeted OSA efficiencies can be achieved.

Employ Army C5 OTA

Create standard project management templates appropriate for OSAEstablish IA/CDS/C&A rqmts and team

Select internal contractors based on OSA prior performanceIdentify government lead for OSA program management best practice

Identify operational SMEsCreate operational simiulation of targeted workflowVender SME engagement plan

Make and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest System

Establish MOE & MOP for development cycle #1

Establish MOE & MOP for development cycle #2Engage JITC. Make and execute NR-KPP compliance plan.Engage DCGS testing authorities. Make and execute test plan.

Review marketing plan and executionIssue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case

Expose HmC DPC to solution providers Invite vendors to assist development of "data rights" templates in PFP OTA languageInvite vendors to assist development of "data rights" templates in PFP OTA languageInvite vendors to assist development of "data rights" templates in PFP OTA language

Publish PFP Plan of Action

Use FedBizOps to share PFP goals, visions, and specific projectsEstablish collaborative portal on DPS (HmC/Confluence)Establish official web site for sharing PFP information

PFP Better-Speed-to-Better-Capabilty Risk-Reward Strategy

Risk-Reward optimization factors

RO = Ability to continuously capture the operational customers’ perception of value within rapidly evolving operational domains (e.g. by establishing continuous feedback loop.) An MP might be “customer contact hours.” i.e. a measure of developers’ performance in communicating with the customer, and a leading indicator of developers’ ability to achieve greater utility in the eyes of the customer.)

RT = Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for open standard product line architecture.) An MP might be “time required to configure component in the EIS stack,” which might be a leading indicator of the ME “time it takes to perform an increment of tech refresh.”)

RT = Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for open standard product line architecture.) An MP might be “time required to configure component in the EIS stack,” which might be a leading indicator of the ME “time it takes to perform an increment of tech refresh.”)

R$ = Ability to predict lifecycle costs for continuously evolving capability (e.g. by heavily leveraging existing off-the-shelf technologies that come with well established life cycle tech refresh cost models. MP might be “lifecycle costs are known and are less than ‘X’.”)

RIA = Ability to balance the need-to-protect information and EIS network resources with the need-to-share them across security domains (e.g. by implementing need-to-share and need-to-protect with high assurance virtual technology. An MP/E might be “accredited ability to execute dynamic-policy-based need-to-share decision.” RVI = Ability to find and deliver valued information bits within tightly constrained decision windows, given large and growing backdrop of available information bits (e.g. by identifying critical conditions of interest and implementing automated “smart push” alerts. MP/E might be “run time demonstration of decision cycle time compression against use case of interest.”)RPS = Availability of professional skills required for rapid evolutionary development (e.g. by evaluating vendors prior performance against similar open standard EIS projects. MP might be “documented success in prior performance on similar open system project.”)

PFP DCGS Modernization R-R Actions

Employ Army C5 OTA

Create standard project management templates appropriate for OSAEstablish IA/CDS/C&A rqmts and team

Select internal contractors based on OSA prior performanceIdentify government lead for OSA program management best practice

Identify operational SMEsCreate operational simiulation of targeted workflowVender SME engagement plan

Make and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest SystemMake and execute plan for continuos evolution of Distributed Plugtest System

Establish MOE & MOP for development cycle #1

Establish MOE & MOP for development cycle #2Engage JITC. Make and execute NR-KPP compliance plan.Engage DCGS testing authorities. Make and execute test plan.

Review marketing plan and executionIssue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case Issue brief solicitations based on clear description of MOE, MOP and associated test case

Expose HmC DPC to solution providers Invite vendors to assist development of "data rights" templates in PFP OTA languageInvite vendors to assist development of "data rights" templates in PFP OTA languageInvite vendors to assist development of "data rights" templates in PFP OTA language

Publish PFP Plan of Action

Use FedBizOps to share PFP goals, visions, and specific projects

Risk/Reward Factor

Risk Factor

Risk Factor

Risk-Reward optimization factors

= Ability to continuously capture the operational customers’ perception of value within rapidly evolving operational domains (e.g. by establishing might be “customer contact hours.” i.e. a measure of developers’ performance in communicating with the customer, and

a leading indicator of developers’ ability to achieve greater utility in the eyes of the customer.)

= Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for P might be “time required to configure component in the EIS stack,” which might be a leading indicator

“time it takes to perform an increment of tech refresh.”)

= Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for P might be “time required to configure component in the EIS stack,” which might be a leading indicator

“time it takes to perform an increment of tech refresh.”)

= Ability to predict lifecycle costs for continuously evolving capability (e.g. by heavily leveraging existing off-the-shelf technologies that come with well established life cycle tech refresh cost models. MP might be “lifecycle costs are known and are less than ‘X’.”)

= Ability to balance the need-to-protect information and EIS network resources with the need-to-share them across security domains (e.g. by implementing need-to-share and need-to-protect with high assurance virtual technology. An MP/E might be “accredited ability to execute dynamic-policy-

= Ability to find and deliver valued information bits within tightly constrained decision windows, given large and growing backdrop of available information bits (e.g. by identifying critical conditions of interest and implementing automated “smart push” alerts. MP/E might be “run time demonstration of decision cycle time compression against use case of interest.”)

= Availability of professional skills required for rapid evolutionary development (e.g. by evaluating vendors prior performance against similar might be “documented success in prior performance on similar open system project.”)

Risk/Reward Factor

Risk Factor25242322212019181716 #1,2,3

1514131211109876543210

Risk Factor Month 1 Month 3 Month 5

Risk Mitigation

Reward Enhancement

Completed Activity

Risk-Reward optimization factors

= Ability to continuously capture the operational customers’ perception of value within rapidly evolving operational domains (e.g. by establishing might be “customer contact hours.” i.e. a measure of developers’ performance in communicating with the customer, and

= Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for might be “time required to configure component in the EIS stack,” which might be a leading indicator

= Ability to continuously harvest technological value in rapidly evolving technological domains (e.g. by applying best commercial practices for might be “time required to configure component in the EIS stack,” which might be a leading indicator

= Ability to predict lifecycle costs for continuously evolving capability (e.g. by heavily leveraging existing off-the-shelf technologies that come might be “lifecycle costs are known and are less than ‘X’.”)

= Ability to balance the need-to-protect information and EIS network resources with the need-to-share them across security domains (e.g. by might be “accredited ability to execute dynamic-policy-

= Ability to find and deliver valued information bits within tightly constrained decision windows, given large and growing backdrop of available information bits (e.g. by identifying critical conditions of interest and implementing automated “smart push” alerts. MP/E might be “run time

= Availability of professional skills required for rapid evolutionary development (e.g. by evaluating vendors prior performance against similar

Negative Consequence

#4,5

Month 5 Month 7 Month 9 Month 11

Risk MitigationPlannedActual

Reward Enhancement

Planned Activity

#4,5

#7

#6

Negative Consequence

Reward Factor0123556789

10111213141516171819202122232425

Month 11 Reward Factor

#8

Likelihood Rationale

n

n

n

Likelihood Rationale

Consequence = Return on Investment maeasured against a baseline value (Vnth

+ nth other square cost error

Cost (c ) = Lifecycle cost including continuous tech refresh

Utility (u) = objective, testable, ability to satisfy requirements, Usually described in terms of Measures of Effectiveness (MOE).

and deploy new capability

RoI = (Value (V) = (Δu/c) ÷ t

Consequence RationaleConsequence = Return on Investment maeasured against a baseline value (V

Cost (c ) = Lifecycle cost including continuous tech refresh

Utility (u) = objective, testable, ability to satisfy requirements, Usually described in terms of Measures of Effectiveness (MOE).

and deploy new capability

RoI = (Value (V) = (Δu/c) ÷ td)

Consequence RationaleConsequence = Return on Investment maeasured against a baseline value (Vb)

Cost (c ) = Lifecycle cost including continuous tech refresh

Utility (u) = objective, testable, ability to satisfy requirements, Usually

COST Probability of Good Consequence

0.95960198

Upfront 3 5

RDT&E 20 15

Certification 12 10

Tech refresh 40 60

Other 25 10ce 100 100

4.03980198

SCHEDULE Probability of Good Consequence

0.8

Scheduled Work Unit weight (w)

Training cycle 1 1 1 1Training event 2 1 1 0Develop cycle 1 0Develop cycle 2 1 1 1Bundling event 1 2 2 1etc…Sum 5 5 3

P(c ) ≈ Aco = (ce - σco) ÷ c

Actual Budget (% planned)

Planned Budget (%)

σco =

P(s) ≈ Adv = Σ (kwfn) ÷ Σ(kwpw X scheduled work unit (p)

test result

0.98614359

Test 8 10

Customer Feedback 8 10

Train 8 10

Market research 8 10

Overhead 18 1050

0.69282032

0.75704428

10012

4

Consequence

P(u) ≈ (Aca = (td - σca) ÷ td

ta ts

td

σca =

P = P(c ) x P(p) x P(s) =w X completed work unit (f)

Risk - Reward Optimization Plan for Distributed Plugtest System Hanscom Mil Cloud Node Development

Probability of positive consequence

Positive consequenceReward factor

Probability of negative consequence

Negative consequenceRisk factor

3.1 3.1 Setup HmC Projects for each PFP Teams

3.2.1

3.2.2

Distributed Plugtest System (DPS) development activity

3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (GXP Xplorer)3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (Socet GXP)

3.2.3

3.2.4

3.3 3.3 Provide ISR data to PFP Teams3.4.1 Provide Weekly Wednesday Dedicated PFP classes3.4.2 Weekly vendor training man hours3.5.1 Weekly vendor HmC DPS develpment activity (hrs)3.5.2 Vendor contact hours with military SMEs 3.5.3 Weekly TWG vendor contact hours

3.6 Testing4 IA activity

4.1 DPS RMF development4.2 MRIP RMF development

5 Overhead activity

Consequence Rationale

MRIP MOE = process timeu = process time target Δu = 100% improvement

3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (DIB)3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (GDES (DDF))

improvement every 18 months per constant X dollar investment = ~5% improvement-per-month-per-constant-dollar-investment

MRIP Positive Consequence

redaction process time, in 6 months, @ $1M= 13% improvement per month per $1M outlay = ~2X Moore's Law based expectation.

MRIP Negative Consequence

Status quo = 100 % reduction in relative capability every 18 months per adversary's ability to leverage Moore's law = ~5% relative decrease in capability per dollar per month.

Risk - Reward Optimization Plan for Distributed Plugtest System Hanscom Mil Cloud Node DevelopmentRISK, REWARD

Positive ConsequencesMaximal Major

0.677245062649281

Probability of negative consequence

5 44 5

5,5 10,45 80-100%20 4

4,10 8,861-80%0.322754937350719 3

3,15 6,122 41-60%5 2

2,20 4,1610 21-40%1

1,25 2,200-20%1 2

Minimal MinorNegative Consequence

Risk factor = (magnitude of neg consequence) X (probability of occur

Reward factor = (magnitude of pos consequence) X (probability of occurrence)

3.1 Setup HmC Projects for each PFP Teams

Activity Value Weighting

Sked hours

Actual hours

Man hours training

Distributed Plugtest System (DPS) development

3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (GXP Xplorer)3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (Socet GXP)

3.3 Provide ISR data to PFP TeamsProvide Weekly Wednesday Dedicated PFP classesWeekly vendor training man hours 37Weekly vendor HmC DPS develpment activity (hrs)Vendor contact hours with military SMEs 2Weekly TWG vendor contact hoursTesting 2

DPS RMF developmentMRIP RMF development

1c = $1Mtd = 6 months

Upfront 5 5

RDT&E 15 15

3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (DIB)3.2 Provide AF DCGS components as assets in HmC and provide the DIB Enterprise Suite as asset (GDES (DDF))

COST Probability of Good Consequence

P(c ) ≈ Aco = (ce - σco) ÷ ce =

Actual Budget (% planned)

Planned Budget (%)

Certificat 10 10Tech refre 60 60Other 10 10ce 100 100

0

0.69231

3.1 1 13.2.1 1 13.2.2 1 13.2.3 13.2.4 1 1

3.3 1 13.4 1 13.4 1 1

3.5.1 1 13.5.2 1 13.5.3 1 1

4.1 1 14.2 1 1

5 1 1

Sum 13

σco =SCHEDULE Probability of Good

Consequence

P(s) ≈ Adv = Σ (kwfn) ÷ Σ(kwpn)=

Scheduled Work Unit

weight (w)

w X scheduled work unit (p)

Risk - Reward Optimization Plan for Distributed Plugtest System Hanscom Mil Cloud Node DevelopmentRISK, REWARD

Positive ConsequencesModerate Minor Minimal

3 2 1

Probability of positive consequence

15,3 20,2 25,11

0-20%

12,6 16,4 20,22

21-40%

9,9 12,6 15,33

41-60%

6,12 8,8 10,44

61-80%

3,15 4,10 5,55

80-100%3 4 5

Moderate Major MaximalNegative Consequence

Risk factor = (magnitude of neg consequence) X (probability of occur

Reward factor = (magnitude of pos consequence) X (probability of occurrence)

Sked DateActual Date

On time =1, Late = 0

Test Sked Date

Actual Test Date

0.97824

3.6 Test 6 8

3.5.2 4 4

Peformance Probability of Good Consequence

P(u) ≈ (Aca = (td - σca) ÷ td) =

ta tsCustomer Feedback

Training 2 23.5.1 +3.5. 8 10

5 Overhead 5 226

0.56568542495

0.67725

1 11 1

00

1 10

1 11 1

01 1

01 11 11 1

9

3.4.1 + 3.4.2

research

td

σca =

P = P(c ) x P(p) x P(s) =

test result

w X completed work unit (f)

Pass = 1, Fail = 0

Vendor V&V, Yes=1, No=0

PFP Critical Path Ex SummaryPLAN PLAN ACTUAL

ACTIVITY START DURATION START

1000 Procurement 1 50 1

1020 1 21 1

1030 Prepare RFP 1 10 11040 Award 11 4

1050 15 14

1070 1 40

1080 Spiral 1 1 11090 Spiral 2 1 161100 Spiral 3 24 8

1110 20 16

5 6

2000 Engineering 1 15

2020 Create plugtest case#1 4 12 4

2030 4 12 42040 Create plugtest case#2 21 4

Risk Level

Track #

Establish USAF Open System Acquisition OTA

Establish Consortium governance

Identify FY15 sponsors, requirements, and budgets

Create PFP OSA acquisition artifiacts

Establish DPS ver 0.1 hosted at HmC

20602070 Establish DGS-X node 12 16

2080 24 8

2090 12 24

2100

3000

3020 12 4

3030 21 24

3040 21 243050 Accredit DPS 9 403060 IATT ver 0.1 9 103070 IATC Test networks 12 243080 IATO ver 1.0 9 36

3090 12 243100 ATO ver 1.1 9 40

Interim Risk Status CriteriaIssue identified but no plan to adIssue assigned to credible POCA plan addresses issue with sufficient resources

Establish federated DPS ver. 1.0

Establish commercial cloud node(s)Establish test network POPsEstablish operational network POP

Security

Identify security team FTE and budgetBuild open standard virtual security layerEstablish streamlined C&A paradigm

IATT virtual cross domain services (VCDS)

PFP Critical Path Ex Summary Period Highl 10 Plan Actual

ACTUAL PERCENT Jan Feb Mar Apr May

DURATION COMPLETE 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 181 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

20%20%

% Complete Actual (beyond plan)% Complete (beyond plan)

Jun Jul Aug Sep Oct Nov

19 20 21 22 23 24 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 2019 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44

Dec

21 22 23 2445 46 47 48 49 50