AFCEA Washington, DC Mobile Technology Summit



Wednesday, March 1, 2017

7:15 AM–5:15 PM

Ronald Reagan Building and International Trade Center

1300 Pennsylvania Ave., NW

Washington, DC 20004


AFCEA Washington, DC invites you to attend the 6th Annual Mobile Tech Summit on March 1, 2017. This event will feature keynotes from government leaders, government-led discussion panels, and an emerging mobile technologies expo.


The program explores mobile solutions from industry and government and focuses on solutions that  solve real-world, tactical and in-garrison challenges. The summit continues the dialog between government and industry and addresses how government can capitalize on commercial innovation. The program also includes industry’s response to the DOD CIO, other civilian agency and military services organization plans.


The event features tactical mobility challenges and solutions within:

  • Emerging Technologies
  • Application Ecosystems
  • Sensors
  • Internet of Things
  • Tactical Cloud
  • Security


View the agenda


Click here for registration







MTD 2017: The Ninth International Workshop on Managing Technical Debt

The Ninth International Workshop on Managing Technical Debt will be held in conjunction with XP 2017 in Cologne, Germany, on May 22, 2017.


Visit the SEI’s website:

Technical debt is a metaphor that software developers and managers increasingly use to communicate key tradeoffs related to release and quality issues. The Managing Technical Debt workshop series has, since 2010, brought together practitioners and researchers to discuss and define issues related to technical debt and how they can be studied.

Workshop participants reiterate the usefulness of the concept each year, share emerging practices used in software development organizations, and emphasize the need for more research and better means for sharing emerging practices and results.


Call for Papers


Big design upfront has widely been replaced by iterative and agile development approaches. In some agile environments, the architecture is even something that is meant to emerge in the course of the project through continuously revisiting and refactoring the product code. Projects have shown that both approaches lead to insufficiencies in design or implementation over the long term, known as technical debt, which is a metaphor used to communicate key tradeoffs related to release and quality issues.


The Ninth Workshop on Managing Technical Debt will bring together leading software researchers and practitioners, especially from the area of iterative and agile software development, for the purpose of exploring theoretical and practical techniques that quantify technical debt.


Questions of interest for the workshop include but are not limited to the following:

  • What are root causes for technical debt outside of the code, and how do we evaluate them?
  • What is the impact of agile and iterative software development approaches on technical debt?
  • Are agile techniques and their iterative development potential root causes for the introduction of technical debt?
  • Does strategic use of technical debt provide insight into the balance between upfront and emergent design and architecture in an agile environment?
  • Can encouraged deprecation mechanisms, versioning, and architectural approaches like microservices help avoid technical debt by simply disposing code?

The Managing Technical Debt workshop series has provided a forum since 2010 for practitioners and researchers to discuss issues related to technical debt, share emerging practices used in software development organizations, and emphasize the need for more research and better means for sharing emerging results. Consensus from our community indicates a need to focus on quantification approaches as well as qualification and measurement of technical debt on higher levels of design and architecture. Contributions from the area of agile and incremental development and their impact on technical debt are of special interest.


The following topics are aligned with the theme:

  • techniques and tools for managing technical debt in agile and DevOps environments
  • techniques and tools for calculating technical debt principal and interest
  • technical debt in code, design, architecture, and development and delivery infrastructure
  • measurements and metrics for technical debt
  • analyzing technical debt
  • visualizing technical debt
  • empirical studies on technical debt evaluations
  • relationship of technical debt to software evolution, maintenance, and aging
  • economic models for describing technical debt
  • technical debt and software life-cycle management
  • technical debt within the software ecosystem
  • technical debt in software models
  • concrete practices and tools used to measure and control technical debt

The SEI invites submissions of papers in any areas related to the themes and goals of the workshop in the following categories:

  • Research Papers: describing innovative and significant original research in the field (up to 8 pages)
  • Industrial Papers: describing industrial experience, case studies, challenges, problems, and solutions (up to 8 pages)
  • Position and Future Trend Papers: describing ongoing research, new results, and future trends (up to 4 pages)

Submissions should be original and unpublished work. Each submitted paper will undergo a rigorous review process by three members of the program committee. Submissions must be submitted online via EasyChair ( and conform to ACM’s general guidelines for academic publishing

( Accepted papers will be presented at the workshop and published in the XP 2017 post-conference proceedings.



Paper submissions: March 3, 2017

Notification of acceptance: March 24, 2017  Camera-ready copy: tbd

Workshop: May 22, 2017

SEI’s Software Solutions Symposium



In March the Carnegie Mellon Software Engineering Institute (SEI) will host a Software Solutions Symposium (SSS) in Arlington, VA. SSS 2017 will be held at the Hilton Crystal City in Arlington, VA.


The SEI is a federally funded research and development center at Carnegie Mellon University dedicated to helping government and industry organizations acquire, develop, operate, and sustain software systems that are affordable, enduring, and trustworthy. Building on the success of an inaugural event held in 2015, the symposium will provide information on emerging technologies and technical strategies for software-reliant systems.


In the dynamic environment of software development, it is challenging to keep up to date with new technologies and methods. The Software Solutions Symposium is a forum for learning about emerging technologies and practical solutions that you can apply today for help with systemic software issues such as assurance, cost, and schedule.


Senior researchers and practitioners who have spent years making the most complex systems and software work in industry and government programs will join together to offer half-day tutorials, talks, and panel discussions.


Click here to learn more and register now










Event Summary: Cyber Resilience Summit, October 20, 2016

CYBER RESILIENCE SUMMIT: Ensure Resiliency in Federal Software Acquisition

Topic: Improving System Development & Sustainment Outcomes with Software Quality and Risk Measurement Standards

Hosted by: Consortium for IT Software Quality (CISQ) in cooperation with Object Management Group, Interoperability Clearinghouse, IT Acquisition Advisory Council

Date: 20 October 2016 from 0800 – 1230

Location: Army Navy Country Club, 1700 Army Navy Drive, Arlington, VA

Agenda and Presentations:


Event Background


The Consortium for IT Software Quality (CISQ) held its semiannual Cyber Resilience Summit at the Army Navy Country Club in Arlington, Virginia in cooperation with the IT Acquisition Advisory Council (IT-AAC) and other IT leadership organizations. “Titans of Cyber” from the U.S. Federal Government attended the Summit to share critical insights from the front lines of the cyber risk management battle. The program focused on standards and best practices for measuring risk and quality in IT-intensive programs from the standpoint of productivity, software assurance, overall quality and system/mission risk. The discussion addressed proven methods and tools of incorporating such standard metrics into the IT software development, sustainment and acquisition processes.


Discussion Points


John Weiler, IT-AAC Vice Chair and Dr. Bill Curtis, CISQ Executive Director opened the Summit.
Dr. Curtis gave an overview of CISQ, explaining that it was co-founded in 2009 by the Software Engineering Institute (SEI) at Carnegie Mellon University and Object Management Group (OMG) and is currently managed by OMG. The Consortium is chartered to create international standards for measuring the size and structural quality of software. Its mission is to increase the use of software product measures in software engineering and management. Dr. Curtis developed the original Capability Maturity Model (CMM) while at the SEI and now directs CISQ. Current sponsors include CAST, Synopsys, Booz Allen Hamilton, Cognizant, others.


Significant CISQ contributions include:

  • A standard for automating Function Points that mirrors IFPUG counting guidelines
  • Four measures of structural quality to quantify violations of good architectural and coding practice:
    • Reliability
    • Performance Efficiency
    • Security
    • Maintainability
  • It is important to note that most measures of reliability assess system availability or downtime, which are behavioral measures. The CISQ measures assess flaws in the software that can cause operational problems. Thus, the CISQ measures provide prerelease indicators for operational or cost of ownership risks.

CISQ measures can be used to track software performance against agreed targets, as well aggregated into management reports to track vendor performance. The continuing stream of multi‐million dollar failures is causing an increased demand for certifying software. Although CISQ will not provide a certification service, it will provide an assessment process to endorse technologies that can detect the critical weaknesses that comprise the CISQ Quality Characteristic measure standards.


The Security measure effort was led by the next speaker, Robert Martin, who oversees the Common Weakness Enumeration Repository maintained by MITRE Corporation. This repository contains over 800 known weaknesses that hackers exploit to gain unauthorized entry into systems.


Robert Martin, Senior Principal Engineer at MITRE, gave a presentation on: Defending Against Exploitable Weaknesses When Acquiring Software-Intensive Systems.



Mr. Martin’s main themes were:

  • We are more dependent upon software-enabled cyber technology than ever
  • Hardware and software are highly vulnerable so the possibility of disruption is greater than ever
  • Software in end items (ex. Cars, fighter jets, etc.) is growing at an exponential rate
  • Almost everything is cyber connected and co-dependent during operations and/or other phases of life
  • Today up to 90% of an application consists of third party code

Mr. Martin’s main question was how do we track and measure all the code characteristics that are flowing into software development? How do we determine and track what is really important?


Mr. Martin then described how to establish assurance by using an Assurance Case Model (Safety Case Tooling) with the elements of Claim/Sub-claim, Argument, and Evidence. He pointed out that this evidence-based assurance is an emerging part of NIST SPs 800-160 (draft) and 800-53 Rev 4.

  • This technique is good for capturing complicated relationships
  • Tying the evidence to supported claims can be an ongoing part of creating and maintaining the system
  • It is useful for Mission Impact Analysis and Cyber Risk Remediation Analysis

Mr. Martin also identified the Common Weakness Scoring System (CWSS) and the Common Weakness Risk Analysis Framework (CWRAF) that applies to the SANS Top 25 list. He then identified the benefits of using multiple detection methods, as some are good at finding cause, while others are good at finding effect. We can use multiple detection methods to collect evidence through all phases of development from design review through red teaming – all the while considering the most important common weaknesses.
Mr. Martin then discussed program protection planning for prioritization/criticality analysis, using the assurance case to tie claims to supporting evidence. He also introduced the concept of “trustworthiness” that is a combination of factors like safety, privacy, security, reliability, and resilience. For example, a security “false-positive” may be a safety or reliability issue.


Finally, Mr. Martin pointed out that we can manage assurance cases with claims and the association of evidence to claims and that the evidence is articulated using structures related to common weaknesses. This accounts for all types of threats including human error, system faults, and environmental disruptions. Assurance cases can also be exchanged and integrated to aid extended system analysis.


CISQ is a standard for measuring security, safety and reliability in a consistent and computable way.


Next, the Titans of Cyber panel was led by Dr. Marv Langston, Principal, Langston Associates.




Dr. Marv Langston introduced the members:

  • Ray Letteer, of the USMC stated the Marine Corps cyber concerns are for operational working metrics using standards. The need is for technical details not a “trust me” plan of action and milestones.
  • Kevin Dulany, of DIAP, (a protégé of Dr. Letteer) stated that networks are “hard on the outside but soft in the middle.” Attacks today are data driven – inside the networks but current risk management frameworks (RMFs) are based on traditional constructs of IT. Procedures require them to use these RMFs but the mitigations do not actually apply. Embedded computing drives the need for a different approach with new mitigations. Kevin also noted that some levels of security tend to be categorized at a lower level because of the lack of resources.
  • Chris Page of ONI stated that we do have superiority because of our great technology, citing that the Navy pub, “Design for Maintaining Naval Superiority” is a message (warning) to our adversaries.
  • Martin Stanley of DHS said that their focus is on securing high-value assets. Their process assesses the security posture, applies measures for a year, and then reassesses with lessons learned. He went on to say that root causes are related to basic IT practices and ways the organization must operate systems. His organization is producing enterprise architecture (EA) guidance that is unusual for cyber because it attempts to address root causes.
  • J. Michael Gilmore of DoD OT&E said that our systems are not designed with cyber security as a priority. He gave an example of a supporting network to an aircraft that was not considered in recovery and was also tied to a vendor network. He explained that capabilities need to be secured from both the government and contractor sides. Mike also cited that people can be major conduit for bad cyber – especially worldwide partners. Finally he added that the Joint Regional Security Stacks (JRSS) is essential for DoD but that people in the field are not fully trained and don’t understand its use or vulnerabilities.

Marv Langston kicked off the discussion by stating:

  • We test and deliver but don’t look back. Why don’t we do cyber tests on operational systems – on a daily basis?
  • Gilmore – We have a limited project to do and are facing resource constraints so we push back on current cyber authorities. The commercial sector does continuous red teaming but the Government is resisting – deploy and forget.
  • Letteer – Testing must be continuous. The USMC has established “White Teams” to do continuous scans, coordinating with red teams. In addition, we have cyber protection teams to help put in mitigations. This is not as widespread as we would like but we are making progress. The USAF has a similar program.
  • Stanley – Many agencies are working with DHS on continuous monitoring. The traditional Certification and Authorization (C&A) process has a place but continuous monitoring is supplemental. Overall compliance is more important than continuous monitoring, and this must change.
  • Dulany – We used to have high emphasis on system security engineering but with too much reliance on contractors. Today we look at controls but that does not get us down into useable specifications or standards. The RMF is a good tool but we have problems keeping up with technologies. We cannot use tools in certain environments, so continuous RMF would help to reinforce compliance.

Langston – I’m concerned that we will wear ourselves out with all these processes but will miss the critical checks like the daily cyber check

  • John Weiler – The software market is constantly refreshing but we still have 1985 software weapon systems.
  • Gilmore – We resist processes that are not spelled out in specifications. What percent of this good stuff are actually in RFPs? So, we will get resistance to innovative metrics because they are not in the specs.
  • Letteer – I agree, we are trying to get cyber security measures in RFPs but this is hard with any specifications. We are used to doing cyber (requirements) in general but cannot do it in the specifications. Because of this, cyber security is not a mandated function.
  • Gilmore – We used to get in response to cyber security “There are no formal requirements for cyber security in DoD requirements.” The Joint Staff is working on cyber Key Performance Parameters (KPPs) but we are not there yet. All we have so far is a document that describes acceptable degradations after a cyber-attack. As of yet, there are no cyber security requirements blessed by the JRCC.
  • Page – Also, we see people shopping around for a threat profile to fit the security they implemented.

Questions from the audience:

  • Question – Aren’t there software assurance metrics language that could be adapted to programs?
  • Panel – Sounds good to us, but we do not tend to use the most modern tools like the Google desktop.
  • Question – Today’s cyber activities seemed to be aimed at the whole stack but not at individual levels.
  • Panel – This relates to how we cement Government – Industry partnerships. We are good at sharing high-level information but not at collaborating on the details. How do we change what we do across the environment? We can look at the kill chain concept to identify our weak points. However, we must look at needed capabilities first then look at tools.
  • Question – Relating to cyber KPPs, we are trying to work with operators on how we manage risk. We need new commercial standards and it is important to work with industry.
  • Panel – Agree, but we are disappointed that this is taking two years. In addition, we are not sure we have PMO experience to understand cyber security engineering architectures. There is also more emphasis on getting complex mobile IT networks to function in the field. This has analogies in the commercial market but not specifics. Therefore, we struggle to get them to work and to facilitate links to supporting entities who can address failures. Common sense cyber controls would cause the (mobile) system to fail – not sure what we can do about this.
  • Question – The scale of nodes is moving from millions to billions, what does the panel think of this?
  • Panel – Going to IPV6 will help drive this complexity. Of course, our mobile devices have access to our networks. We have to focus on the assets we really want to protect, not everything. We must be prepared for continuous surprise. We need to keep up with bad actors, but the solution is not necessarily to modify the RMF. This is a continual slog that we continue to do over the years.




Keynote speaker Dr. David Bray, CIO of the FCC, presented: Charting Cyber Terra Incognita: A CIO’s Perspective and Challenges.




Dr. Bray began his presentation by emphasizing the exponential growth in IT, human participants, and networked devices. His Terra Incognita (unknown land) is the combination of complex legacy infrastructure, the explosive growth of internet-connected systems, all with human actors and behaviors. This complexity and human interaction make it impossible not to have cyber issues and we must strive for resiliency. He said that cyber threats will run like infectious diseases across borders and that the public, private sector, public sector, academia, and non-profits must all build bridges to deal with this. Fortune 500 CEOs cite having one cyber security engineer for each $1B of data. In addition, threats are over classified so it is often hard to make the case for cyber security support. Dr. Bray mentioned “DIY” and the “internet of everything” that is outpacing cyber controls, citing examples such as industrial controls (moving to the internet with weak security, but consumers are not willing to pay for more) and capabilities for grass root entrepreneurs – pioneering civic and social innovation (but could be exploited by terrorists). He also described a greater reliance on IT with machine learning as an essential complement to human activities. Exponential growth in technology is also spilling over into bio warfare with DNA engineered bugs. Everything in cyber could be in bio within 5 years. He cited the example of the FCC that was a sitting target for cyberattacks but then went to 100% public cloud – from premise to off premise. In the briefing, Dr. Bray then described the giant leap from IPV4 to IPV6; addressing numbers from 232 to 2128.This is like moving from the volume of a beach ball to the volume of the sun! Dr. Bray talked about the importance of “Public Service” (21st century) Vs governance (20th century). He ended by emphasizing the need for more “change agents” in these exponential times.


At the conclusion of Dr. Bray’s presentation, John Weiler (IT-AAC) asked, “What do we see as the difference between (big) agile acquisition and agile development?”


Dr. Bray – We should not be in the code writing business. We are trying to procure IT capabilities in 6-9 months so agile acquisition should be an “a la carte” method with selectable modules.


Leo Garciga, Joint Improvised Threat Defeat Agency (JIDO) – In this commoditized environment, why do we still build custom stuff? Even standards are commoditized.


Dr. Bray – The commodity approach is good. Instead of Business Process Engineering, we should just keep it simple and draw on the board “how do you want to work”. In the FCC, we tried to automate an on-line form with an initial estimate of $17M but found we could do it for $450K using the commodity approach.


Question from audience – What should we do about weapon systems and cyber vulnerabilities?


Dr. Bray – We must balance availability and protection. Sometimes we rule out cloud-based solutions by asking ourselves “do I want this on the internet?”


John Weiler – What are services that can be on the internet?


Dr. Bray – We can move to limited public and Government internets (Taiwan and Australia do this.)


Question from audience – How do we retrofit TCPIP to be more secure in flight?


Dr. Bray – 1. Trust but verify (red teams). 2. Focus on Mission (what do you really need?)


Next, Leo Garciga, J6 Chief / CIO, JIDO and Ryan Skousen, Software Engineer, Booz Allen Hamilton presented: Integration of Security and Agile/DevOps Processes.




The presentation began with a review of JIDO’s mission as a quick reaction capability – to bring timely solutions to war fighters. The J6’s mission for IT is to:

  • Built a Big Data analytic platform, “Catapult,” and tool suite based on real-time, tactical needs
  • Embed with users worldwide to understand data available, analytic methodologies, & capability/data gaps
  • Provide solutions required same day at times

JIDO has been doing Agile SDLC for five years. Continuous integration is already implemented with nightly security scans. Release management with traditional CM/CCB is still hard. Agile alone is not enough.

  • Quick reaction capability to emerging threats
  • Quicker than standard DoD process; seeing agility and speed
  • Length of time to approve is standard
  • Intel fusion system with focus on how to change and when we need to change


JIDO started DevOps evolution in 2015. Security and compliance is built in upfront. JIDO’s goal is to completely automate deployments from code to production. We think this is a great capability.

  • Focus on managing risk and not compliance
  • Small changes
  • No manual/human review gate
  • Affordable by other agencies


Security /accreditation – ongoing authorization is secure agile + DevOps + continuous monitoring. JIDO has also developed an automated ongoing authorization pipeline.

  • Think through C&A before writing code
  • Adopt mission focus
  • Security accreditation (per NIST SP 800-37) can be automated to a large extent and should help to implement decisions by continuous monitoring instead of one-by-one inspection of packets – sort of an “ongoing authorization”

JIDO is still working to transition capability. This is hard to do but we are working to make it transferable to other agencies


Major takeaways:

  • Secure design and planning throughout SDLC
  • Containers for standardized deployment packaging
  • Secured, transparent DevOps pipeline.
    • Prohibits tampering, provides monitoring, and traceability.
    • Escalation based on code triggers (code delta, coverage)
  • Type-accredited platform to receive and run containers
  • It is like having a trusted candy factory, packaging goodies into bulletproof briefcases, transporting through a point-to-point hyperloop, delivering to candy shops with turrets – Really need to lick every lollipop?


Question from audience – How long will it take to fully implement JIDO Agile/DevOps process?


JIDO – We are now in full deployment across the classified and unclassified environment. We still have some staff education issues, but technically, we are up and running by working out CM problems.


John Weiler – In the DevOps world, speed is sacrificing some assurance issues. How do we recognize and incorporate engineering needs for safety/security?


JIDO – DevOps cannot be used for new ground up systems. Our process incorporates assurance by having daily scrums.


John Weiler – Security engineering cannot be determined by engineering. We must force rigorous engineering of systems with knowledge infusion.


JIDO – Yes, we do that by scanning in real time.


Question from audience – How do you characterize the tech challenge vs. the people challenge?


JIDO – This is a HUGE cultural change. We initially had a lot of push back with people worried about their rules.


The second panel, Standards of Practice for IT Modernization and Software Assurance, was led by Dr. Bill Curtis, CISQ Executive Director.




Don Davidson, DoD kicked off the panel with a short presentation on DoD cyber resilience:

  • Cyber resilience is to ensure that DoD missions (and their critically enabled systems) are dependable in the face of cyber warfare by a capable cyber adversary.
  • The DoD cybersecurity campaign:
    • Cybersecurity discipline implementation plan
    • Cybersecurity scorecard
    • Culture and compliance
  • The campaign covers these cybersecurity disciplines:
    • Strong authentication for access
    • Device hardening with configuration management / SW patching
    • Reduction of attack surface
    • Monitoring and diagnostics
  • Mission appropriate cybersecurity balances risks vs. additional security (beyond cybersecurity discipline) for trusted systems
  • Approach incorporates fundamental basis of supply chain risk management and addresses compliance through policy.


Joe Jarzombek, Synopsys – We are starting to implement SW assurance systems to address low hanging fruit.


Tom Hurt, DoD – Layers of cybersecurity are like multiple Maginot Lines applying 95% of assets on 16% of problems. Software must be integrated into system engineering.


Emile Monette, DHS – We have challenges to interpret cases and do not cover them all. We have many weaknesses in thousands of categories and automation is difficult. System security measures we discussed today are useful, but we can also focus on human expertise and leave other forms of assurance to automation.


Mr. Jarzombek – It is about leadership, not technical issues. KPPs get diminished for functionality. We need to be more demanding on providers and have requirements that are more specific, MOEs, and testing. We can specify industry standards but we must also help providers work through issues.


Mr. Davidson – We need to write KPPs because there are baseline security requirements that cannot be traded away. CIOs and CISOs are always fighting – but it needs to be a healthy dialog.


At the Black Hat conference, we heard:

  • Major breaches will continue for two years (bad for CISOs)
  • Industry may have to provide software with warranties
  • Software as a Service (SaaS) is a good model. Self-driving cars will lead to insuring software!
  • Sourcing untrusted libraries may drive some away from COTS to in-sourcing


Mr. Hurt – For mission assurance, we can take successful attacks back through architects and engineers to analyze with tools including penetration. Why don’t we have red hat (penetration) tests as part of O&M?

  • We could avoid vulnerabilities in development
  • It always take more money to fix something after the fact


Dr. Curtis – 40% of software engineers are self-taught.


Panel members – We should ask if people and products we have are certified. We need (strong) leadership to avoid deploying dangerous products. This can be part of the RFI. One approach to vetting would be to have industry recommend proper controls, but other vendors may reject the recommendations.


Dr. Curtis – We need to know that a piece of software has some sort of certification. Education may help, but this is a complex issue and cyber courses in schools are not standardized. Institutions are now promoting cybersecurity basics in software engineering schools. We could approach this like a community “buyers’ club” – putting assurance in all Agency networks with requirements to build security into the software. This idea is emerging in industry such as the Vendor Security Alliance. These are models we could use to promote Government standards.


Mr. Hurt – The DoD Program Protection Plan requires use of assurance measures. We need assessments that are passed on to DT, OT, and O&M. We had a Joint Federation Information Assurance IOC in April 2016.


Dr. Curtis – How does cybersecurity work with agile? Agile is not incompatible with this and assumes activities are engaged with customers.


Mr. Hurt – For each sprint, we need a good set of allocated requirements and they must cover assurance – so we blend assurance into agile.


Question from audience – Do we need continuing education for cybersecurity professionals?


Panel – Yes, it is required for CISSPs. In addition, software engineers should be networked to work fixes to bugs. There are software development courses that cover cybersecurity but we still lack hard and fast requirements. The Government always asks for Project Management Professionals (PMPs) but rarely for cyber credentials.


Who is teaching “formerly verified code”? This is a great concept of merging AI with humans but we don’t know how mature it is or how long it would take to train someone.


Question from audience – What are we doing to give “tactical” hands on knowledge?


Dr. Curtis – Industry does not want to train and generally looks for experience. We have professional students vs. untrained practitioners. There is lots of pressure to push out code.


Question from audience – How does Government want industry to train? What certifications?


Mr. Hurt – New DoD 5000.2 will have software tools. We hope policy will move into guidance and best practices on websites. (DoD has 100,000 system engineers)


Question from audience – There is no certification in industry for security in software coding so we have to use contract (language) to govern security requirements. The FAR allows us to make suppliers fix bad software, but who exercises this? It does not seem to stand up in court.


Mr. Garciga – Scanning of software helps to deal with this. We should scan before acceptance. We must also get source code and software design description (SDD) to promote an organizational maturity. See White House.Gov on open source code that is forcing PMs to build document libraries of software with access to source code.


The Cyber Resilience Summit ended at 1230 with John Weiler (IT-AAC) and Dr. Curtis (CISQ) gave closing comments.



Join us at the next Cyber Resilience Summit on March 21, 2017 in Reston, Virginia.


Contact: Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ); 781-444-1132 x149









Cyber Resilience Summit: Measuring and Managing Software Risk, Security and Technical Debt


The Consortium for IT Software Quality is bringing the Cyber Resilience Summit to Europe, to take place on 6 June 2017 in Brussels, Belgium — the vibrant heart of political Europe and headquarters of the European Commission The theme of the Summit is “Measuring and Managing Software Risk, Security and Technical Debt.” Discussion will focus on the latest strategic thinking from innovative American and European CIOs and IT policy makers.


Registration: Admission is complimentary. RSVP is required. Click here to register now.


The program will cover:

  • Managing security and risk with software measurement
  • Applying standard quality metrics to internal benchmarking, vendor agreements, and governance
  • Outcome-based contracts and service level agreements
  • Using software quality standards to comply with regulations
  • Positioning software measurement as a support mechanism for your team while prioritizing actions for business
  • Managing system complexity from a technology and architectural standpoint


Meeting Location:
Radisson Blu Royal Hotel
Rue du Fossé-aux-Loups 47
Wolvengracht 47
1000 Brussels, Belgium


The Cyber Resilience Summit is part of the OMG® Technical Meeting from 5-9 June 2017 in Brussels. Enjoy a discounted hotel room rate until 18 May 2017. Click here to book a hotel room.




09.00 – 15.00 includes lunch, refreshment breaks and networking. Formal agenda to be published soon!


OMG and CISQ Mission – Dr. Richard Soley, CEO, Object Management Group

Dr. Soley introduces the Consortium for IT Software Quality™, an IT leadership group co-founded by the OMG and Software Engineering Institute at Carnegie Mellon University to deliver automated measures of software size and quality from system source code.  Dr. Soley shares the critical context behind CISQ’s formation and mission objectives.


Advances in Software Quality Measurement – Dr. Bill Curtis, Executive Director, Consortium for IT Software Quality

Dr. Curtis introduces software quality standards for use in IT benchmarking, productivity analysis, service level agreements, and vendor relationships. Dr. Curtis will discuss how to manage IT risk and mitigate technical debt.


Keynote: Dr. J. Michael Gilmore, former Director of Operational Test and Evaluation, U.S. Department of Defense, now at RAND


Government Panel Discussion
Discussion lead:  
Professor Georges Ataya from Solvay Brussels School, Academic Director of Information Security Management Education, Managing Partner ICTC.EU, and Vice President of the Belgian Cybersecurity Coalition

Delegates from NATO, the European Commission, and the U.S. Department of Defense are invited to discuss the protection of government systems and applications, improving software procurement, industry regulations, and compliance directives.


Lunch 12:00 – 13:00


CIO Panel Discussion
Discussion lead:
Matthew Crabbe, Editor,

Innovative CIOs across industries are invited to discuss their strategies for improving software quality, security and cyber risk management. Discussion points include: Managing security and risk with software measurement; managing system complexity from a technology and architectural standpoint; enabling digital transformation.


Deloitte Presentation on Operational Risk

Deloitte will discuss the relationship between operational risk and IT software quality. By identifying and managing key areas of risk, an organization can release capital to work on new projects.




We look forward to hosting you in June!  Save your seat and register today!


















Cyber Resilience Summit: Securing Systems inside the Perimeter




Topic: Improving System Development and Sustainment Outcomes with Software Quality and Risk Measurement Standards


Hosted by: Consortium for IT Software Quality (CISQ) in cooperation with Object Management Group (OMG) and IT Acquisition Advisory Council (IT-AAC)


Date: Tuesday, March 21, 2017, 8:00am – 12:30pm


Location: Hyatt Reston Town Center, 1800 Presidents Street, Reston, VA 20190


RSVP: The event is sold out! Tracie Berardi or 781-444-1132 x 149


As the journey to secure our nation’s IT cyber infrastructure gains momentum, it is important to apply proven standards and methodologies that reduce risk and help us meet objectives for acquiring, developing and sustaining secure and reliable software-intensive systems. The theme of the March Cyber Resilience Summit is Securing Systems inside the Perimeter. Defending the network is NOT enough. The most damaging of system failures and security breaches are caused by vulnerabilities lurking inside the network at the application layer.


The discussion focused on meeting assurance-driven objectives, digital transformation, and cyber risk measurement at scale. We’ll discuss risk-managed evolution and practical application of systems engineering to support cloud readiness, big data, technical debt control and risk management of complex mission, C2, weapon and citizen-facing systems. 300 attendees registered from the White House, OMB, DoD, DHS, NSA and several Federal agencies.






Visit CISQ Members Area, “Event and Seminar Presentations,” or click links in agenda below



Meritalk: Government Cyber Efforts May Focus on Wrong Things and FCW: NGA wants 24-hour cloud ATOs


Request an On-site Meeting, Lunch Briefing or Speakers

Over 1,000 people have attended the Cyber Resilience Summit series. Education on these topics is critical if we are to effectively manage these risks in Public Sector acquisition and IT program management. We’re can bring Cyber Resilience best practices to your site or event. To schedule a meeting or speakers contact:






Emcee: Don Davidson, Chief, Lifecycle Risk Management & Cybersecurity/Acquisition, U.S. Department of Defense


7:45am Registration Desk and Refreshments
8:00am Welcome to the Cyber Resilience Summit
– Dr. Bill Curtis, Executive Director, Consortium for IT Software Quality (CISQ)
– John Weiler, Vice Chair, IT Acquisition Advisory Council (IT-AAC)
– Marc Jones, Director of Public Sector Outreach, Consortium for IT Software Quality (CISQ)
– Don Davidson, Chief, Lifecycle Risk Management & Cybersecurity/Acquisition, U.S. Department of Defense
8:20am Keynote: What’s Holding Us Back? – Maj Gen Dale Meyerrose (Download presentation PDF)
Dr. Dale Meyerrose, Major General, U.S. Air Force retired, was the first President appointed, Senate-confirmed chief information officer and information sharing executive for the U.S. Intelligence Community.
8:50am Advances in Measuring the Security and Architectural Integrity of Mission-Critical Systems (Download presentation PDF)
Dr. Bill Curtis, Executive Director, Consortium for IT Software Quality (CISQ)
9:20am Modernizing and Securing Legacy IT Systems
A review of the Presidential Executive Order for Cyber Security and Modernizing Government Technology Act (Meeting Handout)
Lead: John Weiler, Vice Chair, IT Acquisition Advisory Council (IT-AAC)

– Dr. Mitch Crosswait, Deputy Director, Net Centric and Missile Defense Systems, Operational Test and Evaluation, U.S. Department of Defense
– Dr. J. Brian Hall, Acting Deputy Assistant Secretary of Defense for Developmental Test and Evaluation
– Dave Epperson, CIO of NPPD, U.S. Department of Homeland Security
– Jason Hess, Chief, Cloud Security, Office of the Chief Information Officer (OCIO), National Geospatial-Intelligence Agency
– David McKeown, GS-15, CISSP, Chief, Cyber Security Center, Joint Service Provider, DISA
– Tony Davis, Acting Command Acquisition Executive, USCYBERCOM
10:00am Refreshment Break
10:15am Remarks from Dr. Ben Calloni, co-chair of the OMG’s Systems Assurance Task Force
10:30am Titans of Cyber: Critical Insights from the Front Lines of the Cyber Risk Management Battle

Lead: Don Davidson, Chief, Lifecycle Risk Management & Cybersecurity Acquisition, U.S. Department of Defense


Titans of Cyber speakers:

– Sonny Bhagowalia, CIO, U.S. Department of the Treasury
– Dr. Ray Letteer, Chief, Cyber Security Division, U.S. Marine Corps
– Dr. Ron Ross, Fellow, National Institute of Standards and Technology (NIST)
– Rod Turk, Acting CIO, U.S. Department of Commerce
– Danny Toler, Deputy Assistant Secretary, CS&C, NPPD, U.S. Department of Homeland Security (US CERT website)

11:30am Use Case: Putting CISQ Standards into Action at Agile Speed
Barry Snyder, DevOps Manager, AD&M Development Services, Fannie Mae
12:00pm The Value of Security Benchmarks and Controls (Download presentation PDF)
Curtis Dukes, Executive Vice President, Center for Internet Security
12:30pm Closing Remarks









ADCEA-DC-17     CIS-logo    owasp_logo



Marc Jones, CISQ’s Director of Public Sector Outreach, welcomes attendees to the Cyber Resilience Summit and introduces emcee, Don Davidson (DoD).


Keynote speaker, Dr. Dale Meyerrose, Major General, U.S. Air Force retired, presents What’s Holding Us Back?


Dr. Bill Curtis, CISQ’s Executive Director, presents Advances in Measuring the Security and Architectural Integrity of Mission-Critical Systems.


John Weiler, IT-AAC Vice Chair, leads the power panel, Modernizing and Securing Legacy IT, with (L-R) Jason Hess (NGA), Tony Davis (USCYBERCOM), David McKeown (DISA), Dr. Mitch Crosswait (DoD), Dr. J. Brian Hall (DoD).


Cyber Resilience Summit emcee, Don Davidson (DoD), leads the Titans of Cyber panel with (L-R) Dr. Ray Letteer (USMC), Dr. Barry Horowitz (UVA), Danny Toler (DHS NPPD), and Rod Turk (Commerce)

Barry-Snyder-Fannie-Mae-CISQ  Curt-Dukes-CISQ

Barry Snyder, DevOps Manager at Fannie Mae, presents Putting CISQ Standards into Action at Agile Speed.

Curtis Dukes, EVP of the Center for Internet Security, presents The Value of Security Benchmarks and Controls.


The Cyber Resilience Summit sold out! Thank you for participating in this important discussion.

















Gartner Application Strategies & Solutions Summit



The Premier Event for Accelerating Engagement, Driving Customer Experience and Delivering Digital Business Innovation


Today’s applications are key drivers of business advantage. Application leaders are looking for ways to update their application strategy to deliver agility and stability. They need to bridge today’s application silos for greater impact and efficiencies.


Craft an application strategy to drive digital transformation. Explore insights from architecture, development and integration as well as employee engagement and the customer experience at Gartner Application Strategies & Solutions Summit 2016, December 6 – 8, in Las Vegas, NV.


CISQ members save $200 off the standard rate! Use the code GARTCISQ. 


Click here to view the agenda and register








Cybersecurity Workshop at OMG’s Technical Meeting



Cyber threats facing a nation’s critical infrastructure, mission-critical systems, or any Internet of Things (IoT) system, demand a cyber infrastructure that matches their combined enormity and complexity. Risk management solutions must be capable of understanding intricate attack patterns and assessing complex vulnerabilities to give stakeholders confidence in their system’s ability to withstand malicious attacks.


At the Cybersecurity Workshop, practitioners will break down the Security Engineering Lifecycle to help organizations plan for, budget, and reduce costs when building/acquiring secure and resilient software-intensive systems. Cyber experts who have written (and are writing) critical IT/software standards will share business cases where automated risk management, blended with engineering and assurance solutions, addressed key cyber risk issues and enabled real-time reaction capability.

The learning objectives of this workshop:

  • Emerging technologies that contain the cost curve for cyber development and integration
  • The costs involved for budgeting cyber architecture, Risk Management Framework (RMF) analysis, and cyber integration during system integration
  • How to efficiently maintain cyber protection with constantly evolving threats
  • Case studies that reveal the costs of integrating cyber into new systems

Attendees will be presented with business cases and practical guidance in achieving these objectives.


Dr. Bill Curtis, CISQ Executive Director, presents Measuring the Cybersecurity of Software.


This Cybersecurity Workshop is part of the OMG® Technical Meeting, December 5-9, 2016 in Coronado, California. The registration fee for the workshop is $149. For groups of 5+ people, the registration fee is $99. Contact for group registration. If you register for the Technical Meeting week, there is no additional fee to attend any or all of the special events. 


View the agenda and register here









The QA Financial Forum: New York

The Harvard Club, 35 West 44th Street, New York, NY 10036, United States


qa-event-new-york-mpu-1The QA Financial Forum is the only conference focused on: Quality Assurance and Testing for Continuous App Delivery, IT Risk Management and Regulatory Compliance, Test Automation, Virtualization and the Cloud.


This conference is produced by to meet the information needs of senior IT professionals and procurement at banks, asset management companies and insurance companies.


See the agenda and register for this conference now.


You will be investing your valuable time in attending a conference where you will:

  • Hear from expert speakers from leading financial firms, including Fannie Mae, Capital One and leading investment banks and asset management companies.
  • Network with your peers — learn, exchange views and make new contacts — in the ideal central location of the Harvard Club of New York.
  • Discuss new technologies with carefully selected vendors and consultants — a chance for due diligence and exchange of information in an informal, confidential, environment.


Paul Bentz, CISQ Director of Government and Industry Programs, is speaking on the panel, “Embedding Regulatory Compliance and Security into Software Quality Assurance.”








Texas IT Forum: A Vision for Improving the Success Rates in Texas State Agency IT Projects

The Texas IT Forum is being held in the Texas State Capitol Extension Building (E1.004) in Austin, Texas.


Prevention of a troubled project is the approach with the lowest total cost of ownership. A starting point in creating a community consensus is by identifying a small set of underlying principles that can unify and enable our strategies and approaches moving into the future. Underlying those principles are a suite of best practices that could be applied within and across Texas state agencies.


The Texas IT Forum brings together state legislative representatives, state agency and public sector CIOs/CTOs/IRMs, members of other key state agency organizations that monitor such projects, industry IT and software development professionals and other subject matter experts. The breakout sessions will explore potential solution areas (e.g. IT procurement, early interventions, performance measurement, software development best practices, etc.). The event is complemented by social opportunities, providing further venues for discussion and networking.


Dr. Bill Curtis, CISQ Executive Director, will present a plenary talk, “Future Directions in IT Procurement Metrics.”


This event is free. Click here to learn more and register now.