CISQ’s Automated Function Points: History and Calculation

  David Herron, Co-Founder, David Consulting Group, Editor of IFPUG MetricViews Bill Curtis, Executive Director, CISQ        After requests from numerous commercial enterprises, the Consortium for IT Software Quality (CISQ) was formed in 2010 by the Software Engineering Institute at Carnegie Mellon University and the Object Management Group (OMG), an international IT standards organization. CISQ was chartered to create international standards for automating the measurement of size and structural quality from software source code. During early executive forums held in Washington DC, Frankfurt, and Bangalore, five measures were selected for initial specification, among which was a request to automate the counting of Function Points from source code based as closely as possible on counting guidelines from the International Function Points User Group (IFPUG).   The David Consulting Group, (DCG) a leader in Function Point analysis, was one of the founding members of CISQ. David Herron, co-founder of DCG, co-author of the Function Point Analysis, and a leader in IFPUG, was selected to head the international team chartered to develop a … Continue reading

Applying Coding Standards to the NIST Cybersecurity Framework

  The NIST Cybersecurity Framework was first published in 2014 for operators of U.S. critical infrastructure and is now the de facto cybersecurity framework for a wide range of businesses and organizations across industries. Organizations link their cyber approaches to the Framework’s core functions of Identify, Protect, Detect, Respond and Recover to manage their cybersecurity strategy and identify areas for improvement.   Once aligned, an organization can use the NIST Cybersecurity Framework as evidence when seeking certifications or shopping for cyber insurance. Good cyber risk practices will result in a less expensive premium for cyber insurance services.   NIST hosted a Cybersecurity Risk Management Conference from November 7-9 in Baltimore, MD to discuss the current state of cybersecurity risk management and approaches being employed to strengthen quality and resiliency in the software development lifecycle and supply chain. Marc Jones, CISQ Director of Public Sector Outreach, presented on the automated quality characteristic measures developed by CISQ for measuring software Security, Reliability, Performance Efficiency and Maintainability to industry-supported standards.   The slide … Continue reading

CISQ announces new study: The Cost of Poor Software Quality in the US: A 2018 Report

  This report was written by Herb Krasner, a member of CISQ’s Advisory Board. Herb spent many years at the University of Texas at Austin as Professor of Software Engineering, the Director of Outreach Services for the UT Center for Advanced Research in Software Engineering (ARiSE), and founder and CTO of the UT Software Quality Institute (SQI).   The report aggregates publicly available source material to arrive at a rough estimate of the cost of poor software quality in the United States today.  This report fills a gap in our understanding of the financial implications of poor-quality software effecting society today and into the future.   In summary, the cost of poor quality software in the US in 2018 is approximately $2.8 trillion, the main components of which are outlined in the body of the report.  If we remove the future principal cost of technical debt, the total then becomes $2.26 trillion.   It was our intention to use this report as a starting point for a community discussion.  Recommendations … Continue reading

College Degrees Now Available for Secure Software Development

Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Cybersecurity training and workforce development is a common theme and solution that’s proposed at conferences that discuss the challenges of cybersecurity and the future as we know it – developing, architecting and living within digital IT ecosystems. Who’s steering the ship? Do leaders understand the security threats and do their teams know how to develop secure, resilient and trustworthy systems for the future? For years, IT was siloed and focused predominantly on functionality. Web-based applications and services expanded the attack surface.   Amidst these fast-paced technological changes, there is good news for workforce development, because with a skills gap, comes opportunity.   The Software Engineering Institute (SEI) at Carnegie Mellon University is one of the premiere universities in the U.S. for software engineering.  The SEI has developed Software Assurance Curricula with support from the U.S. Department of Homeland Security.  The courses available include –   Master of Software Assurance Curriculum Undergraduate Software Assurance Curriculum Community College Software Assurance Curriculum … Continue reading

Coverity now features integrated on-demand developer training

News item submitted by Elizabeth Samet, Public Relations Manager, Synopsys   Read the full post on Synopsys blog here   The latest release of Coverity by Synopsys features seamless integration with our completely rebuilt eLearning platform, an on-demand developer training solution focusing on secure coding best practices and security guidance.   On-demand developer training enables development teams   Synopsys eLearning is an outcome-driven, learner-centric training solution that makes learning about security easy, relevant, and accessible. With eLearning, learners have on-demand access to an immersive, continuous learning ecosystem that unifies security expertise, instructional design, and storytelling into an intuitive platform. Features include: Content gamification Modularized courses Hands-on exercises Peer-based discussions Role-based training Training impact metrics   These features, among many others, enable developers to actively build their security competency. The integration provides developers who have eLearning licenses and accounts with convenient access—directly from the Coverity interface—to short, context-relevant training modules to help them address security issues Coverity detects in their code.   “As more organizations adopt rapid and iterative development methodologies, … Continue reading

8 takeaways from NIST’s application container security guide

By Tim Mackey, Senior Technical Evangelist for Black Duck Software by Synopsys   Link to original article on Synopsys blog, published May 1, 2018   Companies are leveraging containers on a massive scale to rapidly package and deliver software applications. But because it is difficult for organizations to see the components and dependencies in all their container images, the security risks associated with containerized software delivery has become a hot topic in DevOps. This puts the spotlight on operations teams to find security vulnerabilities in the production environment.   Closely tracking the explosive growth of containers in the last couple years, Black Duck by Synopsys created OpsSight — our first product that gives IT operations teams visibility into their container images to help prevent applications with open source vulnerabilities from being deployed.   Synopsys isn’t the only organization to identify this trend. The National Institute of Standards and Technology (NIST) published the “Application Container Security Guide” in September 2017 to address the security risks associated with container adoption.   Chances … Continue reading

Scope Measurement on Large IT Projects in Texas: A Position Paper

Herb Krasner, University of Texas at Austin (ret.), CISQ Advisory Board member   A new Texas state law adds specific monitoring requirements for large IT projects in Texas state agencies. It requires regular monitoring and reporting on IT project performance indicators of: schedule, cost, scope, and quality. IT scope measurement is the focus of this new report.   Download it here: Scope Measurement on Large IT Projects in Texas: A Position Paper   IT scope metrics are used to define and deliver the right product or system to achieve an organization’s project goals and objectives. This paper recommends several specific IT scope metrics that should be used on all large IT projects: Balanced Scorecard (BAL) System/project size and its growth or change System requirements metrics for anomalies, quality, volatility, change impact and satisfaction How these IT scope metrics are combined to answer the question of are we on track is discussed, along with typical scope challenges and next steps to properly implement the metrics.   For more information, read these … Continue reading

Preventing the Next Equifax – All CVEs Have Root Causes in CWEs

Tracie Berardi, Program Manager, CISQ   The Equifax data breach in 2017 was the result of attackers exploiting an unpatched vulnerability in Equifax software. The vulnerability – Apache Struts: CVE-2017-9805: Possible Remote Code Execution as titled in the NIST National Vulnerability Database – was a flaw discovered in Apache Struts web application software. Equifax was employing the open source code from Apache. The patch became available in March. The breach of Equifax occurred two months later in May. Outrage, lawsuits, and Federal investigations ensued…   A couple of key takeaways from the breach –   Developers commonly use third-party components, both open source and commercial-off-the-shelf, in their code and products. It is critical for the development team to maintain an inventory of its third party components to manage the component’s source, versions, and patches. SAFECode has published an excellent guide on the subject. Read: Managing Security Risks Inherent in the Use of Third-party Components. In the case of Equifax, action came too late. Basic security prevention can help to protect … Continue reading

IT Quality: Measurement Implications for Large IT Projects in Texas

Herb Krasner, University of Texas at Austin (ret.), CISQ Advisory Board member   A new law in Texas necessitates the enhanced monitoring of all large IT projects in state agencies. It requires regular measurement and reporting of project performance indicators: schedule, cost, scope, and quality. Quality is believed to be the most challenging of the performance indicators, yet it has the largest potential for driving significant improvement.   To address this, I’ve published a new position paper on IT quality measurement, which outlines the strategic and tactical reasons for doing so, and lays out a definitional framework for implementation of the new law.   Download it here: IT Quality: Measurement Implications for Large IT Projects in Texas   IT quality metrics for the following work products are defined and explained in the report: plan quality, requirements quality, architecture/design quality, software code quality, data quality, test quality, and operational system quality. The larger implications for process maturity, lean and agile development, effective use of industry standards, and cybersecurity measurement are also … Continue reading

Code Quality Standards Highlighted in U.S. State Department CSM (Consular Systems Modernization) Project

The U.S. State Department Office of Acquisitions referenced code quality requirements in the Consular Systems Modernization (CSM) statement of work.   From the State Dept. CSM acquisition document on page 23, section C.4.2:   “The contractor shall adhere to CST application coding standards intended to assist in creating code that is free of critical quality defects and is highly maintainable.”   CST = Consular Systems and Technology   “CST will employ a Software Code Review process by which it will analyze all source code by measuring application level code quality and code assurance across the portfolio of COTS configurations and custom developed software. CST will also employ Software Code Quality (SCQ), an analysis that will evaluate application risk around robustness (stability, resiliency), performance, architectural security, transferability, system maintainability (sustainment) and changeability of applications as they evolve. These measurements are based upon industry best practices and standards related to complexity, programming practices, architecture, database access and documentation. They are derived from standards bodies such as the International Organization for Standardization (ISO), … Continue reading

CISQ Metrics in GSA Schedule 70 Blank Purchase Agreement for IT and Development Services

Federal IT Acquisition Example Citing CISQ Metrics CISQ has been referenced by the U.S. General Services Administration (GSA), formally citing CISQ requirements in a Information Technology (IT) statement of work from the Office of the CIO for the Office of Public Buildings. GSA is an independent agency of the U.S. government that supports general services of Federal agencies. See page 21, section 5.9 in GSA’s document, Schedule 70 Blank Purchase Agreement for IT and Development Services, citing CISQ…   “PB-ITS (Project Based IT Services) is seeking to establish code quality standards for its existing code base, as well as new development tasks. As an emerging standard, PB-ITS references the Consortium for IT Software Quality (CISQ) for guidance on how to measure, evaluate and improve software.”   Link to GSA doc   About CISQ   The Consortium for IT Software Quality (CISQ) was founded by the Object Management Group, a technology standards organization, and the Software Engineering Institute (SEI) at Carnegie Mellon University, a Federally Funded Research and Development Center,  to … Continue reading

Texas Cybersecurity Legislation Passed In 2017 – A Summary

Herb Krasner, University of Texas at Austin (ret.), CISQ Advisory Board member   Here is a summary of the cybersecurity legislation that was passed this year that will have an impact on state agencies and institutions of higher education (all from the 85th regular session of the Tx legislature). The Tx Dept. of Information Resources (DIR) and state agency CISO’s will be the primary actors to make these new laws happen. The 2017 cybersecurity legislation (HB 8, except where noted otherwise) includes the following summarized provisions: Establishment of legislative select committees for cybersecurity in the House and Senate. Establishment of an information sharing and analysis center to provide a forum for state agencies to share information regarding cybersecurity threats, best practices, and remediation strategies. Providing mandatory guidelines to state agencies for the continuing education requirements for cybersecurity training that must be completed by all IT employees of the agencies. Creating a statewide plan (by DIR) to address cybersecurity risks and incidents in the state. DIR will collect the following information … Continue reading

Measuring IT Project Performances in Texas: House Bill (HB) 3275 Implications

CISQ Advisory Board member, Herb Krasner, has released a position paper for Texas state CIOs and IT leaders seeking guidance on House Bill (HB) 3275 passed in June 2017 requiring the reporting of software quality measurement in Texas State IT projects. Krasner drafted the legislation that was signed into law by Texas governor, Greg Abbott. Directives go into effect on January 1, 2018.   The new law, HB 3275 is available on the CISQ website for review.   Abstract from the position paper, Measuring IT Project Performances in Texas: House Bill (HB) 3275 Implications:   “Texas’ usage of IT is big and getting bigger, but past project performances have a “checkered” history. In June 2017 HB 3275 became law in Texas. It requires state agencies to improve the measuring and monitoring of large IT projects to collect and report on performance indicators for schedule, cost, scope, and quality. If these indicators go out of bounds, more intense scrutiny is then triggered, potentially requiring corrective action. These indicators will be made … Continue reading

How Outsourcing Can Mitigate Cyberrisks in DevOps

  Dr. Erik Beulen, Principal, Amsterdam office (; Dr. Walter W. Bohmayr, Senior Partner, Vienna office (; Dr. Stefan A. Deutscher, Associate Director, Berlin office (; and Alex Asen, Senior Knowledge Analyst, Boston office (   DevOps agility requires organizational adjustments and additional tooling to ensure cybersecurity. At the same time, the challenges of the cybersecurity labor market drive the need to increase tooling’s impact and to consider outsourcing. In turn, these require carefully focusing on cybersecurity governance, including the assignment of accountability and responsibility.   In DevOps, the business is in the driver’s seat. DevOps characteristics (such as iterative prioritizing and deployment) plus the combined responsibility for development and operations present cybersecurity risks. They also create opportunities. DevOps tools, infrastructure, processes, and procedures can be used to fully automate patch deployments and continuously monitor, for example, open ports. Best practices are to automate information security platforms using at a minimum programmable APIs, but preferably automated to control access, containers and container orchestration combined with hypervisors or physical separation to … Continue reading

“Risk-Managed” Digital Transformation at Forrester Forum

An event series now in its second year, Forrester Research is hosting the Digital Transformation Forum in cities across the U.S., Europe and India. CISQ is a proud partner along with parent organization, The Object Management Group® (OMG®). This week (May 9-10) we’ve been at Digital Transformation in Chicago with 500+ attendees discussing multiple, important angles of the subject: Creating customer-centric experiences through digital technology Changing business models and operations Discovering new growth opportunities Supporting digital transformation through technology, culture, leadership, skills and processes   CISQ’s expertise in the digital transformation discussion is at the software level – specifically the IT systems and applications that are being built or modernized to enable these new capabilities. Digital systems (software) are powering the enterprise. Operational excellence is critical in terms of system performance, reliability, maintainability, and security (see CISQ’s Automated Quality Characteristic Measures).   Digital is all about the software that runs your business. What we’re hearing at the Forrester Digital Transformation Forum, and from our members, is that they are going … Continue reading

CIOs Can Master Transformation By Becoming Digital Leaders

Paul Bentz, Director of Government and Industry Programs, CISQ   Digital leaders outperform their peers because they can quickly recognize and scale innovation across the business. According to Gartner, high-performing digital businesses lead because of their participation in a digital ecosystem. This means digital leaders will surround themselves with other innovators and they will in turn work together to drive collaborative solutions that drive business forward.   While Digital Transformation may still be a difficult term to define, it presents significant opportunities to the CIO, giving them a platform to drive synergies with business units. This is a game-changing opportunity for CIOs, giving them unprecedented visibility and an ability to affect real, sustained innovation.   This is one of the reasons CISQ is hosting the Software Risk & Innovation Summit this April.   Often times, the difference in a leading digital CIO is their ability to build trust and manage risk while driving innovation. It’s not enough to simply tackle one or two of these…the most successful CIOs achieve a … Continue reading

Survey on Time-to-Fix Technical Debt

CISQ is working on a standard measure of Technical Debt. Technical debt is a measure of software cost, effort, and risk due to defects remaining in code at release. Like financial debt, technical debt incurs interest over time in the form of extra effort and cost to maintain the software. Technical debt also represents the level of risk exposed to business due to the increased cost of ownership.   Completing the measure requires estimates of the time required to fix software weaknesses included in the definition of Technical Debt.   Please take our Technical Debt Survey   The survey is a PDF form that is posted to the CISQ website. To take the survey: Download the PDF form Fill in your responses Press the “send survey” button on the last page of the survey Alternatively, you can save the PDF file to your desktop and email it directly to:   As a “thank you” for your time, we are giving away $20 Amazon Gift cards to the first 50 respondents. … Continue reading

Event Summary: Cyber Resilience Summit, October 20, 2016

CYBER RESILIENCE SUMMIT: Ensure Resiliency in Federal Software Acquisition Topic: Improving System Development & Sustainment Outcomes with Software Quality and Risk Measurement Standards Hosted by: Consortium for IT Software Quality (CISQ) in cooperation with Object Management Group, Interoperability Clearinghouse, IT Acquisition Advisory Council Date: 20 October 2016 from 0800 – 1230 Location: Army Navy Country Club, 1700 Army Navy Drive, Arlington, VA Agenda and Presentations:   Event Background   The Consortium for IT Software Quality (CISQ) held its semiannual Cyber Resilience Summit at the Army Navy Country Club in Arlington, Virginia in cooperation with the IT Acquisition Advisory Council (IT-AAC) and other IT leadership organizations. “Titans of Cyber” from the U.S. Federal Government attended the Summit to share critical insights from the front lines of the cyber risk management battle. The program focused on standards and best practices for measuring risk and quality in IT-intensive programs from the standpoint of productivity, software assurance, overall quality and system/mission risk. The discussion addressed proven methods and tools of incorporating such standard metrics … Continue reading

Sourcing Innovation

Lev Lesokhin, EVP Strategy and Analytics, CAST CISQ Governing Board Member   I just had the opportunity to spend a day at the Gartner Sourcing Summit. Always a useful event and interesting to catch up with the latest in the IT sourcing space. I thought I’d share just a couple observations.   Picture of CISQ’s table at Gartner Sourcing & Strategic Vendor Relationships Summit, September 21-23, 2016, Grapevine, Texas, USA   Browsing the booths, as usual it’s mostly the traditional mix of IT services providers. Led by Wipro and Hexaware, the sponsors are all talking up their latest methodological and technical prowess. The leading edge of innovation on the vendor side is around IT automation. Almost entirely on the operations side, automation allows the vendors to more quickly and proactively deal with defects. Everything must align to a trend in our industry, and the trend here is certainly “cognitive.” Every vendor has a snappy name for their cognitive agent, Holmes, Ignio, Watson, etc., all of which bring forth the ability … Continue reading

Takeaways from the 2016 Software Risk Summit

Tracie Berardi, CISQ Program Manager   Software risk has historically been overlooked as a security concern by business leaders, and companies have paid a high price as a result. Remember the debacle earlier this year when HSBC services went down, leaving customers unable to access their online banking? That was during the peak of tax season, causing a flurry on social media and deeply damaging the company’s reputation with customers.   Companies have also had to dish out high sums to compensate their customers. RBS paid £231 million for their IT failures a few years ago, and the Target breach cost the retailer $152 million in addition to chief executive turnover. Most recently, Jeep controls have been taken over by hackers, and a similar incident with Toyota-Lexus leaves the manufacturer fixing a software bug that disabled cars’ GPS and climate control systems.   Poor structural quality of IT systems and software risk are not just IT issues. They are big problems that can lead to lost revenue and a decline in consumer … Continue reading

“Government Gets a ‘D’ for Cybersecurity”

Secure Coding Standards Needed for Cyber Resilience   On March 15, 2016 the Consortium for IT Software Quality (, with support from the IT Acquisition Advisory Council (, hosted IT leaders from the U.S. Federal Government to discuss IT risk, secure coding standards, and areas of innovation to reduce the risk of Federal software-intensive systems. The following three themes were repeatedly emphasized by speakers and panelists and underline the need for secure coding standards in cyber resilience efforts.   Three alarms from the March 15 Cyber Resilience Summit tying code quality to secure coding standards   1) The current level of risk in Federal IT is unacceptable and processes must change. Cyberattacks are becoming more prevalent and complex, and the nation’s IT systems, both public and private, are unprepared, explained Curtis Dukes, director of the National Security Agency’s Information Assurance Directorate. He scores the government’s national security systems at 70 to 75 percent, a ‘C’; the government as a whole gets a ‘D’; and the nation as a whole receives … Continue reading

Adjusting Agile for Remote Environments

Bill Dickenson, Independent Consultant, Strategy On The Web   In most commercial environments the developers are distributed — rarely occupying the same physical site and often on very different hours. Faced with this reality, AGILE struggles. In the 12 principles from the “Agile Manifesto” is the principle that “The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.” This is clearly true and taken as a fixed principle, would rule out Agile for remote teams.   Research from CISQ (Consortium for IT Software Quality) has recently evaluated the effectiveness of software teams using Agile as well as Waterfall and found a surprising result. While both Agile and Waterfall produced quality software, organizations that used both methodologies produced higher quality software than organizations that used exclusively with one or the other. This opens up some interesting approaches for Agile in a distributed environment.   Start with the Right Projects   In general, Agile is best suited when the requirements are too high level … Continue reading

What Developers Should Expect from Operations in DevOps

Bill Dickenson, Independent Consultant, Strategy On The Web   Expectation Management As DevOps becomes increasingly mainstream, it is essential that expectations are met for each group involved in the process. Part 1 of this blog focused on what operations should expect from the developers in DevOps  while this part (Part 2) will focus on what developers should expect from Operations. Managing both sides is essential to a successful flow.   To be successful, software must operate efficently on the target platform, handle exceptions without intervention, and be easily changed while remaining secure. It must deliver the functionality at the lowest cost possible. CISQ has evolved a set of quality characteristic measures that when combined with automated software tools, provide a way to make sure that the code delivered, delivers. To deliver on this, Operations must provide the right tools and the right processes to succeed.   Specifications for Continuous Release   DevOps dramatically increases the speed that application code is developed and moved into production and the first requirement is … Continue reading

What Operations Should Expect from Developers in DevOps

Bill Dickenson, Independent Consultant, Strategy On The Web   Expectation Management DevOps brings both the developers and operations processes into alignment. This blog focuses on what operations should expect from the developers while my next blog will focus on what developers should expect from Operations. Managing both sides is essential to a successful flow.   One of the major weaknesses in application development is that while software only delivers value when it is running, few universities or professional training organizations focus on how to make software operate smoothly. To be successful, software must operate efficently on the target platform, handle exceptions without intervention, and be easily changed while remaining secure. Security may sound like an odd addition here but studies continue to validate that many violations in security are at the application level. It must deliver the functionality at the lowest cost possible.  CISQ has evolved a set of quality characteristic measures that when combined with automated software tools, provide a way to make sure that the code delivered, delivers. … Continue reading

How to Identify Architecturally Complex Violations

Bill Dickenson, Independent Consultant, Strategy On The Web   Dr. Richard Soley, the Chariman and CEO of OMG, published a paper for CISQ titled, How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations, that outlines the software quality standard for IT business applications. The last post explored the relationship between unit and system level issues.   The logical and obvious conclusion is to dramatically increase the effort focused on detecting the few really dangerous architectural software defects. Unfortunately, identifying such ‘architecturally complex violations’ is anything but easy. It requires holistic analysis at both the Technology and System Levels, as well as a comprehensive, detailed understanding of the overall structure and layering of an application. For those needing further confirmation and explanation of such problems, the most common examples for each of the four CISQ characteristics, are described below.      #1 Reliability & Resiliency: Lack of reliability and resilience is often rooted in the “error handling.” Local, Unit Level analysis can help find missing … Continue reading

The Relationship Between Unit and System Level Issues

Bill Dickenson, Independent Consultant, Strategy On The Web   Dr. Richard Soley, the Chariman and CEO of OMG, published a paper for CISQ titled, How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations, that outlines the software quality standard for IT business applications. He classified software engineering best practices into two main categories: Rules of good coding practice within a program at the Unit Level without the full Technology or System Level context in which the program operates, and Rules of good architectural and design practice at the Technology or System level that take into consideration the broader architectural context within which a unit of code is integrated. Correlations between programming defects and production defects revealed something really interesting and to some extent, counter-intuitive. It appears that basic Unit Level errors account for 92% of the total errors in the source code. That’s a staggering number. It implies that in fact the coding at the individual program level is much weaker than expected … Continue reading

CISQ Interviewed by SD Times – Dr. Bill Curtis (CISQ) and Dr. Richard Soley (OMG) Cited

Read About CISQ’s Mission, Standards Work, and Future Direction   Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Rob Marvin published an article in the January issue of SD Times that details the work of the Consortium for IT Software Quality (CISQ). Rob interviewed Dr. Richard Soley, CEO of the Object Management Group (OMG) and Dr. Bill Curtis, Executive Director of CISQ.  The article sheds light on the state of software quality standards in the IT marketplace.   I can supplement what’s covered in the article for CISQ members.   CISQ was co-founded by the Object Management Group (OMG) and the Software Engineering Institute (SEI) at Carnegie Mellon University in 2009.   Says Richard Soley of OMG, “Both Paul Nielsen (CEO, Software Engineering Institute) and I were approached to try to solve the twin problems of software builders and buyers (the need for consistent, standardized quality metrics to compare providers and measure development team quality) and SI’s (the need for consistent, standardized quality metrics to lower the … Continue reading

CISQ to Start Work on Automated Enhancement Function Point Specification

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   In January 2013 CISQ published a specification for Automated Function Points (AFP) that enables the automated sizing of software by function points. The spec was developed by an international team led by David Herron of the David Consulting Group. The CISQ AFP spec was designed to be as similar as possible to the IFPUG Counting Guidelines, but also to be objective so counts are consistent (the same every time) and can be automated for use in tools. You can learn more about AFP here.   In 2015 CISQ begins work on a specification for Automated Enhancement Function Points (AEFP). The existing AFP specification is not suitable for productivity analysis, as it does not solve the problem of measuring maintenance work which does not change the total number of function points even after substantial changes to the code.   The primary challenge is to identify a counting or weighting method for AEFPs that is correlated with maintenance effort – … Continue reading

How Do You Measure System Complexity?

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Chris Kohlhepp proposed the Law of Tangental Complexity in an article he wrote on the complexity of large scale systems. He explains: To successful systems we add functionality, inter-dependencies, and layers of abstraction. Pressures exist to continue adding value. Over time systems become so complex that they eventually reach a “cognitive horizon,” i.e. a psychological limit on the ability of humans to understand the complexity of the system. We may add lateral breadth of functionality to the system (tangent to the cognitive horizon), but in time, control is lost and TECHNICAL DEBT ensues.    Image credit: Chris Kohlhepp, Law of Tangental Complexity   As steps are taken to make the system manageable – refactoring, and perhaps the hiring of new staff – the system will again find itself nearing an even greater cognitive horizon. “Recruiting more exceptionally talented engineers who can cope with the cognitive horizon of the system proves less fruitful upon later iterations of this cycle,” … Continue reading

Software Risk Management

By David Gelperin, CTO, ClearSpecs Enterprises   40-60% of larger projects fail. Fewer smaller projects fail. Therefore, do smaller projects.   It’s safer to do projects you have done successfully before, e.g., build another ecommerce website. Therefore, repeat successful projects.   If you must do something larger and unfamiliar, identify its hazards and how you plan to mitigate them.   Functions are the goals that customers care about and focus on. Developers are told to focus on customer value. Qualities like security, privacy, reliability, and robustness are goals that customers rarely think about.    Functions are easy. Qualities are hard. When system failures make the news, e.g., security breaches, it is rarely because of a functional failure. Qualities are commonly missing from software estimates and inadequately supported in operational software.    Quality may be free, but qualities need investment. Providing a quality is nothing like providing a function. Qualities are dangerous because they are unfamiliar and out of focus.   Current Agile development ignores qualities or treats them like functions. … Continue reading

The Other Requirements

By David Gelperin, CTO, ClearSpecs Enterprises   Bob, the developer, is excited. This is his first assignment with his new employer and he really wants to show them what he can do. They are asking him to develop a “make a hotel reservation” function and he is listening carefully to understand exactly what they want. He has done something similar, except for rental cars. He asks a few clarifying questions and feels fortunate that they asked him to do something he is familiar with.   He heads back to his office to develop an estimate and then tells Sue, his supervisor, that he is ready to begin work. When he meets with Sue, she asks if he has included the relevant “crosscutting requirements” in his estimates. Bob is not sure, because he doesn’t understand what she is asking.   She explains that understanding the domain function is important, but its associated crosscutting requirements need to be understood as well and factored into estimates. Crosscutting requirements constrain multiple domain functions or … Continue reading

Seeking Beta Sites for Quality-First Agile Development

By David Gelperin, CTO, ClearSpecs Enterprises   Seeking sites to refine and use a hybrid Agile process containing two phases. The second phase is “pure” Agile development and focuses on user functions. The first phase (Quality-First) identifies and manages quality goals such as reliability, understandability, or response time, which matter to your application.   Quality-First contains the following steps:   1. Identify relevant quality goals and their acceptable quality levels early (workshop).   Some quality goals are universal that are relevant to most applications. These include: reliability, response time, modularity, ease of use and learning, and all basic qualities (compliance, sufficiency, understandability, and verifiability).   The remaining (nonuniversal) quality goals are reviewed to identify those which matter to your application.   <A comprehensive quality model will be supplied to speed this step>   2. Refine quality goal information and identify “quality champions” among your team.   3. Create master lists of development restrictions including quality constraints and design, coding, and verification tactics derived from your quality goals.   Each quality … Continue reading

What is Quality?

By Bill Ferrarini, Senior Quality Assurance Analyst at SunGard Public Sector, and CISQ Member   Quality is more than just a word, it’s a passion of mine.   In 1974 I was fortunate enough to experience Quality Circles. It was definitely that moment, when you realize that you can make a difference. I got into the PC software development industry in the early days, at a time when the Industry was in need of a direction, an industry crying for standards and quality. The first decade of this emerging industry was extremely tumultuous, a young industry struggling with its identity, finding the players that would shape it into what it is today, a multi-billion dollar industry.   Somewhere along the journey, quality became important to companies who developed and published software. Providing software that was relatively ‘bug free’ took the industry by storm. In the early 1980s, companies, left and right, were adopting Best Practice guidelines like ISO 9000. An entire industry of management and training in the art of … Continue reading

CISQ Seminar Presentations Now Available: Measuring and Managing Software Risk, Security, and Technical Debt, September 17, 2014, Austin, TX

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Hello Seminar Attendees and CISQ Members,   Last week we met in Austin, Texas for a CISQ Seminar: Measuring and Managing Software Risk, Security, and Technical Debt.    Presentations are posted to the CISQ website under “Event & Seminar Presentations.” Login with your CISQ username/password, or request a login here   The seminar was kicked off by Dr. Bill Curtis, CISQ Director, and Herb Krasner, Principal Researcher, ARiSE University of Texas. Are you looking to prove the ROI of software quality? Mr. Krasner’s presentation is exploding with helpful statistics. Dr. Israel Gat (Cutter) and Dr. Murray Cantor (IBM) went on to discuss the economics of technical liability and self-insuring software. Dr. William Nichols (SEI Carnegie Mellon) revealed results from studying the practices of agile teams. Robert Martin from MITRE, Director of the Common Weakness Enumeration (CWE), and lead on the CISQ security specification, talked about the latest advancements in fighting software security weaknesses.    Thank you for participating … Continue reading

Interesting Interview – The Internet of Things and the Honda Recall: An Interview with Anders Wallgren

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   In case you didn’t catch this interview with Anders Wallgren, CTO of Electric Cloud, I’m circulating it here. On August 8, 2014 Anders was interviewed by StickyMinds editor, Cameron Philipp-Edmonds about the recent Honda recall and lessons learned (and to be learned) as we develop the “internet of things.”   You can read or watch the interview here:   Software is pervasive. As Anders notes in the interview, even cars can contain two- to three- hundred million lines of code. (Wow!) “Today you’ve got lots of systems interacting in cars with each other, every car these days is basically a distributed network of computers that need to operate together,” he says. It won’t be long before cars are driving themselves.   Honda is recalling thousands of vehicles because of a pesky software bug that impacts acceleration. High profile quality issues like this are popping up more and more, and consumers are taking notice. Consumers have more avenues … Continue reading

So you want to implement Quality Assurance… or should it be Quality Control?

By Bill Ferrarini, Senior Quality Assurance Analyst at SunGard Public Sector, and CISQ Member   Most companies will use these terms interchangeably, but the truth is Quality Assurance is a preventative method while Quality Control is an Identifier.   Don’t go shooting the messenger on this one, I know that each and every one of us has a different point of view when it comes to quality. The truth of the matter is we all have the same goal, but defining how we get there is the difficult part.   Let’s take a look at the different definitions taken from   Quality Assurance Quality Control The planned and systematic activities implemented in a quality system so that quality requirements for a product or service will be fulfilled. The observation techniques and activities used to fulfill requirements for quality. Quality Assurance is a failure prevention system that predicts almost everything about product safety, quality standards and legality that could possibly go wrong, and then takes steps to control and prevent flawed products or … Continue reading

Wall St. Journal Cyber Attack Highlights Need for Security

Last week a hacker known as “w0rm” attacked the Wall St. Journal website. W0rm is a hacker (or group of hackers) known to infiltrate news websites, post screenshots on Twitter as evidence, and solicit the sale of database information and credentials. Information stolen from the site would let someone “modify articles, add new content, insert malicious content in any page, add new users, delete users and so on,” said Andrew Komarov, chief executive of IntelCrawler, who brought the hack to the attention of the Journal.   See “WSJ Takes Some Computer Systems Offline After Cyber Intrusion.”   Security is a major issue that’s highlighted by the rising number of multi-million dollar computer outages and security breaches in the news today. The breach of the Wall St. Journal website was the result of a SQL injection into a vulnerable web graphics system. Since the 1990’s the IT community has been talking about SQL injections (which are relatively simple to prevent) yet input validation issues still represent the significant majority of web … Continue reading

The Aging IT Procurement Processes of the Pentagon

About 2 months ago a blog article was written for the NDIA, exposing the difficulties of buying new IT systems by the Defense Department. Pentagon acquisitions chief Frank Kendall was on the hot seat during an April 30th hearing. Senate Armed Services Committee Chairman Carl Levin, D-Mich., said that the track record for procurement has been “abysmal.” Sen. Claire McCaskill, D-Mo., angrily said “You’re terrible at it, just terrible at it.”   Yet the Pentagon requested $30.3 billion for unclassified IT programs in fiscal year 2015 (a drop of $1 billion, or 3.3 percent, from fiscal 2014). So what are the issues? Well, one of them points to the complex approval process. “I think we’re imposing too much burden on people and we’re micromanaging,” said Kendall. “We have a tendency in the department, I think, to try to force the business systems that we acquire to do things the way we’ve historically done business.” And there is little incentive to change.   David Ahearn, a partner at BluestoneLogic, wrote in … Continue reading

What Software Developers Can Learn From the Latest Car Recalls

By Sam Malek, CTO / Co-Founder of Transvive Inc., and CISQ Member   If you have been following the news these days, you probably heard about the recall of some General Motors cars because of an ignition switch issue. It is estimated to be 2.6 million cars (1) and will cost around $400 million (2), which is roughly $166 per vehicle. This price is significantly expensive for a 57 cent part that could have been easily replaced on the assembly line.   As we enter the third wave of the industrial revolution (Toffler), where information technology is starting to dominate major parts of everyday life, software is becoming a critical component of day-to-day activities: from the coffee machine that might be running a small piece of code to the control unit that governs vehicles, and everything else in between.    However, these days with the overflow of news about applications that have made millions – even billions – of dollars for their developers, the stories we hear about the development … Continue reading

A Compounded Comedy of Software Errors Underpin the Latest Healthcare Signup Glitch

Last week, an article from IEEE SPECTRUM outlined the latest set of issues related to the Obamacare Affordable Care Act (ACA): hundreds of thousands of California Medi-Cal health insurance applications can’t seem to get past the approval finish line and significantly delay the start of healthcare coverage for over 900,000 Californians.   Several issues are to blame for this, and continue a string of problems for this site since its go-live date back in October 2013:   The health insurance exchange website and infrastructure did not expect over 3.2 million residents to enroll for Medi-Cal health insurance coverage – more than 2.5 times the original estimate. The state-run Covered California exchange computer system was supposed to integrate with the 58 individual county social services computer systems by October 1, 2013, so that an applicant’s eligibility could be corroborated and the county managed care plan the applicant selected could be confirmed. However, this functionality wasn’t fully operational until January 21, 2014. Since the state is expected to take no longer than … Continue reading

Productivity Challenges in Outsourcing Contacts

By Sridevi Devathi, HCL Estimation Center of Excellence, and CISQ Member   In an ever competitive market, year-on-year productivity gains and output-based pricing models are standard ‘asks’ in most outsourcing engagements. Mature and accurate SIZING is the KEY in order to address the same!   It is essential that the below stated challenges are clearly understood and addressed in outsourcing contracts for successful implementation.   Challenge 1 – NATURE OF WORK All IT Services provided by IT vendors are NOT measurable using the ISO certified Functional Sizing Measures like IFPUG FP, NESMA FP or COSMIC FP (referred as Function Points hereafter). While pure Application development and Large Application enhancement projects are taken care of by Function Points, there are no industry standard SIZING methods for projects/work units that are purely technology driven, like the following: Pure technical projects like data migration, technical upgrades (e.g. VB version x.1 to VB version x.2) Performance fine tuning and other non-functional projects Small fixes in business logic, configuration to enable a business functionality Pure … Continue reading

CISQ Seminar – Software Quality in Federal Acquisitions

CISQ hosted its latest Seminar at the HYATT Reston Town Center in Reston, VA, USA. The topic for this installment was “Software Quality in Federal Acquisitions”, and included the following speakers:   David Herron, David Consulting Group Robert Martin, Project Lead, Common Weakness Enumeration, MITRE Corp. John Keane, Military Health Systems Dr. Bill Curtis, Director, CISQ John Weiler, CIO Interop. Clearinghouse Joe Jarzombek, Director for Software & Supply Chain Assurance, DHS Dr. William Nichols, Software Engineering Institute   Over 75 senior leaders from public and private sector organizations such as BSAF, MITRE, US Department of Defense, Northrop Grumman, NSA, Fannie Mae, US Army, and NIST were in attendance listening to presentations, engaging in discussions, and networking with peers.   Dr. Curtis began the day by discussing the recent changes in the regulatory environment at the Federal level, especially as they relate to software risk prevention. Kevin Jackson (IT-AAC) stressed how innovation cannot be adopted if it cannot be measured.   Mr. Herron introduced the uses of productivity analysis and Function … Continue reading

Open Source is Not Immune to Software Quality Problems

The Heartbleed Bug reinforces the need to monitor the quality of open source software   OpenSSL came under fire this past week through the now infamous Heartbleed bug.   This open source encryption software is used by over 500,000 websites, including Google, Facebook, and Yahoo to protect their customers’ valuable information. While generally a solid program, OpenSSL harbors a security vulnerability that allows hackers to access the memory of data servers and potentially steal a server’s digital keys that are used to encrypt communications, thus gaining access to an organization’s internal documents.   Technically-known as CVE-2014-0160, the Heartbleed bug allows hackers to access up to 64 kilobytes of memory during any one attack and provides the ability for repeat attacks. Faulty code within OpenSSL is responsible for the vulnerability and – as an open source project – it’s hard to pinpoint who is responsible much less scrutinize all the complex code created for the SSL project to find such a minute vulnerability.   While I’m definitely not knocking open-source projects … Continue reading

Software Quality beyond Application Boundaries

  The retail security crisis continues…   A recent Wall Street Journal article exposed potential issues with Bitcoin’s transaction network. This left Tokyo-based Mt. Gox exchange and Gavin Andresen, Chief Scientist at the Bitcoin Foundation, pointing fingers at each other.   So far the retail industry has felt the pain of sophisticated hackers stealing sensitive information:   Target Corp. – The latest news suggests that the breach started with a malware-laced email phishing attack sent to employees at an HVAC firm that did business with the nationwide retailer Nieman Marcus – 1.1 million debit and credit cards used at its stores may have been compromised Michaels – investigating a possible security breach on its payment card network   According to a Business Insider article, smaller breaches on at least three other well-known U.S. retailers also took place during the U.S. holiday shopping season last year and were conducted using similar techniques as the one on Target. Those breaches have yet to come to light in the mainstream media.   Memory-scraping … Continue reading

Startups Need Software Quality Too

Last week Phil Libin, the CEO of Evernote, wrote an honest blog article titled “On Software Quality and Building a Better Evernote in 2014.” It was in response to an initial blog article by Jason Kinkaid which criticized Evernote for a decline in quality over the last few months. In the response, Lubin accepted the criticism well and publically vowed changes to their software in 2014. Lubin explained how, as a startup, the focus on growing fast had the unfortunate side effect of introducing more bugs and ultimately affecting quality and user experience. He discussed how constant improvement is key, trading the rush of releasing new product versions for more thorough testing, how software quality must be engrained in culture, and that quality improvements need to be shown rather than just discussed.   This story brings to light the importance of software quality, not just with updated tools for testing and measurement but to also empower the culture of an organization to always focus on software quality and customer experience. … Continue reading

Software Robustness and Resiliency in Capital Markets

CISQ hosted its latest Technology Executive Roundtable at the Marriott at Grand Central (NYC). The topic for this installment was “Software Robustness and Resiliency in Capital Markets”, and featured the following speakers: Corey Booth, Partner and Managing Director, Boston Consulting Group; Dr. Bill Curtis, Director, CISQ; JP Chauvet, Chief Architect of Equities, Credit Suisse. Over 25 senior leaders from organizations such as Bridgewater Associates, BNY Mellon, NYSE Euronext, Deutsche Bank, The Depository Trust & Clearing Corporation, and J.P.Morgan were in attendance listening to presentations, engaging in discussions, and networking with peers.   Dr. Curtis started off by discussing the recent changes in the regulatory environment at the Federal level, especially as they relate to software risk prevention. He covered some of the highlights of Regulation SCI, and the feedback provided to the SEC by CISQ. A link to the presentation can be found here.   Mr. Booth then talked about the tradeoffs between risk and development speed, and their implications on software quality frameworks and processes. He discussed the two … Continue reading

Software Startup Quality – High Quality Software Must Be Usable, Reliable, Secure and Available

  Building, maintaining, and enhancing high quality software is not a trivial exercise, yet it is critical to software-based startups. Entering the marketplace with a feature-laden but unstable, insecure, difficult to enhance, and poorly performing product ensures a fast track to startup failure.   Producing high quality software demands the convergence of engaged, quality-focused stakeholders, results-based incentive programs, and a developer culture of quality. It also includes finding the right technology partners, making best use of productivity enhancers like appropriate software development platforms and cloud-based services, and leveraging open standards, and open source assets. Miss any one of these and your software startup may turn out a software turn-off.   Avoid Startup Software Development Risks from Day One   It remains challenging for all organizations to consistently produce high quality software that meets potential customer’s needs, on time, and in budget. For the startup the challenges are greater, and so are the stakes.   Software quality starts with governance, or establishing sound development principles, policies, and decision rights. However, governance … Continue reading

Software Measurement: Its Estimation and Metrics Used

Software measurements and metrics: fundamentals (on the example of eGovernment and eCommerce) With the recent establishment of new regulatory bodies and eGovernment organizations, the growth of software developers and quality assurance professionals has almost doubled in the past 2-3 years. To ensure the sound and more predictable development of high quality systems, it is important for developers to gather and evaluate measurable data that guide estimation, decision-making and assessment. It is common sense that the ability to measure and analyze will lead to improved control and management.   Product metrics are also referred to as software metrics. They are directly associated with the product itself and attempt to measure product quality or characteristics of the product that can be connected with product quality. Process metrics concentrate on the process of software development and measure process structures with the aim of either distinguishing problems or pushing forward effective practices. Resource metrics are associated with the properties that are essential for the development of software systems and their realization.   Measurement is … Continue reading

The Problem of Software Quality Metrics

A fine example of a problem posed by software risk was over a decade ago when the then CIO of the United States Air Force divulged that the US military forces were dependent on hundreds of thousands of copies of a specific piece of software. This piece of software compromised around 65,000,000 lines of code and because it was a trade secret, the Pentagon had not even been allowed to see it. This information was interesting yet terrifying, particularly because the US knew that some of this code had been written by developers in what was considered to be a potentially belligerent nation. However the code, of course, turned out to be Microsoft Windows and the CIO of the US Air Force wasn’t worried about Microsoft or even the potential threat of adversarial software developers. No, his problem, like so many others, arose from his software supply chain.   Supply Chain Risk and Service Chain Risk Whenever a major manufacturer purchases parts from suppliers, there are a number of acceptance … Continue reading