CISQ Interviewed by SD Times – Dr. Bill Curtis (CISQ) and Dr. Richard Soley (OMG) Cited

Read About CISQ’s Mission, Standards Work, and Future Direction


Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)


Rob Marvin published an article in the January issue of SD Times that details the work of the Consortium for IT Software Quality (CISQ). Rob interviewed Dr. Richard Soley, CEO of the Object Management Group (OMG) and Dr. Bill Curtis, Executive Director of CISQ.  The article sheds light on the state of software quality standards in the IT marketplace.


I can supplement what’s covered in the article for CISQ members.


CISQ was co-founded by the Object Management Group (OMG) and the Software Engineering Institute (SEI) at Carnegie Mellon University in 2009.


Says Richard Soley of OMG, “Both Paul Nielsen (CEO, Software Engineering Institute) and I were approached to try to solve the twin problems of software builders and buyers (the need for consistent, standardized quality metrics to compare providers and measure development team quality) and SI’s (the need for consistent, standardized quality metrics to lower the cost of providing quality numbers for delivered software). It was clear that while CMMI is important to understanding the software development process, it doesn’t provide feedback on the artifacts developed. Just as major manufacturers agree on specific processes with their supply chains, but also test parts as they enter the factory, software developers and acquirers should have consistent, standard metrics for software quality. It was natural for Paul and I to pull together the best people in the business to make that happen.”


Richard Soley reached out to Dr. Bill Curtis to take the reins at CISQ. Bill Curtis is well-known in software quality circles as he led the creation of the Capability Maturity Model (CMM) and People CMM while at the Software Engineering Institute. Bill has published 5 books, over 150 articles, and was elected a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) for his career contributions to software process improvement and measurement. He is currently SVP and Chief Scientist at CAST Software.


“Industry and government badly need automated, standardized metrics of software size and quality that are objective and computed directly from source code,” he says.


Bill Curtis organized CISQ working groups to start work on specifications. The Automated Function Point (AFP) specification was led by David Herron of the David Consulting Group and became an officially supported standard of the OMG in 2013. Currently, Software Quality Measures for Security, Reliability, Performance Efficiency, and Maintainability are undergoing standardization by the OMG.


The SD Times article in which Dr. Curtis and Dr. Soley are cited – CISQ aims to ensure industry wide software quality standards – is a summary of these specifications and their adoption. Please read.


A media reprint of the article has been posted to the members area of the CISQ website.  


You can also watch this video with Dr. Bill Curtis.


Later this year CISQ will start work on specs for Technical Debt and Quality-Adjusted Productivity.


How Do You Measure System Complexity?

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)


Chris Kohlhepp proposed the Law of Tangental Complexity in an article he wrote on the complexity of large scale systems. He explains: To successful systems we add functionality, inter-dependencies, and layers of abstraction. Pressures exist to continue adding value. Over time systems become so complex that they eventually reach a “cognitive horizon,” i.e. a psychological limit on the ability of humans to understand the complexity of the system. We may add lateral breadth of functionality to the system (tangent to the cognitive horizon), but in time, control is lost and TECHNICAL DEBT ensues.


Cognitive Horizon

 Image credit: Chris Kohlhepp, Law of Tangental Complexity


As steps are taken to make the system manageable – refactoring, and perhaps the hiring of new staff – the system will again find itself nearing an even greater cognitive horizon. “Recruiting more exceptionally talented engineers who can cope with the cognitive horizon of the system proves less fruitful upon later iterations of this cycle,” the author writes. The law of diminishing returns kicks in. 


Mr. Kohlhepp discusses two traditional mitigation strategies – 1. Limit the complexity of the system, and, 2. Refactor the system into two or more subsystems to manage complexity on a smaller scale. One cannot change what one cannot measure.


At CISQ we agree with this concept. A large project is 10x more likely to fail (Standish Group, CHAOS Report 2013). At the September 2014 CISQ seminar in Austin, Texas, CISQ Director Bill Curtis stressed that over half of maintenance activities are spent first understanding the code. Losing control of the entirety of a system takes away from time spent being proactive versus reactive. 


How do you measure the complexity of a system over time? How do you identify when it’s time to be proactive and split applications? Have you considered applying IT quality metrics developed by CISQ to measure and automate reports on your applications?


We would like to hear your thoughts on this subject. Please comment below.

CISQ Seminar Presentations Now Available: Measuring and Managing Software Risk, Security, and Technical Debt, September 17, 2014, Austin, TX

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)


Hello Seminar Attendees and CISQ Members,


Last week we met in Austin, Texas for a CISQ Seminar: Measuring and Managing Software Risk, Security, and Technical Debt. 


Presentations are posted to the CISQ website under “Event & Seminar Presentations.”
Login with your CISQ username/password, or request a login here


The seminar was kicked off by Dr. Bill Curtis, CISQ Director, and Herb Krasner, Principal Researcher, ARiSE University of Texas. Are you looking to prove the ROI of software quality? Mr. Krasner’s presentation is exploding with helpful statistics. Dr. Israel Gat (Cutter) and Dr. Murray Cantor (IBM) went on to discuss the economics of technical liability and self-insuring software. Dr. William Nichols (SEI Carnegie Mellon) revealed results from studying the practices of agile teams. Robert Martin from MITRE, Director of the Common Weakness Enumeration (CWE), and lead on the CISQ security specification, talked about the latest advancements in fighting software security weaknesses. 


Thank you for participating in this lively event! If you couldn’t make it to Austin, please feel free to view the presentations. Our next seminar will be in Reston, Virginia in late March 2015. 


CISQ aims to turn software quality into a measurable science. CISQ has developed quality measures for Security, Performance Efficiency, Reliability, and Maintainability that are going through the OMG standardization process now. You can view CISQ Quality Standard Version 2.1 on the CISQ site. We expect the measures to become official standards in early 2015.