What Developers Should Expect from Operations in DevOps

Bill Dickenson, Independent Consultant, Strategy On The Web   Expectation Management As DevOps becomes increasingly mainstream, it is essential that expectations are met for each group involved in the process. Part 1 of this blog focused on what operations should expect from the developers in DevOps  while this part (Part 2) will focus on what developers should expect from Operations. Managing both sides is essential to a successful flow.   To be successful, software must operate efficently on the target platform, handle exceptions without intervention, and be easily changed while remaining secure. It must deliver the functionality at the lowest cost possible. CISQ has evolved a set of quality characteristic measures that when combined with automated software tools, provide a way to make sure that the code delivered, delivers. To deliver on this, Operations must provide the right tools and the right processes to succeed.   Specifications for Continuous Release   DevOps dramatically increases the speed that application code is developed and moved into production and the first requirement is … Continue reading

What Operations Should Expect from Developers in DevOps

Bill Dickenson, Independent Consultant, Strategy On The Web   Expectation Management DevOps brings both the developers and operations processes into alignment. This blog focuses on what operations should expect from the developers while my next blog will focus on what developers should expect from Operations. Managing both sides is essential to a successful flow.   One of the major weaknesses in application development is that while software only delivers value when it is running, few universities or professional training organizations focus on how to make software operate smoothly. To be successful, software must operate efficently on the target platform, handle exceptions without intervention, and be easily changed while remaining secure. Security may sound like an odd addition here but studies continue to validate that many violations in security are at the application level. It must deliver the functionality at the lowest cost possible.  CISQ has evolved a set of quality characteristic measures that when combined with automated software tools, provide a way to make sure that the code delivered, delivers. … Continue reading

How to Identify Architecturally Complex Violations

Bill Dickenson, Independent Consultant, Strategy On The Web   Dr. Richard Soley, the Chariman and CEO of OMG, published a paper for CISQ titled, How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations, that outlines the software quality standard for IT business applications. The last post explored the relationship between unit and system level issues.   The logical and obvious conclusion is to dramatically increase the effort focused on detecting the few really dangerous architectural software defects. Unfortunately, identifying such ‘architecturally complex violations’ is anything but easy. It requires holistic analysis at both the Technology and System Levels, as well as a comprehensive, detailed understanding of the overall structure and layering of an application. For those needing further confirmation and explanation of such problems, the most common examples for each of the four CISQ characteristics, are described below.      #1 Reliability & Resiliency: Lack of reliability and resilience is often rooted in the “error handling.” Local, Unit Level analysis can help find missing … Continue reading

The Relationship Between Unit and System Level Issues

Bill Dickenson, Independent Consultant, Strategy On The Web   Dr. Richard Soley, the Chariman and CEO of OMG, published a paper for CISQ titled, How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations, that outlines the software quality standard for IT business applications. He classified software engineering best practices into two main categories: Rules of good coding practice within a program at the Unit Level without the full Technology or System Level context in which the program operates, and Rules of good architectural and design practice at the Technology or System level that take into consideration the broader architectural context within which a unit of code is integrated. Correlations between programming defects and production defects revealed something really interesting and to some extent, counter-intuitive. It appears that basic Unit Level errors account for 92% of the total errors in the source code. That’s a staggering number. It implies that in fact the coding at the individual program level is much weaker than expected … Continue reading

CISQ Interviewed by SD Times – Dr. Bill Curtis (CISQ) and Dr. Richard Soley (OMG) Cited

Read About CISQ’s Mission, Standards Work, and Future Direction   Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Rob Marvin published an article in the January issue of SD Times that details the work of the Consortium for IT Software Quality (CISQ). Rob interviewed Dr. Richard Soley, CEO of the Object Management Group (OMG) and Dr. Bill Curtis, Executive Director of CISQ.  The article sheds light on the state of software quality standards in the IT marketplace.   I can supplement what’s covered in the article for CISQ members.   CISQ was co-founded by the Object Management Group (OMG) and the Software Engineering Institute (SEI) at Carnegie Mellon University in 2009.   Says Richard Soley of OMG, “Both Paul Nielsen (CEO, Software Engineering Institute) and I were approached to try to solve the twin problems of software builders and buyers (the need for consistent, standardized quality metrics to compare providers and measure development team quality) and SI’s (the need for consistent, standardized quality metrics to lower the … Continue reading

CISQ to Start Work on Automated Enhancement Function Point Specification

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   In January 2013 CISQ published a specification for Automated Function Points (AFP) that enables the automated sizing of software by function points. The spec was developed by an international team led by David Herron of the David Consulting Group. The CISQ AFP spec was designed to be as similar as possible to the IFPUG Counting Guidelines, but also to be objective so counts are consistent (the same every time) and can be automated for use in tools. You can learn more about AFP here.   In 2015 CISQ begins work on a specification for Automated Enhancement Function Points (AEFP). The existing AFP specification is not suitable for productivity analysis, as it does not solve the problem of measuring maintenance work which does not change the total number of function points even after substantial changes to the code.   The primary challenge is to identify a counting or weighting method for AEFPs that is correlated with maintenance effort – … Continue reading

How Do You Measure System Complexity?

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Chris Kohlhepp proposed the Law of Tangental Complexity in an article he wrote on the complexity of large scale systems. He explains: To successful systems we add functionality, inter-dependencies, and layers of abstraction. Pressures exist to continue adding value. Over time systems become so complex that they eventually reach a “cognitive horizon,” i.e. a psychological limit on the ability of humans to understand the complexity of the system. We may add lateral breadth of functionality to the system (tangent to the cognitive horizon), but in time, control is lost and TECHNICAL DEBT ensues.    Image credit: Chris Kohlhepp, Law of Tangental Complexity   As steps are taken to make the system manageable – refactoring, and perhaps the hiring of new staff – the system will again find itself nearing an even greater cognitive horizon. “Recruiting more exceptionally talented engineers who can cope with the cognitive horizon of the system proves less fruitful upon later iterations of this cycle,” … Continue reading

Software Risk Management

By David Gelperin, CTO, ClearSpecs Enterprises   40-60% of larger projects fail. Fewer smaller projects fail. Therefore, do smaller projects.   It’s safer to do projects you have done successfully before, e.g., build another ecommerce website. Therefore, repeat successful projects.   If you must do something larger and unfamiliar, identify its hazards and how you plan to mitigate them.   Functions are the goals that customers care about and focus on. Developers are told to focus on customer value. Qualities like security, privacy, reliability, and robustness are goals that customers rarely think about.    Functions are easy. Qualities are hard. When system failures make the news, e.g., security breaches, it is rarely because of a functional failure. Qualities are commonly missing from software estimates and inadequately supported in operational software.    Quality may be free, but qualities need investment. Providing a quality is nothing like providing a function. Qualities are dangerous because they are unfamiliar and out of focus.   Current Agile development ignores qualities or treats them like functions. … Continue reading

The Other Requirements

By David Gelperin, CTO, ClearSpecs Enterprises   Bob, the developer, is excited. This is his first assignment with his new employer and he really wants to show them what he can do. They are asking him to develop a “make a hotel reservation” function and he is listening carefully to understand exactly what they want. He has done something similar, except for rental cars. He asks a few clarifying questions and feels fortunate that they asked him to do something he is familiar with.   He heads back to his office to develop an estimate and then tells Sue, his supervisor, that he is ready to begin work. When he meets with Sue, she asks if he has included the relevant “crosscutting requirements” in his estimates. Bob is not sure, because he doesn’t understand what she is asking.   She explains that understanding the domain function is important, but its associated crosscutting requirements need to be understood as well and factored into estimates. Crosscutting requirements constrain multiple domain functions or … Continue reading

Seeking Beta Sites for Quality-First Agile Development

By David Gelperin, CTO, ClearSpecs Enterprises   Seeking sites to refine and use a hybrid Agile process containing two phases. The second phase is “pure” Agile development and focuses on user functions. The first phase (Quality-First) identifies and manages quality goals such as reliability, understandability, or response time, which matter to your application.   Quality-First contains the following steps:   1. Identify relevant quality goals and their acceptable quality levels early (workshop).   Some quality goals are universal that are relevant to most applications. These include: reliability, response time, modularity, ease of use and learning, and all basic qualities (compliance, sufficiency, understandability, and verifiability).   The remaining (nonuniversal) quality goals are reviewed to identify those which matter to your application.   <A comprehensive quality model will be supplied to speed this step>   2. Refine quality goal information and identify “quality champions” among your team.   3. Create master lists of development restrictions including quality constraints and design, coding, and verification tactics derived from your quality goals.   Each quality … Continue reading

What is Quality?

By Bill Ferrarini, Senior Quality Assurance Analyst at SunGard Public Sector, and CISQ Member   Quality is more than just a word, it’s a passion of mine.   In 1974 I was fortunate enough to experience Quality Circles. It was definitely that moment, when you realize that you can make a difference. I got into the PC software development industry in the early days, at a time when the Industry was in need of a direction, an industry crying for standards and quality. The first decade of this emerging industry was extremely tumultuous, a young industry struggling with its identity, finding the players that would shape it into what it is today, a multi-billion dollar industry.   Somewhere along the journey, quality became important to companies who developed and published software. Providing software that was relatively ‘bug free’ took the industry by storm. In the early 1980s, companies, left and right, were adopting Best Practice guidelines like ISO 9000. An entire industry of management and training in the art of … Continue reading

CISQ Seminar Presentations Now Available: Measuring and Managing Software Risk, Security, and Technical Debt, September 17, 2014, Austin, TX

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   Hello Seminar Attendees and CISQ Members,   Last week we met in Austin, Texas for a CISQ Seminar: Measuring and Managing Software Risk, Security, and Technical Debt.    Presentations are posted to the CISQ website under “Event & Seminar Presentations.” Login with your CISQ username/password, or request a login here   The seminar was kicked off by Dr. Bill Curtis, CISQ Director, and Herb Krasner, Principal Researcher, ARiSE University of Texas. Are you looking to prove the ROI of software quality? Mr. Krasner’s presentation is exploding with helpful statistics. Dr. Israel Gat (Cutter) and Dr. Murray Cantor (IBM) went on to discuss the economics of technical liability and self-insuring software. Dr. William Nichols (SEI Carnegie Mellon) revealed results from studying the practices of agile teams. Robert Martin from MITRE, Director of the Common Weakness Enumeration (CWE), and lead on the CISQ security specification, talked about the latest advancements in fighting software security weaknesses.    Thank you for participating … Continue reading

Interesting Interview – The Internet of Things and the Honda Recall: An Interview with Anders Wallgren

By Tracie Berardi, Program Manager, Consortium for IT Software Quality (CISQ)   In case you didn’t catch this interview with Anders Wallgren, CTO of Electric Cloud, I’m circulating it here. On August 8, 2014 Anders was interviewed by StickyMinds editor, Cameron Philipp-Edmonds about the recent Honda recall and lessons learned (and to be learned) as we develop the “internet of things.”   You can read or watch the interview here: http://www.stickyminds.com/interview/internet-things-and-honda-recall-interview-anders-wallgren   Software is pervasive. As Anders notes in the interview, even cars can contain two- to three- hundred million lines of code. (Wow!) “Today you’ve got lots of systems interacting in cars with each other, every car these days is basically a distributed network of computers that need to operate together,” he says. It won’t be long before cars are driving themselves.   Honda is recalling thousands of vehicles because of a pesky software bug that impacts acceleration. High profile quality issues like this are popping up more and more, and consumers are taking notice. Consumers have more avenues … Continue reading

So you want to implement Quality Assurance… or should it be Quality Control?

By Bill Ferrarini, Senior Quality Assurance Analyst at SunGard Public Sector, and CISQ Member   Most companies will use these terms interchangeably, but the truth is Quality Assurance is a preventative method while Quality Control is an Identifier.   Don’t go shooting the messenger on this one, I know that each and every one of us has a different point of view when it comes to quality. The truth of the matter is we all have the same goal, but defining how we get there is the difficult part.   Let’s take a look at the different definitions taken from ASQ.org.   Quality Assurance Quality Control The planned and systematic activities implemented in a quality system so that quality requirements for a product or service will be fulfilled. The observation techniques and activities used to fulfill requirements for quality. Quality Assurance is a failure prevention system that predicts almost everything about product safety, quality standards and legality that could possibly go wrong, and then takes steps to control and prevent flawed products or … Continue reading

Wall St. Journal Cyber Attack Highlights Need for Security

Last week a hacker known as “w0rm” attacked the Wall St. Journal website. W0rm is a hacker (or group of hackers) known to infiltrate news websites, post screenshots on Twitter as evidence, and solicit the sale of database information and credentials. Information stolen from the site would let someone “modify articles, add new content, insert malicious content in any page, add new users, delete users and so on,” said Andrew Komarov, chief executive of IntelCrawler, who brought the hack to the attention of the Journal.   See “WSJ Takes Some Computer Systems Offline After Cyber Intrusion.”   Security is a major issue that’s highlighted by the rising number of multi-million dollar computer outages and security breaches in the news today. The breach of the Wall St. Journal website was the result of a SQL injection into a vulnerable web graphics system. Since the 1990’s the IT community has been talking about SQL injections (which are relatively simple to prevent) yet input validation issues still represent the significant majority of web … Continue reading

The Aging IT Procurement Processes of the Pentagon

About 2 months ago a blog article was written for the NDIA, exposing the difficulties of buying new IT systems by the Defense Department. Pentagon acquisitions chief Frank Kendall was on the hot seat during an April 30th hearing. Senate Armed Services Committee Chairman Carl Levin, D-Mich., said that the track record for procurement has been “abysmal.” Sen. Claire McCaskill, D-Mo., angrily said “You’re terrible at it, just terrible at it.”   Yet the Pentagon requested $30.3 billion for unclassified IT programs in fiscal year 2015 (a drop of $1 billion, or 3.3 percent, from fiscal 2014). So what are the issues? Well, one of them points to the complex approval process. “I think we’re imposing too much burden on people and we’re micromanaging,” said Kendall. “We have a tendency in the department, I think, to try to force the business systems that we acquire to do things the way we’ve historically done business.” And there is little incentive to change.   David Ahearn, a partner at BluestoneLogic, wrote in … Continue reading

What Software Developers Can Learn From the Latest Car Recalls

By Sam Malek, CTO / Co-Founder of Transvive Inc., and CISQ Member   If you have been following the news these days, you probably heard about the recall of some General Motors cars because of an ignition switch issue. It is estimated to be 2.6 million cars (1) and will cost around $400 million (2), which is roughly $166 per vehicle. This price is significantly expensive for a 57 cent part that could have been easily replaced on the assembly line.   As we enter the third wave of the industrial revolution (Toffler), where information technology is starting to dominate major parts of everyday life, software is becoming a critical component of day-to-day activities: from the coffee machine that might be running a small piece of code to the control unit that governs vehicles, and everything else in between.    However, these days with the overflow of news about applications that have made millions – even billions – of dollars for their developers, the stories we hear about the development … Continue reading

A Compounded Comedy of Software Errors Underpin the Latest Healthcare Signup Glitch

Last week, an article from IEEE SPECTRUM outlined the latest set of issues related to the Obamacare Affordable Care Act (ACA): hundreds of thousands of California Medi-Cal health insurance applications can’t seem to get past the approval finish line and significantly delay the start of healthcare coverage for over 900,000 Californians.   Several issues are to blame for this, and continue a string of problems for this site since its go-live date back in October 2013:   The health insurance exchange website and infrastructure did not expect over 3.2 million residents to enroll for Medi-Cal health insurance coverage – more than 2.5 times the original estimate. The state-run Covered California exchange computer system was supposed to integrate with the 58 individual county social services computer systems by October 1, 2013, so that an applicant’s eligibility could be corroborated and the county managed care plan the applicant selected could be confirmed. However, this functionality wasn’t fully operational until January 21, 2014. Since the state is expected to take no longer than … Continue reading

Productivity Challenges in Outsourcing Contacts

By Sridevi Devathi, HCL Estimation Center of Excellence, and CISQ Member   In an ever competitive market, year-on-year productivity gains and output-based pricing models are standard ‘asks’ in most outsourcing engagements. Mature and accurate SIZING is the KEY in order to address the same!   It is essential that the below stated challenges are clearly understood and addressed in outsourcing contracts for successful implementation.   Challenge 1 – NATURE OF WORK All IT Services provided by IT vendors are NOT measurable using the ISO certified Functional Sizing Measures like IFPUG FP, NESMA FP or COSMIC FP (referred as Function Points hereafter). While pure Application development and Large Application enhancement projects are taken care of by Function Points, there are no industry standard SIZING methods for projects/work units that are purely technology driven, like the following: Pure technical projects like data migration, technical upgrades (e.g. VB version x.1 to VB version x.2) Performance fine tuning and other non-functional projects Small fixes in business logic, configuration to enable a business functionality Pure … Continue reading

CISQ Seminar – Software Quality in Federal Acquisitions

CISQ hosted its latest Seminar at the HYATT Reston Town Center in Reston, VA, USA. The topic for this installment was “Software Quality in Federal Acquisitions”, and included the following speakers:   David Herron, David Consulting Group Robert Martin, Project Lead, Common Weakness Enumeration, MITRE Corp. John Keane, Military Health Systems Dr. Bill Curtis, Director, CISQ John Weiler, CIO Interop. Clearinghouse Joe Jarzombek, Director for Software & Supply Chain Assurance, DHS Dr. William Nichols, Software Engineering Institute   Over 75 senior leaders from public and private sector organizations such as BSAF, MITRE, US Department of Defense, Northrop Grumman, NSA, Fannie Mae, US Army, and NIST were in attendance listening to presentations, engaging in discussions, and networking with peers.   Dr. Curtis began the day by discussing the recent changes in the regulatory environment at the Federal level, especially as they relate to software risk prevention. Kevin Jackson (IT-AAC) stressed how innovation cannot be adopted if it cannot be measured.   Mr. Herron introduced the uses of productivity analysis and Function … Continue reading

Open Source is Not Immune to Software Quality Problems

The Heartbleed Bug reinforces the need to monitor the quality of open source software   OpenSSL came under fire this past week through the now infamous Heartbleed bug.   This open source encryption software is used by over 500,000 websites, including Google, Facebook, and Yahoo to protect their customers’ valuable information. While generally a solid program, OpenSSL harbors a security vulnerability that allows hackers to access the memory of data servers and potentially steal a server’s digital keys that are used to encrypt communications, thus gaining access to an organization’s internal documents.   Technically-known as CVE-2014-0160, the Heartbleed bug allows hackers to access up to 64 kilobytes of memory during any one attack and provides the ability for repeat attacks. Faulty code within OpenSSL is responsible for the vulnerability and – as an open source project – it’s hard to pinpoint who is responsible much less scrutinize all the complex code created for the SSL project to find such a minute vulnerability.   While I’m definitely not knocking open-source projects … Continue reading

Software Quality beyond Application Boundaries

  The retail security crisis continues…   A recent Wall Street Journal article exposed potential issues with Bitcoin’s transaction network. This left Tokyo-based Mt. Gox exchange and Gavin Andresen, Chief Scientist at the Bitcoin Foundation, pointing fingers at each other.   So far the retail industry has felt the pain of sophisticated hackers stealing sensitive information:   Target Corp. – The latest news suggests that the breach started with a malware-laced email phishing attack sent to employees at an HVAC firm that did business with the nationwide retailer Nieman Marcus – 1.1 million debit and credit cards used at its stores may have been compromised Michaels – investigating a possible security breach on its payment card network   According to a Business Insider article, smaller breaches on at least three other well-known U.S. retailers also took place during the U.S. holiday shopping season last year and were conducted using similar techniques as the one on Target. Those breaches have yet to come to light in the mainstream media.   Memory-scraping … Continue reading

Startups Need Software Quality Too

Last week Phil Libin, the CEO of Evernote, wrote an honest blog article titled “On Software Quality and Building a Better Evernote in 2014.” It was in response to an initial blog article by Jason Kinkaid which criticized Evernote for a decline in quality over the last few months. In the response, Lubin accepted the criticism well and publically vowed changes to their software in 2014. Lubin explained how, as a startup, the focus on growing fast had the unfortunate side effect of introducing more bugs and ultimately affecting quality and user experience. He discussed how constant improvement is key, trading the rush of releasing new product versions for more thorough testing, how software quality must be engrained in culture, and that quality improvements need to be shown rather than just discussed.   This story brings to light the importance of software quality, not just with updated tools for testing and measurement but to also empower the culture of an organization to always focus on software quality and customer experience. … Continue reading

Software Robustness and Resiliency in Capital Markets

CISQ hosted its latest Technology Executive Roundtable at the Marriott at Grand Central (NYC). The topic for this installment was “Software Robustness and Resiliency in Capital Markets”, and featured the following speakers: Corey Booth, Partner and Managing Director, Boston Consulting Group; Dr. Bill Curtis, Director, CISQ; JP Chauvet, Chief Architect of Equities, Credit Suisse. Over 25 senior leaders from organizations such as Bridgewater Associates, BNY Mellon, NYSE Euronext, Deutsche Bank, The Depository Trust & Clearing Corporation, and J.P.Morgan were in attendance listening to presentations, engaging in discussions, and networking with peers.   Dr. Curtis started off by discussing the recent changes in the regulatory environment at the Federal level, especially as they relate to software risk prevention. He covered some of the highlights of Regulation SCI, and the feedback provided to the SEC by CISQ. A link to the presentation can be found here.   Mr. Booth then talked about the tradeoffs between risk and development speed, and their implications on software quality frameworks and processes. He discussed the two … Continue reading

Software Startup Quality – High Quality Software Must Be Usable, Reliable, Secure and Available

  Building, maintaining, and enhancing high quality software is not a trivial exercise, yet it is critical to software-based startups. Entering the marketplace with a feature-laden but unstable, insecure, difficult to enhance, and poorly performing product ensures a fast track to startup failure.   Producing high quality software demands the convergence of engaged, quality-focused stakeholders, results-based incentive programs, and a developer culture of quality. It also includes finding the right technology partners, making best use of productivity enhancers like appropriate software development platforms and cloud-based services, and leveraging open standards, and open source assets. Miss any one of these and your software startup may turn out a software turn-off.   Avoid Startup Software Development Risks from Day One   It remains challenging for all organizations to consistently produce high quality software that meets potential customer’s needs, on time, and in budget. For the startup the challenges are greater, and so are the stakes.   Software quality starts with governance, or establishing sound development principles, policies, and decision rights. However, governance … Continue reading

Software Measurement: Its Estimation and Metrics Used

Software measurements and metrics: fundamentals (on the example of eGovernment and eCommerce) With the recent establishment of new regulatory bodies and eGovernment organizations, the growth of software developers and quality assurance professionals has almost doubled in the past 2-3 years. To ensure the sound and more predictable development of high quality systems, it is important for developers to gather and evaluate measurable data that guide estimation, decision-making and assessment. It is common sense that the ability to measure and analyze will lead to improved control and management.   Product metrics are also referred to as software metrics. They are directly associated with the product itself and attempt to measure product quality or characteristics of the product that can be connected with product quality. Process metrics concentrate on the process of software development and measure process structures with the aim of either distinguishing problems or pushing forward effective practices. Resource metrics are associated with the properties that are essential for the development of software systems and their realization.   Measurement is … Continue reading

The Problem of Software Quality Metrics

A fine example of a problem posed by software risk was over a decade ago when the then CIO of the United States Air Force divulged that the US military forces were dependent on hundreds of thousands of copies of a specific piece of software. This piece of software compromised around 65,000,000 lines of code and because it was a trade secret, the Pentagon had not even been allowed to see it. This information was interesting yet terrifying, particularly because the US knew that some of this code had been written by developers in what was considered to be a potentially belligerent nation. However the code, of course, turned out to be Microsoft Windows and the CIO of the US Air Force wasn’t worried about Microsoft or even the potential threat of adversarial software developers. No, his problem, like so many others, arose from his software supply chain.   Supply Chain Risk and Service Chain Risk Whenever a major manufacturer purchases parts from suppliers, there are a number of acceptance … Continue reading