Software Engineering

Created: by Pradeep Gowda Updated: Jan 18, 2024

A Senior Engineer’s CheckList - Little Blah


Paper reviews

Done as part of CSCI 50400 class under Dr. Hill at IUPUI.

No Silver Bullet by Fred Brooks

Brooks, Frederick P. “No Silver Bullet: Essence and Accidents of Software Engineering,” 1987. 20 (4).

This paper by Fred Brooks addresses the inherent difficulties involved in the construction of software. The author distinguishes between the accidental difficulties, which affect the process of development but are not inherent to the final system and essential difficulties, which are inherent to the system. The progress in hardware technology, development of new, higher level programming languages and constructs help in mitigating the accidental difficulties. Of the silverbullets proposed, some have influenced the direction of software development (eg: Ada, OOP), others like program verification, IDEs, cheap & abudant processing power have managed to affect the accidental complexity to a large extent. Expert systems, AI, automated & graphical programming are of little help in deciding and defining what the software should do. The author identifies complexity, conformity, changeability and invisibility as the essential properties of the system. Developing and distributing reusable software components in place of custom development has the benefit of reduced costs via cost-sharing, improved documentation and generalisation of features required by the cross-section of the users. Rapid prototyping allows the clients to narrow down the scope of their requirements in faster fashion and also helps in understanding the capabilities of software systems in an iterative process. Moving beyond the idea of writing software to growing software boosts the morale of engineers and stakeholders alike. All of these proposed methods have to championed by great designers who are trained in sound design methodoligies, have the ability to distinguish between good and great designs and over time develop the capability to deliver structures that are efficient, simple and effective. Managing the entropy generated by the interplay of people, software, hardware is at the core of software engineering. The success of RAD tools has given credence to the rapid prototyping approach. Practicing software construction as an ongoing activity encourages the designers to make choices that address the essential problems instead of getting lost in the accidental aspects, which in practice demand significant attention. Many recent software development methodologies such as continuous integration, agile programming, emphasise the importance of the engineers, and customers having access to the software as it is built. This allows all the stakeholders to get familiar with the system capability from top-down and consequently influence the direction of software development. The idea of software a living, growing entity is a powerful one and one that encourages the desiger to make decisions that are congruent with that expectation. The pervasive adoption of component vendors and Open source software development model has proven that the “buy over build” approach has succeeded spectacularly. reviewed: Aug 22, 2012

Quantitative evaluation of software quality by Boehm

Boehm, Barry W, James R Brown, and Mlity Lipow. “Quantitative evaluation of software quality,” 1976.

Software quality evaluation deals with fundametal issues such as: defining characterstics of software quality that are measureable and non-overlapping, how well one can measure overall software quality and how can this information be used to improve the software life-cycle process. The paper concludes that the desireable qualities of a product are not universal and there is no single metric that can universally rate software quality. However, a useful approximation is achievable through checklists and priorities. Even then, the overall rating is suggestive than conclusive and lessons from one product cannot wholly transposed on to a new product. Quality metrics should be used to indicate the presence of anomalies and as guides to develop, test, acquire and maintain software products. Software quality is a fuzzy phrase. This paper does an excellent job of enumerating the characterstics that any proposed metric that claims to improve quality should have. Many metrics have been considered and methodically analyzed to evaluate their efficacy in achieving goals identified for the product. Metrics which are specific, help in tracking errors over the life-cycle and are amenable to automation are considered to be higher value. It is important to develop a catalog of software quality metrics that transcend individual projects. Selecting which metrics to apply for a given project should be an important part of any software project planning. Reviewed: Aug 27, 2012

NAOMI – An Experimental Plaform for Multi-modeling

Denton, Trip, Edward Jones, Srini Srinivasan, Ken Owens, and Richard W Buskens. NAOMI–an experimental platform for multi–modeling,” 2008.

This paper documents an experimental platform designed to address the challenge of combining Domain Specific Modeling Langauges(DSMLs). The key challenges are capturing interdependencies, maintaining consistency and semantic precision. A version controlled repository is used to store all the artefacts which provides a model registry, constraint-checking, and notifications on model change. The platform provides data exchange between models, executes the models processing input to output and propagates change between connected models. The platform also checks for consistency of attributes, constraints of models and thier interactions. The components of the NOAMI system are: connectors to provide interoperability between models, a manager to orchestrate modelers in integrating and invoking connectors, and an execution engine that determines the order in which the invdividual models are executed. The system is implemented and tested using an experiment that simulates a traffic light. This paper does a good job of documenting the NOAMI project. The authors start by identifying the key challenges that the proposed system should solve. Then they go on to identify the requirements that are captured as testable outcomes. They model the software that provides the functionality required to achieve the desired outcome. Overall, the paper does an excellent job of explaining why the project was necessary, how to they implemented it and tested it. This paper made me aware of the various modeling languages used in various contexts and the problem of interoperability that exists in large projects. However, I was a little disappointed that they did not test this tool on multi-models from a real-life project that could substantiate some of the challenges they faced and validate the results obtained by using this platform. Reviewed: Aug 29, 2012

Where Do Operations Come from? A Multiparadigm Specification Technique

Zave, Pamela, and Michael Jackson. “Where do operations come from? A multiparadigm specification technique,” 1996. 22 (7).

In this paper, a multiparadigm technque is discussed to organize and write complex specifications of systems that are event driven and the output of the systems modify the system state. The Z notation is augmented with automata and grammars to map input stimuli to operations and arguments of the Z specifications. The specification is made of partial specifications written in different languages: finite automaton, first-order logic and Z. Partial specifications have vocabularies which are categorised as event class, argument function and a state component.The technique described assumes that the application is restricted to a event-oriented input and the output is state-oriented. The apply this technique to write the specifications for a graphical user interface and a telephone switch. They then show how the multiparadigm specification is checked for consistency. There are many techniques which allow us to write partial specifications, which can then be combined into a complete specification of the system. While it is appealing to use a single paradigm, having multiple, yet cooperating paradigms allows us to use them against each other to check for overall system consistency. Writing formal specifications for real-life complex applications is very challenging. There are different ways of capturing the specifcations formally. I came to know of the existence of the Z system, which is impressively also an ISO specification. Some problems fit naturally with some paradigms (eg: state diagrams for -> state diagrams). Some paradigms are better suited for low level details (eg: first order logic). I’m of the opinion that the entire system should be specified using the same language, preferrably with a full- fledged, standard system like Zed. This allows the designers to get the complete system picture. However, I see that choosing the right paradigm is a judgement call on the part of the designer. So, while the paper demonstrates the use of the multiple paradigms on a real problem, it is not prescriptive for all projects.

Seven Myths of Formal Methods

Hall, Anthony. “Seven myths of formal methods,” 1990. 7 (5).

There are many commonly held beliefs that are hindering the adoption of formal methods in software development. Formal methods cannot guarantee that software is perfect because there is a limit what can be proven to correct and users of the formal system are not immune to making mistakes. However, formal methods can demonstrate correctness of the specification and find errors. Formal methods are often assumed to be useful only for program proving, however in authors experience, they are more useful in writing formal specifications, providing properties, constructing programs by mathematicaly manipulating specifications and verifying programs by mathematical argument. The paper also counters the lament that formal methods are only useful for safety-critical systems by giving examples from development of CASE tools. Formal methods do not need higher level mathematics are often feared. Familiarity with Logic and Set theory in addition to some training in Z system, can bring up the professional software engineer up to speed. The rigour brought by formal methods are claimed to reduce development costs by identifying potential problems early on. Keeping formal specifications in sync with the implementation, it is claimed that life-cycle change will become easier. The author also references the use of formal methods in large, industrial system development. The author has made a sincere effort to address the unsaid, yet often decision-influencing assumptions about formal methods. This paper makes an attempt at identifying some of those commonly held beliefs and provides anecodtal evidence from author’s own software development projects. This papers serves as a good starting point to start addressing some of the concerns that formal methods are “not practical” and “too academic” given the splash of mathematical symbols on any material related to formal methods. What this paper lacks in experimental rigour is made up for by author’s enthusiasm in urging the reader to take a second look at formal methods by providing convinient exits. The author’s emphasis that formal methods are primarily a tool for capturing specifications rings true to me. Since specifications are contracts between the stake-holders and the implementeres, capturing it with utmost rigour is beneficial. There will always be cultural backlash from programmers who resist deviating from their programming language and platform worldview. However, a serious software engineer will welcome the clarity formal methods bring to the table.

Reviewed: Sep 10, 2012

Requirements Engineering in the year 00 – A research perspective

Van Lamsweerde, Axel. “Requirements engineering in the year 00: A research perspective,” 2000.

This paper covers the milestones in evolution of requirement engineering as a discipline. Existing system can be modeled using ontological questions like why-what-how and reasoning about the model. The scope of requirement engineering also involved identifying the elements of ontology such as goals, viewpoints, data, operations, agents and resources. Goal based reasoning allows us to capture the why questions. Formal frameworks (KOAS methodology) allow us to caputre goal refinements through AND/OR graph structures. Qualitative frameworks (NFR methodology) catpure the soft (and sub)goals. Conflicts betweeen stakeholders’ viewpoints are addressed by applying conjuction on pais of viewpoints. Multi paradigm specifications lik OMT capture entiry-relationship, dataflow and state transition which is now popular(!) as UML. To elicit better expression from stakeholders, scenario based “story telling” has proven to be successful. Scenarios capture exceptional cases, abstract conceptual models, requirement validations and generate acceptance tests.

This paper makes a distinction between requirements and software specifications. This distinction is important as stakeholders often think in terms of objects and agents in the real world. Specification are formulated in terms of software objects. I am curious to learn more about goal driven nature of requirement engineering can segue into writing better software specifications. I suspect that the often mentioned screed of “insufficient requirements” can perhaps be attributed to the impedence mismatch between these two vocabularies. Another aspect of requirement engineering that appeals to me is that it has specific methodologies (NFR) to capture non-functional requirements, which is often brushed under the rug.

Reviewed: Sep 12, 2012

Software Architecture Patterns

Formal Methods in Software Engineering

Documentation

tutorials, how-to guides, explanation, reference

Misc reading

To read

Software Design X-ray