Cart
Free Shipping in Australia
Proud to be B-Corp

Evaluating Software Architectures Peter Gordon

Evaluating Software Architectures By Peter Gordon

Evaluating Software Architectures by Peter Gordon


$30,99
Condition - Very Good
Out of stock

Summary

Drawing on identified connections between architecture design decisions and resulting software properties, this book describes systematic methods for evaluating software architectures and applies them to real-life cases. It shows you how such evaluation can reduce risk, and introduces the conceptual background for architecture evaluation.

Evaluating Software Architectures Summary

Evaluating Software Architectures: Methods and Case Studies by Peter Gordon

The foundation of any software system is its architecture. Using this book, you can evaluate every aspect of architecture in advance, at remarkably low cost -- identifying improvements that can dramatically improve any system's performance, security, reliability, and maintainability. As the practice of software architecture has matured, it has become possible to identify causal connections between architectural design decisions and the qualities and properties that result downstream in the systems that follow from them. This book shows how, offering step-by-step guidance, as well as detailed practical examples -- complete with sample artifacts reflective of those that evaluators will encounter. The techniques presented here are applicable not only to software architectures, but also to system architectures encompassing computing hardware, networking equipment, and other elements. For all software architects, software engineers, developers, IT managers, and others responsible for creating, evaluating, or implementing software architectures.

About Peter Gordon

Paul Clements is a senior member of the technical staff at the SEI, where he works on software architecture and product line engineering. He is the author of five books and more than three dozen papers on these and other topics.

Rick Kazman is a senior member of the technical staff at the SEI. He is also an Associate Professor at the University of Hawaii. He is the author of two books, editor of two more, and has written more than seventy papers on software engineering and related topics.

Mark Klein is a senior member of the technical staff at the SEI. He is an adjunct professor in the Masters of Software Engineering program at Carnegie Mellon and a coauthor of A Practitioner's Handbook for Real-time Analysis: Guide to Rate Monotonic Analysis for Real-Time Systems (Kluwer Academic Publishers, 1993).



020170482XAB01162003

Table of Contents



List of Figures.


List of Tables.


Preface.


Acknowledgments.


Reader's Guide.


1. What Is Software Architecture?

Architecture as a Vehicle for Communication among Stakeholders.

Architecture and Its Effects on Stakeholders.

Architectural Views.

Architecture Description Languages.

Architecture as the Manifestation of the Earliest Design Decisions.

Architectural Styles.

Architecture as a Reusable, Transferable Abstraction of a System.

Summary.

For Further Reading.

Discussion Questions.



2. Evaluating a Software Architecture.

Why Evaluate an Architecture?

When Can an Architecture Be Evaluated?

Who's Involved?

What Result Does an Architecture Evaluation Produce?

For What Qualities Can We Evaluate an Architecture?

Why Are Quality Attributes Too Vague for Analysis?

What Are the Outputs of an Architecture Evaluation?

Outputs from the ATAM, the SAAM, and ARID.

Outputs Only from the ATAM.

What Are the Benefits and Costs of Performing an Architecture Evaluation?

For Further Reading.

Discussion Questions.



3. The ATAM-A Method for Architecture Evaluation.

Summary of the ATAM Steps.

Detailed Description of the ATAM Steps.

Step 1: Present the ATAM.

Step 2: Present the Business Drivers.

Step 3: Present the Architecture.

Step 4: Identify the Architectural Approaches.

Step 5: Generate the Quality Attribute Utility Tree.

Step 6: Analyze the Architectural Approaches.

Step 7: Brainstorm and Prioritize Scenarios.

Step 8: Analyze the Architectural Approaches.

Step 9: Present the Results.

The Phases of the ATAM.

Phase 0 Activities.

Phase 1 Activities.

Phase 2 Activities.

Phase 3 Activities.

For Further Reading.

Discussion Questions.



4. The Battlefield Control System-The First Case Study in Applying the ATAM.

Preparation.

Phase 1.

Step 1: Present the ATAM.

Step 2: Present the Business Drivers.

Step 3: Present the Architecture.

Step 4: Identify the Architectural Approaches.

Step 5: Generate the Quality Attribute Utility Tree.

Step 6: Analyze the Architectural Approaches.

Phase 2.

Step 7: Brainstorm and Prioritize Scenarios.

Step 8: Analyze the Architectural Approaches.

Step 9: Present the Results.

Results of the BCS Evaluation.

Documentation.

Requirements.

Sensitivities and Tradeoffs.

Architectural Risks.

Summary.

Discussion Questions.



5. Understanding Quality Attributes.

Quality Attribute Characterizations.

Performance.

Availability.

Modifiability.

Characterizations Inspire Questions.

Using Quality Attribute Characterizations in the ATAM.

Attribute-Based Architectural Styles.

Summary.

For Further Reading.

Discussion Questions.



6. A Case Study in Applying the ATAM.

Background.

Phase 0: Partnership and Preparation.

Phase 0, Step 1: Present the ATAM.

Phase 0, Step 2: Describe Candidate System.

Phase 0, Step 3: Make a Go/No-Go Decision.

Phase 0, Step 4: Negotiate the Statement of Work.

Phase 0, Step 5: Form the Core Evaluation Team.

Phase 0, Step 6: Hold Evaluation Team Kick-off Meeting.

Phase 0, Step 7: Prepare for Phase 1.

Phase 0, Step 8: Review the Architecture.

Phase 1: Initial Evaluation.

Phase 1, Step 1: Present the ATAM.

Phase 1, Step 2: Present Business Drivers.

Phase 1, Step 3: Present the Architecture.

Phase 1, Step 4: Identify Architectural Approaches.

Phase 1, Step 5: Generate Quality Attribute Utility Tree.

Phase 1, Step 6: Analyze the Architectural Approaches.

Hiatus between Phase 1 and Phase 2.

Phase 2: Complete Evaluation.

Phase 2, Step 0: Prepare for Phase 2.

Phase 2, Steps 1-6.

Phase 2, Step 7: Brainstorm and Prioritize Scenarios.

Phase 2, Step 8: Analyze Architectural Approaches.

Phase 2, Step 9: Present Results.

Phase 3: Follow-Up.

Phase 3, Step 1: Produce the Final Report.

Phase 3, Step 2: Hold the Postmortem Meeting.

Phase 3, Step 3: Build Portfolio and Update Artifact Repositories.

For Further Reading.

Discussion Questions.



7. Using the SAAM to Evaluate an Example Architecture.

Overview of the SAAM.

Inputs to a SAAM Evaluation.

Outputs from a SAAM Evaluation.

Steps of a SAAM Evaluation.

Step 1: Develop Scenarios.

Step 2: Describe the Architecture(s).

Step 3: Classify and Prioritize the Scenarios.

Step 4: Individually Evaluate Indirect Scenarios.

Step 5: Assess Scenario Interactions.

Step 6: Create the Overall Evaluation.

A Sample SAAM Agenda.

A SAAM Case Study.

ATAT System Overview.

Step 1: Develop Scenarios, First Iteration.

Step 2: Describe the Architecture(s), First Iteration.

Step 1: Develop Scenarios, Second Iteration.

Step 2: Describe the Architecture(s), Second Iteration.

Step 3: Classify and Prioritize the Scenarios.

Step 4: Individually Evaluate Indirect Scenarios.

Step 5: Assess Scenario Interactions.

Step 6: Create the Overall Evaluation-Results and Recommendations.

Summary.

For Further Reading.

Discussion Questions.



8. ARID-An Evaluation Method for Partial Architectures.

Active Design Reviews.

ARID: An ADR/ATAM Hybrid.

The Steps of ARID.

Phase 1: Rehearsal.

Phase 2: Review.

A Case Study in Applying ARID.

Carrying Out the Steps.

Results of the Exercise.

Summary.

For Further Reading.

Discussion Questions.



9. Comparing Software Architecture Evaluation Methods.

Questioning Techniques.

Questionnaires and Checklists.

Scenarios and Scenario-Based Methods.

Measuring Techniques.

Metrics.

Simulations, Prototypes, and Experiments.

Rate-Monotonic Analysis.

Automated Tools and Architecture Description Languages.

Hybrid Techniques.

Software Performance Engineering.

The ATAM.

Summary.

For Further Reading.

Discussion Questions.



10. Growing an Architecture Evaluation Capability in Your Organization.

Building Organizational Buy-in.

Growing a Pool of Evaluators.

Establishing a Corporate Memory.

Cost and Benefit Data.

Method Guidance.

Reusable Artifacts.

Summary.

Discussion Questions.



11. Conclusions.

You Are Now Ready!

What Methods Have You Seen?

Why Evaluate Architectures?

Why Does the ATAM Work?

A Parting Message.



Appendix A: An Example Attribute-Based Architectural Style.

Problem Description.

Stimulus/Response.

Architectural Style.

Analysis.

Reasoning.

Priority Assignment.

Priority Inversion.

Blocking Time.

For Further Reading.



References.


Index. 020170482XT10082001

Additional information

GOR002936691
9780201704822
020170482X
Evaluating Software Architectures: Methods and Case Studies by Peter Gordon
Used - Very Good
Hardback
Pearson Education (US)
20011108
368
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a used book - there is no escaping the fact it has been read by someone else and it will show signs of wear and previous use. Overall we expect it to be in very good condition, but if you are not entirely satisfied please get in touch with us

Customer Reviews - Evaluating Software Architectures