Extract from the white paper: The role of the architect – The Cercle CESAM
Preamble

Throughout the architecture design cycle, it is necessary to control the confidence that one can have in the definition of the architecture on the basis of indicators that the architect defines himself or that are given to him, imposed by the project/organization. These indicators are useful for both the design project and stakeholders to track progress.

THE ESSENTIAL

It is important here to clearly define the objectives at the start of the project with regard to the indicators, to set up maturity indicators at the start of the project and to carry out peer reviews on a regular basis (both audit and collaborative engineering).

THE MAIN PITFALLS

Among the main pitfalls:

  • Not piloting the rise in maturity of the definition
  • Have indicators that relate only to the production of elements and not to their quality
  • Link contractual progress only to document production indicators, which can lead to maintaining two sets of documentation or worse that the documentation is only used for reporting
  • The reuse of existing architectures whose maturity is proven can prevent the identification of areas for improvement
  • A poorly analyzed strict reuse hypothesis may not correspond to the real context of use and generate potentially costly readjustments
  • Difficulty in identifying the maturity of subjects in terms of innovation or unknown areas
  • The maturity of a system is not the sum of the maturity of its constituents (emergent properties, integration, etc.)
  • Not following the maturity of the definition over the entire life cycle (especially during the downstream phases of the design)
  • Forgetting the maturity of the interfaces when assessing the maturity of the definition of the solution
  • Forgetting the maturity of the data model in the assessment of the maturity of the definition of the solution
  • Failure to take into account the evolution of the issues in the monitoring of the evolution of the maturity
BEST PRACTICES

Here are some good practices to consider:

  • Set up standard evaluation criteria, even at a high level, instantiable by the projects, and structuring to give a transversal vision of the projects to the management and the architects. Here are some ideas:
    • Assess maturity against the entire lifecycle, use cases, functions and operational scenarios
    • Functionality: Are the use cases identified and do they correspond to the needs of the stakeholders? Are functions, decompositions and functional dependencies identified?
    • Structuring: Are the subsystems, components clearly defined and organized?
    • Organization: What is the level of correspondence between the organization (embodiment & communication) and the structuring of the solution?
    • Interfaces: identification of external and internal interfaces (functional/physical) and their number
    • Allocation: Are functions allocated to components? Are functional interfaces allocated to physical interfaces?
    • Reuse rate (of components, models, interfaces, etc.)
  • Separate the documentation/deliverable (ex: word file) from the architectural elements that constitute it (ex: models, data, etc.)
  • Perform regular solution definition maturity assessment reviews with the right stakeholders (technical team, full lifecycle experts)
  • Take into account the opinion of experts and batch managers
  • Plan for the development of progressive maturity loops
TESTIMONIALS

We have compiled here several verbatim statements from project managers or system architects from different companies, which echo this phase:

We have put in place an architecture completeness file which traces each expectation of the architecture analyzes in terms of deliverables and activities (e.g. life phase study rate, number of use cases, scenarios…) which makes it possible to define the progress on the architecture.

We follow the progress of the definition of the architecture on two axes: the number of macro-use cases studied (defined from the start of the design) on the one hand and the quality of the analysis (based on the number reviews by peers) carried out on these use cases on the other hand.

Interesting tools available from INCOSE (SRL System Readiness Level) & NASA (ARL Application Readiness level) for instantiation

This post is also available in pdf format.DOWNLOAD
Any comments?

Your comments will be considered by the members of the Cercle at the next monthly meeting.