Features

Double check your digital designs

CROSS design
Image: Dreamstime.com

Without proper professional oversight, independent checking and sound design principles, computational tools including AI have the potential to lead to unsafe structures and costly mistakes, says CROSS.

Digital engineering can be considered from four perspectives: software, people, process and hardware. These are linked, and a weakness in any can result in a safety issue.

Confidential Reporting for Safer Structures (CROSS) has received a significant number of reports relating to computational design that suggest there is a gap between the use of software and the understanding of it.

This widening gap has the potential to lead to unsafe outcomes. The rapid expansion of artificial intelligence (AI) and other digital tools may mean that this trend will accelerate unless steps are taken to address it.

To help professionals engage with these risks, CROSS has collected these safety reports on digital engineering in a dedicated theme page on its website (www.cross-safety.org/uk/digital-engineering). This page aims to help engineers understand common errors, learn how to mitigate them and share their experiences for the benefit of others.

The circumstances in which the misuse of computational models may lead to unsafe structures include:

  • People without adequate structural engineering knowledge or training developing structural analysis tools.
  • Limitations of computational models not being sufficiently apparent to users.
  • Software being applied by inexperienced engineers beyond its limitations.
  • Inadequate checking processes that fail to catch errors.
  • Even experienced engineers struggling to spot weaknesses in programs when applied to unusual structures.
  • Automated design software creating a false sense of security, where errors can easily be hidden.

Professionals with awareness of a safety issue connected to digital engineering are encouraged to contribute to this growing knowledge base by submitting their own confidential reports to CROSS.

CROSS safety report: Modelling of concrete frame building

One report featured on the CROSS page, Concern over Modelling of Concrete Frame Building for Construction Stage (March 2022), highlights serious risks caused by over-reliance on computational tools without proper validation. 

In this case, the designers used a global 3D model that assumed the building was complete and fully cured and hence had not achieved its design strength. 

This overlooked the temporary conditions during construction and drastically underestimated loads on a critical transfer slab. The result was that the slab was under-reinforced and at real risk of failure or even disproportionate collapse.

What the report describes illustrates a broader challenge: computational models can generate unrealistic load paths and omit consideration of temporary conditions during construction, especially where self-weight is significant. 

The risks were compounded by a lack of hand-checks, weak internal review and inexperience among the engineers involved. Together, these factors demonstrate the dangers of treating software outputs as definitive, rather than as tools requiring judgment in the interpretation of their outputs.

The key lesson identified by the CROSS expert panel is that digital engineering must always be paired with rigorous validation and independent checks. Designers should sanity check outputs against conventional methods, consider buildable construction sequences and explicitly communicate the assumed methodology. 

If contractors propose sequencing changes, designs must be reassessed. Robustness and redundancy should be built into every stage of construction, not only the final condition.

Ultimately, this safety report shows that computational tools are only as reliable as the engineering judgment and processes that underpin their use.

CROSS safety report: Unqualified engineer’s unsafe design

Another report, Unqualified Engineer’s Unsafe Computer-aided Design of a Retaining Wall, shows the dangers of unqualified individuals relying on computer-aided design without the expertise to validate results. 

The reporter describes how retaining walls designed by a non-engineer were found to be unsafe, with inadequate resistance against overturning despite extensive computer calculations.

The fundamental issue was a misapplication of Eurocode 7 principles and a lack of understanding of structural equilibrium. 

Computer outputs were produced in large volumes, but they concealed the fact that the design was fundamentally unsafe. The result was that the retaining walls would likely need to be demolished and rebuilt, with both safety and financial consequences.

This CROSS safety report shows how computational tools are only effective when used by competent, qualified engineers who can interpret, check and challenge results. 

Over-reliance on software without adequate knowledge of the underlying engineering principle creates the risk of unsafe designs passing into construction. Where building control bodies lack capacity for detailed technical checking, this risk is heightened. 

CROSS safety report: Errors in steelwork connection design

This recent CROSS report from June 2025, Errors in Steelwork Connection Design Risk Unsafe Beam Sagging Moments, highlights the risks of misusing structural software, particularly when programs are used outside their intended scope. 

In the case described by the reporter, software that did not support a required beam-to-column web moment connection led designers to substitute a beam-to-beam connection type. This substitution was not identified by the designer and hence not challenged.

The difference is critical: the web of a column behaves very differently from the end plate of a beam. Using the wrong assumptions led to errors in bolt forces, yield line patterns and force distribution. The potential consequences included web deformation, unintended rotation of connections and higher sagging moments in beams than allowed for in the structural model.

This case demonstrates how software can mislead designers if its limitations are not understood. Outputs may appear precise and detailed but can still be fundamentally flawed. Structural analysis models often idealise member connections, ignoring local stress effects that can critically influence safety.

The broader lesson is that engineering judgment and awareness of software boundaries are essential. Designers must not manipulate software to fit unsupported conditions, nor treat simplified models as complete reflections of real behaviour. To mitigate risks, lead designers should prescribe connection assumptions, review fabrication details and ensure all designs remain within codified and tested guidance. 

Once again, the overarching lesson is that computational design is only reliable when validated against engineering knowledge.

Overall themes and guidance

Taken together, these reports demonstrate a growing industrywide concern: computational design tools are being widely used, but sometimes without sufficient understanding or validation. 

For those seeking advice, the Institution of Structural Engineers (IStructE) has published useful guidance on the use of software for engineering calculations, which is free to download at www.istructe.org. 

The guidance focuses on the management and control of calculation processes, including:

  • Establishing clear workflows for analysis models.
  • Interpreting results effectively.
  • Carrying out assessment checks.
  • Sizing and detailing components appropriately.
  • Reporting conclusions clearly and transparently.

This guidance reinforces the same lessons that emerge from the CROSS reports: software must be treated as a tool, not as a substitute for engineering knowledge.

Conclusion

Digital engineering is an essential part of modern structural design, but its safety depends on the balance of software, people, process and hardware. A weakness in any of these areas can lead to failure. The CROSS safety reports show that the misuse or misunderstanding of computational tools has the potential to lead to unsafe structures and costly mistakes.

Digital tools can support engineers but cannot replace the need for competence, judgment and rigorous validation. As the use of AI and other digital engineering technologies grows, the need for professional oversight, independent checking and sound design principles becomes ever more important.

By learning from these case studies, sharing experiences and applying published guidance, the industry can ensure that digital engineering enhances safety rather than undermines it.

Beware the AI hype 

The challenge using AI is to cut through inflated expectations and focus on practical, safe applications, writes Peter Debney.

Risks and limitations

  • Bias and prejudice: AI systems trained on historical data can perpetuate existing biases.
  • Lack of understanding: AI encapsulates knowledge without true understanding. This means outputs can be plausible but wrong, creating risks if not carefully validated.
  • Data security and privacy: Using AI tools embedded within company systems may be safe, but putting sensitive information into public AI platforms carries risks. Data may be reused for training or published.
  • Hallucinations: AI can generate convincing but false information (hallucinations). Neural networks also lack transparency, making it difficult to trace exactly how decisions are reached.
  • Environmental impact: Training and running large models is energy-intensive, with significant carbon costs.

Benefits and practical uses

Despite these risks, AI has potential in engineering when applied with care for a number of tasks:

  • Optimisation: AI can generate and test multiple design options quickly, including complex topology and shape optimisation. This can reduce carbon impacts and improve performance.
  • Document search and productivity: AI tools help scan large volumes of material such as codes, standards and client specifications, supporting engineers with routine tasks.
  • Inspiration and writing support: AI can produce outlines or generating ideas for refinement, provided outputs are carefully checked.

There are also deeper questions for the profession. If AI takes on basic calculations and routine design work, junior engineers may lose opportunities to build the skills needed to become senior decision-makers. Experience ‘through the mill’ is essential for developing judgment.

Ultimately, AI is best viewed as a tool. Used wisely, it can extend human capability, but it cannot currently replace the engineering knowledge, responsibility and critical checking that ensure safety.

Peter Debney is a CROSS expert panel member, a fellow of the Institution of Structural Engineers and the author of Computational Engineering.

Story for PSJ? Get in touch via email: [email protected]

Latest articles in Features