In an ideal world, research facilities would predict and accommodate every emerging trend in the research and development sector. Of course, the world isn’t perfect, and the ways in which we work change constantly and unpredictably, particularly in the research realm.
The use of computer modelling, to analyse that which cannot be examined in the physical environment, has significantly increased. What were once chemical and water-based “wet labs”, are now operating on a predominantly machine and computer-based model.
Traditionally, labs have been designed to be completely autonomous, featuring emergency power systems and N+1 Heating, Ventilation and Air Conditioning systems. Now, the labs of today require new infrastructural elements so that the rapid influx of digital tools can perform at optimal levels.
Over the past decade, high-speed internet access and cloud computing have become increasingly critical to the maintenance, security and operational consistency of any lab facility.
To maximise productivity, and reduce the risk of research loss, facility design must be nimble. In today’s terms, that means having optimal levels of reliable, resilient, and redundant connectivity. To achieve this, new design best practices and aspects of infrastructure must be considered: