Canadian Consulting Engineer

New laboratory at U of T allows researchers to “see” rock failures

January 28, 2008
By Canadian Consulting Engineer

Engineers will have a clearer picture of how the rock on which they are building might change over time, thanks to ...

Engineers will have a clearer picture of how the rock on which they are building might change over time, thanks to a new laboratory that was unveiled at the University of Toronto last week.
The new Rock Fracture Dynamics Laboratory (RFDL) in the Civil Engineering department is the first in the world of its type, performing true-triaxial testing, integrated monitoring and modeling.
The laboratory tests samples of rock in a non-destructive way. Instead of cutting into the rock to observe changes, sensors are attached on all sides of the sample, and the data from these sensors are used to build three-dimensional computer models of deformations caused in the material.
The rock samples — 80 mm cubes and 50 mm diameter cylinders — are subjected to high stress, temperature and pore pressure, until they fail. Sensors pick up a variety of data, including acoustic waves, from the rock and send the data to a supercomputer for analysis and modelling. The laboratory can mimic real earth conditions (thermal, hydraulic and mechanical) at depths up to four kilometres.
The research will be useful in a number of engineering areas, such as mine construction and dams and reservoirs. For example, says Dave Collins, a research associate in the Department of Civil Engineering at the university, when water levels in a reservoir change, the pressures on the rock deep below also change. The computer models could help in these kinds of analysis, as well as in areas such as risk assessments for earthquake activity, and underground radioactive waste storage.
Located in the Sanford Fleming building in King’s College Circle at University of Toronto, the laboratory took three years to build. The heavy physical testing machinery, weighing 25 tonnes, is in the basement level. Collins says building the equipment was a complex task because most of the parts are unique.
On the floor above the rock testing equipment is a high-performance computing cluster (HPCC), or supercomputer, for processing the data from the testing at very high speeds. The system consists of 256 processors, specifically 64 quad-core 64-bit processors with four 8GB RAMs per processor. In near real time the supercomputer can process and display results from 400 MB of data being recorded per second. It will allow the creation of much larger and higher resolution models to be produced than before.
Ten graduate students, researchers, and IT staff work at the facilities, alongside the principal investigator, Professor Paul Young.
The $5-million laboratory was realized through grants from the Canadian Foundation for Innovation, the province of Ontario and the U.S. Keck Foundation, as well as contributions from MTS Systems, Dell Canada and Microsoft.


Stories continue below