Instruments used to measure the amount of energy in a sample by measuring the amount of heat released or absorbed during combustion; available in portable and benchtop configurations.
The simplest calorimeter consists of a thermometer, metal container of water, and a combustion or reaction chamber. Calorimeters were first devised in the mid-1700s and heat production was determined based on the melting temperatures of snow and ice.
As an example, the heat produced by exothermic reactions in calorimeters is absorbed by the solution, which raises its temperature. In contrast, an endothermic reaction absorbs heat from the solution’s thermal energy, which decreases the temperature. The temperature change, along with the specific heat and mass of the solution, can then be used to calculate the amount of heat involved in either case.
Today’s most commonly used calorimeter types include:
- Differential scanning calorimeters
- Isothermal micro calorimeters
- Titration calorimeters
- Accelerated rate calorimeters
Another type, bomb calorimeters, have a constant volume and are used to measure energy from reactions that produce heat and gaseous products, including combustion reactions. They consist of strong steel containers (the “bomb”) into which the reactants and high-pressure oxygen is added. While the bomb is submerged in water, the mixture is ignited. The energy is trapped in the vessel and the surrounding water. Bomb calorimeters are routinely calibrated (using a standard known reaction) to verify their heat capacity and ensure accurate result calculation.
Calorimeters are useful in many applications:
- Fire effluent characterization
- Geothermal well testing
- Radionuclide standardization
- Natural gas heating values
- Crystallization of hydrates and waxes
- Furniture burning behavior
- Metal injection molding (MIM) material characterizations