Soil compaction, the process of densifying soil by reducing its air content, is a fundamental aspect of geotechnical engineering. The effectiveness of compaction relies heavily on two primary factors: the degree of compaction effort and the moisture content of the soil. In laboratory settings, 2.5kg and 4.5kg rammers are commonly used for compaction, and the maximum dry densities achieved with these tools typically represent the range observed in field compaction using machinery.
The second factor, moisture content, plays a pivotal role in the compaction process. When moisture content is low, soil particles resist compaction due to increased friction. However, as moisture content gradually increases, water acts as a lubricant, facilitating particle rearrangement and increasing compaction efficiency. Yet, at excessive moisture levels, water starts occupying a significant portion of the soil’s volume, leading to a decrease in dry density.
In practical terms, soil compaction tests compare the dry density achieved through on-site compaction using vibrating rollers or plates to the maximum dry density obtained in laboratory tests using a 2.5kg rammer (or 4.5kg rammer) with similar soil samples. This comparison gauges the effectiveness of field compaction relative to a standardized laboratory benchmark.
Results exceeding 100% indicate that the on-site compaction effort surpasses that of the laboratory standard, typically considered a satisfactory level of compaction. However, overzealous compaction, especially in granular soils, can fracture particles, compromising their strength parameters.
Therefore, while achieving adequate compaction is essential for soil stability and performance, excessive compaction should be avoided. Striking the right balance between compaction effort and moisture content is crucial to maximize soil density and strength without jeopardizing its structural integrity.