Answer
In 1877, Ludwig Boltzmann laid the foundation for the modern understanding of entropy by defining it as the logarithm of the number of possible arrangements of a system, known as the Boltzmann equation (S = k log W). This concept revolutionized statistical mechanics, providing a measure of disorder and randomness at the microscopic level, essential for comprehending phenomena like heat flow and the behavior of thermodynamic systems.