The Bohr radius, often denoted as "a₀," is a fundamental physical constant in quantum mechanics and atomic physics. It is named after the Danish physicist Niels Bohr, who made significant contributions to our understanding of atomic structure.
The Bohr radius represents the average distance between the nucleus and the electron in the lowest energy state (ground state) of a hydrogen atom, or a hydrogen-like ion with a single electron (e.g., helium ion with only one electron remaining). It is a key parameter in the Bohr model of the hydrogen atom.
The Bohr radius is defined as:
a₀ = (4πε₀ħ²) / (me²),
where:
When you calculate the Bohr radius using these constants, you get a value of approximately 5.29177210903 x 10⁻¹¹ meters, or about 0.5292 angstroms (Å).
The Bohr radius is a critical parameter in understanding the structure of atoms, particularly hydrogen-like atoms. It provides a basic scale for the size of atomic orbitals and helps in describing the energy levels of electrons in these atoms.
Earth's polar radius, often denoted as "r," is the distance from the center of the Earth to a point on the Earth's surface near either the North Pole or the South Pole. It represents the Earth's radius when measured from its center to a point along its polar axis. The polar radius is shorter than the equatorial radius because the Earth is slightly flattened at the poles and bulges at the equator due to its rotation.
The approximate value for Earth's polar radius is about 6,357 kilometers (or approximately 3,949 miles). This value may vary slightly depending on the reference ellipsoid used for modeling the Earth's shape, but the given value is a commonly used and accurate approximation for most purposes.
In contrast to the polar radius, Earth's equatorial radius (measured from the center to a point on the equator) is slightly longer, approximately 6,378.1 kilometers (3,963.2 miles).