What is ohm
Last updated: April 2, 2026
Key Facts
- One ohm (1Ω) is defined as 1 volt divided by 1 ampere, established by Georg Ohm's research in the 1820s
- The symbol Ω represents the ohm, using the uppercase Greek letter omega, standardized internationally in the SI system
- Copper wire at 20°C (68°F) has a resistance of approximately 0.0000168 ohms per meter, making it an excellent electrical conductor
- Following the 2019 SI revision, the ohm was redefined as an exact value based on fundamental physical constants rather than measurement standards
- Typical household circuits operate at resistances between 10-100 ohms, while household electrical wiring ranges from 0.001 to 0.01 ohms per meter depending on wire gauge
Overview: Understanding Electrical Resistance
The ohm is the fundamental unit used to measure electrical resistance, which describes how much a material or component opposes the flow of electric current. When electricity flows through a conductor, it encounters opposition—this resistance generates heat and reduces the efficiency of electrical transmission. The ohm provides a standardized way to quantify this opposition, allowing engineers, electricians, and scientists to calculate electrical behavior, design safe circuits, and select appropriate components. Named after Georg Ohm, a German physicist who discovered the relationship between voltage, current, and resistance in the 1820s, the ohm has become indispensable in modern electrical systems worldwide.
Historical Development and Ohm's Law
Georg Ohm conducted experiments in the 1820s that revealed a fundamental relationship between voltage (electrical potential), current (flow of electrons), and resistance. His findings, published in 1827, established what became known as Ohm's Law: V = IR (Voltage equals Current times Resistance). This mathematical relationship revolutionized electrical science by providing a predictable, quantifiable framework for understanding electrical behavior. One ohm was subsequently defined as the exact amount of resistance that allows one volt to push one ampere of current through a conductor. Ohm's work was initially dismissed by the scientific community, but by the 1840s, his contributions were widely recognized, and the unit was named in his honor at the 1881 International Electrical Congress. His discoveries remain foundational to all electrical engineering and continue to be taught in physics and electronics courses globally.
Technical Definition and Measurement
The ohm is defined through a precise mathematical relationship: 1Ω = 1V ÷ 1A (one volt per ampere). This means that if a voltage of one volt is applied across a conductor and produces a current of one ampere, that conductor has a resistance of one ohm. Modern digital multimeters measure resistance in ohms by applying a small test voltage and measuring the resulting current, then calculating resistance using Ohm's Law. Materials exhibit varying resistance based on their atomic structure, temperature, and physical dimensions. The resistance of a wire, for example, is determined by the formula R = ρL/A, where ρ is the material's resistivity, L is the length, and A is the cross-sectional area. Copper, with a resistivity of approximately 0.0000168 ohm-meters at 20°C (68°F), is one of the best electrical conductors used in wiring and electronics. In contrast, rubber and glass are insulators with extremely high resistance values (millions or billions of ohms), making them suitable for electrical insulation. The 2019 SI revision redefined the ohm in terms of fundamental physical constants, specifically the fine structure constant and other quantum mechanical properties, making it an exact value rather than a derived measurement from physical standards.
Practical Applications in Daily Life
Electrical resistance measured in ohms is critical in countless everyday devices. Household incandescent light bulbs, for example, typically have resistances ranging from 100 to 300 ohms when operating at full brightness—the electrical resistance causes the filament to heat up and emit light. Heating elements in electric ovens, toasters, and water heaters operate at resistances between 10 and 50 ohms, converting electrical energy into thermal energy through resistance. Electric circuit breakers and fuses use specific resistance values to detect dangerous current levels and shut off power to prevent fires. The wiring in homes is designed to have very low resistance (typically 0.001 to 0.01 ohms per meter) to minimize heat loss and power waste during transmission from the electrical panel to outlets and appliances. Mobile phones, computers, and televisions contain millions of microscopic components called resistors with precise ohm ratings, from a fraction of an ohm to millions of ohms, that control voltage and current throughout the device to ensure proper operation and prevent component damage.
Common Misconceptions About Resistance
One widespread misconception is that higher resistance always means less current flow, leading some to believe that resistance is inherently bad. In reality, resistance is neither good nor bad—it is simply a property of materials. Resistance is essential in electronics: precise resistor components with exact ohm values are fundamental building blocks of virtually every electronic device, used to control voltage levels, limit current, create timing circuits, and protect sensitive components. Another common myth is that resistance generates no useful effect beyond heat waste. While resistance does produce heat (which is why power transmission lines and electronics must be cooled), this same property enables useful technologies like space heaters, electric stoves, and incandescent lighting. A third misconception is that a perfect conductor (zero ohms) is always desirable. While superconductors with zero resistance have specialized applications in medical imaging and research, most electrical systems require controlled resistance to function properly—zero resistance can cause dangerous electrical shorts.
Temperature Effects and Resistance Variations
Most materials show significant changes in resistance with temperature variations. For metallic conductors like copper and aluminum, resistance increases with temperature at a rate of approximately 0.4% per degree Celsius, meaning a copper wire at 40°C has about 8% higher resistance than the same wire at 0°C. This temperature coefficient of resistance is critical in electrical engineering: power transmission lines are longer in hot weather (due to thermal expansion) and carry higher resistance, leading to greater power losses. Thermistors are specialized components that exploit this temperature-resistance relationship, changing resistance dramatically with small temperature changes and serving as temperature sensors in thermostats and medical devices. Superconductors, discovered in the early 1900s, exhibit a dramatic transition where resistance drops to virtually zero when cooled below a critical temperature (different for each material, but typically between -270°C and -150°C)—this property enables high-field electromagnets and precision instruments in medical and scientific applications. Understanding how temperature affects resistance is essential for designing equipment that must operate reliably across wide temperature ranges, from arctic mining operations to desert power plants.
Related Questions
How does Ohm's Law work?
Ohm's Law states that voltage equals current multiplied by resistance (V = IR). This fundamental relationship shows that doubling the resistance while keeping voltage constant cuts the current in half, and doubling the voltage while keeping resistance constant doubles the current. For example, a 120-volt household circuit with a 12-ohm toaster will draw 10 amperes of current. Understanding this relationship allows engineers to calculate the current flow through any electrical device when they know the voltage and resistance.
What is the difference between resistance and resistivity?
Resistance (measured in ohms) is the opposition to current flow in a specific conductor or component, while resistivity (measured in ohm-meters) is a material property that describes how much any sample of that material opposes current flow. Copper has a very low resistivity of 0.0000168 ohm-meters, meaning all copper conductors have relatively low resistance. A thin copper wire has higher resistance than a thick copper wire of the same length because resistivity is constant but cross-sectional area differs, demonstrating that resistance depends on both the material's resistivity and the conductor's dimensions.
Why do devices get hot when current flows through resistance?
When electric current flows through a material with resistance, electrical energy is converted into thermal energy according to Joule's Law (Power = I²R, or Power = Voltage × Current). The electrical energy is absorbed by the material's atoms, increasing their vibration and generating heat. A 1000-watt electric heater operating at 120 volts draws about 8.33 amperes and maintains an internal resistance of about 1.73 ohms, converting nearly all electrical energy into useful heat. This same process occurs in all electronic devices, which is why computers, phones, and power supplies require cooling systems to prevent overheating.
What is the purpose of resistors in electronic circuits?
Resistors are components with precise ohm ratings used to control voltage and current in electronic circuits, protect sensitive components, and create timing circuits. A typical smartphone circuit might contain thousands of resistors ranging from a few ohms to millions of ohms, each serving a specific purpose such as limiting current through an LED indicator, setting the frequency of a clock signal, or protecting the device from power surges. Without resistors, excessive current would flow through delicate semiconductors, causing immediate failure, making resistors essential to circuit design and electronic device reliability.
How do you measure resistance in ohms?
A digital multimeter measures resistance by selecting the ohms setting, disconnecting power to the circuit being measured, and connecting the meter's probes across the component. The meter applies a small test voltage and measures the resulting current, then calculates resistance using Ohm's Law (R = V/I). Modern digital multimeters typically measure resistances from 0.1 ohms to 20 megaohms (20 million ohms) with reasonable accuracy, though precise measurements of very high resistance values (above 1 megaohm) require special equipment due to factors like humidity and contact resistance affecting the measurement.
More What Is in Daily Life
Also in Daily Life
More "What Is" Questions
Trending on WhatAnswers
Browse by Topic
Browse by Question Type
Sources
- Ohm - WikipediaCC-BY-SA
- Ohm | Electricity, Resistance & Voltage | BritannicaProprietary
- What is Ohm (Ω)? Unit of Electrical Resistance and ImpedanceProprietary
- The Resistance Unit Ohm - PTB.deGovernment