How Much Does It Cost to Run a TV All Day?
Televisions are often left on for long stretches of time, whether for background noise, entertainment, or all-day viewing. Because TVs can run for hours at a time, many people wonder how much electricity they actually use and whether leaving one on all day adds much to the electric bill. In this article, we’ll break down how TV electricity costs are calculated, show realistic cost examples, and explain what factors make the biggest difference.
What affects the cost of running a TV
The cost of running a TV depends on a few main factors.
TV size and type
Larger TVs generally use more electricity than smaller ones. The display technology also matters. LED TVs usually use less power than older plasma TVs, while OLED TVs can use more power depending on brightness.
Power usage
Most modern TVs use between 50 and 200 watts while operating. Very large or high-brightness TVs can use more.
Brightness and settings
Higher brightness levels increase power consumption. Picture modes like “Vivid” typically use more electricity than standard or energy-saving modes.
How long the TV is on
A TV running for a few hours a day costs much less than one left on all day.
How TV electricity cost is calculated
The basic calculation looks like this:
Wattage ÷ 1,000 × hours running × electricity rate = cost
This formula can be used to estimate daily, weekly, or monthly TV usage.
Real-world cost examples
Running a 100-watt TV for about 4 hours per day
0.1 kWh × 4 hours = 0.4 kWh
0.4 kWh × $0.20 = $0.08 per day
That’s about $2 to $3 per month.
Running a 100-watt TV for about 8 hours per day
0.1 kWh × 8 hours = 0.8 kWh
0.8 kWh × $0.20 = $0.16 per day
That works out to roughly $5 per month.
Running a 100-watt TV all day (24 hours)
0.1 kWh × 24 hours = 2.4 kWh
2.4 kWh × $0.20 = $0.48 per day
That comes out to about $14 to $15 per month.
This represents a maximum or upper-bound scenario, assuming the TV stays on continuously.
Larger TVs and higher power usage
A larger or brighter TV using around 200 watts:
8 hours per day
0.2 kWh × 8 hours = 1.6 kWh
1.6 kWh × $0.20 = $0.32 per day
Around $10 per month
24 hours per day
0.2 kWh × 24 hours = 4.8 kWh
4.8 kWh × $0.20 = $0.96 per day
Around $29 per month
Do TVs use electricity when they’re “off”?
Most modern TVs use a small amount of standby power when turned off but still plugged in. This allows features like quick startup and remote control activation.
Standby power is usually very low, often under 1 watt, and typically adds only a small amount to monthly electricity costs.
When TV electricity costs add up
TV costs become more noticeable when:
The TV is left on most of the day
Multiple TVs are running at the same time
Very large or high-brightness TVs are used
Electricity rates are higher than average
Even then, TV electricity use is usually modest compared to major appliances.
When TV electricity costs are usually minor
The cost is often small when:
The TV is used a few hours per day
Energy-saving picture modes are enabled
Smaller or more efficient models are used
For many households, the cost of running a TV is relatively low.
How to estimate your own TV’s cost
To estimate your TV’s cost:
Check the TV’s wattage in the manual or on the manufacturer’s website
Look up your electricity rate on your utility bill
Estimate how many hours per day the TV is on
Some smart plugs and power meters can also show actual usage.
The bottom line
Most modern TVs are relatively efficient and inexpensive to run, even for several hours a day. Leaving a TV on all day can add up over time, but the total cost is usually modest compared to larger household appliances. Screen size, brightness settings, and daily run time have the biggest impact on electricity use.