New SAE paper says EVs worse than ICEs at hitting EPA-rated range estimates
Anecdotally, we probably come across the same number of comments here and elsewhere about how battery-electric vehicles don’t achieve their EPA-rated range estimates versus how they do achieve or exceed their range estimates. A new technical paper from the Society of Automotive Engineers (SAE) co-authored by Car and Driver asserts that “most BEVs tested to date fall short of both their electric consumption and range label values.” Not only that, the writers say that the disparity between claimed and actual EV driving ranges is wider than the same disparity for vehicles powered by internal combustion engines. Presenting the paper at this week’s SAE World Congress Experience that discusses the hurdles facing transportation and mobility, C/D testing director Dave VanderWerp and SAE’s Gregory Pannonethe assert the problem isn’t with BEVs, but with EPA testing and calculation procedures.
The quick backstory is that in 2016, Car and Driver added a highway portion to its fuel economy testing. The route is a 200-mile out-and-back loop on I-94 in Michigan, run with the cruise control set at 75 miles per hour, the speed backed by GPS verification. The mag explained the process in detail in 2020, including how EV testing adds a few extra steps to ensure state-of-charge consistency, measure range fluctuations over time, and extrapolate data points to adjust for last few percent of unused charge. Last October, VanderWerp explained, “We chose 75 mph for our test because driving at a steady elevated speed is the worst range case for an EV (this is not the case for a gas-powered car, which we run at 75 mph to get a highway fuel-economy figure). Another reason is that range matters most in a scenario like a highway road trip where you’re driving a lot of miles in a day; nearly every EV on the market has sufficient range when stuck in slow city traffic for hours upon hours.”
In August 2022, VanderWerp wrote, “Unlike gas- or diesel-powered vehicles, which regularly beat their EPA ratings in our highway testing, only three of the 33 EVs that we’ve run range tests on to date have exceeded their EPA highway and combined figures.” Comparing numbers by powertrain in relation to the new paper, C/D wrote that on the highway test, “more than 350 internal-combustion vehicles averaged 4.0 percent better fuel economy than what was stated on their labels. But the average range for an EV was 12.5 percent worse than the price sticker numbers.”
What could be causing the findings? Among the reasons, the paper cites the variable speeds used in EPA testing, decelerations helping EVs recoup energy and extend range. The EPA runs its highway turns indoors and at lower speeds than 75 mph. There’s the EPA’s weighted rating system that favors an EV’s city range by 55% compared to 45% highway when deriving the combined range number, the EPA not easily providing those constituent numbers to the public, something the agency ceased doing with the 2022 model year. The city and highway mileage figures at the EPA site are for efficiency, which isn’t the same as pure range. VanderWerp told Autoblog, “While efficiency and range are related, unlike gas-powered vehicles, you can’t directly compute range from the MPGe figures like you can with gas mpg and the tank size. That’s because there are an unknown amount of charging losses included in the MPGe figures that don’t factor into the range figure. Further complicating matters is that every automaker isn’t good about providing usable battery capacity.”
On top of all of this, the EPA uses a small range of multipliers to convert its test data into real-world estimates customers can expect. Automaker often self-certify and submit their testing info, and EPA multipliers shift depending on whether automakers choose to perform two-cycle or five-cycle tests, making comparisons difficult across models. On this last point, VanderWerp made it clear to us that, “One of the key things we’re highlighting in the paper is that not all EPA figures are exactly equal. To get to the label values, they take the results from the city and highway-cycle tests and reduce them by a factor. The default is 0.7 (i.e., a result of 300 miles on the test cycle equals a 210-mile label figure), but automakers have the option to do five-cycle testing to earn a more advantageous reduction factor. Some Teslas are as high as 0.77, so that same hypothetical 300 miles on the test cycle would instead be a label figure of 231 miles. It’s perhaps no surprise then that vehicles using more aggressive reduction factors [higher multipliers] do relatively worse in our real-world highway range testing. In fact, a Tesla (the automaker that underperforms its label the most in our testing) with a 400-mile label range figure and a Porsche Taycan with a 300-mile range figure are probably about the same range at 75 mph on the highway. That’s kind of problematic for potential buyers comparing label range values during the shopping process.
The vagaries can also throw results the other way, like when the EPA rated the just-launched Porsche Taycan Turbo at 201 miles of total range. Porsche’s third-party testing claimed around 275 miles, our eight-hour, real-world test of the same model showed more like 254 miles.
Three of the solutions the authors say could aid consumers would be standardized testing procedures, a more realistic range multiplier, and for the EPA to show its city and highway ranges for EVs, as the agency does with ICE vehicles.
Head to C/D for its explanation of the findings. If you’re super keen, you can buy the paper at the SAE site, non-members will pay $35.