This is a guest post written by our friend Bill as a result of some discussion around the race traces I've been posting and in particular how straight the lines are in some races. Over to you Bill...

Tyre saving has become a hot topic in F1 over the past few seasons. I've heard engineers remind drivers to save tyres over the radio, and drivers blame it for poor race performance. But what should a team and driver do through a race to best exploit tyre saving? Is there anything that can be done?

To begin thinking about this, we need a couple of things in place - a model for tyre performance, and an idea of the size of the effects.

Models can be complicated, but luckily something simple seems to fit the evidence pretty well. If we assume that every lap at racing speed increases, by a constant amount, the minimum potential lap time a car can achieve on subsequent laps, then all looks good- we have an explanation for why lap times fall massively after a pit-stop (when tyres with many laps worth of this “tyre degradation” are replaced) and we see lap times that don't get much faster through a stint.

Why are constant lap times through a stint evidence of cumulative tyre deg? Well, we know F1 cars go faster when they are lighter - estimates in the public domain seem to hover around 0.03s/lap per kg. This means an F1 car should get quicker as it burns off fuel. So when it doesn't, we know a cumulative slowing effect must exist. If we assume all of this is tyre deg, then we instantly have an estimate for the size of the per lap tyre degradation effect- it has to be about the same as the gain expected from burning 1 lap of fuel. If fuel consumption is 1.75 kg a lap, then that makes tyre deg about 1.75kg0.03s/lap/kg = 0.05 s/lap lost every lap. With this model and 20 lap old tyres, we'd be going 200.05 = 1 s/lap slower than if we were on fresh new tyres.

So what's tyre saving all about then? The idea seems to be that, if the driver goes slower at particular points on the lap, he can reduce the tyre degradation he accumulates on that lap. With our model for tyre deg, we know tyre deg slows him on all subsequent laps. Hence a reduction in tyre deg benefits him over the remainder of the laps he completes on those tyres. How a driver might save tyre deg during a lap is likely pretty complicated, and probably the focus of a fair amount of research in F1. Fortunately, we can make progress without it here - we can just model it as the function between the deliberate slowness, seconds, a driver adds to his lap time and the per lap tyre deg, seconds, he accumulates on that lap.

What shape should this relationship take? The upper and lower end points seem pretty obvious- it seems unlikely that tyres would get faster no matter how slow a car goes, and you'd expect Tyre deg to be maximal at a drivers flat out speed when he is doing no tyre saving. This suggests the most likely shape is something like an exponential decay:

for

Where:

= a positive constant.

= the deliberate slowness a driver adds to his laptime to save tyres on a lap (s).

= tyre deg accumulated on this lap that will affect all subsequent laps (s/lap).

= tyre deg with ; no deliberate slowness (s/lap).

So if we go flat out () we accrue our maximum deg (), if we go really slow (big ), we accrue close to zero deg. If there are laps left on these tyres, the effect on total race time of going s/lap slow on this lap is the cost on this lap + the tyre deg effect on every remaining lap. If we let change to race time due to saving on this lap = :

We can look at the minimum of this by differentiating it and setting the result to zero. This yields:

Which results in :

This solution has a problem. y can be negative- which means it can ask the driver to go faster that his fastest, accumulating even more tyre degradation than at his flat out place. We've specifically disallowed this as unhelpful in our model, and believe the driver is flat out when he says he is at . This makes our real solution for the minimum:

This doesn't depend on our behaviour on any other lap, so the fastest way to the end of the race is to go this optimal amount slower on every lap.

Was all this worth bothering about? Lets put some numbers in and see.

From our argument above, lets assume is about the same as our reducing fuel load weight effect = 0.05 s/lap.

If we set , then going 1s/lap slower than flat out saves 0.007 s/lap of tyre deg -- which doesn't seem ridiculous.

With these settings, optimal slowness, s/lap, looks like this:

Fig. 1: Optimal tyre saving with 20 laps remaining and .

The most striking feature of this is that it is zero towards the end of the stint. This suggests the driver shouldn't be doing any saving from lap 11 onwards - just rinsing his tyres for all they are worth. It's just not worth going slow at all from here on in as there aren't enough future laps to recoup how slow you had to go on this lap to get the performance. All the important tyre saving is done at the start of a stint.

Slightly less expected is the behaviour at the start of a long stint with a lot of laps left - you don't go that much slower than on the previous lap. The function is convex in this area. Despite the massive gain you get by having your saving last for a lot of subsequent laps, you're already going quite slow and are well into the greatly diminishing part of the exponential and so barely get any return for going a lot slower.

So how would a car driven like this stack up in a race with a car driven flat out from the start? I've compared those two, and a car driven with optimal constant slowness and optimal linear reducing slowness in the race trace below.

Fig. 2: Race-trace for various tyre saving strategies, one stint only.

Pleasingly, our optimal deliberate slowness model wins. It performs only a little bit better than optimal linear reducing slowness, but a load better than going flat out every lap. The race trace shows the optimally driven car drops back by over a second over the first few laps, but then catches up and more as he puts his saved tyre performance to use.

This profile (and the more extreme ones for higher deg) make for some interesting possibilities. In an effective two horse race, there seems little disadvantage to the second car in going a little bit slower than the lead car at the start of the stint – you will save tyre performance and be quicker at the end of the stint. If the first car is driven optimally, he won't catch it before the end of the race, but if the first car has any issues at all (safety cars, missing a chicane, a slow lap...) he will not only close the gap, but will have a faster car than his opponent for the remainder of the race. Moreover, if the other car has underestimated the actual tyre deg rate, he will be driving closer to optimal and be able to catch, and have the chance to pass him, before the end of the race.

The model also gives us a clue as to why tyre saving seems to be a relatively recent hot topic. With just a small reduction in tyre deg (to 75% of our estimate), the optimal slowness is always zero and a driver should be going flat out from the start. No tyre saving helps.

As ever, the real situation is likely to be more complicated than we have modelled. We've ignored the probability of being slowed by other cars and we've assumed tyre deg behaviour is constant and known throughout a race, rather than variable and hard to predict. All of these are likely to be important factors, and must make for some interesting race day strategy debates within teams.