PlaymakerAI are the first in the world to introduce effective time!

Challenging the traditional 90-min game metric, our analysis reveals significant discrepancies in player stats due to overlooked stoppage time.

Time is relative

In Einstein’s Theory of Relativity, time is determined to be relative, meaning that how it is perceived can vary between different observers. Now, within football we’ve always said that time, by all means, isn’t relative. One minute is one minute and a game is 90 minutes.

Is a game really 90 minutes long?

Let's look at how long the games played really are. Here I have broken down the average game length for some different leagues over the last 8 years.

What implications does this have on the "per 90" metric?

The idea behind the "per 90" metric basically that we want to quantify a players actions per game played. So now that we know that a "game" is 101 minutes (PL 23/24) and not 90 what do we do with that information?

In analytics and statistics alike, the widespread belief has been that a game lasts exactly 90 minutes. But given the above - is this really fair? In my opinion the answer is obviously no.

This could be the end of this article. A simple no. But it gets more complicated.

How are minutes calculated?

If we acknowledge the fact that a football game isn´t 90 minutes but in fact 95 or even 100 a player having played all games of a season (38 in PL) will have played more than 38*90 = 3420 minutes right? Well if we consult the biggest online database for football statistics the answer is no. In fact it turns out that all games are considered to be exactly 90 minutes long. No extra time is added in either half. And what implications does this have if we would use this way of calculating minutes played?

Analyzing minutes played in the Premier League

Let's take two examples from the Premier League. One who is used a lot per game and one who has relatively few minutes played per appearance.

Bruno Fernandes has played 28 games without being substituted on or off so given how the biggest online database calculates his played minutes that becomes as simple as 28*90 = 2520 minutes.

But if you were to include all stoppage time the actual played minutes becomes 2816 minutes. A difference of 11%.

If we take a player on the other side of the spectrum instead the difference becomes more evident. Pierre-Emile Højbjerg has appeared in just as many games as Bruno Fernandes but he has only played 1323 minutes - if you ask us who include stoppage time. If you ask the biggest online database it says that he has played 1000 minutes. A difference of 28%.

The difference in a simple metric like minutes played among data providers.

If we look at the three biggest online sources for football stats (from different data providers) they all differ a bit.

The above is a bit confusing in trying to decipher what they do and what they don't. The best guess I have is that #1 and #2 simply assume that all games are exactly 90 minutes long while data source #3 seems to include extra time in the second half but not in the first.

The error percentages above also gives a good indication on the error margin in general for playing times. The 11% difference we see for Fernandes correlates quite good with the fact that the actual game time for a game is 100 and not 90. Then we see what happens when a player plays a lot of fractions of games. The error increases. In fact theoretically a player could have been substituted on at minute 89 in every game of the 28. Then he/she would have 28 minutes played by two data sources while we probably would have around 300. The difference would then be 166%

Lets then assume that Højbjerg has made 1000 successful passes over the course of the season and 10 goals. Using the minutes above we can calculate the passes/90 ang goals/90 metric according to the different data providers.

This is a pyramidal difference. Totally monumental. Actually I am left speechless. I will not say more in this section. I can’t.

The theoretical super sub scoring 2 goals…

Does 0.4532 goals/90 make sense?

When you have let this sink in, keep in mind that all the data providers tend to use at least two decimals when displaying metrics. Using two decimals communicates that you have extremely high precision in the underlying data! Now do they really - given what we just have found?

To assume that a match is 90 minutes long when the truth is closer to 100, yet still insist on specifying all "per match" metrics with two decimals feels incredibly unserious. As long as the error margin on the minutes played is much larger than the other part of the calculation even for large quantity metric as passes the discussion should rather be if metrics should be calculated with decimals at all but rather to the closest integer.

Also, often a very precise number is used for determining a player's playing time in a certain season. Let's say you claim that a player has played 1603 minutes. To me this indicates minute precision to the minutes played metric. But if the actual minutes played in fact (OT first and second half included) is 1741 to me a better way of communicating an uncertainty would be to use the wavy equal sign; ≈ 1600.

How should minutes played be calculated?

So far we have defined time as it is displayed on the official game clock. This means that all players will accumulate minutes played during all stoppages (ball out of play, injuries, substitutions e.t.c).

The biggest problem with this as I see it is that players in different teams will have more or less actual time available in games to do things with. The best way to visualize this is to use effective time as a metric.

Here we look at the average game time for teams in the Premier League last season.

We see that possession oriented teams in general tend to have more effective game time. When we calculate the effective game time we stop the clock on every stoppage (ball out of play, injuries, substitutions e.t.c).

So what does this say?

Well in general you could argue that Erling Haaland has more time available per game for scoring his goals than Elijah Adebayo - the top goalscorer in Luton.

Effective time

And this gets very clear when we use effective time to calculate time played instead of using the running clock. Tick tock, tick tock…

To me - using effective time is way more fair.

The solution

We never try to force our customers into one single metric or a single solution. We always want to provide options that go hand in hand with how you want to work. Therefore we give our customers their own choice on how to solve the problem of time relativity.

At PlaymakerAI you are able to choose whether you want minutes played to be calculated using effective time or not and also if the per game should be calculated using 90 or the average game time of the current dataset.

Conclusion

In conclusion, the exploration of time in football through the lens of Einstein's Theory of Relativity has led us to question and redefine one of the key concepts used in football analytics. The evidence presented challenges the long-standing assumption that a game lasts precisely 90 minutes, revealing a significant discrepancy in how we calculate and interpret player statistics. This discrepancy underscores the need for a more accurate and adaptable metric that reflects the true nature of the game.

Ola Lidmark Eriksson
Founder of PlaymakerAI


Previous
Previous

Why Visuals are Key to Football Storytelling in the Media