Alfie's suggestion about calculating over rates with DRS delays included is a sound one, it feels a little bizarre that the true over rate (not the one Sky uses) remove drinks breaks and fall of wickets from their equations, but not DRS breaks, which probably contribute as much, if not more lost time. This would be an easy stat to compute.
BUT......
The real key to this topic is the false assumption that over-rates have been reduced, which evidence does not support. It is somewhat an unprovable nostalgic myth to assume that over rates have declined. Its impossible to track "over-rates" through history as a pure stat, but if you isolate number of overs bowled per match, and take into account the requirement of 6 hours play has been consistent in the rules, you actually find a slight and gradual increase in the number of overs bowled per test since 1980.
The number declined significantly at that point, but there is an acute difference - the 6 ball over was implemented across all countries in the winter of 1979-80, along with the every session drinks break. 90 minutes was therefore instantly lost to drinks breaks - while the drop in overs per match equated to roughly 30 overs. It seems natural to assume the extra 5 overs lost aside from time for a drinks break to be included, could be accounted for less bowling changes/change of ends... which in practice, only had a difference of about 1 over per day give or take a decimal. Over-rates before 1979 are calculated based on the number balls being divided by 6... so of course, 8 ball overs with less changes of end will be quicker, its just natural. Looking at the last Ashes series played in the late 70s in Australia, and very roughly calculating it using proportion of total score/end of a days play, you get nowhere near 90 overs a day. Estimated to be about 60-65.
You could still then make the case that over rates particularly going back to the Bradman era and then past the WWII and into the 50-60-70s were higher.... why could that be? Well, there are two possible reasons you could isolate.
Run rates - The short answer is, the data shows a very clear link between over rates and scoring rates.
The 1940s and 1980s are comparable in overs bowled per game, and 1940s is the era in test cricket with the highest batting averages and well ahead of its time in scoring rates. This changed abruptly as the 1950s-70s becomes the slowest scoring rates of any era, weighted from earliest to latest.... over rates pretty much mirror this. In the 50s they were rattling off overs better than ever, but scoring rates of only 35 per 100 balls. Over time, scoring rates in that 30 year period after 1950 gradually rise, and there is a proportional and consistent drop in over rates to meet it.
The statistical outliers in the data set are all post 2000. The increase in SRs are not accounted for drops in bowling rates expected through the data set in history. So interestingly, despite there being a very obvious link between high scoring and the fact that the ball being constantly hit the boundary takes longer than if it hits the wicket keepers hands and instantly sent back, this seems to be made up in the modern day by an inclination to get on with it.
Left Handed batters - Another interesting statistical narrative you find in the data set - the increased proportion of left handers seems to effect over rates. It goes without saying that if you have a left hander batting with a right hander, every single facilitates a complete mirror field with players in the outfield having to make up potentially 50 meters of ground between every ball. Era's with more left handers as a percentage of players, also seem to show less over rates.