There’s a common idea that you can translate the age of a dog to the equivalent age of a human by multiplying by 7. It’s easily disprovable at a basic level insomuch as different breeds of dogs tend to live for different numbers of years . I chanced upon a preprint that also suggests the idea of a linear “human years multiplied by X” isn’t right either.
Something more akin to reality turns out to involve slightly more complicated math. The researchers based their results on Labrador retrievers as the reference dog, and determined via comparing methylomes between species that the relationship between humans and dog lives is better expressed as:
human age = 16ln(dog age) + 31
The ln there represents taking the natural log of a number, which you can calculate here.
One advantage of this non-linear view is that it allows certain physiological milestones line up much better - for example lining up the typical time puppies and humans develop teeth, as well as correspondingly similar average overall lifespans.