There must be some effect of fps_max on the time things take to happen. Based on the manual I thought that was the case as well, but it would seem that there is something else at work.
I did a little experiment, and now I'm confused. If 1 tick is 1/16 of a second, then changing fps_max should have no effect on how fast something falls. To test this I had a model fall from the sky every time I pressed space bar, incrementing fps_max by 10 each time.
I measured the time using "time" and "time_physics" and the relationship between the two changes as I changed fps_max. I also timed the actual event and found that the time taken to fall the distance was indeed increasing (it got as high as 2.4s to fall a distance that started out taking about 0.8s before I stopped). However, the number of ticks taken remained approximately constant throughout. Here's a graph of what I found
ticksgraph.bmp
Essentially this graph shows that there is a mostly linear relationship between ticks passed and time passed, but that the gradient of this line is governed by fps_max.
So why does fps_max affect the number of ticks in a second? Does anyone know exactly what the relationship is?
P.S. Is there a way to make the image come up on this post? I'm new at this and I couldn't make the "image" UBB code work.