is Time.deltaTime dependent on computer speed?

sorry for the rookie question, i read some posts and i didn’t really understand it. in the unity documentation it said that time.deltatime is the time in seconds it took to complete the last frame. dosnt that mean that time.deltatime will change depending on the computer speed, faster computers should be able to processes more frames per second so the time.deltatime on faster computers will be slower right??.. i actually tried this on 2 computers and i did find a difference, if im right what constant would i use so that its the same on all computers. thanks

Time.deltatime is based on the time in seconds since the last frame has completed being drawn. Depending on the computer, this value will either get smaller (faster computers) or bigger (slower computers). For example, if your computer gets 24 FPS (Frames per second), then your time.deltatime would be 100ms / 24ms which equals 4.1ms, converted to seconds would be 0.041.

You are supposed to use time.deltatime for scripts that include physics (movement, etc) and user mouse input (turning around, etc).

You would need to post how that constant would be used in what situation.

Let me explain it on this example: You want to move something at a specific speed. You can do this, using the Update() function. Update() gets called once every frame, so if you move your object a specific range every frame, it would be faster on faster computers. So you multiply the speed of the object with time.deltaTime, and the object will have the same speed on every computer.