Lerping takes 10 x longer than it should

Hi all,
I’ve written a generic lerp function for use in my project, however I’ve noticed some odd behaviour.
The time taken to lerp seems to be 10x longer than the value I am parsing to the function in seconds. If I divide seconds by 10 I get an accurate timing for the lerp. Can someone explain to me why this is happening?

//INITIALISATION CODE
function Start () {
	GlobalFunctions.Lerp(transform, transform.position, Vector3(0, 0, 10), 10);
}

//STATIC FUNCTION IN ANOTHER SCRIPT
static function Lerp (object: Transform, startPos: Vector3, endPos: Vector3, seconds: float)
{
	//seconds = seconds / 10; ---- BY ENABLING THIS THE FUNCTION WORKS AS EXPECTED
	var startTime = Time.time;
	var per: float = 0.0;
	var timeSinceStart: float = 0.0;
	var zoomTime: float = 0.0;
	
	while (object.position != endPos)
	{
		Debug.Log(timeSinceStart);
		timeSinceStart = Time.time - startTime;
		per = timeSinceStart / seconds;
		object.position = Vector3.Lerp(startPos, endPos, per);
		yield;
	}
}

Ahh, I’m not sure what you mean by it’s 10x longer than it should be. This example lasts 10 seconds as it should. You mean, you want it to last 1 second? Why not change the seconds var from 10 to 1?