gstutils: Fix linear regression comparision

The check for dropping precision was wrong when sxx and syy were negative.

if they are negative then "G_MAXINT64 - val" would always overflow

The check was meant to use G_MININT64 (like in the loop contained just
after).
This commit is contained in:
Edward Hervey 2017-11-24 12:05:26 +01:00 committed by Edward Hervey
parent 3afc575062
commit 741ff6a371

View file

@ -4368,7 +4368,7 @@ gst_calculate_linear_regression (const GstClockTime * xy,
tmp /= 4;
} while (G_MAXINT64 - sxx <= tmp);
break;
} else if (G_UNLIKELY (tmp < 0 && sxx < 0 && (G_MAXINT64 - sxx >= tmp))) {
} else if (G_UNLIKELY (tmp < 0 && sxx < 0 && (G_MININT64 - sxx >= tmp))) {
do {
/* Drop some precision and restart */
pshift++;
@ -4387,7 +4387,7 @@ gst_calculate_linear_regression (const GstClockTime * xy,
tmp /= 4;
} while (G_MAXINT64 - syy <= tmp);
break;
} else if (G_UNLIKELY (tmp < 0 && syy < 0 && (G_MAXINT64 - syy >= tmp))) {
} else if (G_UNLIKELY (tmp < 0 && syy < 0 && (G_MININT64 - syy >= tmp))) {
do {
pshift++;
syy /= 4;