mirror of
https://github.com/torvalds/linux.git
synced 2025-12-07 20:06:24 +00:00
The row/column bounds (for a screen window box) are changed from 'offset one' to 'offset zero' and bound to the screen size using: v->xs = min_t(u16, v->xs - 1, vc->vc_cols - 1); This has the side effect of converting zero to the limit. A check I'm adding to min_t() reports that (u16)(v->xs - 1) (etc) discards signiticant bits (because v->xs is promoted to 'int' before the addition). If v->xs is zero (it comes from userspace) it converts -1 to 0xffff. This is then bounded to 'vc->vc_cols - 1' which will be fine. Replace with: v->xs = umin(v->xs - 1, vc->vc_cols - 1); which again converts a -1 to unsigned - this time to 0xffffffff, with the same overall effect. Whether zero is meant to mean the 'maximum size' is unknown. I can't find any documentation for the ioctl and it pre-dates git. Detected by an extra check added to min_t(). Signed-off-by: David Laight <david.laight.linux@gmail.com> Reviewed-by: Jiri Slaby <jirislaby@kernel.org> Link: https://patch.msgid.link/20251119224140.8616-28-david.laight.linux@gmail.com Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
11 KiB
11 KiB