'Integer division in Java [duplicate]

This feels like a stupid question, but I can't find the answer anywhere in the Java documentation. If I declare two ints and then divide them, what exactly is happening? Are they converted to floats/doubles first, divided, then cast back to an integer, or is the division "done" as integers?

Also, purely from experimentation, integer division seems to round the answer towards zero (i.e. 3/2 = 1 and -3/2 = -1). Am I right in believing this?



Solution 1:[1]

They are being divided in integer arithmetics. So dividing integer a by integer b you get how many times b fits into a. Also a % b will give you a remainder of a division. So (a / b ) * b + a % b = a

Solution 2:[2]

Java does autoconvert types:

"It autoconverts ints to doubles. It autoconverts shorts and bytes to ints even when no ints are involved, requiring constant annoying casts when you want to do short or byte arithmetic. It autoconverts primitives to wrappers and vice versa for boxing and autoboxing." - user2357112

Java never casts anything without you specifying it.

But still integer / integer = integer.

Also, it does always truncate the result. So if the result would be 0.999999 as float the integer division would still return 0.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Andrei Bardyshev
Solution 2 Community