Given an integer range [A,B],
- What’s the probability to get a 1-bit if we first randomly choose a number x in the range and then randomly choose a bit from x?
- What’s the expected number of bit 1s if we randomly choose a number x in the range?
Input Format
The first line of input is the number of test cases
Each test cases is a line contains 2 integers and separated by a space.
Output Format
For each test case output a line containing 2 float numbers separated by a space. The first one is the probability and the second one is the expected number. You should output the number accurate to 5 fractional digits.
Constraints
Sample Input
1
2 4
Sample Output
0.61111 1.33333
Explanation
(10) (11) (100)
(1) So we got a one in
(2) The expected 1 we have is :