JirR02 88e0b5ed69 Converted everything to orgmode
converted everything to orgmode and added solution to the README files
2025-03-31 08:40:43 +02:00
..
2025-03-31 08:40:43 +02:00

Task 3b: Fibonacci overflow check

Background

Fibonacci numbers are the integers in the following sequence: \(0, 1, 1, 2, 3, 5, 8, 13, 21, ...\), where each number is the sum of the two previous numbers.

Task

Fibonacci numbers grow fast, and can thus easily exceed the value range of 32-bit int. Think of a general way how you can check if the result of an addition would exceed the range of a 32-bit int (i.e. overflow) without actually performing the addition causing the overflow.

Remember that we consider signed integers. Because only half of the numbers are positive, this leaves us with 31 bits to store the actual positive number value.

Write a program that asks the user for an integer \(n\) and then prints the first $n$ Fibonacci numbers, each number on a new line. Use an int (we assume 32 bits, including the sign) to represent the current Fibonacci number. Most importantly: exit the print loop as soon as you detect that an overflow would occur.

Finally, again on a new line, output the count \(c\) of Fibonacci numbers previously printed, and the initial input \(n\) from the user, in the format: c of n.

Example:

Let's (wrongly!) assume that \(5\) cannot be represented using a 32 bit int. This means that \(3\) is the largest 32-bit Fibonacci number. If your program is asked to print the first \(4\) Fibonacci numbers the output should look as follows:

0
1
1
2
Printed 4 of 4 Fibonacci numbers

If you instead ask it to print the first 100 Fibonacci numbers the output should look as follows:

0
1
1
2
3
Printed 5 of 100 Fibonacci numbers

Important: using anything other than int (e.g., unsigned int, floating point numbers, long, or double long) is forbidden.

Restrictions:

  • The program must not rely on the knowledge of its final result. In particular, it is not allowed to hard-code

    • the largest 32-bits Fibonacci number, or
    • the number of digits that it has, or
    • the total number of Fibonacci numbers representable with a 32-bit int
  • Most importantly: do not perform additions that cause an overflow on 32 bit

Note: It is straightfoward to compute the largest (signed) integer representable with 32 bits. You are also explicitly allowed to hard-code this value in your program.


Warning: The autograder does not catch if an addition causes an overflow or if you do anything that's disallowed in the "Restrictions" section above, but you will receive 0 points when your TA corrects and catches this.

Mistakes

  • The variable j goes into overflow which is not allowed!

Solution

#include <iostream>

int main() {
  int a = 0; // First Fibonacci number
  int b = 1; // Second Fibonacci number
  int j = 0; // New Fibonacci number
  int c = 0; // Number of Fibonacci numbers
  int max = 2147483647;
  int n; // Input integer

  std::cin >> n;

  for (int i = 0; i < n; ++i) {

    if (max - a < b) { // Check if the new Fibonacci number goes into overflow
      break;
    } else { // otherwise, calculate next Fibonacci number
      std::cout << j << "\n";
      a = b;
      b = j;
      j = a + b;
      ++c;
    }
  }
  std::cout << c << " of " << n; // End Message
}

——-

Made by JirR02 in Switzerland 🇨🇭