Why is Counting Down Considered Less Work in Recursive Functions?

In summary, the professor says that going the other way would be more work, but that the benefit is not clear. He also says that the code should be initialized before it is used.
  • #1
JimmyK
9
0
In class, we're learning about recursive functions. One example given was figuring out 1 + 2 + 3+ 4 + 5 + 6 + 7 + 8 + 9 + 10. My professor stated to address this, we start with 10 in the function and decrement for the next recursive call, so 10 + f(9). He said going the other way would be considered more work, 1 + f(2), and a recursive function should do the least amount of work possible. I don't get why this is considered more work. It would be the same number of steps either way. What's the benefit of counting down instead of up in terms of the workload? Thanks.
 
Technology news on Phys.org
  • #2
http://ideone.com/4q4wC
 
  • #3
There's no benefit to adding from right to left rather than left to right, assuming the "work" to add two numbers is always the same.

Xitami, there are issues with the code you posted.

Anyway here is code that demonstrates it should take the same number of recursive steps, and arrive at the same result.

http://ideone.com/MFJth
 
Last edited:
  • #4
The overhead is the same for a fixed input, but the decrementing function is more flexible since the max value is specified as an input instead of a define or global variable.

Code:
#include <stdio.h>

static int ld, lu, i;

int fd(n){
    ld++;
    if(n>0)return n+fd(n-1);
    return 0;}
 
int fu(n){
    lu++;
    if(n<i)return n+fu(n+1);
    return i;} 
 
int main(){
    ld = lu = 0;
    for(i = 0; i <= 10; i++){
        printf("%d ",fd(i));
        printf("%d ",fu(0));}
    printf("\n%d %d\n", lu, ld);
    return(0);}

output:
Code:
0 0 1 1 3 3 6 6 10 10 15 15 21 21 28 28 36 36 45 45 55 55 
66 66
 
Last edited:
  • #5
retracted
 
Last edited:
  • #6
MisterX said:
using a for loop to do iterative addition when the topic is about finding a sum with recursion & using recursive incrementation to implement addition.
Unless someone deleted a post, none of the posts in this thread include an example of summation via a for loop. The for loop in my post was used to invoke the recursive summation routines 11 times, to run the summation for 0->0 to 0->10 cases. The incrementing done in the recursive functions was done to compare the overhead (how many times each function was called), the 11 cases resulted in a total of 66 instances of each function being invoked.

Getting back to the original post, in the case of the recursive function that increments the value for each iteration, the max value could be passed via a parameter instead of using a global. If the max value parameter is passed via the stack for each iteration, it would increase the overhead. If the max value parameter is simply kept in the same register for all instances of the function, then there would be no significant overhead (for register based parameters (up to 2 in 32 bit mode, up to 4 in 64 bit mode) with Microsoft 32 bit compiler, use fastcall (/Gr) option, for Microsoft 64 bit compiler, fastcall is the default).
 
Last edited:
  • #7
rcgldr said:
Unless someone deleted a post, none of the posts in this thread include an example of summation via a for loop. The for loop in my post was used to invoke the recursive summation routines 11 times, to run the summation for 0->0 to 0->10 cases.

Sorry, that wasn't clear to me (each time you call one of the functions, it requires i+1 recursive steps, and since your were summing these number of steps for n from 0 - > 10, getting lu and ld is similar to solving the OP's problem).

Please initialize variables before using them in C or C++.
 
Last edited:
  • #8
MisterX said:
Please initialize variables before using them in C or C++.
I missed that when modifying Xtami's code to show the number of instances were the same for both functions. Normally I declare those type of variables as static which initializes them to zero. I updated my previous post to properly initialize them.
 
Last edited:
  • #9
JimmyK said:
In class, we're learning about recursive functions. One example given was figuring out 1 + 2 + 3+ 4 + 5 + 6 + 7 + 8 + 9 + 10. My professor stated to address this, we start with 10 in the function and decrement for the next recursive call, so 10 + f(9). He said going the other way would be considered more work, 1 + f(2), and a recursive function should do the least amount of work possible. I don't get why this is considered more work. It would be the same number of steps either way. What's the benefit of counting down instead of up in terms of the workload? Thanks.

Hey Jimmyk and welcome to the forums.

For this example, it should not matter: there will be the same number of evaluations no matter how you evaluate it. It might have been more intuitive to go left to right, because it seems natural to do since we do that all the time (mathematical formulas, writing words, and so on).

The comment about doing the minimal amount of computations is probably the most important. In this specific example, there is a one or two line calculation that will figure out the answer [(n+1)n/2 I think it is], so in this particular case you don't need recursion. But many things out there require it, since many problems don't simplify like your example, and recursion provides a really powerful tool.

With regards to doing recursion problems in the future, it can be handy to think of your data in terms of some kind of "chain". Each recursion step divides up the chain, and how you divide up the chain can help you set up your function parameters. It is a personal style comment, but if you end up using common templates like these with all your recursive functions, it will help you understand problems with code that you wrote literally months ago or even last year.

Good luck in your learning.
 
  • #10
JimmyK said:
My professor stated to address this, we start with 10 in the function and decrement for the next recursive call, so 10 + f(9). He said going the other way would be considered more work, 1 + f(2) ... What's the benefit of counting down instead of up in terms of the workload?
My guess is that the professor was considering "semi" generic functions where the upper value is an input while the lower value is fixed (zero in this case). The decrement version only needs one parameter, while the increment version needs two:

Code:
int rsumdec(int i){
    if(i>0)return i+rsumdec(i-1);
    return i;} 

// versus

int rsuminc(int i, int n){
    if(i<n)return i+rsuminc(i+1, n);
    return i;}

However if the compiled code for rsuminc() leaves n in a register as opposed to storing a copy of it on the stack for each instance of rsuminc(), then there's no additional overhead. If the summation is more generic where both the lower and upper input parameters are variable, then both functions would need two input paramters and it wouldn't matter if you incremented or decremented, and rsumdec() would look like this:

Code:
int rsumdec(int i, int n){
    if(n>i)return n+rsumdec(i, n-1);
    return n;}
 
Last edited:

Related to Why is Counting Down Considered Less Work in Recursive Functions?

1. What is recursion in units of work?

Recursion is a programming technique where a function calls itself repeatedly until a specific base case is reached. In units of work, this means that a task or problem is broken down into smaller subtasks, which are then solved recursively until the original task is completed.

2. How does recursion work in units of work?

In units of work, recursion works by breaking down a task into smaller subtasks, which are then solved using the same function. This process continues until a base case is reached, at which point the results are combined to solve the original task.

3. What are the benefits of using recursion in units of work?

Recursion allows for a more elegant and concise solution to complex problems by breaking them down into smaller, more manageable subtasks. It also helps to reduce code duplication and can be more efficient in certain cases.

4. Are there any limitations to using recursion in units of work?

One limitation of recursion is that it can lead to stack overflow if the base case is not reached, resulting in an infinite loop. Additionally, recursion may not always be the most efficient solution, and it can be difficult to debug compared to iterative solutions.

5. How can I determine when to use recursion in units of work?

Recursion is best used when a problem can be broken down into smaller subproblems that can be solved using the same function. It is also useful when the input data is of a variable size or when a more concise solution is desired. However, it is important to consider the potential limitations and efficiency of recursion before using it in units of work.

Similar threads

  • Programming and Computer Science
Replies
11
Views
837
  • Programming and Computer Science
Replies
5
Views
12K
  • Programming and Computer Science
Replies
7
Views
3K
  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
17
Views
2K
  • Programming and Computer Science
7
Replies
235
Views
10K
  • Programming and Computer Science
Replies
5
Views
1K
  • Programming and Computer Science
Replies
12
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top