Calculate a running total during a for loop - Python
Edit: Below is my working code based on the feedback/answers I recieved.
This question stems from my previous question that came up while learning Python/CS using open courseware from MIT. --See my previous question here--
I am using the following code to make a list of month payments and other things. However at the end of the loop I need to give a running total for the total amount that has been paid of the months.
Original Code
balance = float(raw_input("Outstanding Balance: "))
interestRate = float(raw_input("Interest Rate: "))
minPayRate = float(raw_input("Minimum Monthly Payment Rate: "))
for month in xrange(1, 12+1):
interestPaid = round(interestRate / 12.0 * balance, 2)
minPayment = round(minPayRate * balance, 2)
principalPaid = round(minPayment - interestPaid, 2)
remainingBalance = round(balance - principalPaid, 2开发者_如何学Python)
print 'Month: %d' % (month,)
print 'Minimum monthly payment: %.2f' % (minPayment,)
print 'Principle paid: %.2f' % (principalPaid,)
print 'Remaining balance: %.2f' % (remainingBalance,)
balance = remainingBalance
if month in xrange(12, 12+1):
print 'RESULTS'
print 'Total amount paid: '
print 'Remaining balance: %.2f' % (remainingBalance,)
The problem is that I have not been able to figure out how to keep a running total of the amounts paid. I tried adding totalPaid = round(interestPaid + principalPaid, 2)
but that just led to a total for a single month, I cant seem to get it to keep that value for each month and then add them all up at the end to be printed out.
Also I know that the resulting amount should be 1131.12
I have found many examples of doing this when each value is know, via a list, but I cant seem to extrapolate that correctly.
Fixed Code
balance = float(raw_input("Outstanding Balance: "))
interestRate = float(raw_input("Interest Rate: "))
minPayRate = float(raw_input("Minimum Monthly Payment Rate: "))
totalPaid = 0
for month in xrange(1, 12+1):
interestPaid = round(interestRate / 12.0 * balance, 2)
minPayment = round(minPayRate * balance, 2)
principalPaid = round(minPayment - interestPaid, 2)
remainingBalance = round(balance - principalPaid, 2)
totalPaid += round(minPayment, 2)
print 'Month: %d' % (month,)
print 'Minimum monthly payment: %.2f' % (minPayment,)
print 'Principle paid: %.2f' % (principalPaid,)
print 'Remaining balance: %.2f' % (remainingBalance,)
balance = remainingBalance
if month in xrange(12, 12+1):
print 'RESULTS'
print 'Total amount paid: %.2f' % (totalPaid,)
print 'Remaining balance: %.2f' % (remainingBalance,)
Before your loop, initialize a variable to accumulate value:
total_paid = 0
And then, in the body of your loop, add the appropriate amount to it. You can use the +=
operator to add to an existing variable, e.g.
total_paid += 1
is a short form for total_paid = total_paid + 1
. You don't want to give total_paid
a new value each iteration, rather you want to add to its existing value.
I'm not sure about the specifics of your problem, but this is the general form for accumulating a value as you loop.
You always make the minimum payment? Just use minPayment instead of figuring out that math again. Keep a running total, then print it out after the loop.
balance = float(raw_input("Outstanding Balance: "))
interestRate = float(raw_input("Interest Rate: "))
minPayRate = float(raw_input("Minimum Monthly Payment Rate: "))
paid = 0
for month in xrange(1, 12+1):
interestPaid = round(interestRate / 12.0 * balance, 2)
minPayment = round(minPayRate * balance, 2)
principalPaid = round(minPayment - interestPaid, 2)
remainingBalance = round(balance - principalPaid, 2)
paid += minPayment
print # Make the output easier to read.
print 'Month: %d' % (month,)
print 'Minimum monthly payment: %.2f' % (minPayment,)
print 'Principle paid: %.2f' % (principalPaid,)
print 'Remaining balance: %.2f' % (remainingBalance,)
balance = remainingBalance
print
print 'RESULTS'
print 'Total amount paid:', paid
print 'Remaining balance: %.2f' % (remainingBalance,)
Also notice that range has exactly one value, so you'd just check month == 12, but it's simply not necessary here.
This answer worked for me:
First, create the derived value:
df.loc[0, 'C'] = df.loc[0, 'D']
Then iterate through the remaining rows and fill the calculated values:
for i in range(1, len(df)):
df.loc[i, 'C'] = df.loc[i-1, 'C'] * df.loc[i, 'A'] + df.loc[i, 'B']
Index_Date | A | B | C | D |
---|---|---|---|---|
2015/01/31 | 10 | 10 | 10 | 10 |
2015/02/01 | 2 | 3 | 23 | 22 |
2015/02/02 | 10 | 60 | 290 | 280 |
You actually have to initialize totalPaid to 0 and then
totalPaid = round(interestPaid + principalPaid, 2) + totalPaid
Inside the loop. Your problem is that you're not accumulating the total, you're just setting a new one on each iteration.
sounds like you were close. The problem is that you were overwriting the total each time. Try something like this:
totalPaid = totalPaid + round(interestPaid + principalPaid, 2)
精彩评论