There’s a buddy of mine who I chat with frequently on AIM by the name of Josh, and well, one of our favorite activities (besides endless political debates and what-not) is essentially a giant pissing contest between Ruby and Python.
Josh is your typical die-hard Python fan. In his mind, there really isn’t anything that can compare.
Naturally, it’s only my God-given duty to show him how and why I prefer Ruby. π
One of our recent contests involved the matter of a simple task: mathematically, find and print all numbers between 1 and 100, designating which were divisible by 3, 5, or both. He was the first one to write any code, and here’s what he came up with:
i=1
while i<=100:
if i%15==0:
print("%s is divisible by 3 and 5" % i)
elif i%3==0:
print("%s is divisible by 3" % i)
elif i%5==0:
print("%s is divisible by 5" % i)
else:
print i
i=i+1
Fairly simple, right? Here’s my first iteration, which is basically a mirror of Josh’s code in Ruby:
i = 1
loop do
i = i + 1
if i >= 101
exit
end
case i
when i % 15 == 0
puts("#{i} is divisible by 15")
when i % 3 == 0
puts("#{i} is divisible by 3")
when i % 5 == 0
puts("#{i} is divisible by 5")
else
puts("#{i}")
end
end
As anyone can tell you, there are some inherent issues with that code. Let’s talk about them:
- The incrementation of i (i = i + 1) should never come at the beginning of a loop. It’s bad practice. I’m not going to go off into a huge rant as to why – it’s been covered and explained in so many text books and what-not that it’d be a waste of my time.
- The code doesn’t intuitively “know” to stop at 100. It manually checks every single time the process is run if we’re above 100, which is unnecessarily redundant and time-consuming. Or just plain unnecessary.
- Injecting a string was entirely unnecessary. “puts(i)” would have achieved the same effect in cleaner code.
To summarize: there are better ways of doing it. One other way is to use a ‘while’ loop, but that’s not the best way to do it either. In the end I settled for this implementation using a “for foo in bar” loop:
for i in (1..100)
if i % 15 == 0
puts("#{i} is divisible by both 3 and 5")
elsif i % 3 == 0
puts("#{i} is divisible by 3")
elsif i % 5 == 0
puts("#{i} is divisible by 5")
else
puts(i)
end
end
Of course, after I did this, Josh realized he could do the same in Python:
for i in range(1, 101):
if i%15==0:
print("FizzBuzz")
elif i%3==0:
print("Fizz")
elif i%5==0:
print("Buzz")
else:
print i
And for kicks, he even did it in LISP:
(defun count()
(setq i 1)
(while (<= i 100)
(cond
((= 15 (gcd i 15)) (princ "i is divisible by 5 and 3n")
((= 5 (gcd i 5)) (princ "i is divisible by 5n")
((= 3 (gcd i 3)) (princ "i is divisible by 3n")
(t (princ i) (princ "n") )
)
(setq i (+ 1 i))
)
(princ)
)
Let’s analyze the code (that is, my Ruby code), shall we?
The code opens with our main loop, which is a fairly readable type of loop. Simply put, it translates into “For the variable ‘i’ in the range of numbers 1 through 100, including 1 and 100, do the following”. Fairly easy to understand right? That’s Ruby for you. π From there, the code breaks down into a simple conditional where it checks first to see if the number is divisible by both 3 and 5. It goes on to check if it is divisible by 3 or 5. Finally, if none of the above is true, then the number is printed.
It’s important to understand why this code is better, and more efficient. First and foremost, it maintains readability, which is always a must when working with code, while at the same time using some of the nicer features in Ruby in order to avoid redundant work. The biggest change by switching to this type of loop is that the loop now knows to stop at 100 without us manually having to assert that the program exit. We also were able to remove all the redundant incrementation of ‘i’ because it’s factored into our loop already.
So why am I writing about something so trivial as a pissing contest? Because it’s an important tool for learning about how to teach coding, and more specifically Ruby. Since I’m still learning (albeit at a fairly rapid rate), I’m essentially a perfect guinea pig. Until you break down things like “for foo in bar” into human-understandable syntax, you’ll find that you will be going at problems from a longer, more difficult angle. High-level languages have very strong and usable syntax for a reason. However, you’ve got to make a conscious effort into learning more than just the most basic of the basics. The quirks might not be enjoyable to learn, but they’ll improve your code ten-fold.
PS. Thanks to Nathan for tipping me off about using “for foo in bar” instead of while! π