this post was submitted on 06 Jan 2025
160 points (96.0% liked)

Programmer Humor

19932 readers
1906 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

A couple of years ago, my friend wanted to learn programming, so I was giving her a hand with resources and reviewing her code. She got to the part on adding code comments, and wrote the now-infamous line,

i = i + 1 #this increments i

We've all written superflouous comments, especially as beginners. And it's not even really funny, but for whatever reason, somehow we both remember this specific line years later and laugh at it together.

Years later (this week), to poke fun, I started writing sillier and sillier ways to increment i:

Beginner level:

# this increments i:
x = i 
x = x + int(True)
i = x

Beginner++ level:

# this increments i:
def increment(val):
   for i in range(val+1):
      output = i + 1
   return output

Intermediate level:

# this increments i:
class NumIncrementor:
	def __init__(self, initial_num):
		self.internal_num = initial_num

	def increment_number(self):
		incremented_number = 0
		# we add 1 each iteration for indexing reasons
		for i in list(range(self.internal_num)) + [len(range(self.internal_num))]: 
			incremented_number = i + 1 # fix obo error by incrementing i. I won't use recursion, I won't use recursion, I won't use recursion

		self.internal_num = incremented_number

	def get_incremented_number(self):
		return self.internal_num

i = input("Enter a number:")

incrementor = NumIncrementor(i)
incrementor.increment_number()
i = incrementor.get_incremented_number()

print(i)

Since I'm obviously very bored, I thought I'd hear your take on the "best" way to increment an int in your language of choice - I don't think my code is quite expert-level enough. Consider it a sort of advent of code challenge? Any code which does not contain the comment "this increments i:" will produce a compile error and fail to run.

No AI code pls. That's no fun.

you are viewing a single comment's thread
view the rest of the comments
[–] Sonotsugipaa@lemmy.dbzer0.com 16 points 3 days ago* (last edited 3 days ago) (3 children)
// C++20

#include <concepts>
#include <cstdint>

template <typename T>
concept C = requires (T t) { { b(t) } -> std::same_as<int>; };

char b(bool v) { return char(uintmax_t(v) % 5); }
#define Int jnt=i
auto b(char v) { return 'int'; }

// this increments i:
void inc(int& i) {
  auto Int == 1;
  using c = decltype(b(jnt));
  // edited mistake here: c is a type, not a value
  // i += decltype(jnt)(C<decltype(b(c))>);
  i += decltype(jnt)(C<decltype(b(c(1)))>);
}

I'm not quite sure it compiles, I wrote this on my phone and with the sheer amount of landmines here making a mistake is almost inevitable.

[–] henfredemars@infosec.pub 12 points 3 days ago (1 children)

I think my eyes are throwing up.

[–] Sonotsugipaa@lemmy.dbzer0.com 9 points 3 days ago

Just surround your eyes with try { ... } catch(Up& up) { }, easy fix

[–] Sonotsugipaa@lemmy.dbzer0.com 1 points 3 days ago* (last edited 3 days ago)

Just tested this: the "original+" code compiles, but does not increment i.

There were two problems:

  • b(bool) and b(char) are ambiguous (quick fix: change the signatures to char b(bool&) and auto b(char&& v));
  • The concept def. has to come after the b functions, even if the constraint is only checked after both, I was unaware of this (fix: define C immediately before void inc(int&)).