this post was submitted on 08 Nov 2024
740 points (98.2% liked)

Programmer Humor

19551 readers
1021 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] state_electrician@discuss.tchncs.de 132 points 6 days ago (2 children)

My favorite is StalinSort. You go through the list and eliminate all elements which are not in line.

[–] pyre@lemmy.world 49 points 5 days ago* (last edited 5 days ago) (6 children)

you should post this on lemmy.ml

[–] affiliate@lemmy.world 30 points 5 days ago

it would be a pretty funny post for the full 5 minutes it would last until it got stalin sorted out of lemmy.ml

[–] magic_lobster_party@fedia.io 3 points 5 days ago

They would see nothing wrong with it

load more comments (4 replies)
[–] Incandemon@lemmy.ca 4 points 5 days ago (1 children)

I tend to prefer Hiroshima sort. Sorting completed in O(1) time, and it frees up memory too.

load more comments (1 replies)
[–] BatmanAoD@programming.dev 149 points 6 days ago (5 children)

Reminds me of quantum-bogosort: randomize the list; check if it is sorted. If it is, you're done; otherwise, destroy this universe.

[–] xmunk@sh.itjust.works 95 points 6 days ago (3 children)

Guaranteed to sort the list in nearly instantaneous time and with absolutely no downsides that are capable of objecting.

[–] frezik@midwest.social 48 points 6 days ago (2 children)

You still have to check that it's sorted, which is O(n).

We'll also assume that destroying the universe takes constant time.

[–] BatmanAoD@programming.dev 43 points 6 days ago (2 children)

In the universe where the list is sorted, it doesn't actually matter how long the destruction takes!

[–] groet@feddit.org 13 points 5 days ago (1 children)

It actually takes a few trillion years but its fine because we just stop considering the "failed" universes because they will be gone soon™ anyway.

[–] MBM@lemmings.world 8 points 5 days ago

Eh, trillion is a constant

[–] FiskFisk33@startrek.website 8 points 6 days ago

amortized O(0)

[–] Benjaben@lemmy.world 9 points 5 days ago (1 children)

We'll also assume that destroying the universe takes constant time.

Well yeah just delete the pointer to it!

load more comments (1 replies)
[–] vithigar@lemmy.ca 16 points 6 days ago (1 children)

Except you missed a bug in the "check if it's sorted" code and it ends up destroying every universe.

[–] db2@lemmy.world 7 points 6 days ago

There's a bug in it now, that's why we're still here.

load more comments (1 replies)
[–] Zaphod@discuss.tchncs.de 25 points 5 days ago (1 children)

The creation and destruction of universes is left as an exercise to the reader

[–] BatmanAoD@programming.dev 4 points 5 days ago

Creation is easy, assuming the many-worlds interpretation of quantum mechanics!

[–] random72guy@lemmy.world 17 points 5 days ago (1 children)

Instead of destroying the universe, can we destroy prior, failed shuffle/check iterations to retain o(1)? Then we wouldn't have to reload all of creation into RAM.

[–] BatmanAoD@programming.dev 6 points 5 days ago

Delete prior iterations of the loop in the same timeline? I'm not sure there's anything in quantum mechanics to permit that...

[–] SubArcticTundra@lemmy.ml 13 points 6 days ago (2 children)

What library are you using for that?

[–] jcg@halubilo.social 29 points 6 days ago* (last edited 6 days ago) (1 children)

is-sorted and a handful of about 300 other npm packages. Cloning the repo and installing takes about 16 hours but after that you're pretty much good for the rest of eternity

[–] Swedneck@discuss.tchncs.de 8 points 5 days ago (1 children)

that explains why it took god 7 days to make the universe

load more comments (1 replies)
[–] SkaveRat@discuss.tchncs.de 11 points 5 days ago

In Python you just use

import destroy_universe
load more comments (1 replies)
[–] ChaoticNeutralCzech@feddit.org 51 points 6 days ago (1 children)
[–] MajorHavoc@programming.dev 32 points 6 days ago
// portability

Gave me the giggles. I've helped maintain systems where this portable solution would have left everyone better off.

[–] Swedneck@discuss.tchncs.de 32 points 5 days ago* (last edited 5 days ago) (4 children)
import yhwh  

def interventionSort(unsortedList):
    sortedList = yhwh.pray(
    "Oh great and merciful Lord above, let thine glory shine upon yonder list!", 
    unsortedList
    )  
    return sortedList
[–] porous_grey_matter@lemmy.ml 11 points 5 days ago

Camelcase in python, ew, a fundamentalist would do that

load more comments (3 replies)
[–] Allero@lemmy.today 28 points 5 days ago (6 children)

The most beautiful thing about this program is that it would work.

Various bit flips will once lead to all numbers being in the correct order. No guarantee the numbers will be the same, though...

[–] fallingcats@discuss.tchncs.de 8 points 5 days ago (2 children)

Those bitflips are probably more likely to skip the section erroneously than waiting for the array to be sorted.

load more comments (2 replies)
[–] Zoomboingding@lemmy.world 6 points 5 days ago

Reminds me of a program in Homestuck. It's code that iterates until the author/universe dies, then executes some unknown code. The coding language is ~ath, or TilDeath.

[–] Midnitte@beehaw.org 7 points 5 days ago

Might also take a very long time (or a large amount of radiation).

[–] ProgrammingSocks@pawb.social 3 points 5 days ago (2 children)

Not necessarily. I don't have the numbers in front if me, but there is actually a probability that, past that point, something is so unlikely that you can consider it to be impossible (I.e. will never happen within the lifetime of the universe)

load more comments (2 replies)
load more comments (2 replies)
[–] TheTechnician27@lemmy.world 40 points 6 days ago (1 children)
load more comments (1 replies)
[–] iAvicenna@lemmy.world 14 points 5 days ago (1 children)

you can also call it quantum sort since there is non zero probability that it will sort itself by random flips

[–] Comment105@lemm.ee 5 points 5 days ago (1 children)

It would actually have happened an infinite amount of times already, if either the universe is infinite, or there are infinite universes.

[–] CanadaPlus@lemmy.sdf.org 2 points 5 days ago

I really deeply hope the universe is finite for this reason. Every great or terrible thing always happens forever, there is no causality or consequence.

[–] fluckx@lemmy.world 24 points 5 days ago* (last edited 4 days ago) (1 children)

I prefer the one where you randomly sort the array until all elements are in order. ( Bogosort )

load more comments (1 replies)
[–] 1984@lemmy.today 25 points 5 days ago (1 children)

This is the algoritm I use at work.

load more comments (1 replies)
[–] lemmydividebyzero@reddthat.com 19 points 5 days ago (1 children)
[–] Ephera@lemmy.ml 34 points 5 days ago

I hear, it actually significantly increases the chance of the miracle occurring when you pass the array into multiple threads. It's a very mysterious algorithm.

[–] aeharding@vger.social 13 points 6 days ago* (last edited 6 days ago) (1 children)

Shameless plug for my sort lib

edit: Looking at my old code it might be time to add typescript, es6 and promises to make it ✨  p r o d u c t i o n   r e a d y  ✨

[–] breakcore@discuss.tchncs.de 2 points 5 days ago

Goos stuff, I will start using it. My code needs to chill out anyway

[–] 1boiledpotato@sh.itjust.works 8 points 5 days ago (1 children)

And the time complexity is only O(1)

[–] voldage@lemmy.world 14 points 5 days ago (1 children)

I don't think you can check if array of n elements is sorted in O(1), if you skip the check though and just assume it is sorted now (have faith), then the time would be constant, depending on how long you're willing to wait until the miracle happens. As long as MTM (Mean Time to Miracle) is constant, the faithfull miracle sort has O(1) time complexity, even if MTM is infinite. Faithless miracle sort has at best the complexity of the algorithm that checks if the array is sorted.

Technically you can to down to O(0) if you assume all array are always sorted.

load more comments (1 replies)
[–] TheOakTree@lemm.ee 5 points 5 days ago (2 children)

Hello programmers...

I recently took a course that went through basic python, C, and C++.

I had a hard time implementing various forms of sorting functions by hand (these were exercises for exam study). Are there any resources you folks would recommend so that I can build a better grasp of sorting implementations and efficiency?

[–] 90s_hacker@reddthat.com 7 points 5 days ago* (last edited 5 days ago) (1 children)

Skiena's Algorithm design manual is very widely recommended for learning algorithms, I've also heard good things about A common sense guide to algorithms and data structures. Skiena's also has video lectures on YouTube if you prefer videos.

From what I've seen, a common sense guide seems to be more geared towards newer programmers while Skiena assumes more experience. Consequently, Skiena goes into more depth while A common sense guide seems to be more focused on what you specifically asked for. algorithm design manual

A common sense guide

load more comments (1 replies)
[–] AllHailTheSheep@sh.itjust.works 3 points 5 days ago* (last edited 5 days ago)

don't get discouraged. sorting algorithms occur frequently in interviews, and yes you use them a decent amount (especially in languages without built in sorts like c) but they are one of the harder things to visualize in terms of how they work. I'd say avoid anything recursive for now until you can get selection and insertion down pat. check out geeksforgeeks articles on them, but also don't be afraid to Google willy nilly, you'll find the resource that makes it click eventually.

in terms of efficiency, it does become a little more difficult to grasp without some math background. big o is known as asymptomatic notation, and describes how a function grows. for example, if you graph f1(x)=15log(x) and f2(x)=x, you'll notice that if x is bigger than 19, then f2(x) always has a higher output value than f1(x). in computer science terms, we'd say f1 is O(log(n)), meaning it has logarithmic growth, and f2 is O(n), or linear growth. the formal definition of big o is that f(x) is O(g(x)), if and only if (sometimes abbreviated as iff) there exists constants N and C such that |f(x)| <= C|g(x)| for all x>N. in our example, we can say that C = 1, and N>19, so that fulfills definition as |15log(x)| <= 1|x| whenever x>19. therefore, f1(x) is O(f2(x)). apologies for just throwing numbers at you, (or if you've heard all this before) but having even just the most basic grasp of the math is gonna help a lot. again, in terms of best resources, geeksforgeeks is always great and googling can help you find thousands of more resources. trust that you are not the first person to have trouble with these and most people before you have asked online about it as well.

I also highly reccomend grabbing a copy of discrete mathematics and it's applications by Kenneth Rosen to dig farther into the math. there's a few other types of asymptomatic notation such os big omega and big theta, even little o, that I didn't mention here but are useful for comparing functions in slightly different ways. it's a big book but it starts at the bottom and is generally pretty well written and well laid out.

feel free to dm me if you have questions, I'm always down to talk math and comp sci.

edit: in our example, we could also pick c =19 and n = 1, or probably a few other combinations as well. as long as it fills the definition it's correct.

[–] mavu@discuss.tchncs.de 3 points 5 days ago (1 children)

Wait, that's exactly how i tidy up my kitchen!

load more comments (1 replies)
load more comments
view more: next ›