this post was submitted on 10 Jul 2023
72 points (97.4% liked)

Programming

17406 readers
111 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

I've started noticing articles and YouTube videos touting the benefits of branchless programming, making it sound like this is a hot new technique (or maybe a hot old technique) that everyone should be using. But it seems like it's only really applicable to data processing applications (as opposed to general programming) and there are very few times in my career where I've needed to use, much less optimize, data processing code. And when I do, I use someone else's library.

How often does branchless programming actually matter in the day to day life of an average developer?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] LaggyKar@programming.dev 22 points 1 year ago* (last edited 1 year ago) (1 children)

Or are GPUs particularly bad at branches.

Yes. GPUs don't have per-core branching, they have dozens of cores running the same instructions. So if some cores should run the if branch and some run the else branch, all cores in the group will execute both branches, and mask out the one they shouldn't have run. I also think they they don't have the advanced branch prediction CPUs have.

https://en.wikipedia.org/wiki/Single_instruction,_multiple_threads

Makes sense. The most programming I've ever done for a GPU was a few simple shaders for a toy project.