Discussion about this post

User's avatar
Dr. Robert van Engelen's avatar

"In place" algorithms are typically faster. But one has to avoid making code changes that accidentally violate data dependencies in loops and algorithms overall, e.g. when the right-hand side suddenly consumes an updated value as a result of the code optimization. But in a few cases we can make this "mistake" on purpose to get more accurate algorithms! For example, Gauss Seidel iterations (reuse updated values that are converging) versus Jacobi iterations (no reuse).

Funny that reusing memory appears to be controversial to some folks, judging from LinkedIn comments. Perhaps functional programming comes to mind, in which nothing is reused (at least not in the abstract!) But efficient imperative computing with numerical algorithms almost always entails implementation that reuse memory. Also local reuse is important, e.g. block-wise matrix algorithms optimize cache by improving spatial and temporal locality. Code readability is not the ultimate goal, since the algorithms are will understood and documented. Documentation is key.

Expand full comment
1 more comment...

No posts