Ask HN: Looking for a Talk on Software Optimization
There was some academic research mentioned, I don't know exactly when, on HN related to software optimization. I can't seem to track down the source and need some help. From what I can remember, the gist is this:
- How do you know your optimization is useful?
- It could be fragile, dependent on compilers, etc.
- Modelling in this research was done by modifying a concept of time. Something like assuming some subsystem ran faster using some sort of novel technique I can't recall.
- Instrumentation within this framework gave data that pointed to subsystems where optimization would be fruitful and avoid fragility.
I can't be sure if any of the details of the above points are misremembered. Sorry. Hopefully someone can help. Thank you for your suggestion, but no. This wasn't explored recently. It's been a year at least since this was discussed. The work comes from an academic researcher in a CS or Engineering department. I'm interested because it presents a much higher-level view of optimization beyond micro-optimization where people worry about such things as cache effects and such, which would be bound to micro-architectural implementation choices. Woah, I can't believe it, I've found the discussion that linked the thing I was looking for. [1] It's called Coz, the causal profiler [2]. I've submitted this talk again, because I think it's super interesting.