Settings

Theme

Monte Carlo Long-Range Interacting System Simulations

uni-leipzig.de

56 points by sandoze 2 years ago · 17 comments

Reader

conformist 2 years ago

Arxiv: https://arxiv.org/abs/2207.14670

If I understand correctly after skimming, one of the fundamental ideas behind this appears to be similar to the well-known Fast Multiple Method [1]. It’s also a tree-based approach where far away points are aggregated into larger chunks?

[1] https://en.m.wikipedia.org/wiki/Fast_multipole_method

  • physicsguy 2 years ago

    It’s exactly the same as the Barnes-Hut method from a quick glance, there’s nothing particularly new in this paper as far as I can see. There are various papers which have merit on releasing specific codes that support this, but people have been using it in spin systems and even in micromagnetics for years. I did my PhD in this area years ago and implemented it in Monte Carlo and dynamical simulations at the time… and I cited papers going back to the 80s and 90s when I wrote up my thesis!

    • radioactivist 2 years ago

      While the spatial decomposition matches what is done in Barnes-Hut, the details of the underlying algorithm are somewhat different (they outline this in the introduction).

      In particular, their scheme using exact bounds on energy differences (evaluated using a hierarchical tree as in BH) but in such a way that no approximation is being made. The tree is evaluated to whatever depth is needed to decide whether to accept/reject the MC move (which in worse case could be a brute-force sum over the whole lattice/system) -- this is different I think than BH or other multipole inspired methods (which have a kind of "truncation" or "tolerance" parameter).

      [This also works well with systems where update are local and not global, which I think is a difference from some other spatial partitioning schemes -- but I'm less conversant with that aspect].

      • physicsguy 2 years ago

        I read the full paper after posting my initial comment. Not really - BH does use a “opening angle” parameter, but for FMM you use an order of expansion which provides a strict error bound in terms of accuracy which was derived in Greengard and Rohklin’s original paper on but have been expanded to other potentials.

        To me it’s just a minor (but nonetheless interesting) variation of an existing method; there have been many of these. They don’t compare it to other tree based methods in performance in the paper which says to me that is merits aren’t really that clear… with long range potentials FP inaccuracies mean none of this is exact anyway.

    • conformist 2 years ago

      Ah, yes, and they do indeed cite Barnes-Hut, too. So the (claimed) novelty seems to be how they connect it to Monte Carlo steps.

    • JBits 2 years ago

      That sounds cool! I think I'd be interested in reading your thesis.

      • physicsguy 2 years ago

        Would rather not share on this account, but if you're interested in software that does this, have a look at ExaFMM for a highly performant implementation.

Fede_V 2 years ago

Isn't this a stochastic variant of Barnes-Hut: https://en.wikipedia.org/wiki/Barnes%E2%80%93Hut_simulation

HaoZeke 2 years ago

Also the example cases they tested on (Ising model for LR, LJ for short range) are almost too trivial..

punnerud 2 years ago

Duplicate? https://news.ycombinator.com/item?id=36930857

ckrapu 2 years ago

Any commentary on what this means for the wider world of MCMC?

  • tnecniv 2 years ago

    I imagine a similar scheme can be used for general inference if you have a way to cluster components of your model in a similar fashion even if the parameters are not spatial. Perhaps your model has some kind of natural tree structure.

phyalow 2 years ago

Welcome to reinforcement learning

  • wrp 2 years ago

    This is a funny comment, not because it's so wrong, but because of the irony. Statisticians used to complain that neural net research was just badly reproducing statistics. As late as the early 2000s, I had a stats professor actually get angry when I expressed interest in ANN theory.

  • angus-mackaiver 2 years ago

    Could you elaborate on that or point to any literature/websites because I'm curious to learn about what these have in common. Thank you!

    • wrp 2 years ago

      The relation between ANNs and statistics is a topic over which much ink was spilled in the 1990s. This is the most concise discussion I can think of:

      Sarle, Warren S. (1994). Neural Networks and Statistical Models. Proceedings of the Nineteenth Annual SAS Users Group International Conference, April, 1994.

      If you want an in-depth discussion, these books relate ANNs and statistical methods:

      Schürmann, Jürgen. (1996). Pattern Classification: A Unified View of Statistical and Neural Approaches.

      Raudys, Sarunas. (2001). Statistical and Neural Classifiers: An Integrated Approach to Design.

  • whatever1 2 years ago

    Really, communities should start talking to each other.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection