LLM Evaluation Metrics
confident-ai.comGoing to suggest one metric is that the LLM doesn’t suggest by default that humans act in a manner likely to result in the termination of fellow humans.
https://news.ycombinator.com/item?id=42092961#42092997
I’m calling it the Pangolin Principle.