Settings

Theme

Ask HN: Are there LLMs that can do UX testing?

1 points by edwcross 2 months ago · 0 comments · 1 min read


One of the main difficulties in testing user interfaces is that, even when you have volunteers, they can only be used once for initial testing of an interface.

After the first test, they have already learned something, and that memory will interfere with further tests.

AI could be very useful in this case: imitate human behavior (can multi-modal LLMS "read" an interface screenshot and pretend to try to interact with it? Are there tools that can interpret what the LLM responds, e.g. "I'll try to click 'Details'" and then give the LLM another image?), but then immediately forget everything when a different version of the interface is presented to them.

Bonus points if you can add "personas" to the LLM (e.g. "you are a hurried user who barely reads the text", "you are a patient beginner who patiently watches the screen before trying", etc).

Maybe all of this is already available with agents and being currently used?

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection