Settings

Theme

Self-driving cars less likely to detect children and people of color

businessinsider.com

69 points by Tozen 2 years ago · 91 comments

Reader

jl6 2 years ago

> While the study did not use the exact software companies like Tesla use to power self-driving cars because they are confidential, the software systems used for the study are based on the same open-source AI those companies use, according to Zhang.

I was under the impression that commercial self-driving software was deeply proprietary and confidential, and there is no way to know that this study will generalize if run on state of the art detectors. Tesla and Cruise are name-checked in the article - how do we know this isn’t a problem they have worked extensively on and made great improvements to, relative to the open source components?

Feels like a case of outrage-for-clicks.

  • jnovek 2 years ago

    > Feels like a case of outrage-for-clicks.

    The BI article is definitely outrage for clicks. I wouldn't be surprised if the actual journal article was more measured in its conclusions and this is just typical bad science reporting.

  • dee-bee 2 years ago

    Presumably these companies are free to provide their software for research. The onus is on them to demonstrate it works in the first place....

    > how do we know this isn’t a problem they have worked extensively on and made great improvements to, relative to the open source components?

    They are a private, for-profit entity with a strong incentive to mislead people about their products. I see no reason to assume they've addressed this issue.

  • candiddevmike 2 years ago

    The point of the article, to me, is computer vision will never be enough. These are machines and need to be augmented with radar and other object detection methods.

    • ben_w 2 years ago

      Radar etc. almost certainly make it easier, but for it to be "never" this would also have to be provably impossible for humans.

      Which isn't too say it's not never, as I remember studies in my own childhood that said human drivers were also bad at recognising how far away children were, and I've never heard of human perception of skin colour being tested in this way so it might just turn out that melanin is unfortunately good camouflage against tarmac…

      …but unless and until that suggestion turns out to be correct of all humans, I default to assuming we're an existence proof of the capability to do without, and that means I still wouldn't say "never" to sufficiently advanced AI doing at least as well.

      • wongarsu 2 years ago

        I'm confident that a comprehensive study would show that on average humans are worse at detecting people of color against dark backgrounds (if we agree this is code for "people with highly pigmented skin" and not Asians or Latinos). There is just much less contrast to work with, and dark skin also makes facial features stand out less (which is an issue because faces are the thing humans can recognize best).

        There is a discussion we could have whether we want to measure self-driving cars against an ideal perfect baseline or against the status quo. But of course the ideal case is much easier to define, and has fewer things that make some people uncomfortable.

        • seanmcdirmid 2 years ago

          I’m harder to see on a rainy winter night if I wear a black jacket vs a bright orange one. I learned early in it wear bright orange on those nights given that I was almost clipped by human driven cars a few time. Clothing choices are important.

        • tbrownaw 2 years ago

          > But of course the ideal case is much easier to define, and has fewer things that make some people uncomfortable.

          As another bonus, it also provides an extra excuse to advocate against replacing humans with AIs.

    • sokoloff 2 years ago

      It’s intended to replace human drivers who have not yet evolved radar. I agree that radar could make it easier and/or more reliable, but there’s a pretty strong argument that building a system to equal/exceed humans using vision alone is possible.

      • micromacrofoot 2 years ago

        Humans haven’t evolved radar, but computer vision systems are also worse than human vision systems in many ways, so they need to compensate

        • sokoloff 2 years ago

          They are currently worse in many ways. So, perhaps the computer vision systems need to continue to improve until such time as they deliver clearly superior results to the as-observed outcomes of human drivers.

          I worked on the vision system for an autonomous vehicle program in 1991, using the processing power available then. Our team held several world records at the time for different categories of completely autonomous travel on public highways.

          If you fit any kind of curve between what (relatively little) we could do then for ~$200K in equipment and what a production car with < $1K of BOM costs can do today, it's reasonable to predict that well within my lifetime that vision-only autonomous driving systems could be better than a human on typical roads (absent snow cover).

          • micromacrofoot 2 years ago

            > better than a human on typical roads (absent snow cover).

            the weather caveats feel like evergreen statements about self-driving and make me feel like it's further off than most people realize — I agree that the improvements have been impressive over our lifetimes, but like in most general tasks there's still an enormous gulf between biology and technology

            we'll get there eventually, and much faster than biology did (ie, not millions of years)... but I wouldn't be surprised if full self-driving was another 20 years out

            • sokoloff 2 years ago

              I've made the same "never, ever will we completely replace human drivers" predictions before, informed by my experience trying to do it 30+ years ago but also by dozens of winters' worth of driving on snow-covered roads in New England.

              But there are huge potential gains even if self-driving is only usable in 99.8% of driving scenarios, provided there's adequate safeguards and sensible hand-overs to human drivers. (Not dumping an out-of-control, at-speed automobile into the human driver's lap with 50 milliseconds of notice.)

      • mola 2 years ago

        The human eye is a better "camera" than anything we use today in self driving cars (just read about the eyes dynamic range). Not to mention that human hearing has unbelievable audiolocalization ability that we struggle to explain.

    • AdamN 2 years ago

      Isn't radar and object detection standard for these types of systems?

  • foooorsyth 2 years ago

    > Feels like a case of outrage-for-clicks

    Like 99% of these “AI discrimination” articles.

    >human-detecting AI is developed in a western country with ~60% white population. Most of the training data is collected there

    >the AI performed slightly worse in Uttar Pradesh, where the people and everything else in the background look different

    >AI is prejudiced! Get outraged!

    Every time.

    • mola 2 years ago

      Weird, you articulated the exact point of bias in AI but the tone you used is dismissing. Yes obviously AI is not a moral agent and it isn't racist per se. But if it's input is biased and the test is biased then the application will be biased. That's a problem, if you go and deploy these models where their training data is lacking. Let's say, self driving car using the AI you described deployed in Uttar Pradesh is less safe because of bias.

      What is wrong with this statement in your opinion?

      • foooorsyth 2 years ago

        I'm dismissive of endless streams of unhelpful clickbait articles written by barely-tech-literate journalists aiming to spark racial outrage. I'm not dismissive of the threat of bias in AI, and I'm certainly not a fan of cavalier automotive companies running clearly-alpha autonomous software on public roads.

        Still, I find most of these kinds of articles to be obnoxious and unhelpful in their shaming. Did you read the paper linked in the OP? It investigates the following datasets: CityPersons, EuroCityPersons, and BDD100k.

        * CityPersons data is from mostly Germany (with all of it coming from central Europe)

        * EuroCityPersons, as the name implies, is data from European cities

        * BDD100k data is from NYC and Bay Area

        So are we shocked with the outcomes here, or are they more or less obvious? If I trained a popular object detector with pedestrian image data solely collected in India, would you be surprised if it performed poorly outside of India? Would that warrant a racially-inflammatory article title?

        And, as others have pointed out, these types of investigations make the implicit assumption that autonomous companies aren't working hard to reduce these biases in their own internal training pipelines. As reckless as some of those companies are, I promise you that they are not solely relying on Western data before deploying to non-Western streets. The investigators here may have simply been lazy/biased themselves (only investigating openly-available datasets from Western regions), and then projecting that laziness/bias onto AV companies.

tetromino_ 2 years ago

Human driver eyes (and I suspect any other optical systems working in the visible color range) are also less likely to detect people of color. Five years ago I avoided running over a pedestrian at night only by luck: he was black, wearing a black jacket, black pants, walking across a badly-lit suburban street; I think that either my visual system did not perceive him at all until the last fraction of a second, or perhaps perceived him as a shadow. I managed to swerve. But a fraction of a second later? I am afraid to think about it...

I am a big fan of Scandinavian style pedestrian safety reflectors. Attach one to your bag or jacket if you are walking late at night; it might save your life. But if you don't have a reflector, wear at least one piece of bright, light-colored clothing; this is particularly important it your skin color is dark!

  • joker_minmax 2 years ago

    Regardless of race, it shouldnt be on the road if it cannot detect jaywalkers at night wearing all black. Where I live POC are more likely to be the pedestrians because it's an immigrant-heavy community. I've seen a white guy jaywalking at night wearing all black in between street lights but I couldn't tell he was white until he got across the street and I could see him head-on.

  • zephrx1111 2 years ago

    Apparently the media wanted to sell an optical problem as racial problem.

    Some saying: racists are people that are thinking of race, talking about race, and acting based upon race.

Lewton 2 years ago

they're testing 8 different detection algorithms

> The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.

while they all had a harder time with adults vs children, that 7.52% is gotten by averaging 2 algorithms that performed abysmally, with 6 that had no statistically significant differences

https://arxiv.org/pdf/2308.02935.pdf table 6

  • strken 2 years ago

    The two with significantly worse performance were RetinaNet and YOLOX. I don't really know anything about the field, but it's interesting they're both single stage performant models, while the slower but lower miss-rate RCNN variants are two-stage. It's interesting that the pedestrian-specific models are all worse than the general models at detecting people!

    The conclusion is kind of weird: apparently their "findings reveal significant bias in the current pedestrian detectors" despite the bias being almost entirely within the single-pass general object detectors. And where it's statistically significant in the other models, the miss rate is low in both cases, and the effect is reversed! (Dry-weather Cascade-RCNN does better on dark-skin than light-skin, among others.)

  • yorwba 2 years ago

    I think you misunderstood table 6. All algorithms show significant differences in miss rate for children, two show significant differences based on gender, and four others based on skin color. The four that showed no statistically significant difference between light and dark skin had very high miss rates overall. Of the other four, two are much worse for dark skin, and two are slightly better. Those last two are also best at detecting children, but 28% miss rate is still a bit too high for my taste.

    • Lewton 2 years ago

      Yeah I missed the two statistically significant algorithms that favor darker skin since they have smaller percentage differences than the ones they didn't mark as statistically significant (but I guess that's because of how it relates to the overall miss rate)

      RE: 28% miss rate, I think this is meaningless as it's looking at single images/data points, while self driving cars get a continuous stream of data

  • techwizrd 2 years ago

    Are these pedestrian detection models in use in any widely-deployed commercial self-driving car? Is there a limitation since these are images rather than videos? I would've expected these to be addressed in the "Threats to Validity." There is also no control comparison to humans, beyond the two annotators. Are these detectors significantly worse than humans?

    There is telling whether these results are valid or applicable at all, but they purport that there are statistically significant unfairness based on gender and skin color. At best, this feels misleading.

zirgs 2 years ago

How do they work in winter then? You can't see much skin if someone is wearing a winter coat. Right - self driving cars are a solution for Silicon Valley only only so they don't even bother testing those cars elsewhere.

  • mrweasel 2 years ago

    The skin color is going to be the least of your issues in the winter time. How are the cars going to "see" the road under 10cm of snow? Granted humans shouldn't really be driving in those conditions either, but we do and mostly successfully, to avoid sleeping on the side of the road in -10C.

    What is it that makes it so hard for all these algorithms to work on people with darker skin? This has been an issue for more than ten years, surely someone has started adding various skin colors into the training data. Is it a case of lack of training material, or is it just faster to focus on one skin type?

    • snakeyjake 2 years ago

      My employer makes a multi-band synthetic aperture radar that penetrate snow and is high-resolution enough to "see" painted road markings and reflectors beneath a layer of snow.

      It is nowhere near small or cheap enough for self-driving car applications, but will be one day.

      Another challenge is affordable real-time processing of the data. Churning through 3,200MB/s of phase-history data is expensive but again that will solve itself given time.

      • mrweasel 2 years ago

        That is pretty neat, but what about the road with no markings? None of the road leading into the town where I live have road marking or reflectors or are you able to target the reflectors on posts by the side of the road as well? I mean see through the snow on those.

      • ChatGTP 2 years ago

        Snow clearers always destroy road markers. Every summer where I live they have to be repainted. So this would suck.

        Honestly, why don’t we just install magnetic or some other type of “tracks” in roads to help cars work like trains ?

    • mrkeen 2 years ago

      > This has been an issue for more than ten years

      Yes

      > Surely someone has started adding various skin colors into the training data.

      What has to occur for this to happen:

          * Someone has to take the time and effort to measure things, to identify that there is a problem.
          * They have to get that message out so that it's heard.
          * That message needs to:
              * hit the public hard enough that people demand intervention from their elected representatives
              * or, alert the company directly, and hope that the incentives align. (Will the company make more money by fixing this?)
      
      There's plenty of easier alternatives:

        * Call the problem too hard to solve
        * Call it bad science
        * Call it ragebait
        * Call it woke
        * Make up a bunch of equivalences and channel it into inertia:
          * If people are wearing winter coats, then they won't show enough skin for the cars to be racist.  And if the cars aren't racist in cold places, then it isn't a problem in warm places.
          * People don't have radar/lidar either, and they're allowed to drive
    • zirgs 2 years ago

      Yup - as someone from a country where -10 degrees is considered a normal winter day I think that self driving cars should not rely on road markings at all. Even road signs can be unreadable after a particularly nasty blizzard.

      I've yet to see self driving cars successfully navigating during bad winter conditions. They can't even avoid killing pedestrians in California.

  • mrkeen 2 years ago

    > A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person's race, gender, and age.

    - edit -

    Sorry, I read the article too quickly and assumed it was talking about the countries UK and China. Perhaps they only bothered testing the cards in UK, Silicon Valley and China, Silicon Valley.

boomboomsubban 2 years ago

It's strange that the paper doesn't seem to include any of the actual data, but it is available on their github page https://github.com/FairnessResearch/Fairness-Testing-of-Auto...

From what I can see, a couple of the detectors used really seem shit overall, making the combined data of questionable value.

hermannj314 2 years ago

"A new technology reduces mortality risk for all people, but has slightly better outcomes for white adults."

Conclusion - we call on lawmakers to make this technology illegal. We prefer more people die at equal rates more than we prefer less people to die at unequal rates.

I am not sure I agree with the ethics that underlies this way of seeing the world.

  • asddubs 2 years ago

    What if the technology was only/primarily tested with white people, rather than inherently having better outcomes for white adults? It's not really as clear cut as you make it out to be, technologies at this level of complexity aren't just derived from physical/biological principles. Perhaps there was a better variant of the technology that was scrapped as a cost-cutting measure, because it performed the same for white adults, but better for other classes of people (alluding to radar here, though I'm not sure it really performed the same, but I'm trying to make a larger point than any specific technology anyway).

    • hermannj314 2 years ago

      Fair point. Cost benefit analysis for adoption of safety features in automobiles is inconsistent.

      Drunk drivers kill more Americans each few months than terrorists on planes have in the last 25 years. Yet every airline passenger must prove they aren't a terrorist but no one driving a car has a default presumption they are drunk. Unless you've been convicted previously, then maybe sometimes.

      I wouldn't be surprised if a better model exists for object detection and we aren't using it to save pennies. Politics and ethics in automobile safety is asinine. Fair point.

    • mrweasel 2 years ago

      > What if the technology was only/primarily tested with white people, rather than inherently having better outcomes for white adults?

      I think we have a precedence for that in testing of drugs. The majority of drugs are primarily tested on white men, meaning that their effect and dosages may be problematic for women or people of color.

      There's also the issue of the majority of tools being designed for right handed people and any left handed either needs to spend more on tools or accept a certain risk when operating a chainsaw.

  • candiddevmike 2 years ago

    It should be illegal, the current systems will never deliver without additional object detection methods (radar, lidar, etc). Get this shit recalled and back to the drawing board.

    • luma 2 years ago

      Do you have lidar and radar when you drive? Should it be illegal for you to drive without those things?

      • notamy 2 years ago

        What is it with people on this website (not specifically the user I’m replying to) and going with "the computer vaguely does the thing the human does, literally what's the difference???" (or similar) as an argument/gotcha/etc. around ML/AI/self-driving/...? It’s painfully obvious that the computer and the human are different systems that process information differently and therefore likely need different sets of inputs for optimal function.

        • matwood 2 years ago

          I agree that they are different, but if I read the comment you responded to charitably, it's pointing out that the bar often seems unreasonably high for self driving. Human drivers are terrible leading to ~100 deaths/day in the US. How much better does self-driving need to be in order to make it worthwhile? Does it have to be perfect?

          • hermannj314 2 years ago

            The government mandates adding rear view cameras to all cars at a cost of like $1 million dollars per life saved.

            There is no rational basis for vehicle safety standards and the outcomes (deaths per passenger mile) show that we are doing a horrible job at it.

            Regulating the addition of LIDAR based object detection to all vehicles will have no basis in safety until domestic manufacturers can't compete with imports and they need another expensive component to tack on to every car to narrow the price gap in their incompetence to compete globally. Then there will be a PR campaign, funded by Detroit, about how necessary it is for "safety" to add LIDAR or whatever to every car.

        • luma 2 years ago

          It’s counter argument by example. Very Serious Scientists argued that heavier than air flight is impossible while birds were flying outside their window. Likewise, I am 100% confident that it is possible in theory to drive a car without the assistance of LIDAR or RADAR.

          Does this mean current systems require this or that? Nope, but making the claim that all systems must be equipped with a certain technology seems short sighted, particularly when the proven counter example is putting a few million miles on the road every day.

      • jhot 2 years ago

        Why guess depth from images when you can know depth?

    • elcano 2 years ago

      I wouldn't say 'never'. But the right thing to do is to keep the training wheels (radar, lidar) until it's proven that the system has learned to drive without them when there's good visibility and to reduce speed (or stop, if necessary) when the visibility is reduced. This also includes learning to recognize almost every possible situation, including zero shot events.

HunterWare 2 years ago

In other news, it turns out that detecting smaller and lower contrast objects is harding with optical sensors. Almost, you know, like how it is with real people.

Eddy_Viscosity2 2 years ago

It won't be long till they won't just be detecting people, by identifying specifically which people they are. Think of the data! All those cars logging time/location of all those people. (which, of course, will only be used for good and the occasional targeted ad)

  • solarkraft 2 years ago

    Buddy, you're commenting on an article showing that cars aren't even that good at identifying people as people.

    • pie420 2 years ago

      Hey Buddy, you realize things improve, right? Especially novel software that has currently improved tremendously in the past 10 years. Are you actually suggesting that it will suddenly stop improving today? Weird.

  • candiddevmike 2 years ago

    I think what you're trying to say is cars will start identifying folks and logging their location. Cell tracking/triangulation data is already available from cell companies. Self driving cars aren't going to make things worse.

    • Eddy_Viscosity2 2 years ago

      Optimistic to think that having more ways to track people won't make things worse.

    • helmette 2 years ago

      Carrying a mobile phone is optional.

      • ben_w 2 years ago

        And visual detection more precise. You can tell not only if someone is usually within a few tens of metres as you can with cellphone data, but exactly where they are, what they're doing, and who/what they're looking at.

  • gizajob 2 years ago

    Urgh

sbuttgereit 2 years ago

This seems like a pretty poor article on a pretty poor research subject.

The way I read it is something like this...

Some researchers got their hands on software that purports to do similar stuff to what self driving cars might also do, but crucially isn't the same as what the cars actually use, and then extrapolate the results into the headline-like title of the research paper: "Dark-Skin Individuals Are at More Risk on the Street: Unmasking Fairness Issues of Autonomous Driving Systems". That's justified isn't it? After all, all software in a category is more or less the same program and the car company software and their research subject software all runs on computers? Right? Must be valid... clearly you can make factual assertions on that kind of extrapolation about computer systems and software.

Then some bright-eyed-bushy-tailed reporter comes along and applies the criticality of the typical college educated/professional journalist, which is to say they carefully considered the headline they could write, but otherwise just took the word of the researchers that something resembling knowledge was actually gained by the study. News is delivered! Job done!

Look, sarcasm aside, could I have read/understood things incorrectly? Sure... I'm not an expert in this field. Could this be a problem in production-used-in-the-real-world pedestrian detection systems? Sure. But insofar as I can tell, the best the paper could be telling us is that racial biases in pedestrian detection systems is a viable possibility: not the assertion that "Dark-Skin Individuals Are at More Risk on the Street". It might be true, but I don't think these researchers know that any better than I do. Of course, "Dark-Skin Individuals Could Be at More Risk on the Street" isn't nearly so catchy or attention grabbing, is it?

And who knows... maybe this research team should pick up the search for low temperature/low pressure super-conductivity... sounds like they have the right temperament.

steveBK123 2 years ago

To be fair, human drivers would likely have the same problem given seating position in cars & certain lighting conditions..

  • skinkestek 2 years ago

    Can confirm to some degree.

    I had one dark skinned kid in dark clothes casually crossing the road in front of me during the dark months here were I live.

    I didn't have to slam the brakes or anything because it was a bit ahead of me, bit it was scary because of how hard it was to detect.

    • amluto 2 years ago

      To be fair, in cool or cold weather, a pedestrian can easily have little or so visible skin — long sleeves, hair, and possibly a hat of any sort can hide skin very well.

      Pedestrians (and cyclists) should wear bright clothing and/or retroreflectors at night! (Ideally both. Retroreflectors are mostly useless if a car has its lights off.)

      • skinkestek 2 years ago

        Around here you get a fine if you drive with the lights off, even in the summer.

        • steveBK123 2 years ago

          Taking your life in your hands (biking in dark clothing, especially at night) because someone is going to get a fine for having their lights off after they kill you is not a good risk calculation.

GaggiX 2 years ago

>This is due to bias in open-source AI, on which self-driving cars rely, researchers say.

Or they are simply less visible.

  • cj 2 years ago

    What's "less visible" to the human eye shouldn't be used to lower our expectations of what can be visible to a machine.

    • GaggiX 2 years ago

      Well if it's using optical recognition then lower contrast or smaller object will always make the job more difficult then a bigger object with more contrast; then the reason why the model is better at recognizing white adult people. Even if the car is never going to strike a child or a black person the model recognition would still be better with them.

    • willcipriano 2 years ago

      Don't let physics get in the way of your ideals.

  • elcano 2 years ago

    You mean, to all possible sensors and spectrum signals in the arsenal?

    • lsaferite 2 years ago

      Pretty sure they are discussing 3-band cameras between 400nm and 700nm (roughly). If we start adding multi-spectral or hyper-spectral cameras to the mix then you'd have a lot more information to use, but then you'd have a lost more information to process.

  • gcanyon 2 years ago

    > This is due to bias in open-source AI, on which self-driving cars rely, researchers say.

    The second line in the article

    • DennisP 2 years ago

      Yes but by "bias" they just mean the miss rate is different. Digging into the study, they talk about visibility and contrast, and checked whether the problem is worse at night (which it is). They don't appear to be suggesting that the algorithms are prejudiced against darker-skinned people.

    • hardware2win 2 years ago

      How do they know?

haunter 2 years ago

The abstract sounds a bit different than the (rage baiting) article

>bias towards dark-skin pedestrians increases significantly under scenarios of low contrast and low brightness

https://arxiv.org/pdf/2308.02935.pdf

endymi0n 2 years ago

In other words: "Objects with less surface area and lower albedo are less reliably picked up by visual deep neural nets. We haven't benchmarked this against human drivers."

im3w1l 2 years ago

The researchers supposedly used similar but not quite the same approach as Tesla, and claimed it worked worse for people of color. That makes sense since optical recognition is harder for dark skinned people.

However, the article leads with a picture of a Cruise car, which use lidar technology. Those should afaik recognize people with the same accuracy regardless of skin color.

  • seanmcdirmid 2 years ago

    I thought Waymo and Cruise are both using optical and LiDAR (using fancy sensor fusion algorithms to make those work safely together), so this would impact them as well? LiDAR alone isn’t going to be able to tell if something is a pedestrian, just that something is there, optical is still needed to determine that the thing is a person.

  • joe__f 2 years ago

    Also, children are small

elif 2 years ago

They should have controlled with a person painted blue or purple to determine whether it is simply a matter of photonics rather than AI bias.

gcanyon 2 years ago

Anyone know how likely it is that this is the result of imbalanced training data? You have fewer dark-skinned people and children in the training data, you end up with a model less-skilled at detecting those people.

  • causi 2 years ago

    You'd have to compare the machine's performance to real human performance. I suspect humans also have an uneven detection rate between those categories.

Horffupolde 2 years ago

If skin colors differ how can you possibly expect to obtain the same result?

epivosism 2 years ago

One good bit of news is that I can confidently predict that at least they don't follow the sizeism of our society, and are likely able to detect larger people more easily than skinnier.

lemper 2 years ago

as much as I dislike "AI" and its adjacent topics in HN, I think this can be solved with the companies which have the stake to get data from Asia and African nations. I don't know, pay someone group of people to drive around Cape Town, Bengaluru, Shanghai, Shaanxi, Jakarta, and whichever place that has a lot of "PoC" and kids.

archo 2 years ago

https://archive.is/8rf7Z

bigfryo 2 years ago

It's almost as if the media is looking for racism in everything

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection