Show HN: The remaining unregistered one-word .com domains
randomdotcom.comWe made this as part of a weekend hack / side-project. We originally thought that the only words left would be mega long or ones that noone would recognize but surprisingly there are some quite good ones!
As it stands, there are around 1.8k left.
Interestingly, since we showed people on Reddit yesterday, there have been around 100 domains that were in the list but are now taken.
Could we get a list on the site of those that are taken? Would be cool to see what people actually buy!
This is a great idea :) thanks
Just added this!
Very cool, some hilarious ones left. Kind of like the "land" no one wants.
Yeah. A side benefit of this is that I'm extending my vocabulary!
I just bought three new domains and now I feel dirty. Hope I don't get buyers remorse :)
Cool app
Is it using a US-english dictionary or does it have both UK-en and US-en words?
What'd you find that you bought yourself?
For those that are curious, here's ~1400 of them:
https://gist.github.com/theycallmeswift/6e479d61b772543aa613
Did you just ddosed them to get the list? Why there is no down vote button here?
Please respect others' work. Writing a scraper nowadays is not cool any more!
Nope, they actually have a public API.
Sounds like you might want to check out the FAQ for answer to your other question. https://news.ycombinator.com/newsfaq.html
Considered putting in A mechanism to limit the number of words per person but decided not to. Oh well, its the Internet, what to do eh :)
...and no one wants "benumbs.com" :)
The majority of them seem to be verbs.
Cool. Some of the words are very exotic, but then some are surprisingly common words. I wonder if there exists an index with a "commonality rating" that could be used to sort/weight the randomiser with, so that one could get more common words up first?
WordNet has this. You can see it in action here [1] but you have to enable "Show Frequency Counts" in the pulldown. It has frequency counts for each "sense" of a word, not just the word.
Here's where I pimp my side-project, which has a frequency score for words.
Very nice. What did you base the data on? Some 3rd party source or did you compile stats from a corpus of random texts?
I love that the first suggestion it made to me was "gangrening.com"
Publish the full list somewhere maybe? It's trivial to set up a Node app to get all 1.8k unique words from /shortname/domain, so you are at risk of getting unintentionally ddosed this way.
Can't believe teargassing.com isn't taken yet.
This rocks! There are still some great domain names. A huge time saver.
is this opensourced?
This isn't, no. It's based on/powered by our main project www.domcomp.com - a domain price comparison and availability checking tool. However, we are considering whether we can split some bits off to be opened sourced :)