Show HN: TextBlob API Demo delivered inside docker
textblob-api-1743413701.us-east-1.elb.amazonaws.com@jduckles started a competition on Twitter: https://twitter.com/jduckles/status/372151101581033472 Any competitor? :)
Running w/ docker 0.6.1 on EC2 ubuntu instances behind an ELB. Pretty cool docker.io use case: docker run sguignot/textblob-api-server
Uhh, is there really any point to using docker for this other than bragging rights?
IMO docker is made to deliver plug'n'play apps/components. I think sentiment analysis API is a good example. I hope it will create new virtuous loops: easier to install => more people playing with => more feedback => easier for devs to improve the sentiment analysis => improved versions deliveries => docker pull. Your thoughts? Any feedback on the analysis quality?
I don't really know anything about NLP, so I can't comment on the analysis quality. What I meant to say is that a full-on Linux container seems a bit overkill for distributing what is essentially just a python app.
If I just wanted to run an instance of this locally to play around with, I would probably want to git clone it, make a virtualenv, run "pip install -r requirements.txt" or "python setup.py develop" to grab the dependencies, and then run some python script which starts up the web server.
And if I wanted to deploy the app to EC2, I would probably just grab an AMI and launch an instance off of that.
And if you couldn't find the right AMI or if you wanted to use a different provider, you could use Docker.
This is exactly why Docker containers are becoming popular.