Settings

Theme

Show HN: Python CLI for Compiling Jupyter Notebooks into FastAPI Apps

docs.neutrinolabs.dev

5 points by ricardoagzz 2 years ago · 4 comments · 1 min read

Reader

Hi HN!

I recently built Neutrino Notebooks, an open source python library for compiling Jupyter notebooks into FastAPI apps.

I work with notebooks a ton and typically find myself refactoring notebook code into a backend or some other python script. So, I made this to streamline that process.

In short, it lets you:

- Expose cells as HTTP or websockets endpoints with comment declaratives like `@HTTP` and `@WS`

- Periodically run cells as scheduled tasks for simple data pipelines with `@SCHEDULE`

- Automatic routing based on filename and directory structure, sort of similar to NextJs.

- Ignore sandbox files by naming them ‘_sandbox’

You can compile your notebooks, which creates a /build folder with a dockerized FastAPI app for local testing or deployment.

GitHub repo: https://github.com/neutrino-ai/neutrino-notebooks

Excited for feedback from the HN community

TechBro8615 2 years ago

This is a really cool idea. I love the simplicity of annotating cells and "compiling" a serverless HTTP application. In fact this idea of "compiling" the app might be the real novelty here. I suspect the adoption of the project will depend on how seamless you can make compilation and deployment for specific use cases.

Have you considered combining this idea with LLMs and templated prompts?

I'd also encourage you to focus your efforts on creating a smooth experience for "deploying" the resultant application to arbitrary platforms e.g. Fly.io, Render, Google Cloud Run, etc. (I would suggest Cloudflare Workers as well, but as far as I know that's not a platform that supports Python. So maybe it's also worth thinking about how to adopt the same idea to TypeScript.)

  • ricardoagzzOP 2 years ago

    Thanks for the feedback!

    I wrote an example on how you can use it to create a simple pipeline to scrape wikipedia's daily featured article, chunk it, and store it in a vector DB using the scheduled cells, then use the HTTP cells to expose an endpoint calling GPT with RAG using the vector DB (https://docs.neutrinolabs.dev/example-up-to-date-chatbot). Any specific use cases you had in mind?

    And I agree, the deployment point is super important, its in the works!

Rutvikrau 2 years ago

Have been using notebooks as a testing ground for all my python related code and seems interesting to be able to transfer this over to an API

Would this be something that can be moved to production as well?

  • ricardoagzzOP 2 years ago

    Of course, we're working on better support for deployment soon, but since its just a FastAPI app with a dockerfile you could deploy it anywhere. Also depends on your use case though, for example, if you have auth requirements you could add a middleware to your endpoints to verify a jwt token or call the endpoints from another backend service.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection