GitHub - stevenschmatz/clarifai-ios-nsfw-detector: Upload images from your user library and see if they're NSFW or not. And if they are, it sends them to me. :smiling-imp:

1 min read Original article ↗

In this project, the user uploads an image and receives an NSFW/SFW status with a given likelihood. If the probability of NSFW is greater than 85%, the image is classified as NSFW.

For more information about the awesome Clarifai API, check out developer.clarifai.com!

Building and Running

To build this project, you need Xcode 7 and CocoaPods. To build and run:

  1. Install dependencies and generate workspace.
  1. Open the workspace in Xcode
open ClarifaiApiDemo.xcworkspace
  1. Go to developer.clarifai.com/applications, click on your application, then copy the "Client ID" and "Client Secret" values (if you don't already have an account or application, you'll need to create them first).

    Replace the values of clarifaiClientID and clarifaiClientSecret in Credentials.swift with the ones you copied.

  2. Press the "Play" button in the toolbar to build, install, and run the app.