Detect explicit content in images from Google Drive
Detect explicit content in images from Google Drive
Get this templateApps used in template
About
Every time a new file is uploaded to your Google Drive folder, Make will automatically upload that file to Eden AI to detect explicit content in images. Created by [email protected]
Create your own workflow
Create custom workflows by choosing triggers, actions, and searches. A trigger is an event that launches the workflow, an action is the event.
Adds a face to the faces database.
Analyzes the syntactic structure of a text.
Anonymizes a document by removing personally identifiable information.
Anonymizes an image by bluring sensitive parts (faces, car plates, etc.)
Anonymizes a Text by removing names, addresses, etc.
Provide an answer to a question based on a text.
Provide an answer to a question based on a video.
Generate an answer to a question based on the information present in the image.
Sends a query to the Language Learning Model (LLM) and retrieves a response.
FAQ
See Make in action
See how Make works