This model can be used to remove backgrounds from an image or create a segmentation mask for an object.
This model takes an input image and creates two output images:
- This first output image is the original image with the background removed
- The second output image contains the mask of the object in the foreground
First, clone this repository:
git clone https://github.com/basetenlabs/truss-examples/
cd dis-segmentation
Before deployment:
- Make sure you have a Baseten account and API key.
- Install the latest version of Truss:
pip install --upgrade truss
With dis-segmentation
as your working directory, you can deploy the model with:
truss push
Paste your Baseten API key if prompted.
For more information, see Truss documentation.
The model only has one input:
input_image
(required): An image represented as a base74 string.
The output of the model contains two images in the form of a base64 string. Example model output:
{
"img_without_bg": "Base64 string image",
"image_mask": "Base64 string image"
}
You can invoke the model in Python like so:
import os
import json
import base64
import requests
# Set essential values
model_id = ""
baseten_api_key = ""
BASE64_PREAMBLE = "data:image/png;base64,"
def pil_to_b64(pil_img):
buffered = BytesIO()
pil_img.save(buffered, format="PNG")
img_str = base64.b64encode(buffered.getvalue()).decode("utf-8")
return img_str
def b64_to_pil(b64_str):
return Image.open(BytesIO(base64.b64decode(b64_str.replace(BASE64_PREAMBLE, ""))))
# Call model endpoint
res = requests.post(
f"https://model-{model_id}.api.baseten.co/development/predict",
headers={"Authorization": f"Api-Key {baseten_api_key}"},
json={
"input_image": pil_to_b64(Image.open("path/to/image.png"))
}
)
# Get output image
res = res.json()
background_removed = b64_to_pil(res.get("img_without_bg"))
mask = b64_to_pil(res.get("image_mask"))
background_removed.save("image_without_background.png")
mask.save("image_mask.png")
Here is an example of the outputs given the following input.
Input image:
Image without background:
Image mask: