Zeblok Computational
1.2.7
1.2.7
  • About Zeblok Computational
  • Quick Start
  • Core Concepts
  • ISV Onboarding Guide
  • Create an Account
    • Individual User
    • Inviting User
  • Sign In
  • Spawn Ai-WorkStation
  • Spawn MicroService
  • Spawn Orchestration Add-on
  • Video Tutorial
  • FAQS
  • Ai-APIᵀᴹ Engine
    • Introduction
    • Notebooks
      • Installing different frameworks
    • Build Models
      • ZBL Command Line Interface
        • How to use the CLI
    • Deploy
  • DATA AND METRICS
    • DataLake
      • Datasets
    • Monitoring
      • Resource level monitoring
  • MORE
    • Manage Account
      • Forgot Password
      • 2 Factor Authentication
    • IAM
      • Roles
      • Usergroup
    • Admin Guide
      • Menu
      • Dashboard
      • IAM (Identity Access Management)
      • Organizations
      • DataCenters
      • Plans
      • NameSpaces
      • WorkStations
        • Customised Workstations
      • MicroServices
      • Resource Manager
    • SDK (version 1.1.0)
  • RELEASES
    • Release notes
    • Known Issues
    • Common Errors
  • Support
    • Support Emails
Powered by GitBook
On this page
  • Overview
  • Easy steps to Ai-API™ Deployment

Was this helpful?

  1. Ai-APIᵀᴹ Engine

Introduction

Introduction to Ai-APIᵀᴹ

PreviousFAQSNextNotebooks

Last updated 1 year ago

Was this helpful?

Overview

Ai-API™ makes moving trained ML models to production easy:

  • Package models trained with ML framework and then containerize the model server for production deployment

  • Deploy anywhere for online API serving endpoints or offline batch inference jobs

  • High-Performance API model server with adaptive micro-batching support

  • Ai-API™ server is able to handle high-volume without crashing, supports multi-model inference, API server Dockerization, Built-in Prometheus metric endpoint, Swagger/Open API endpoint for API Client library generation, serverless endpoint deployment etc.

  • Central hub for managing models and deployment process via web UI and APIs

  • Supports various ML frameworks including:

Scikit-Learn, PyTorch, TensorFlow 2.0, Keras, FastAI v1 & v2, XGBoost, H2O, ONNX, Gluon and more

  • Supports API input data types including:

DataframeInput, JsonInput, TfTensorflowInput, ImageInput, FileInput, MultifileInput, StringInput, AnnotatedImageInput and more

  • Supports API output Adapters including:

BaseOutputAdapter, DefaultOutput, DataframeOutput, TfTensorOutput and JsonOutput

Easy steps to Ai-API™ Deployment

Select your notebook
Build Model
Deploy