Phi-3 Model Serverless on Azure
Setting Up Phi-3 Model Serverless on Azure OpenAI
Created Jul 25, 2024 - Last updated: Jul 25, 2024
Growing 🌿
AI
Azure
SLM
Introduction
The Phi-3 family of small language models (SLMs) offers powerful and cost-effective solutions for various AI applications. In this blog, we’ll walk you through the steps to deploy a Phi-3 model as a serverless API on Azure OpenAI and provide a sample Python code to invoke the model. We’ll also cover how to set up Python on your local laptop to run the code.
Prerequisites
- An Azure account.
- Access to Azure AI Studio.
- Basic knowledge of Python.
Step 1: Deploying Phi-3 Model on Azure OpenAI
- Sign in to Azure AI Studio: Go to the Azure AI Studio and sign in with your Azure account.
- Select Model Catalog: From the left sidebar, select “Model catalog”.
- Choose Phi-3 Model: Search for and select the Phi-3 model you want to deploy (e.g., Phi-3-mini-4k-Instruct).
- Deploy Model: Click on “Deploy” and choose the “Serverless API” option to open the deployment window.
- Configure Deployment: Follow the prompts to configure your deployment settings and complete the deployment process.
Step 2: Sample Python Code to Invoke the Phi-3 Model
Once the model is deployed, you can use the following Python code to invoke the model:
Read More: Visit Medium.com