Skip to content

Latest commit

 

History

History
37 lines (25 loc) · 1.2 KB

README.md

File metadata and controls

37 lines (25 loc) · 1.2 KB

Unlocking Generative AI with Phi-3-mini: A Guide to Inference and Deployment

Discover how Phi-3-mini, a new series of models from Microsoft, enables deployment of Large Language Models (LLMs) on edge devices and IoT devices. Learn how to use Semantic Kernel, Ollama/LlamaEdge, and ONNX Runtime to access and infer Phi-3-mini models, and explore the possibilities of generative AI in various application scenarios.

Features

inference phi3-mini model in:

  • Semantic Kernel
  • Ollama
  • LlamaEdge
  • ONNX Runtime
  • iOS

Getting Started

Prerequisites

  • macOS/Windows/Liunx
  • python 3.10+

Guideline

please read my blog https://aka.ms/phi3gettingstarted to run the demo

please read my blog and follow guildeline to run iPhone demo

Resources