Skip to content
  • About Chris Chien
  • Signature Projects
  • Signature Projects 2020-Present
  • Tech Events
  • Work and Education

Chris & AI

AI is another way to interpret science in life.

  • Home
  • About Chris Chien
    • Signature Projects
    • Signature Projects 2020-Present
    • Work and Education
    • Tech Community Events
  • Edge AI
  • Cloud AI
  • AI Algorithm
  • MLOps
  • Tech Philosophy
  • Home
  • Edge AI
  • Edge AI Optimization and Inference Engine
Edge AI

Edge AI Optimization and Inference Engine

May 21, 2024November 12, 2025 admin

LLM Inference Optimization

  • Here We Give You All the Tips to Run LLM Inference Smoothly

Cross-Platform Analysis

  • Code Comparison — OpenVINO VS RyzenAI VS ONNXRuntime
  • Your Debugging Map When Deploying AI Models to Edge Devices

Cross-Platform Optimization – TVM

  • Deploy AI Models Everywhere — Who Is TVM?
  • Speed Up Our AI Inference Using TVM Auto-Tuner

Cross-Platform Optimization – Olive and Onnxruntime

  • Quantize Our Edge AI Model Using Microsoft Olive

AMD Infence Engine

  • Quantize Our Edge AI Model Using AMD RyzenAI

Qualcomm Inference Engine

  • Quantize Our Edge AI Model Using Qualcomm QNN

Intel Inference Engine

  • Quantize Our Edge AI Model Using Intel OpenVINO

Related Posts

Edge AI

AI Compiler & Driver

November 12, 2025November 12, 2025 admin
Edge AI

How a Data Scientist Makes Friends with IT

September 15, 2024November 12, 2025 admin

Post navigation

Previous: Gen-AI
Next: All About AI Demo App

Recent Posts

  • Vibe Coding in Practice
  • AI Compiler & Driver
  • The Tech Entrepreneur’s Survival Guide

Recent Comments

  1. A WordPress Commenter on Hola Bot

Categories

  • About Chris Chien
  • AI Algorithm
  • Cloud AI
  • Edge AI
  • Education
  • MLOps
  • Tech Community Events
  • Tech Philosophy
  • Gitlab Link
  • Kaggle Link
  • Chris Thread
All Rights Reserved 2024.
Proudly powered by WordPress | Theme: Fairy by Candid Themes.