MCP Server

guidance-for-scalable-model-inference-and-agentic-ai-on-amazon-eks

by aws-solutions-library-samples

Comprehensive, scalable ML inference architecture using Amazon EKS, leveraging Graviton processors for cost-effective CPU-based inference and GPU instances for accelerated inference. Guidance provides a complete end-to-end platform for deploying LLMs with agentic AI capabilities, including RAG and MCP

21 stars8 forksActivePython
79
Good

📊 Score Breakdown

🛡️Security30%
5.0/5
Utility30%
2.0/5
🔄Maintenance25%
5.0/5
💎Uniqueness15%
4.0/5

Overall = Security (30%) + Utility (30%) + Maintenance (25%) + Uniqueness (15%). Full methodology →

ℹ️ Details

Category

🤖 AI & LLM Tools

Ecosystem

MCP Server

Language

Python

Pricing

Free

License

MIT-0

Status

Active

Platforms

claude

📈 GitHub Signals

21

Stars

8

Forks

0

Commits (30d)

4

Open Issues

Last commit: 1 months ago

agentic-aiagentic-workflowhuggingfaceinferenceinference-enginelangfuselitellm-ai-gatewaymcp-client

The Weekly Index 📬

New MCP servers, Claude skills, stale alerts, and picks — every Thursday.

Data last verified: 3 weeks ago. See something wrong? Report it →