❓ Frequently Asked Questions

Everything you need to know about OpenFinOps

🚀 Getting Started

What is OpenFinOps? +
OpenFinOps is an open-source FinOps platform for AI/ML cost observability and optimization. It provides real-time cost monitoring, LLM-powered recommendations, and multi-cloud tracking (AWS, Azure, GCP) to help teams reduce infrastructure costs by 30-50%.
How long does setup take? +
Most teams are up and running in under 5 minutes. Simply install via pip (pip install openfinops), configure your cloud credentials, and launch the dashboard. The system automatically discovers resources and starts tracking costs immediately.
Is OpenFinOps really free? +
Yes! OpenFinOps is 100% free and open source under the Apache 2.0 license. There are no hidden costs, no premium tiers, and no credit card required. You can use all features without any limitations.

💰 Cost Savings

How much can I save with OpenFinOps? +
Teams typically save 30-50% on AI/ML infrastructure costs using OpenFinOps through:
• GPU right-sizing: 18% savings
• Auto-scaling: 12% savings
• Spot instances: 15% savings
• LLM recommendations: 7% savings

For a $10,000/month cloud spend, that's $36,000-$60,000 annual savings!
What are spot instances and how much can I save? +
Spot instances are unused cloud capacity available at up to 90% discount compared to on-demand pricing. OpenFinOps identifies fault-tolerant workloads suitable for spot instances and provides specific recommendations. For example, using spot H100 instances can reduce costs from $8.20/hour to $0.82/hour, saving $177,000 annually for a single instance running 24/7.
What GPU types does OpenFinOps optimize? +
OpenFinOps optimizes costs for NVIDIA A100 (80GB), H100 (80GB), L4, T4, and V100 GPUs across AWS, GCP, and Azure. It provides specific recommendations like:
• "Switch from A100 to L4 for inference workloads to save 70%"
• "Use spot instances for H100 to save up to 90%"
• "Right-size from 8x A100 to 4x H100 for better performance and 50% cost reduction"

⚡ Features & Capabilities

Which cloud providers does OpenFinOps support? +
OpenFinOps supports Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure. You can monitor costs across all three providers in a single dashboard with unified cost attribution and multi-cloud optimization recommendations.
How does the LLM-powered recommendation work? +
OpenFinOps can integrate with OpenAI (GPT-4), Anthropic (Claude), Azure OpenAI, or Ollama (local models) to analyze your infrastructure and provide intelligent optimization recommendations. It analyzes your usage patterns, workload types, and costs to suggest specific actions like GPU downsizing, auto-scaling policies, or spot instance usage. You can also use rule-based recommendations without an LLM.
Does OpenFinOps work with Kubernetes? +
Yes! OpenFinOps tracks costs for Kubernetes clusters (EKS, GKE, AKS) including pod-level cost attribution, node right-sizing recommendations, and cluster auto-scaling policies. It helps optimize container workloads and provides namespace-level cost breakdowns.
Can I track OpenAI and Anthropic API costs? +
Yes! OpenFinOps tracks LLM API usage and costs for OpenAI (GPT-4, GPT-3.5), Anthropic (Claude), and other providers. It monitors token usage, calculates costs per request, and provides optimization recommendations like caching, prompt optimization, or model selection.
How does cost attribution work? +
OpenFinOps provides flexible cost attribution by team, project, environment, or custom tags. You can track costs by ML model, training run, department, or any custom dimension. The system automatically aggregates costs across services and provides detailed reports for chargeback or showback.

🤝 Support & Community

How do I get support? +
OpenFinOps offers multiple support channels:
GitHub Discussions - Ask questions and get help from the community
GitHub Issues - Report bugs and request features
• Comprehensive documentation and guides

Coming soon: Live chat, Discord community, and community office hours

Join our GitHub Discussions to engage with the community and get support!
Can I contribute to OpenFinOps? +
Yes! OpenFinOps is open source and welcomes contributions. You can contribute code, documentation, bug reports, feature requests, or help others in the community. Check the CONTRIBUTING.md guide on GitHub to get started. We have a contributor recognition program and monthly community highlights.
Can I use OpenFinOps for production workloads? +
Absolutely! OpenFinOps is designed for production environments with features like budget alerts, cost attribution, role-based dashboards (CFO, COO, Infrastructure), and real-time monitoring. It's built to handle large-scale AI/ML workloads.

📊 Comparisons

What's the difference between OpenFinOps and commercial FinOps tools? +
OpenFinOps is 100% free and open source with no vendor lock-in. Unlike commercial tools that charge per-node or percentage of spend, OpenFinOps has zero cost. It's specifically designed for AI/ML workloads with GPU optimization, LLM cost tracking, and intelligent recommendations. You have full control over your data and can self-host or customize the platform.

Still Have Questions?

Join our GitHub community or get started for free