Q&A: Design principles for multi-environment AI architectures
Datacom’s AI and infrastructure experts – Matt Neil (Director – Data Centres), Mike Walls (Director – Cloud) and Daniel Bowbyes (Associate Director – Strategy) – discuss when centralised compute makes sense for AI, and how to orchestrate AI across edge, core data centres and cloud. The team shares governance, readiness and architectural approaches to enable reliable multi-environment AI. When does centralised cloud or core data centre compute make the most sense for AI workloads? Mike Walls, Director – Cloud : Centralised compute is sensible when workloads benefit from scale, governance and uniform platform capabilities that are harder to achieve in distributed setups. Think large‑scale training, platforms or workloads requiring a consistent, controlled environment with robust security and
Could not retrieve the full article text.
Read on CIO Magazine →CIO Magazine
https://www.cio.com/article/4153211/qa-design-principles-for-multi-environment-ai-architectures.htmlSign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.








Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!